National Library of Energy BETA

Sample records for develop baseline computational

  1. Baseline Glass Development for Combined Fission Products Waste Streams

    SciTech Connect (OSTI)

    Crum, Jarrod V.; Billings, Amanda Y.; Lang, Jesse B.; Marra, James C.; Rodriguez, Carmen P.; Ryan, Joseph V.; Vienna, John D.

    2009-06-29

    Borosilicate glass was selected as the baseline technology for immobilization of the Cs/Sr/Ba/Rb (Cs), lanthanide (Ln) and transition metal fission product (TM) waste steams as part of a cost benefit analysis study.[1] Vitrification of the combined waste streams have several advantages, minimization of the number of waste forms, a proven technology, and similarity to waste forms currently accepted for repository disposal. A joint study was undertaken by Pacific Northwest National Laboratory (PNNL) and Savannah River National Laboratory (SRNL) to develop acceptable glasses for the combined Cs + Ln + TM waste streams (Option 1) and Cs + Ln combined waste streams (Option 2) generated by the AFCI UREX+ set of processes. This study is aimed to develop baseline glasses for both combined waste stream options and identify key waste components and their impact on waste loading. The elemental compositions of the four-corners study were used along with the available separations data to determine the effect of burnup, decay, and separations variability on estimated waste stream compositions.[2-5] Two different components/scenarios were identified that could limit waste loading of the combined Cs + LN + TM waste streams, where as the combined Cs + LN waste stream has no single component that is perceived to limit waste loading. Combined Cs + LN waste stream in a glass waste form will most likely be limited by heat due to the high activity of Cs and Sr isotopes.

  2. A SURVEY OF ASTRONOMICAL RESEARCH: A BASELINE FOR ASTRONOMICAL DEVELOPMENT

    SciTech Connect (OSTI)

    Ribeiro, V. A. R. M.; Russo, P.; Cárdenas-Avendańo, A. E-mail: russo@strw.leidenuniv.nl

    2013-12-01

    Measuring scientific development is a difficult task. Different metrics have been put forward to evaluate scientific development; in this paper we explore a metric that uses the number of peer-reviewed, and when available non-peer-reviewed, research articles as an indicator of development in the field of astronomy. We analyzed the available publication record, using the Smithsonian Astrophysical Observatory/NASA Astrophysics Database System, by country affiliation in the time span between 1950 and 2011 for countries with a gross national income of less than 14,365 USD in 2010. This represents 149 countries. We propose that this metric identifies countries in ''astronomical development'' with a culture of research publishing. We also propose that for a country to develop in astronomy, it should invest in outside expert visits, send its staff abroad to study, and establish a culture of scientific publishing. Furthermore, we propose that this paper may be used as a baseline to measure the success of major international projects, such as the International Year of Astronomy 2009.

  3. developing-compute-efficient

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Developing Compute-efficient, Quality Models with LS-PrePost 3 on the TRACC Cluster Oct. ... with an emphasis on applying these capabilities to build computationally efficient models. ...

  4. Development Of Regional Climate Mitigation Baseline For A DominantAgro-Ecological Zone Of Karnataka, India

    SciTech Connect (OSTI)

    Sudha, P.; Shubhashree, D.; Khan, H.; Hedge, G.T.; Murthy, I.K.; Shreedhara, V.; Ravindranath, N.H.

    2007-06-01

    Setting a baseline for carbon stock changes in forest andland use sector mitigation projects is an essential step for assessingadditionality of the project. There are two approaches for settingbaselines namely, project-specific and regional baseline. This paperpresents the methodology adopted for estimating the land available formitigation, for developing a regional baseline, transaction cost involvedand a comparison of project-specific and regional baseline. The studyshowed that it is possible to estimate the potential land and itssuitability for afforestation and reforestation mitigation projects,using existing maps and data, in the dry zone of Karnataka, southernIndia. The study adopted a three-step approach for developing a regionalbaseline, namely: i) identification of likely baseline options for landuse, ii) estimation of baseline rates of land-use change, and iii)quantification of baseline carbon profile over time. The analysis showedthat carbon stock estimates made for wastelands and fallow lands forproject-specific as well as the regional baseline are comparable. Theratio of wasteland Carbon stocks of a project to regional baseline is1.02, and that of fallow lands in the project to regional baseline is0.97. The cost of conducting field studies for determination of regionalbaseline is about a quarter of the cost of developing a project-specificbaseline on a per hectare basis. The study has shown the reliability,feasibility and cost-effectiveness of adopting regional baseline forforestry sectormitigation projects.

  5. Development of computer graphics

    SciTech Connect (OSTI)

    Nuttall, H.E.

    1989-07-01

    The purpose of this project was to screen and evaluate three graphics packages as to their suitability for displaying concentration contour graphs. The information to be displayed is from computer code simulations describing air-born contaminant transport. The three evaluation programs were MONGO (John Tonry, MIT, Cambridge, MA, 02139), Mathematica (Wolfram Research Inc.), and NCSA Image (National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign). After a preliminary investigation of each package, NCSA Image appeared to be significantly superior for generating the desired concentration contour graphs. Hence subsequent work and this report describes the implementation and testing of NCSA Image on both an Apple MacII and Sun 4 computers. NCSA Image includes several utilities (Layout, DataScope, HDF, and PalEdit) which were used in this study and installed on Dr. Ted Yamada`s Mac II computer. Dr. Yamada provided two sets of air pollution plume data which were displayed using NCSA Image. Both sets were animated into a sequential expanding plume series.

  6. Tools for Closure Project and Contract Management: Development of the Rocky Flats Integrated Closure Project Baseline

    SciTech Connect (OSTI)

    Gelles, C. M.; Sheppard, F. R.

    2002-02-26

    This paper details the development of the Rocky Flats Integrated Closure Project Baseline - an innovative project management effort undertaken to ensure proactive management of the Rocky Flats Closure Contract in support of the Department's goal for achieving the safe closure of the Rocky Flats Environmental Technology Site (RFETS) in December 2006. The accelerated closure of RFETS is one of the most prominent projects within the Department of Energy (DOE) Environmental Management program. As the first major former weapons plant to be remediated and closed, it is a first-of-kind effort requiring the resolution of multiple complex technical and institutional challenges. Most significantly, the closure of RFETS is dependent upon the shipment of all special nuclear material and wastes to other DOE sites. The Department is actively working to strengthen project management across programs, and there is increasing external interest in this progress. The development of the Rocky Flats Integrated Closure Project Baseline represents a groundbreaking and cooperative effort to formalize the management of such a complex project across multiple sites and organizations. It is original in both scope and process, however it provides a useful precedent for the other ongoing project management efforts within the Environmental Management program.

  7. Development of baseline water quality stormwater detention pond model for Chesapeake Bay catchments

    SciTech Connect (OSTI)

    Musico, W.J.; Yoon, J.

    1999-07-01

    An environmental impact assessment is required for every proposed development in the Commonwealth of Virginia to help identify areas of potential concerns. The purpose of the Chesapeake Bay Local Assistance Department (CBLAD), Guidance Calculation Procedures is to ensure that development of previously constructed areas do not further exacerbate current problems of stormwater-induced eutrophication and downstream flooding. The methodology is based on the post development conditions that will not generate greater peak flows and will result in a 10% overall reduction of total phosphorus. Currently, several well-known models can develop hydrographs and pollutographs that accurately model the real response of a given watershed to any given rainfall event. However, conventional method of achieving the desired peak flow reduction and pollutant removal is not a deterministic procedure, and is inherently a trail and error process. A method of quickly and accurately determining the required size of stormwater easements was developed to evaluate the effectiveness of alternative stormwater collection and treatment systems. In this method, predevelopment conditions were modeled first to estimate the peak flows and subsequent pollutants generation that can be used as a baseline for post development plan. Resulting stormwater easement estimates facilitate decision-making processes during the planning and development phase of a project. The design can be optimized for the minimum cost or the smallest-possible pond size required for peak flow reduction and detention time given the most basic data such as: inflow hydrograph and maximum allowable pond depth.

  8. Baseline data for the residential sector and development of a residential forecasting database

    SciTech Connect (OSTI)

    Hanford, J.W.; Koomey, J.G.; Stewart, L.E.; Lecar, M.E.; Brown, R.E.; Johnson, F.X.; Hwang, R.J.; Price, L.K.

    1994-05-01

    This report describes the Lawrence Berkeley Laboratory (LBL) residential forecasting database. It provides a description of the methodology used to develop the database and describes the data used for heating and cooling end-uses as well as for typical household appliances. This report provides information on end-use unit energy consumption (UEC) values of appliances and equipment historical and current appliance and equipment market shares, appliance and equipment efficiency and sales trends, cost vs efficiency data for appliances and equipment, product lifetime estimates, thermal shell characteristics of buildings, heating and cooling loads, shell measure cost data for new and retrofit buildings, baseline housing stocks, forecasts of housing starts, and forecasts of energy prices and other economic drivers. Model inputs and outputs, as well as all other information in the database, are fully documented with the source and an explanation of how they were derived.

  9. High energy neutron Computed Tomography developed

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    High energy neutron Computed Tomography developed High energy neutron Computed Tomography developed LANSCE now has a high-energy neutron imaging capability that can be deployed on WNR flight paths for unclassified and classified objects. May 9, 2014 Neutron tomography horizontal "slice" of a tungsten and polyethylene test object containing tungsten carbide BBs. Neutron tomography horizontal "slice" of a tungsten and polyethylene test object containing tungsten carbide BBs.

  10. Development and Application of a Statistical Methodology to Evaluate the Predictive Accuracy of Building Energy Baseline Models

    SciTech Connect (OSTI)

    Granderson, Jessica; Price, Phillip N

    2014-02-21

    This  paper  documents  the  development  and  application  of  a  general  statistical  methodology to assess the accuracy of baseline energy models, focusing on its application  to  Measurement  and  Verification  (M&V)  of  whole-­building  energy  savings.  The methodology complements the principles addressed in resources such as ASHRAE Guideline  14  and  the  International  Performance  Measurement  and  Verification  Protocol. It requires fitting a baseline model to data from a ``training period’’ and using the  model  to  predict  total  electricity  consumption  during  a  subsequent  ``prediction  period.’’ We  illustrate  the  methodology  by  evaluating  five  baseline  models  using  data  from  29  buildings. The training period and prediction period were varied, and model predictions of  daily,  weekly,  and  monthly  energy  consumption  were  compared  to  meter  data  to  determine model accuracy. Several metrics were used to characterize the accuracy of the predictions, and in some cases the best-­performing model as judged by one metric was not the best performer when judged by another metric.

  11. Development And Implementation Of A Strategic Technical Baseline Approach For Nuclear Decommissioning And Clean Up Programmes In The UK

    SciTech Connect (OSTI)

    Brownridge, M.; Ensor, B.

    2008-07-01

    The NDA mission as set out within the Energy Act 2004 and stated in the NDA strategy is clear: - 'to deliver a world class programme of safe, cost-effective, accelerated and environmentally responsible decommissioning of the UK's civil nuclear legacy in an open and transparent manner and with due regard to the socio-economic impacts on our communities. Critical to achieving the NDA main objective and overall mission is to accelerate and deliver clean-up programmes through the application of appropriate and innovative technology. The NDA remit also requires us to secure good practice by contractors and carry out and promote research into matters relating to the decommissioning and clean up of nuclear installations and sites. NDA have defined a strategic approach for the underpinning of operational and decommissioning activities where each nuclear site is required to write within the Life Time Plans (LTP) the proposed technical baseline for those activities. This enables the robustness of the activities to be assessed, the gaps and opportunities and accompanying Research and Developments (R and D) requirements to be highlighted and investment to be targeted at key technical issues. NDA also supports the development of a commercial framework where innovation is encouraged and improvements can be demonstrated against the technical baseline. In this paper we will present NDA's overall strategic approach, the benefits already realised and highlight the areas for continued development. In conclusion: The development and implementation of a strategic approach to robustly underpin the technical components of the lifetime plans for operational and decommissioning activities on NDA sites has been extremely successful. As well as showing how mature technology assumptions are and where the key gaps and risks are it has also provided a method for highlighting opportunities to improve on that baseline. The use of a common template across all NDA LTPs has enabled direct comparison of issues, identification of inter-site dependencies and scheduling hold points and a platform for sharing success and good practice. This deployment of TRLs and TBURDs across all 20 UK NDA sites is unique. The strategic approach is linked into wider areas in NDA's mission including NDA skills strategy and NDA's Direct Research Portfolio. NDA plan to ensure the strategic approach continues to evolve by the following activities: - Benchmarking the success of the TBURD and consistency of TRL evaluation to develop to a world class model; - Development of quantitative as well as qualitative methods of measuring the benefits realised; - Further support the development of technical interest groups where good practice is rewarded, shared, deployed and the benefit realised is measured; - Seek to reward innovation through the technical baselines and continue to develop the commercial framework. The progress described to date has summarised the work to develop the approach and indicated how it has been embedded and applied within three LTP cycles. NDA will further develop the TBURD through international benchmarking to create a process that is world class and clearly linked to delivery and innovation in order to reduce cost and schedule. (authors)

  12. Scalable Computational Chemistry: New Developments and Applications

    SciTech Connect (OSTI)

    Yuri Alexeev

    2002-12-31

    The computational part of the thesis is the investigation of titanium chloride (II) as a potential catalyst for the bis-silylation reaction of ethylene with hexaclorodisilane at different levels of theory. Bis-silylation is an important reaction for producing bis(silyl) compounds and new C-Si bonds, which can serve as monomers for silicon containing polymers and silicon carbides. Ab initio calculations on the steps involved in a proposed mechanism are presented. This choice of reactants allows them to study this reaction at reliable levels of theory without compromising accuracy. The calculations indicate that this is a highly exothermic barrierless reaction. The TiCl{sub 2} catalyst removes a 50 kcal/mol activation energy barrier required for the reaction without the catalyst. The first step is interaction of TiCl{sub 2} with ethylene to form an intermediate that is 60 kcal/mol below the energy of the reactants. This is the driving force for the entire reaction. Dynamic correlation plays a significant role because RHF calculations indicate that the net barrier for the catalyzed reaction is 50 kcal/mol. They conclude that divalent Ti has the potential to become an important industrial catalyst for silylation reactions. In the programming part of the thesis, parallelization of different quantum chemistry methods is presented. The parallelization of code is becoming important aspects of quantum chemistry code development. Two trends contribute to it: the overall desire to study large chemical systems and the desire to employ highly correlated methods which are usually computationally and memory expensive. In the presented distributed data algorithms computation is parallelized and the largest arrays are evenly distributed among CPUs. First, the parallelization of the Hartree-Fock self-consistent field (SCF) method is considered. SCF method is the most common starting point for more accurate calculations. The Fock build (sub step of SCF) from AO integrals is also often used to avoid MO integral computation. The presented distributed data SCF increases the size of chemical systems that can be calculated by using RHF and DFT. The important ab initio method to study bond formation and breaking as well as excited molecules is CASSCF. The presented distributed data CASSCF algorithm can significantly decrease computational time and memory requirements per node. Therefore, large CASSCF computations can be performed. The most time consuming operation to study potential energy surfaces of reactions and chemical systems is Hessian calculations. The distributed data parallelization of CPHF will alloy scientists carry out large analytic Hessian calculations.

  13. Baseline design/economics for advanced Fischer-Tropsch technology

    SciTech Connect (OSTI)

    Not Available

    1992-04-27

    The objectives of the study are to: Develop a baseline design for indirect liquefaction using advanced Fischer-Tropsch (F-T) technology. Prepare the capital and operating costs for the baseline design. Develop a process flowsheet simulation (PFS) model. The baseline design, the economic analysis, and the computer model will be the major research planning tools that Pittsburgh Energy Technology Center will use to plan, guide, and evaluate its ongoing and future research and commercialization programs relating to indirect coal liquefaction for the manufacture of synthetic liquid fuels from coal.

  14. Development of Computer-Aided Design Tools for Automotive Batteries...

    Broader source: Energy.gov (indexed) [DOE]

    Progress of Computer-Aided Engineering of Batteries (CAEBAT) Vehicle Technologies Office Merit Review 2014: Development of Computer-Aided Design Tools for Automotive Batteries ...

  15. Computational Tools to Accelerate Commercial Development

    SciTech Connect (OSTI)

    Miller, David C.

    2013-01-01

    The goals of the work reported are: to develop new computational tools and models to enable industry to more rapidly develop and deploy new advanced energy technologies; to demonstrate the capabilities of the CCSI Toolset on non-proprietary case studies; and to deploy the CCSI Toolset to industry. Challenges of simulating carbon capture (and other) processes include: dealing with multiple scales (particle, device, and whole process scales); integration across scales; verification, validation, and uncertainty; and decision support. The tools cover: risk analysis and decision making; validated, high-fidelity CFD; high-resolution filtered sub-models; process design and optimization tools; advanced process control and dynamics; process models; basic data sub-models; and cross-cutting integration tools.

  16. Development of probabilistic multimedia multipathway computer codes.

    SciTech Connect (OSTI)

    Yu, C.; LePoire, D.; Gnanapragasam, E.; Arnish, J.; Kamboj, S.; Biwer, B. M.; Cheng, J.-J.; Zielen, A. J.; Chen, S. Y.; Mo, T.; Abu-Eid, R.; Thaggard, M.; Sallo, A., III.; Peterson, H., Jr.; Williams, W. A.; Environmental Assessment; NRC; EM

    2002-01-01

    The deterministic multimedia dose/risk assessment codes RESRAD and RESRAD-BUILD have been widely used for many years for evaluation of sites contaminated with residual radioactive materials. The RESRAD code applies to the cleanup of sites (soils) and the RESRAD-BUILD code applies to the cleanup of buildings and structures. This work describes the procedure used to enhance the deterministic RESRAD and RESRAD-BUILD codes for probabilistic dose analysis. A six-step procedure was used in developing default parameter distributions and the probabilistic analysis modules. These six steps include (1) listing and categorizing parameters; (2) ranking parameters; (3) developing parameter distributions; (4) testing parameter distributions for probabilistic analysis; (5) developing probabilistic software modules; and (6) testing probabilistic modules and integrated codes. The procedures used can be applied to the development of other multimedia probabilistic codes. The probabilistic versions of RESRAD and RESRAD-BUILD codes provide tools for studying the uncertainty in dose assessment caused by uncertain input parameters. The parameter distribution data collected in this work can also be applied to other multimedia assessment tasks and multimedia computer codes.

  17. BASELINE DESIGN/ECONOMICS FOR ADVANCED FISCHER-TROPSCH TECHNOLOGY

    SciTech Connect (OSTI)

    1998-04-01

    Bechtel, along with Amoco as the main subcontractor, developed a Baseline design, two alternative designs, and computer process simulation models for indirect coal liquefaction based on advanced Fischer-Tropsch (F-T) technology for the U. S. Department of Energy's (DOE's) Federal Energy Technology Center (FETC).

  18. NASA technical baseline

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    ... Twitter Google + Vimeo GovDelivery SlideShare SunShot Grand Challenge: Regional Test Centers NASA technical baseline HomeTag:NASA technical baseline Curiosity's multi-mission ...

  19. Preliminary Phase Field Computational Model Development

    SciTech Connect (OSTI)

    Li, Yulan; Hu, Shenyang Y.; Xu, Ke; Suter, Jonathan D.; McCloy, John S.; Johnson, Bradley R.; Ramuhalli, Pradeep

    2014-12-15

    This interim report presents progress towards the development of meso-scale models of magnetic behavior that incorporate microstructural information. Modeling magnetic signatures in irradiated materials with complex microstructures (such as structural steels) is a significant challenge. The complexity is addressed incrementally, using the monocrystalline Fe (i.e., ferrite) film as model systems to develop and validate initial models, followed by polycrystalline Fe films, and by more complicated and representative alloys. In addition, the modeling incrementally addresses inclusion of other major phases (e.g., martensite, austenite), minor magnetic phases (e.g., carbides, FeCr precipitates), and minor nonmagnetic phases (e.g., Cu precipitates, voids). The focus of the magnetic modeling is on phase-field models. The models are based on the numerical solution to the Landau-Lifshitz-Gilbert equation. From the computational standpoint, phase-field modeling allows the simulation of large enough systems that relevant defect structures and their effects on functional properties like magnetism can be simulated. To date, two phase-field models have been generated in support of this work. First, a bulk iron model with periodic boundary conditions was generated as a proof-of-concept to investigate major loop effects of single versus polycrystalline bulk iron and effects of single non-magnetic defects. More recently, to support the experimental program herein using iron thin films, a new model was generated that uses finite boundary conditions representing surfaces and edges. This model has provided key insights into the domain structures observed in magnetic force microscopy (MFM) measurements. Simulation results for single crystal thin-film iron indicate the feasibility of the model for determining magnetic domain wall thickness and mobility in an externally applied field. Because the phase-field model dimensions are limited relative to the size of most specimens used in experiments, special experimental methods were devised to create similar boundary conditions in the iron films. Preliminary MFM studies conducted on single and polycrystalline iron films with small sub-areas created with focused ion beam have correlated quite well qualitatively with phase-field simulations. However, phase-field model dimensions are still small relative to experiments thus far. We are in the process of increasing the size of the models and decreasing specimen size so both have identical dimensions. Ongoing research is focused on validation of the phase-field model. Validation is being accomplished through comparison with experimentally obtained MFM images (in progress), and planned measurements of major hysteresis loops and first order reversal curves. Extrapolation of simulation sizes to represent a more stochastic bulk-like system will require sampling of various simulations (i.e., with single non-magnetic defect, single magnetic defect, single grain boundary, single dislocation, etc.) with distributions of input parameters. These outputs can then be compared to laboratory magnetic measurements and ultimately to simulate magnetic Barkhausen noise signals.

  20. Development of Computer-Aided Design Tools for Automotive Batteries |

    Broader source: Energy.gov (indexed) [DOE]

    Department of Energy 9_han_2012_o.pdf More Documents & Publications Progress of Computer-Aided Engineering of Batteries (CAEBAT) Computer-Aided Engineering for Electric Drive Vehicle Batteries (CAEBAT) Vehicle Technologies Office Merit Review 2014: Development of Computer-Aided Design Tools for Automotive Batteries

  1. NREL Supports Industry to Develop Computer-Aided Engineering...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    NREL Supports Industry to Develop Computer-Aided Engineering Tools for Car Batteries July ... tools to help produce the next generation of electric drive vehicle (EDV) batteries. ...

  2. TWRS baseline system description

    SciTech Connect (OSTI)

    Lee, A.K.

    1995-03-28

    This document provides a description of the baseline system conceptualized for remediating the tank waste stored within the Hanford Site. Remediation of the tank waste will be performed by the Tank Waste Remediation System (TWRS). This baseline system description (BSD) document has been prepared to describe the current planning basis for the TWRS for accomplishing the tank waste remediation functions. The BSD document is not intended to prescribe firm program management strategies for implementing the TWRS. The scope of the TWRS Program includes managing existing facilities, developing technology for new systems; building, testing and operating new facilities; and maintaining the system. The TWRS Program will manage the system used for receiving, safely storing, maintaining, treating, and disposing onsite, or packaging for offsite disposal, all tank waste. The scope of the TWRS Program encompasses existing facilities such as waste storage tanks, evaporators, pipelines, and low-level radioactive waste treatment and disposal facilities. It includes support facilities that comprise the total TWRS infrastructure, including upgrades to existing facilities or equipment and the addition of new facilities.

  3. Synthesis and Comparison of Baseline Avian and Bat Use, Raptor Nesting and Mortality Information from Proposed and Existing Wind Developments: Final Report.

    SciTech Connect (OSTI)

    Erickson, Wallace P.

    2002-12-01

    Primarily due to concerns generated from observed raptor mortality at the Altamont Pass (CA) wind plant, one of the first commercial electricity generating wind plants in the U.S., new proposed wind projects both within and outside of California have received a great deal of scrutiny and environmental review. A large amount of baseline and operational monitoring data have been collected at proposed and existing U.S. wind plants. The primary use of the avian baseline data collected at wind developments has been to estimate the overall project impacts (e.g., very low, low, moderate, and high relative mortality) on birds, especially raptors and sensitive species (e.g., state and federally listed species). In a few cases, these data have also been used for guiding placement of turbines within a project boundary. This new information has strengthened our ability to accurately predict and mitigate impacts from new projects. This report should assist various stakeholders in the interpretation and use of this large information source in evaluating new projects. This report also suggests that the level of baseline data (e.g., avian use data) required to adequately assess expected impacts of some projects may be reduced. This report provides an evaluation of the ability to predict direct impacts on avian resources (primarily raptors and waterfowl/waterbirds) using less than an entire year of baseline avian use data (one season, two seasons, etc.). This evaluation is important because pre-construction wildlife surveys can be one of the most time-consuming aspects of permitting wind power projects. For baseline data, this study focuses primarily on standardized avian use data usually collected using point count survey methodology and raptor nest survey data. In addition to avian use and raptor nest survey data, other baseline data is usually collected at a proposed project to further quantify potential impacts. These surveys often include vegetation mapping and state or federal sensitive-status wildlife and plant surveys if there is a likelihood of these species occurring in the vicinity of the project area. This report does not address these types of surveys, however, it is assumed in this document that those surveys are conducted when appropriate to help further quantify potential impacts. The amount and extent of ecological baseline data to collect at a wind project should be determined on a case-by-case basis. The decision should use information gained from this report, recent information from new projects (e.g., Stateline OR/WA), existing project site data from agencies and other knowledgeable groups/individuals, public scoping, and results of vegetation and habitat mapping. Other factors that should also be considered include the likelihood of the presence of sensitive species at the site and expected impacts to those species, project size and project layout.

  4. Hazard baseline documentation

    SciTech Connect (OSTI)

    Not Available

    1994-08-01

    This DOE limited technical standard establishes uniform Office of Environmental Management (EM) guidance on hazards baseline documents that identify and control radiological and nonradiological hazards for all EM facilities. It provides a road map to the safety and health hazard identification and control requirements contained in the Department`s orders and provides EM guidance on the applicability and integration of these requirements. This includes a definition of four classes of facilities (nuclear, non-nuclear, radiological, and other industrial); the thresholds for facility hazard classification; and applicable safety and health hazard identification, controls, and documentation. The standard applies to the classification, development, review, and approval of hazard identification and control documentation for EM facilities.

  5. Hazard Baseline Documentation

    Broader source: Directives, Delegations, and Requirements [Office of Management (MA)]

    1995-12-04

    This standard establishes uniform Office of Environmental Management (EM) guidance on hazard baseline documents that identify and control radiological and non-radiological hazards for all EM facilities.

  6. Develop baseline computational model for proactive welding stress management to suppress helium induced cracking during weld repair

    Broader source: Energy.gov [DOE]

    There are over 100 nuclear power plants operating in the U.S., which generate approximately 20% of the nation’s electricity. These plants range from 15 to 40 years old. Extending the service lives...

  7. Development of Computer-Aided Design Tools for Automotive Batteries |

    Broader source: Energy.gov (indexed) [DOE]

    Department of Energy 8_hartridge_2012_o.pdf More Documents & Publications Progress of Computer-Aided Engineering of Batteries (CAEBAT) Vehicle Technologies Office Merit Review 2014: Development of Computer-Aided Design Tools for Automotive Batteries Review of A123s HEV and PHEV USABC Programs

  8. Computational Tools for Accelerating Carbon Capture Process Development

    SciTech Connect (OSTI)

    Miller, David; Sahinidis, N.V,; Cozad, A; Lee, A; Kim, H; Morinelly, J.; Eslick, J.; Yuan, Z.

    2013-06-04

    This presentation reports development of advanced computational tools to accelerate next generation technology development. These tools are to develop an optimized process using rigorous models. They include: Process Models; Simulation-Based Optimization; Optimized Process; Uncertainty Quantification; Algebraic Surrogate Models; and Superstructure Optimization (Determine Configuration).

  9. The role of customized computational tools in product development.

    SciTech Connect (OSTI)

    Heinstein, Martin Wilhelm; Kempka, Steven Norman; Tikare, Veena

    2005-06-01

    Model-based computer simulations have revolutionized product development in the last 10 to 15 years. Technologies that have existed for many decades or even centuries have been improved with the aid of computer simulations. Everything from low-tech consumer goods such as detergents, lubricants and light bulb filaments to the most advanced high-tech products such as airplane wings, wireless communication technologies and pharmaceuticals is engineered with the aid of computer simulations today. In this paper, we present a framework for describing computational tools and their application within the context of product engineering. We examine a few cases of product development that integrate numerical computer simulations into the development stage. We will discuss how the simulations were integrated into the development process, what features made the simulations useful, the level of knowledge and experience that was necessary to run meaningful simulations and other details of the process. Based on this discussion, recommendations for the incorporation of simulations and computational tools into product development will be made.

  10. Baseline design/economics for advanced Fischer-Tropsch technology. Quarterly report, October--December 1991

    SciTech Connect (OSTI)

    Not Available

    1992-04-27

    The objectives of the study are to: Develop a baseline design for indirect liquefaction using advanced Fischer-Tropsch (F-T) technology. Prepare the capital and operating costs for the baseline design. Develop a process flowsheet simulation (PFS) model. The baseline design, the economic analysis, and the computer model will be the major research planning tools that Pittsburgh Energy Technology Center will use to plan, guide, and evaluate its ongoing and future research and commercialization programs relating to indirect coal liquefaction for the manufacture of synthetic liquid fuels from coal.

  11. HEV America Baseline Test Sequence

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    BASELINE TEST SEQUENCE Revision 1 September 1, 2006 Prepared by Electric Transportation ... All Rights Reserved HEV America Baseline Test Sequence Page 1 HEV PERFORMANCE TEST ...

  12. Transportation Baseline Report

    SciTech Connect (OSTI)

    Fawcett, Ricky Lee; Kramer, George Leroy Jr.

    1999-12-01

    The National Transportation Program 1999 Transportation Baseline Report presents data that form a baseline to enable analysis and planning for future Department of Energy (DOE) Environmental Management (EM) waste and materials transportation. In addition, this Report provides a summary overview of DOE’s projected quantities of waste and materials for transportation. Data presented in this report were gathered as a part of the IPABS Spring 1999 update of the EM Corporate Database and are current as of July 30, 1999. These data were input and compiled using the Analysis and Visualization System (AVS) which is used to update all stream-level components of the EM Corporate Database, as well as TSD System and programmatic risk (disposition barrier) information. Project (PBS) and site-level IPABS data are being collected through the Interim Data Management System (IDMS). The data are presented in appendices to this report.

  13. New DOE Office of Science support for CAMERA to develop computational...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    to develop computational mathematics for experimental facilities research New DOE Office of Science support for CAMERA to develop computational mathematics for experimental ...

  14. Baseline LAW Glass Formulation Testing

    SciTech Connect (OSTI)

    Kruger, Albert A.; Mooers, Cavin; Bazemore, Gina; Pegg, Ian L.; Hight, Kenneth; Lai, Shan Tao; Buechele, Andrew; Rielley, Elizabeth; Gan, Hao; Muller, Isabelle S.; Cecil, Richard

    2013-06-13

    The major objective of the baseline glass formulation work was to develop and select glass formulations that are compliant with contractual and processing requirements for each of the LAW waste streams. Other objectives of the work included preparation and characterization of glasses with respect to the properties of interest, optimization of sulfate loading in the glasses, evaluation of ability to achieve waste loading limits, testing to demonstrate compatibility of glass melts with melter materials of construction, development of glass formulations to support ILAW qualification activities, and identification of glass formulation issues with respect to contract specifications and processing requirements.

  15. Accelerating technology development through integrated computation and experimentation

    SciTech Connect (OSTI)

    Shekhawat, Dushyant; Srivastava, Rameshwar

    2013-01-01

    This special section of Energy & Fuels comprises a selection of papers presented at the topical conference “Accelerating Technology Development through Integrated Computation and Experimentation”, sponsored and organized by the United States Department of Energy’s National Energy Technology Laboratory (NETL) as part of the 2012 American Institute of Chemical Engineers (AIChE) Annual Meeting held in Pittsburgh, PA, Oct 28?Nov 2, 2012. That topical conference focused on the latest research and development efforts in five main areas related to fossil energy, with each area focusing on the utilization of both experimental and computational approaches: (1) gas separations (membranes, sorbents, and solvents for CO{sub 2}, H{sub 2}, and O{sub 2} production), (2) CO{sub 2} utilization (enhanced oil recovery, chemical production, mineralization, etc.), (3) carbon sequestration (flow in natural systems), (4) advanced power cycles (oxy-combustion, chemical looping, gasification, etc.), and (5) fuel processing (H{sub 2} production for fuel cells).

  16. Development of a Very Dense Liquid Cooled Compute Platform

    SciTech Connect (OSTI)

    Hughes, Phillip N.; Lipp, Robert J.

    2013-12-10

    The objective of this project was to design and develop a prototype very energy efficient high density compute platform with 100% pumped refrigerant liquid cooling using commodity components and high volume manufacturing techniques. Testing at SLAC has indicated that we achieved a DCIE of 0.93 against our original goal of 0.85. This number includes both cooling and power supply and was achieved employing some of the highest wattage processors available.

  17. Baseline design/economics for advanced Fischer-Tropsch technology. Quarterly report, October--December 1992

    SciTech Connect (OSTI)

    Not Available

    1992-12-31

    Bechtel, with Amoco as the main subcontractor, initiated a study on September 26, 1991, for the US Department of Energy`s (DOE`s) Pittsburgh Energy Technology Center (PETC) to develop a computer model and baseline design for advanced Fischer-Tropsch (F-T) technology. This 24-month study, with an approved budget of $2.3 million, is being performed under DOE Contract Number AC22-91PC90027. (1) Develop a baseline design and two alternative designs for indirect liquefaction using advanced F-T technology. The baseline design uses Illinois No. 6 Eastern Coal and conventional refining. There is an alternative refining case using ZSM-5 treatment of the vapor stream from the slurry F-T reactor and an alternative coal case using Western coal from the Powder River Basin. (2) Prepare the capital and operating costs for the baseline design and the alternatives. Individual plant costs for the alternative cases will be prorated on capacity, wherever possible, from the baseline case. (3) Develop a process flowsheet simulation (PFS) model. The baseline design, the economic analysis and computer model will be major research planning tools that PETC will use to plan, guide and evaluate its ongoing and future research and commercialization programs relating to indirect coal liquefaction for the manufacture of synthetic liquid fuels from coal.

  18. Annual Technology Baseline

    Broader source: Energy.gov [DOE]

    The National Renewable Energy Laboratory is conducting a study sponsored by the U.S. Department of Energy DOE, Office of Energy Efficiency and Renewable Energy (EERE), that aims to document and implement an annual process designed to identify a realistic and timely set of input assumptions (e.g., technology cost and performance, fuel costs), and a diverse set of potential futures (standard scenarios), initially for electric sector analysis. This primary product of the Annual Technology Baseline (ATB) project component includes detailed cost and performance data (both current and projected) for both renewable and conventional technologies. This data is presented in MS Excel.

  19. FED baseline engineering studies report

    SciTech Connect (OSTI)

    Sager, P.H.

    1983-04-01

    Studies were carried out on the FED Baseline to improve design definition, establish feasibility, and reduce cost. Emphasis was placed on cost reduction, but significant feasibility concerns existed in several areas, and better design definition was required to establish feasibility and provide a better basis for cost estimates. Design definition and feasibility studies included the development of a labyrinth shield ring concept to prevent radiation streaming between the torus spool and the TF coil cryostat. The labyrinth shield concept which was developed reduced radiation streaming sufficiently to permit contact maintenance of the inboard EF coils. Various concepts of preventing arcing between adjacent shield sectors were also explored. It was concluded that installation of copper straps with molybdenum thermal radiation shields would provide the most reliable means of preventing arcing. Other design studies included torus spool electrical/structural concepts, test module shielding, torus seismic response, poloidal conditions in the magnets, disruption characteristics, and eddy current effects. These additional studies had no significant impact on cost but did confirm the feasibility of the basic FED Baseline concept.

  20. Development of Computer-Aided Design Tools for Automotive Batteries...

    Broader source: Energy.gov (indexed) [DOE]

    More Documents & Publications Progress of Computer-Aided Engineering of Batteries (CAEBAT) Computer-Aided Engineering for Electric Drive Vehicle Batteries (CAEBAT) Vehicle ...

  1. Computational Tools for Accelerating Carbon Capture Process Development

    SciTech Connect (OSTI)

    Miller, David

    2013-01-01

    The goals of the work reported are: to develop new computational tools and models to enable industry to more rapidly develop and deploy new advanced energy technologies; to demonstrate the capabilities of the CCSI Toolset on non-proprietary case studies; and to deploy the CCSI Toolset to industry. Challenges of simulating carbon capture (and other) processes include: dealing with multiple scales (particle, device, and whole process scales); integration across scales; verification, validation, and uncertainty; and decision support. The tools cover: risk analysis and decision making; validated, high-fidelity CFD; high-resolution filtered sub-models; process design and optimization tools; advanced process control and dynamics; process models; basic data sub-models; and cross-cutting integration tools.

  2. Direct coal liquefaction baseline design and system analysis

    SciTech Connect (OSTI)

    Not Available

    1991-04-01

    The primary objective of the study is to develop a computer model for a base line direct coal liquefaction design based on two stage direct coupled catalytic reactors. This primary objective is to be accomplished by completing the following: a base line design based on previous DOE/PETC results from Wilsonville pilot plant and other engineering evaluations; a cost estimate and economic analysis; a computer model incorporating the above two steps over a wide range of capacities and selected process alternatives; a comprehensive training program for DOE/PETC Staff to understand and use the computer model; a thorough documentation of all underlying assumptions for baseline economics; and a user manual and training material which will facilitate updating of the model in the future.

  3. Direct coal liquefaction baseline design and system analysis

    SciTech Connect (OSTI)

    Not Available

    1991-07-01

    The primary objective of the study is to develop a computer model for a base line direct coal liquefaction design based on two stage direct coupled catalytic reactors. This primary objective is to be accomplished by completing the following: a base line design based on previous DOE/PETC results from Wilsonville pilot plant and other engineering evaluations; a cost estimate and economic analysis; a computer model incorporating the above two steps over a wide range of capacities and selected process alternatives; a comprehensive training program for DOE/PETC Staff to understand and use the computer model; a thorough documentation of all underlying assumptions for baseline economics; and a user manual and training material which will facilitate updating of the model in the future.

  4. Baseline design/economics for advanced Fischer-Tropsch technology. Quarterly report, April--June 1993

    SciTech Connect (OSTI)

    1993-12-31

    The objectives of this study are to: (1) Develop a baseline design and two alternative designs for indirect liquefaction using advanced F-T technology. The baseline design uses Illinois No. 6 Eastern Coal and conventional refining. There is an alternative refining case using ZSM-5 treatment of the vapor stream from the slurry F-T reactor and an alternative coal case using Western coal from the Powder River Basin. (2) Prepare the capital and operating costs for the baseline design and the alternatives. Individual plant costs for the alternative cases will be prorated on capacity, wherever possible, from the baseline case. (3) Develop a process flowsheet simulation model. The baseline design, the economic analysis and computer model will be major research planning tools that PETC will use to plan, guide and evaluate its ongoing and future research and commercialization programs relating to indirect coal liquefaction for the manufacture of synthetic liquid fuels from coal. The study has been divided into seven major tasks: Task 1: Establish the baseline design and alternatives. Task 2: Evaluate baseline and alternative economics. Task 3: Develop engineering design criteria. Task 4: Develop a process flowsheet simulation (PFS) model. Task 5: Perform sensitivity studies using the PFS model. Task 6: Document the PFS model and develop a DOE training session on its use. Task 7: Perform project management, technical coordination and other miscellaneous support functions. During the reporting period, work progressed on Tasks 1, 4 and 7. This report covers work done during the period and consists of four sections: Introduction and Summary. Task 1--Baseline Design and Alternatives. Task 4--Process Flowsheet Simulation (PFS) Model, and Project Management and Staffing Report.

  5. Baseline design/economics for advanced Fischer-Tropsch technology. Quarterly report, January--March 1993

    SciTech Connect (OSTI)

    1993-12-31

    The objectives of this study are to: Develop a baseline design and two alternative designs for indirect liquefaction using advanced F-T technology. The baseline design uses Illinois No. 6 Eastern Coal and conventional refining. There is an alternative refining case using ZSM- 5 treatment of the vapor stream from the slurry F-T reactor and an alternative coal case using Western coal from the Powder River Basin. Prepare the capital and operating costs for the baseline design and the alternatives. Individual plant costs for the alternative cases will be prorated on capacity, wherever possible, from the baseline case, and develop a process flowsheet simulation model. The baseline design, the economic analysis and computer model will be major research planning tools that Pittsburgh Energy Technology Center will use to plan, guide and evaluate its ongoing and future research and commercialization programs relating to indirect coal liquefaction for the manufacture of synthetic liquid fuels from coal. The study has been divided into seven major tasks: Task 1, establish the baseline design and alternatives; Task 2, evaluate baseline and alternative economics; Task 3, develop engineering design criteria; Task 4, develop a process flowsheet simulation (PFS) model; Task 5, perform sensitivity studies using the PFS model; Task 6, document the PFS model and develop a DOE training session on its use; and Task 7, perform project management, technical coordination and other miscellaneous support functions. This report covers work done during the period and consists of four sections: Introduction and summary; Task 1, baseline design and alternatives; Task 4, process flowsheet simulation (PFS) model; and project management and staffing report.

  6. Baseline design/economics for advanced Fischer-Tropsch technology. Quarterly report, October--December 1994

    SciTech Connect (OSTI)

    1993-12-31

    The objectives of the study are to: Develop a baseline design and two alternative designs for indirect liquefaction using advanced F-T technology. The baseline design uses Illinois No. 6 Eastern Coal and conventional refining. There is an alternative refining case using ZSM-5 treatment of the vapor stream from the slurry F-T reactor and an alternative coal case using Western coal from the Powder River Basin. Prepare the capital and operating costs for the baseline design and the alternatives. Individual plant costs for the alternative cases will be prorated on capacity, wherever possible, from the baseline case. Develop a process flowsheet simulation model. The baseline design, the economic analysis and computer model will be major research planning tools that PETC will use to plan, guide and evaluate its ongoing and future research and commercialization programs relating to indirect coal liquefaction for the manufacture of synthetic liquid fuels from coal. The study has been divided into seven major tasks. Task 1: Establish the baseline design and alternatives. Task 2: Evaluate baseline and alternative economics. Task 3: Develop engineering design criteria. Task 4: Develop a process flowsheet simulation model. Task 5: Perform sensitivity studies using the PFS model. Task 6: Document the PFS model and develop a DOE training session on its use, and Task 7: Perform project management, technical coordination and other miscellaneous support functions. During the reporting period, work progressed on Tasks 1, 4 and 7. This report covers work done during the period and consists of four sections: Introduction and Summary. Task 1--Baseline Design and Alternatives. Task 4--Process Flowsheet Simulation Model. Project Management and Staffing Report.

  7. Integrated Baseline System (IBS) Version 2.0: Models guide

    SciTech Connect (OSTI)

    Not Available

    1994-03-01

    The Integrated Baseline System (IBS) is an emergency management planning and analysis tool being developed under the direction of the US Army Nuclear and Chemical Agency. This Models Guide summarizes the IBS use of several computer models for predicting the results of emergency situations. These include models for predicting dispersion/doses of airborne contaminants, traffic evacuation, explosion effects, heat radiation from a fire, and siren sound transmission. The guide references additional technical documentation on the models when such documentation is available from other sources. The audience for this manual is chiefly emergency management planners and analysts, but also data managers and system managers.

  8. Baseline design/economics for advanced Fischer-Tropsch technology. Quarterly report, April--June 1994

    SciTech Connect (OSTI)

    1994-01-01

    The objectives of this study are to: Develop a baseline design and two alternative designs for indirect liquefaction using advanced F-T technology. The baseline design uses Illinois No. 6 Eastern Coal and conventional refining. There is an alternative refining case using ZSM-5 treatment of the vapor steam from the flurry F-T reactor and an alternative coal case using Western coal from the Powder River Basin. Prepare the capital and operating costs for the baseline design and the alternatives. Individual plant costs for the alternative cases will be prorated on capacity, wherever possible, from the baseline case, develop a process flowsheet simulation (PFS) model. The baseline design, the economic analysis and computer model will be major research planning tools that Pittsburgh Energy Technology Center will use to plan, guide and evaluate its ongoing and future research and commercialization programs relating to indirect coal liquefaction for the manufacture of synthetic liquid fuels from coal. During the reporting period, work progressed on Tasks 1, 4, 5, 6 and 7. This report covers work done during the period and consists of six sections: introduction and summary; Task 1, baseline design and alternatives; Task 4, process flowsheet simulation (PFS) model; Task 5, perform sensitivity studies using the PFS model; Task 6, document the PFS model and develop a DOE training session on its use, and project management and staffing report.

  9. Baseline Test Specimen Machining Report

    SciTech Connect (OSTI)

    mark Carroll

    2009-08-01

    The Next Generation Nuclear Plant (NGNP) Project is tasked with selecting a high temperature gas reactor technology that will be capable of generating electricity and supplying large amounts of process heat. The NGNP is presently being designed as a helium-cooled high temperature gas reactor (HTGR) with a large graphite core. The graphite baseline characterization project is conducting the research and development (R&D) activities deemed necessary to fully qualify nuclear-grade graphite for use in the NGNP reactor. Establishing nonirradiated thermomechanical and thermophysical properties by characterizing lot-to-lot and billet-to-billet variations (for probabilistic baseline data needs) through extensive data collection and statistical analysis is one of the major fundamental objectives of the project. The reactor core will be made up of stacks of graphite moderator blocks. In order to gain a more comprehensive understanding of the varying characteristics in a wide range of suitable graphites, any of which can be classified as “nuclear grade,” an experimental program has been initiated to develop an extensive database of the baseline characteristics of numerous candidate graphites. Various factors known to affect the properties of graphite will be investigated, including specimen size, spatial location within a graphite billet, specimen orientation within a billet (either parallel to [P] or transverse to [T] the long axis of the as-produced billet), and billet-to-billet variations within a lot or across different production lots. Because each data point is based on a certain position within a given billet of graphite, particular attention must be paid to the traceability of each specimen and its spatial location and orientation within each billet. The evaluation of these properties is discussed in the Graphite Technology Development Plan (Windes et. al, 2007). One of the key components in the evaluation of these graphite types will be mechanical testing on specimens drawn from carefully controlled sections of each billet. To this end, this report will discuss the machining of the first set of test specimens that will be evaluated in this program through tensile, compressive, and flexural testing. Validation that the test specimens have been produced to the tolerances required by the applicable ASTM standards, and to the quality control levels required by this program, will demonstrate the viability of sending graphite to selected suppliers that will provide valuable and certifiable data to future data sets that are integral to the NGNP program and beyond.

  10. Single ion implantation for solid state quantum computer development

    SciTech Connect (OSTI)

    Schenkel, Thomas; Meijers, Jan; Persaud, Arun; McDonald, Joseph W.; Holder, Joseph P.; Schneider, Dieter H.

    2001-12-18

    Several solid state quantum computer schemes are based on the manipulation of electron and nuclear spins of single donor atoms in a solid matrix. The fabrication of qubit arrays requires the placement of individual atoms with nanometer precision and high efficiency. In this article we describe first results from low dose, low energy implantations and our development of a low energy (<10 keV), single ion implantation scheme for {sup 31}P{sup q+} ions. When {sup 31}P{sup q+} ions impinge on a wafer surface, their potential energy (9.3 keV for P{sup 15+}) is released, and about 20 secondary electrons are emitted. The emission of multiple secondary electrons allows detection of each ion impact with 100% efficiency. The beam spot on target is controlled by beam focusing and collimation. Exactly one ion is implanted into a selected area avoiding a Poissonian distribution of implanted ions.

  11. Pinellas Plant Environmental Baseline Report

    SciTech Connect (OSTI)

    Not Available

    1997-06-01

    The Pinellas Plant has been part of the Department of Energy`s (DOE) nuclear weapons complex since the plant opened in 1957. In March 1995, the DOE sold the Pinellas Plant to the Pinellas County Industry Council (PCIC). DOE has leased back a large portion of the plant site to facilitate transition to alternate use and safe shutdown. The current mission is to achieve a safe transition of the facility from defense production and prepare the site for alternative uses as a community resource for economic development. Toward that effort, the Pinellas Plant Environmental Baseline Report (EBR) discusses the current and past environmental conditions of the plant site. Information for the EBR is obtained from plant records. Historical process and chemical usage information for each area is reviewed during area characterizations.

  12. Baseline design/economics for advanced Fischer-Tropsch technology. Auarterly report, July--September 1992

    SciTech Connect (OSTI)

    1992-12-31

    The objectives of this study are to: Develop a baseline design for indirect liquefaction using advanced F-T technology; prepare the capital and operating costs for the baseline design; and develop a process flowsheet simulation model. The baseline design, the economic analysis and computer model will be major research planning tools that Pittsburgh Energy Technology Center will use to plan, guide and evaluate its ongoing and future research and commercialisation programs relating to indirect coal liquefaction for the manufacture of synthetic liquid fuels from coal. The study has been divided into seven major tasks: Task 1, establish the baseline design and alternatives; Task 2, evaluate baseline economics; Task 3: Develop engineering design criteria; Task 4, develop a process flowsheet simulation (PFS) model; Task 5, perform sensitivity studies using the PFS model; Task 6, document the PFS model and develop a DOE training session on its use; Task 7, perform project management, technical coordination and other miscellaneous support functions. During the reporting period work progressed on Tasks 1, 4 and 7. This report covers work done during the period and consists of five sections: Introduction and summary; preliminary design for syngas production; Task 1, preliminary F-T reaction loop design; Task 1, development of a process simulation model; Task 4, key personnel staffing report, Task 7.

  13. Baseline Control Measures.pdf

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Individual Permit Baseline Control Measures at Los Alamos National Laboratory, Poster, Individual Permit for Storm Water, NPDES Permit No. NM0030759 Author(s): Veenis, Steven J....

  14. Energy Intensity Baselining and Tracking Guidance | Department...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Technical Assistance Better Plants Energy Intensity Baselining and Tracking Guidance Energy Intensity Baselining and Tracking Guidance The Energy Intensity Baselining and ...

  15. computers

    National Nuclear Security Administration (NNSA)

    Each successive generation of computing system has provided greater computing power and energy efficiency.

    CTS-1 clusters will support NNSA's Life Extension Program and...

  16. Hanford Site technical baseline database

    SciTech Connect (OSTI)

    Porter, P.E., Westinghouse Hanford

    1996-05-10

    This document includes a cassette tape that contains the Hanford specific files that make up the Hanford Site Technical Baseline Database as of May 10, 1996. The cassette tape also includes the delta files that delineate the differences between this revision and revision 3 (April 10, 1996) of the Hanford Site Technical Baseline Database.

  17. Ethiopia-National Greenhouse Gas Emissions Baseline Scenarios...

    Open Energy Info (EERE)

    National Greenhouse Gas Emissions Baseline Scenarios: Learning from Experiences in Developing Countries Jump to: navigation, search Name Ethiopia-National Greenhouse Gas Emissions...

  18. U.S. Department of Energy Performance Baseline Guide

    Broader source: Directives, Delegations, and Requirements [Office of Management (MA)]

    2008-09-12

    The guide supports DOE O 413.3A and identifies key performance baseline development processes and practices. Does not cancel other directives.

  19. Computing

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Computing /newsroom/_assets/images/computing-icon.png Computing Providing world-class high performance computing capability that enables unsurpassed solutions to complex problems of strategic national interest. Health Space Computing Energy Earth Materials Science Technology The Lab All Los Alamos National Laboratory sits on top of a once-remote mesa in northern New Mexico with the Jemez mountains as a backdrop to research and innovation covering multi-disciplines from bioscience, sustainable

  20. Baseline Graphite Characterization: First Billet

    SciTech Connect (OSTI)

    Mark C. Carroll; Joe Lords; David Rohrbaugh

    2010-09-01

    The Next Generation Nuclear Plant Project Graphite Research and Development program is currently establishing the safe operating envelope of graphite core components for a very high temperature reactor design. To meet this goal, the program is generating the extensive amount of quantitative data necessary for predicting the behavior and operating performance of the available nuclear graphite grades. In order determine the in-service behavior of the graphite for the latest proposed designs, two main programs are underway. The first, the Advanced Graphite Creep (AGC) program, is a set of experiments that are designed to evaluate the irradiated properties and behavior of nuclear grade graphite over a large spectrum of temperatures, neutron fluences, and compressive loads. Despite the aggressive experimental matrix that comprises the set of AGC test runs, a limited amount of data can be generated based upon the availability of space within the Advanced Test Reactor and the geometric constraints placed on the AGC specimens that will be inserted. In order to supplement the AGC data set, the Baseline Graphite Characterization program will endeavor to provide supplemental data that will characterize the inherent property variability in nuclear-grade graphite without the testing constraints of the AGC program. This variability in properties is a natural artifact of graphite due to the geologic raw materials that are utilized in its production. This variability will be quantified not only within a single billet of as-produced graphite, but also from billets within a single lot, billets from different lots of the same grade, and across different billets of the numerous grades of nuclear graphite that are presently available. The thorough understanding of this variability will provide added detail to the irradiated property data, and provide a more thorough understanding of the behavior of graphite that will be used in reactor design and licensing. This report covers the development of the Baseline Graphite Characterization program from a testing and data collection standpoint through the completion of characterization on the first billet of nuclear-grade graphite. This data set is the starting point for all future evaluations and comparisons of material properties.

  1. 324 Building Baseline Radiological Characterization

    SciTech Connect (OSTI)

    R.J. Reeder, J.C. Cooper

    2010-06-24

    This report documents the analysis of radiological data collected as part of the characterization study performed in 1998. The study was performed to create a baseline of the radiological conditions in the 324 Building.

  2. Baseline design/economics for advanced Fischer-Tropsch technology. Quarterly report, April--June 1992

    SciTech Connect (OSTI)

    Not Available

    1992-10-01

    Effective September 26, 1991, Bechtel, with Amoco as the main subcontractor, initiated a study to develop a computer model and baseline design for advanced Fischer-Tropsch (F-T) technology for the US Department of Energy`s Pittsburgh Energy Technology Center (PETC). The objectives of the study are to: Develop a baseline design for indirect liquefaction using advanced F-T technology; prepare the capital and operating costs for the baseline design; and develop a process flow sheet simulation (PI-S) model. The baseline design, the economic analysis, and the computer model win be the major research planning tools that PETC will use to plan, guide, and evaluate its ongoing and future research and commercialization programs relating to indirect coal liquefaction. for the manufacture of synthetic liquid fuels from coal. This report is Bechtel`s third quarterly technical progress report covering the period from March 16, 1992 through June 21, 1992. This report consists of seven sections: Section 1 - introduction; Section 2 - summary; Section 3 - carbon dioxide removal tradeoff study; Section 4 - preliminary plant designs for coal preparation; Section 5 - preliminary design for syngas production; Section 6 - Task 3 - engineering design criteria; and Section 7 - project management.

  3. U.S Department of Energy Performance Baseline Guide

    Broader source: Directives, Delegations, and Requirements [Office of Management (MA)]

    2011-09-23

    This guide identifies key Performance Baseline (PB) elements, development processes, and practices; describes the context in which DOE PB development occurs; and suggests ways of addressing the critical elements in PB development.

  4. Mid-Atlantic Baseline Studies Project | Department of Energy

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Mid-Atlantic Baseline Studies Project Mid-Atlantic Baseline Studies Project Funded by the Department of Energy, along with a number of partners, the collaborative Mid-Atlantic Baseline Studies Project, led by the Biodiversity Research Institute (BRI), helps improve understanding of species composition and use of the Mid-Atlantic marine environment in order to promote more sustainable offshore wind development. This first-of-its-kind study along the Eastern Seaboard of the United States delivers

  5. Computations

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    ... Software Computations Uncertainty Quantification Stochastic About CRF Transportation Energy Consortiums Engine Combustion Heavy Duty Heavy Duty Low-Temperature & Diesel Combustion ...

  6. Computer

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    I. INTRODUCTION This paper presents several computational tools required for processing images of a heavy ion beam and estimating the magnetic field within a plasma. The...

  7. computers

    National Nuclear Security Administration (NNSA)

    California.

     

    Retired computers used for cybersecurity research at Sandia National...

  8. Firm develops own EMS built on Apple computer

    SciTech Connect (OSTI)

    Pospisil, R.

    1982-04-05

    Firestone Fibers and Textile Co. programmed a $2000 desktop Apple II computer and special electronic panels designed by the engineering staff to perform process control and other energy-management functions. The system should reduce natural gas consumption 40% and save the company up to $75,000 a year by reducing the amount of hot air exhausted from fabric-treating ovens. The system can be expanded to control lights and space-conditioning equipment. The company is willing to negotiate with other firms to market the panels. The Apple II was chosen because it has a high capacity for data acquisition and testing and because of the available software. (DCK)

  9. Tank waste remediation system technical baseline summary description

    SciTech Connect (OSTI)

    Raymond, R.E.

    1998-01-08

    This document is one of the tools used to develop and control the mission work as depicted in the included figure. This Technical Baseline Summary Description document is the top-level tool for management of the Technical Baseline for waste storage operations.

  10. Computing

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Office of Advanced Scientific Computing Research in the Department of Energy Office of Science under contract number DE-AC02-05CH11231. Application and System Memory Use, ...

  11. Annual Technology Baseline (Including Supporting Data); NREL...

    Office of Scientific and Technical Information (OSTI)

    Annual Technology Baseline (Including Supporting Data); NREL (National Renewable Energy Laboratory) Citation Details In-Document Search Title: Annual Technology Baseline ...

  12. Baseline Wind Energy Facility | Open Energy Information

    Open Energy Info (EERE)

    Wind Energy Facility Jump to: navigation, search Name Baseline Wind Energy Facility Facility Baseline Wind Energy Facility Sector Wind energy Facility Type Commercial Scale Wind...

  13. Engineering task plan TWRS technical baseline completion

    SciTech Connect (OSTI)

    Moore, T.L

    1996-03-08

    The Tank Waste Remediation System (TWRS) includes many activities required to remediate the radioactive waste stored in underground waste storage tanks. These activities include routine monitoring of the waste, facilities maintenance, upgrades to existing equipment, and installation of new equipment necessary to manage, retrieve, process, and dispose of the waste. In order to ensure that these multiple activities are integrated, cost effective, and necessary, a sound technical baseline is required from which all activities can be traced and measured. The process by which this technical baseline is developed will consist of the identification of functions, requirements, architecture, and test (FRAT) methodology. This process must be completed for TWRS to a level that provides the technical basis for all facility/system/component maintenance, upgrades, or new equipment installation.

  14. Baseline Control Measures.pdf

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Individual Permit Baseline Control Measures at Los Alamos National Laboratory, Poster, Individual Permit for Storm Water, NPDES Permit No. NM0030759 Author(s): Veenis, Steven J. Intended for: Public Purpose: This poster was prepared for the June 2013 Individual Permit for Storm Water (IP) public meeting. The purpose of the meeting was to update the public on implementation of the permit as required under Part 1.I (7) of the IP (National Pollutant Discharge Elimination System Permit No.

  15. Open source development experience with a computational gas-solids flow code

    SciTech Connect (OSTI)

    Syamlal, M; O'Brien, T. J.; Benyahia, Sofiane; Gel, Aytekin; Pannala, Sreekanth

    2008-01-01

    A case study on the use of open source (OS) software development in chemical engineering research and education is presented here. The multiphase computational fluid dynamics software MFIX is the object of the case study. The verification and validation steps required for constructing modern computational software and the advantages of OS development in those steps are discussed. The infrastructure used for enabling the OS development of MFIX is described. The impact of OS development on computational research and education in gas-solids flow and the dissemination of information to other areas such as geotechnical and volcanology research are demonstrated. It is shown that the advantages of OS development methodology were realized: verification by many users, which enhances software quality; the use of software as a means for accumulating and exchanging information; and the facilitation of peer review of the results of computational research.

  16. Computations

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Computations - Sandia Energy Energy Search Icon Sandia Home Locations Contact Us Employee Locator Energy & Climate Secure & Sustainable Energy Future Stationary Power Energy Conversion Efficiency Solar Energy Wind Energy Water Power Supercritical CO2 Geothermal Natural Gas Safety, Security & Resilience of the Energy Infrastructure Energy Storage Nuclear Power & Engineering Grid Modernization Battery Testing Nuclear Fuel Cycle Defense Waste Management Programs Advanced Nuclear

  17. Computational and Experimental Development of Novel High Temperature Alloys

    SciTech Connect (OSTI)

    Kramer, M.J.; Ray, P.K.; and Akinc, M.

    2010-06-29

    The work done in this paper is based on our earlier work on developing an extended Miedema model and then using it to downselect potential alloy systems. Our approach is to closely couple the semi-empirical methodologies to more accurate ab initio methods to dentify the best candidates for ternary alloying additions. The architectural framework for our material's design is a refractory base metal with a high temperature intermetallic which provides both high temperature creep strength and a source of oxidatively stable elements. Potential refractory base metals are groups IIIA, IVA and VA. For Fossil applications, Ni-Al appears to be the best choice to provide the source of oxidatively stable elements but this system requires a 'boost' in melting temperatures to be a viable candidate in the ultra-high temperature regime (> 1200C). Some late transition metals and noble elements are known to increase the melting temperature of Ni-Al phases. Such an approach suggested that a Mo-Ni-Al system would be a good base alloy system that could be further improved upon by dding Platinum group metals (PGMs). In this paper, we demonstrate the variety of microstructures that can be synthesized for the base alloy system, its oxidation behavior as well as the oxidation behavior of the PGM substituted oxidation resistant B2 NiAl phase.

  18. Vehicle Technologies Office Merit Review 2015: Development of Computer-Aided Design Tools for Automotive Batteries

    Broader source: Energy.gov [DOE]

    Presentation given by General Motors at 2015 DOE Hydrogen and Fuel Cells Program and Vehicle Technologies Office Annual Merit Review and Peer Evaluation Meeting about development of computer-aided...

  19. Vehicle Technologies Office Merit Review 2014: Development of Computer-Aided Design Tools for Automotive Batteries

    Broader source: Energy.gov [DOE]

    Presentation given by General Motors at 2014 DOE Hydrogen and Fuel Cells Program and Vehicle Technologies Office Annual Merit Review and Peer Evaluation Meeting about development of computer-aided...

  20. Vehicle Technologies Office Merit Review 2014: Development of Computer-Aided Design Tools for Automotive Batteries

    Broader source: Energy.gov [DOE]

    Presentation given by CD-Adapco at 2014 DOE Hydrogen and Fuel Cells Program and Vehicle Technologies Office Annual Merit Review and Peer Evaluation Meeting about development of computer-aided...

    1. Laboratory Directed Research & Development Page National Energy Research Scientific Computing Center

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Directed Research & Development Page National Energy Research Scientific Computing Center T3E Individual Node Optimization Michael Stewart, SGI/Cray, 4/9/98 * Introduction * T3E Processor * T3E Local Memory * Cache Structure * Optimizing Codes for Cache Usage * Loop Unrolling * Other Useful Optimization Options * References 1 Laboratory Directed Research & Development Page National Energy Research Scientific Computing Center Introduction * Primary topic will be single processor

    2. New DOE Office of Science support for CAMERA to develop computational

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      mathematics for experimental facilities research New DOE Office of Science support for CAMERA to develop computational mathematics for experimental facilities research New DOE Office of Science support for CAMERA to develop computational mathematics for experimental facilities research September 22, 2015 Contact: Linda Vu, +1 510 495 2402, lvu@lbl.gov newcameralogofinal Experimental science is evolving. With the advent of new technology, scientific facilities are collecting data at

    3. Accelerating Development of EV Batteries Through Computer-Aided Engineering (Presentation)

      SciTech Connect (OSTI)

      Pesaran, A.; Kim, G. H.; Smith, K.; Santhanagopalan, S.

      2012-12-01

      The Department of Energy's Vehicle Technology Program has launched the Computer-Aided Engineering for Automotive Batteries (CAEBAT) project to work with national labs, industry and software venders to develop sophisticated software. As coordinator, NREL has teamed with a number of companies to help improve and accelerate battery design and production. This presentation provides an overview of CAEBAT, including its predictive computer simulation of Li-ion batteries known as the Multi-Scale Multi-Dimensional (MSMD) model framework. MSMD's modular, flexible architecture connects the physics of battery charge/discharge processes, thermal control, safety and reliability in a computationally efficient manner. This allows independent development of submodels at the cell and pack levels.

    4. Global Nuclear Energy Partnership Waste Treatment Baseline

      SciTech Connect (OSTI)

      Dirk Gombert; William Ebert; James Marra; Robert Jubin; John Vienna

      2008-05-01

      The Global Nuclear Energy Partnership program (GNEP) is designed to demonstrate a proliferation-resistant and sustainable integrated nuclear fuel cycle that can be commercialized and used internationally. Alternative stabilization concepts for byproducts and waste streams generated by fuel recycling processes were evaluated and a baseline of waste forms was recommended for the safe disposition of waste streams. Waste forms are recommended based on the demonstrated or expected commercial practicability and technical maturity of the processes needed to make the waste forms, and performance of the waste form materials when disposed. Significant issues remain in developing technologies to process some of the wastes into the recommended waste forms, and a detailed analysis of technology readiness and availability may lead to the choice of a different waste form than what is recommended herein. Evolving regulations could also affect the selection of waste forms.

    5. High-Performance Computing for Alloy Development | netl.doe.gov

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      High-Performance Computing for Alloy Development alloy-development.jpg Tomorrow's fossil-fuel based power plants will achieve higher efficiencies by operating at higher pressures and temperatures and under harsher and more corrosive conditions. Unfortunately, conventional metals simply cannot withstand these extreme environments, so advanced alloys must be designed and fabricated to meet the needs of these advanced systems. The properties of metal alloys, which are mixtures of metallic elements,

    6. NREL: Climate Neutral Research Campuses - Determine Baseline...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Determine Baseline Energy Consumption To create a climate action plan for your research campus, begin by determining current energy consumption and the resulting greenhouse gas ...

    7. Tank waste remediation systems technical baseline database

      SciTech Connect (OSTI)

      Porter, P.E.

      1996-10-16

      This document includes a cassette tape that contains Hanford generated data for the Tank Waste Remediation Systems Technical Baseline Database as of October 09, 1996.

    8. Baselines for Greenhouse Gas Reductions: Problems, Precedents...

      Open Energy Info (EERE)

      Baseline projection, GHG inventory, Pathways analysis Resource Type: Publications, Lessons learnedbest practices Website: www.p2pays.orgref2221739.pdf References:...

    9. Long-Baseline Neutrino Facility / Deep Underground Neutrino Project...

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      Long-Baseline Neutrino Facility Deep Underground Neutrino Project (LBNF-DUNE) Long-Baseline Neutrino Facility Deep Underground Neutrino Project (LBNF-DUNE) Long-Baseline ...

    10. LEDSGP/Transportation Toolkit/Key Actions/Create a Baseline ...

      Open Energy Info (EERE)

      a Baseline) Jump to: navigation, search LEDSGP Logo.png Transportation Toolkit Home Tools Training Request Assistance Key Actions for Low-Emission Development in Transportation...

    11. TWRS technical baseline database manager definition document

      SciTech Connect (OSTI)

      Acree, C.D.

      1997-08-13

      This document serves as a guide for using the TWRS Technical Baseline Database Management Systems Engineering (SE) support tool in performing SE activities for the Tank Waste Remediation System (TWRS). This document will provide a consistent interpretation of the relationships between the TWRS Technical Baseline Database Management software and the present TWRS SE practices. The Database Manager currently utilized is the RDD-1000 System manufactured by the Ascent Logic Corporation. In other documents, the term RDD-1000 may be used interchangeably with TWRS Technical Baseline Database Manager.

    12. Management of the baseline shift using a new and simple method for respiratory-gated radiation therapy: Detectability and effectiveness of a flexible monitoring system

      SciTech Connect (OSTI)

      Tachibana, Hidenobu; Kitamura, Nozomi; Ito, Yasushi; Kawai, Daisuke; Nakajima, Masaru; Tsuda, Akihisa; Shiizuka, Hisao

      2011-07-15

      Purpose: In respiratory-gated radiation therapy, a baseline shift decreases the accuracy of target coverage and organs at risk (OAR) sparing. The effectiveness of audio-feedback and audio-visual feedback in correcting the baseline shift in the breathing pattern of the patient has been demonstrated previously. However, the baseline shift derived from the intrafraction motion of the patient's body cannot be corrected by these methods. In the present study, the authors designed and developed a simple and flexible system. Methods: The system consisted of a web camera and a computer running our in-house software. The in-house software was adapted to template matching and also to no preimage processing. The system was capable of monitoring the baseline shift in the intrafraction motion of the patient's body. Another marker box was used to monitor the baseline shift due to the flexible setups required of a marker box for gated signals. The system accuracy was evaluated by employing a respiratory motion phantom and was found to be within AAPM Task Group 142 tolerance (positional accuracy <2 mm and temporal accuracy <100 ms) for respiratory-gated radiation therapy. Additionally, the effectiveness of this flexible and independent system in gated treatment was investigated in healthy volunteers, in terms of the results from the differences in the baseline shift detectable between the marker positions, which the authors evaluated statistically. Results: The movement of the marker on the sternum [1.599 {+-} 0.622 mm (1 SD)] was substantially decreased as compared with the abdomen [6.547 {+-} 0.962 mm (1 SD)]. Additionally, in all of the volunteers, the baseline shifts for the sternum [-0.136 {+-} 0.868 (2 SD)] were in better agreement with the nominal baseline shifts than was the case for the abdomen [-0.722 {+-} 1.56 mm (2 SD)]. The baseline shifts could be accurately measured and detected using the monitoring system, which could acquire the movement of the marker on the sternum. The baseline shift-monitoring system with the displacement-based methods for highly accurate respiratory-gated treatments should be used to make most of the displacement-based gating methods. Conclusions: The advent of intensity modulated radiation therapy and volumetric modulated radiation therapy facilitates margin reduction for the planning target volumes and the OARs, but highly accurate irradiation is needed to achieve target coverage and OAR sparing with a small margin. The baseline shifts can affect treatment not only with the respiratory gating system but also without the system. Our system can manage the baseline shift and also enables treatment irradiation to be undertaken with high accuracy.

    13. Performance Modeling for 3D Visualization in a Heterogeneous Computing

      Office of Scientific and Technical Information (OSTI)

      Environment (Technical Report) | SciTech Connect Performance Modeling for 3D Visualization in a Heterogeneous Computing Environment Citation Details In-Document Search Title: Performance Modeling for 3D Visualization in a Heterogeneous Computing Environment The visualization of large, remotely located data sets necessitates the development of a distributed computing pipeline in order to reduce the data, in stages, to a manageable size. The required baseline infrastructure for launching such

    14. Hanford Site technical baseline database. Revision 1

      SciTech Connect (OSTI)

      Porter, P.E.

      1995-01-27

      This report lists the Hanford specific files (Table 1) that make up the Hanford Site Technical Baseline Database. Table 2 includes the delta files that delineate the differences between this revision and revision 0 of the Hanford Site Technical Baseline Database. This information is being managed and maintained on the Hanford RDD-100 System, which uses the capabilities of RDD-100, a systems engineering software system of Ascent Logic Corporation (ALC). This revision of the Hanford Site Technical Baseline Database uses RDD-100 version 3.0.2.2 (see Table 3). Directories reflect those controlled by the Hanford RDD-100 System Administrator. Table 4 provides information regarding the platform. A cassette tape containing the Hanford Site Technical Baseline Database is available.

    15. Computing Videos

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Computing Videos Computing

    16. Integrated Baseline Bystem (IBS) Version 1.03: Models guide

      SciTech Connect (OSTI)

      Not Available

      1993-01-01

      The Integrated Baseline System)(IBS), operated by the Federal Emergency Management Agency (FEMA), is a system of computerized tools for emergency planning and analysis. This document is the models guide for the IBS and explains how to use the emergency related computer models. This document provides information for the experienced system user, and is the primary reference for the computer modeling software supplied with the system. It is designed for emergency managers and planners, and others familiar with the concepts of computer modeling. Although the IBS manual set covers basic and advanced operations, it is not a complete reference document set. Emergency situation modeling software in the IBS is supported by additional technical documents. Some of the other IBS software is commercial software for which more complete documentation is available. The IBS manuals reference such documentation where necessary.

    17. Development of a computer wellbore simulator for coiled-tube operations

      SciTech Connect (OSTI)

      Gu, H.; Walton, I.C.; Dowell, S.

      1994-12-31

      This paper describes a computer wellbore simulator developed for coiled tubing operations of fill cleanout and unloading of oil and gas wells. The simulator models the transient, multiphase fluid flow and mass transport process that occur in these operations. Unique features of the simulator include a sand bed that may form during fill cleanout in deviated and horizontal wells, particle transport with multiphase compressible fluids, and the transient unloading process of oil and gas wells. The requirements for a computer wellbore simulator for coiled tubing operations are discussed and it is demonstrated that the developed simulator is suitable for modeling these operations. The simulator structure and the incorporation of submodules for gas/liquid two-phase flow, reservoir and choke models, and coiled tubing movement are addressed. Simulation examples are presented to show the sand bed formed in cleanout in a deviated well and the transient unloading results of oil and gas wells. The wellbore simulator developed in this work can assist a field engineer with the design of coiled tubing operations. By using the simulator to predict the pressure, flow rates, sand concentration and bed depth, the engineer will be able to select the coiled tubing, fluid and schedule of an optimum design for particular well and reservoir conditions.

    18. Fortran Transformational Tools in Support of Scientific Application Development for Petascale Computer Architectures

      SciTech Connect (OSTI)

      Sottille, Matthew

      2013-09-12

      This document is the final report for a multi-year effort building infrastructure to support tool development for Fortran programs. We also investigated static analysis and code transformation methods relevant to scientific programmers who are writing Fortran programs for petascale-class high performance computing systems. This report details our accomplishments, technical approaches, and provides information on where the research results and code may be obtained from an open source software repository. The report for the first year of the project that was performed at the University of Oregon prior to the PI moving to Galois, Inc. is included as an appendix.

    19. GridPACK Toolkit for Developing Power Grid Simulations on High Performance Computing Platforms

      SciTech Connect (OSTI)

      Palmer, Bruce J.; Perkins, William A.; Glass, Kevin A.; Chen, Yousu; Jin, Shuangshuang; Callahan, Charles D.

      2013-11-30

      This paper describes the GridPACK™ framework, which is designed to help power grid engineers develop modeling software capable of running on todays high performance computers. The framework contains modules for setting up distributed power grid networks, assigning buses and branches with arbitrary behaviors to the network, creating distributed matrices and vectors, using parallel linear and non-linear solvers to solve algebraic equations, and mapping functionality to create matrices and vectors based on properties of the network. In addition, the framework contains additional functionality to support IO and to manage errors.

    20. Revised SRC-I project baseline. Volume 1

      SciTech Connect (OSTI)

      Not Available

      1984-01-01

      International Coal Refining Company (ICRC), in cooperation with the Commonwealth of Kentucky has contracted with the United States Department of Energy (DOE) to design, build and operate a first-of-its-kind plant demonstrating the economic, environmental, socioeconomic and technical feasibility of the direct coal liquefaction process known as SRC-I. ICRC has made a massive commitment of time and expertise to design processes, plan and formulate policy, schedules, costs and technical drawings for all plant systems. These fully integrated plans comprise the Project Baseline and are the basis for all future detailed engineering, plant construction, operation, and other work set forth in the contract between ICRC and the DOE. Volumes I and II of the accompanying documents constitute the updated Project Baseline for the SRC-I two-stage liquefaction plant. International Coal Refining Company believes this versatile plant design incorporates the most advanced coal liquefaction system available in the synthetic fuels field. SRC-I two-stage liquefaction, as developed by ICRC, is the way of the future in coal liquefaction because of its product slate flexibility, high process thermal efficiency, and low consumption of hydrogen. The SRC-I Project Baseline design also has made important state-of-the-art advances in areas such as environmental control systems. Because of a lack of funding, the DOE has curtailed the total project effort without specifying a definite renewal date. This precludes the development of revised accurate and meaningful schedules and, hence, escalated project costs. ICRC has revised and updated the original Design Baseline to include in the technical documentation all of the approved but previously non-incorporated Category B and C and new Post-Baseline Engineering Change Proposals.

    1. Development of high performance scientific components for interoperability of computing packages

      SciTech Connect (OSTI)

      Gulabani, Teena Pratap

      2008-12-01

      Three major high performance quantum chemistry computational packages, NWChem, GAMESS and MPQC have been developed by different research efforts following different design patterns. The goal is to achieve interoperability among these packages by overcoming the challenges caused by the different communication patterns and software design of each of these packages. A chemistry algorithm is hard to develop as well as being a time consuming process; integration of large quantum chemistry packages will allow resource sharing and thus avoid reinvention of the wheel. Creating connections between these incompatible packages is the major motivation of the proposed work. This interoperability is achieved by bringing the benefits of Component Based Software Engineering through a plug-and-play component framework called Common Component Architecture (CCA). In this thesis, I present a strategy and process used for interfacing two widely used and important computational chemistry methodologies: Quantum Mechanics and Molecular Mechanics. To show the feasibility of the proposed approach the Tuning and Analysis Utility (TAU) has been coupled with NWChem code and its CCA components. Results show that the overhead is negligible when compared to the ease and potential of organizing and coping with large-scale software applications.

    2. TWRS privatization phase I - site characterization and environmental baseline work plan

      SciTech Connect (OSTI)

      Reidel, S.P.; Hodges, F.N., Westinghouse Hanford

      1996-08-27

      This work plan defines the steps necessary to develop a Site Characterization Plan and Environmental Baseline for the TWRS Privatization Phase I area. The Data Quality Objectives Process will be the primary tool used to develop these plans.

    3. Waste management project technical baseline description

      SciTech Connect (OSTI)

      Sederburg, J.P.

      1997-08-13

      A systems engineering approach has been taken to describe the technical baseline under which the Waste Management Project is currently operating. The document contains a mission analysis, function analysis, requirement analysis, interface definitions, alternative analysis, system definition, documentation requirements, implementation definitions, and discussion of uncertainties facing the Project.

    4. Solid Waste Program technical baseline description

      SciTech Connect (OSTI)

      Carlson, A.B.

      1994-07-01

      The system engineering approach has been taken to describe the technical baseline under which the Solid Waste Program is currently operating. The document contains a mission analysis, function analysis, system definition, documentation requirements, facility and project bases, and uncertainties facing the program.

    5. Mexico-Danish Government Baseline Workstream | Open Energy Information

      Open Energy Info (EERE)

      Danish Government Baseline Workstream Jump to: navigation, search Name Mexico-Danish Government Baseline Workstream AgencyCompany Organization Danish Government Partner Danish...

    6. Indonesia-Danish Government Baseline Workstream | Open Energy...

      Open Energy Info (EERE)

      Indonesia-Danish Government Baseline Workstream Jump to: navigation, search Name Indonesia-Danish Government Baseline Workstream AgencyCompany Organization Danish Government...

    7. India-Danish Government Baseline Workstream | Open Energy Information

      Open Energy Info (EERE)

      Danish Government Baseline Workstream Jump to: navigation, search Name India-Danish Government Baseline Workstream AgencyCompany Organization Danish Government Partner Danish...

    8. UNFCCC-Consolidated baseline and monitoring methodology for landfill...

      Open Energy Info (EERE)

      Consolidated baseline and monitoring methodology for landfill gas project activities Jump to: navigation, search Tool Summary LAUNCH TOOL Name: UNFCCC-Consolidated baseline and...

    9. China-Danish Government Baseline Workstream | Open Energy Information

      Open Energy Info (EERE)

      Danish Government Baseline Workstream Jump to: navigation, search Name China-Danish Government Baseline Workstream AgencyCompany Organization Danish Government Partner Danish...

    10. EVMS Training Snippet: 4.6 Baseline Control Methods

      Broader source: Energy.gov [DOE]

      This EVMS Training Snippet, sponsored by the Office of Project Management (PM) discusses baseline revisions and the different baseline control vehicles used in DOE.

    11. South Africa-Danish Government Baseline Workstream | Open Energy...

      Open Energy Info (EERE)

      Baseline Workstream Jump to: navigation, search Name South Africa-Danish Government Baseline Workstream AgencyCompany Organization Danish Government Partner Danish Ministry for...

    12. EA-1943: Construction and Operation of the Long Baseline Neutrino...

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      43: Construction and Operation of the Long Baseline Neutrino Facility and Deep Underground ... EA-1943: Construction and Operation of the Long Baseline Neutrino Facility and Deep ...

    13. Brazil-Danish Government Baseline Workstream | Open Energy Information

      Open Energy Info (EERE)

      Danish Government Baseline Workstream Jump to: navigation, search Name Brazil-Danish Government Baseline Workstream AgencyCompany Organization Danish Government Partner Danish...

    14. Final Scientific Report: A Scalable Development Environment for Peta-Scale Computing

      SciTech Connect (OSTI)

      Karbach, Carsten; Frings, Wolfgang

      2013-02-20

      This document is the final scientific report of the project DE-SC000120 (A scalable Development Environment for Peta-Scale Computing). The objective of this project is the extension of the Parallel Tools Platform (PTP) for applying it to peta-scale systems. PTP is an integrated development environment for parallel applications. It comprises code analysis, performance tuning, parallel debugging and system monitoring. The contribution of the Juelich Supercomputing Centre (JSC) aims to provide a scalable solution for system monitoring of supercomputers. This includes the development of a new communication protocol for exchanging status data between the target remote system and the client running PTP. The communication has to work for high latency. PTP needs to be implemented robustly and should hide the complexity of the supercomputer's architecture in order to provide a transparent access to various remote systems via a uniform user interface. This simplifies the porting of applications to different systems, because PTP functions as abstraction layer between parallel application developer and compute resources. The common requirement for all PTP components is that they have to interact with the remote supercomputer. E.g. applications are built remotely and performance tools are attached to job submissions and their output data resides on the remote system. Status data has to be collected by evaluating outputs of the remote job scheduler and the parallel debugger needs to control an application executed on the supercomputer. The challenge is to provide this functionality for peta-scale systems in real-time. The client server architecture of the established monitoring application LLview, developed by the JSC, can be applied to PTP's system monitoring. LLview provides a well-arranged overview of the supercomputer's current status. A set of statistics, a list of running and queued jobs as well as a node display mapping running jobs to their compute resources form the user display of LLview. These monitoring features have to be integrated into the development environment. Besides showing the current status PTP's monitoring also needs to allow for submitting and canceling user jobs. Monitoring peta-scale systems especially deals with presenting the large amount of status data in a useful manner. Users require to select arbitrary levels of detail. The monitoring views have to provide a quick overview of the system state, but also need to allow for zooming into specific parts of the system, into which the user is interested in. At present, the major batch systems running on supercomputers are PBS, TORQUE, ALPS and LoadLeveler, which have to be supported by both the monitoring and the job controlling component. Finally, PTP needs to be designed as generic as possible, so that it can be extended for future batch systems.

    15. TWRS phase I privatization site environmental baseline and characterization plan

      SciTech Connect (OSTI)

      Shade, J.W.

      1997-09-01

      This document provides a plan to characterize and develop an environmental baseline for the TWRS Phase I Privatization Site before construction begins. A site evaluation study selected the former Grout Disposal Area of the Grout Treatment Facility in the 200 East Area as the TWRS Phase I Demonstration Site. The site is generally clean and has not been used for previous activities other than the GTF. A DQO process was used to develop a Sampling and Analysis Plan that would allow comparison of site conditions during operations and after Phase I ends to the presently existing conditions and provide data for the development of a preoperational monitoring plan.

    16. Baseline Microstructural Characterization of Outer 3013 Containers

      SciTech Connect (OSTI)

      Zapp, Phillip E.; Dunn, Kerry A

      2005-07-31

      Three DOE Standard 3013 outer storage containers were examined to characterize the microstructure of the type 316L stainless steel material of construction. Two of the containers were closure-welded yielding production-quality outer 3013 containers; the third examined container was not closed. Optical metallography and Knoop microhardness measurements were performed to establish a baseline characterization that will support future destructive examinations of 3013 outer containers in the storage inventory. Metallography revealed the microstructural features typical of this austenitic stainless steel as it is formed and welded. The grains were equiaxed with evident annealing twins. Flow lines were prominent in the forming directions of the cylindrical body and flat lids and bottom caps. No adverse indications were seen. Microhardness values, although widely varying, were consistent with annealed austenitic stainless steel. The data gathered as part of this characterization will be used as a baseline for the destructive examination of 3013 containers removed from the storage inventory.

    17. Module 7 - Integrated Baseline Review and Change Control | Department of

      Energy Savers [EERE]

      Energy 7 - Integrated Baseline Review and Change Control Module 7 - Integrated Baseline Review and Change Control This module focuses on integrated baseline reviews (IBR) and change control. This module outlines the objective and responsibility of an integrated baseline review. Additionally, this module will discuss the change control process required for implementing earned value

    18. Document Number Q0029500 Baseline Risk Assessment Update 4.0 Baseline Risk Assessment Update

      Office of Legacy Management (LM)

      Baseline Risk Assessment Update 4.0 Baseline Risk Assessment Update This section updates the human health and the ecological risk assessments that were originally presented in the 1998 RI (DOE 1998a). The impacts on the 1998 risk assessments are summarized in Section 2.9. 4.1 Human Health Risk Assessment Several activities completed since 1998 have contributed to changes in surface water and ground water concentrations. Activities that have impacted, or likely impacted surface water and ground

    19. Computing Sciences

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Computing Sciences Our Vision National User Facilities Research Areas In Focus Global Solutions ⇒ Navigate Section Our Vision National User Facilities Research Areas In Focus Global Solutions Computational Research Division The Computational Research Division conducts research and development in mathematical modeling and simulation, algorithm design, data storage, management and analysis, computer system architecture and high-performance software implementation. Scientific Networking

    20. Long-Baseline Neutrino Experiment (LBNE)Conceptual Design ReportThe LBNE Water Cherenkov DetectorApril 13 2012

      SciTech Connect (OSTI)

      Kettell S. H.; Bishai, M.; Brown, R.; Chen, H.; Diwan, M.; Dolph, J., Geronimo, G.; Gill, R.; Hackenburg, R.; Hahn, R.; Hans, S.; Isvan, Z.; Jaffe, D.; Junnarkar, S.; Kettell, S.H.; Lanni,F.; Li, Y.; Ling, J.; Littenberg, L.; Makowiecki, D.; Marciano, W.; Morse, W.; Parsa, Z.; Radeka, V.; Rescia, S.; Samios, N.; Sharma, R.; Simos, N.; Sondericker, J.; Stewart, J.; Tanaka, H.; Themann, H.; Thorn, C.; Viren, B., White, S.; Worcester, E.; Yeh, M.; Yu, B.; Zhang, C.

      2012-04-13

      Conceptual Design Report (CDR) developed for the Water Cherekov Detector (WCD) option for the far detector of the Long Baseline Neutrino Experiment (LBNE)

    1. Integrated Baseline System (IBS) Version 1.03: Utilities guide

      SciTech Connect (OSTI)

      Burford, M.J.; Downing, T.R.; Pottier, M.C.; Schrank, E.E.; Williams, J.R.

      1993-01-01

      The Integrated Baseline System (IBS) is an emergency management planning and analysis tool that was developed under the direction of the Federal Emergency Management Agency (FEMA). This Utilities Guide explains how to operate utility programs that are supplied as a part of the IBS. These utility programs are chiefly for managing and manipulating various kinds of IBS data and system administration files. Many of the utilities are for creating, editing, converting, or displaying map data and other data that are related to geographic location.

    2. Integrated Baseline System (IBS) Version 2.0: Utilities Guide

      SciTech Connect (OSTI)

      Burford, M.J.; Downing, T.R.; Williams, J.R.; Bower, J.C.

      1994-03-01

      The Integrated Baseline System (IBS) is an emergency management planning and analysis tool being developed under the direction of the US Army Nuclear and Chemical Agency. This Utilities Guide explains how you can use the IBS utility programs to manage and manipulate various kinds of IBS data. These programs include utilities for creating, editing, and displaying maps and other data that are referenced to geographic location. The intended audience for this document are chiefly data managers but also system managers and some emergency management planners and analysts.

    3. Process Simulation Role in the Development of New Alloys Based on Integrated Computational Material Science and Engineering

      SciTech Connect (OSTI)

      Sabau, Adrian S [ORNL; Porter, Wallace D [ORNL; Roy, Shibayan [ORNL; Shyam, Amit [ORNL

      2014-01-01

      To accelerate the introduction of new materials and components, the development of metal casting processes requires the teaming between different disciplines, as multi-physical phenomena have to be considered simultaneously for the process design and optimization of mechanical properties. The required models for physical phenomena as well as their validation status for metal casting are reviewed. The data on materials properties, model validation, and relevant microstructure for materials properties are highlighted. One vehicle to accelerate the development of new materials is through combined experimental-computational efforts. Integrated computational/experimental practices are reviewed; strengths and weaknesses are identified with respect to metal casting processes. Specifically, the examples are given for the knowledge base established at Oak Ridge National Laboratory and computer models for predicting casting defects and microstructure distribution in aluminum alloy components.

    4. Baseline design/economics for advanced Fischer-Tropsch technology. Quarterly report, July--September 1993

      SciTech Connect (OSTI)

      1993-12-31

      The objectives of this study are to: Develop a baseline design and two alternative designs for indirect liquefaction using advanced F-T technology. The baseline design uses Illinois No. 6 Eastern Coal and conventional refining. There is an alternative refining case using ZSM-5 treatment of the vapor stream from the slurry F-T reactor and an alternative coal case using Western coal from the Powder River Basin. Prepare the capital and operating costs for the baseline design and the alternatives. Individual plant costs for the alternative cases will be prorated on capacity, wherever possible, from the baseline case. Develop a process flowsheet simulation (PFS) model. During the period of this report, a Topical Report summarizing the Baseline Case design was drafted and issued to DOE/PETC for review and release approval. Major effort was spent on the Alternate Upgrading and Refining Case. Its design specifications were finalized, and material and utility balances completed. Initial capital cost estimates were developed. A Topical Report, summarizing the Alternative (ZSM-5) Upgrading and Refining Case design, is being drafted. Under Task 4, some of the individual plant models were expanded and enhanced. An overall ASPEN/SP process simulation model was developed for the Baseline Design Case by combining the individual models of Areas 100, 200 and 300. In addition, a separate model for the simplified product refining area, Area 300, of the Alternate Upgrading and Refining case was developed. Under Task 7, cost and schedule control was the primary activity. A technical paper entitled ``Baseline Design/Economics for Advanced Fischer-Tropsch Technology`` was presented in the DOE/PETC`s Annual Contractors Review Conference, held at Pittsburgh, Pennsylvania, on September 27-29, 1993. A contract amendment was submitted to include the Kerr McGee ROSE unit in the Baseline design case and to convert the PFS models from the ASPEN/SP to ASPEN/Plus software code.

    5. Integrated Baseline System (IBS). Version 1.03, System Management Guide

      SciTech Connect (OSTI)

      Williams, J.R.; Bailey, S.; Bower, J.C.

      1993-01-01

      This IBS System Management Guide explains how to install or upgrade the Integrated Baseline System (IBS) software package. The IBS is an emergency management planning and analysis tool that was developed under the direction of the Federal Emergency Management Agency (FEMA). This guide includes detailed instructions for installing the IBS software package on a Digital Equipment Corporation (DEC) VAX computer from the IBS distribution tapes. The installation instructions include procedures for both first-time installations and upgrades to existing IBS installations. To ensure that the system manager has the background necessary for successful installation of the IBS package, this guide also includes information on IBS computer requirements, software organization, and the generation of IBS distribution tapes. When special utility programs are used during IBS installation and setups, this guide refers you to the IBS Utilities Guide for specific instructions. This guide also refers you to the IBS Data Management Guide for detailed descriptions of some IBS data files and structures. Any special requirements for installation are not documented here but should be included in a set of installation notes that come with the distribution tapes.

    6. Grocery 2009 TSD Miami Baseline | Open Energy Information

      Open Energy Info (EERE)

      Jump to: navigation, search Model Name Grocery 2009 TSD Miami Baseline Building Type Food Sales Model Type Baseline Model Target Type ASHRAE 90.1 2004 Model Year 2009 IDF file...

    7. Grocery 2009 TSD Chicago Baseline | Open Energy Information

      Open Energy Info (EERE)

      Jump to: navigation, search Model Name Grocery 2009 TSD Chicago Baseline Building Type Food Sales Model Type Baseline Model Target Type ASHRAE 90.1 2004 Model Year 2009 IDF file...

    8. Proposed Methodology for LEED Baseline Refrigeration Modeling (Presentation)

      SciTech Connect (OSTI)

      Deru, M.

      2011-02-01

      This PowerPoint presentation summarizes a proposed methodology for LEED baseline refrigeration modeling. The presentation discusses why refrigeration modeling is important, the inputs of energy models, resources, reference building model cases, baseline model highlights, example savings calculations and results.

    9. System design and algorithmic development for computational steering in distributed environments

      SciTech Connect (OSTI)

      Wu, Qishi; Zhu, Mengxia; Gu, Yi; Rao, Nageswara S

      2010-03-01

      Supporting visualization pipelines over wide-area networks is critical to enabling large-scale scientific applications that require visual feedback to interactively steer online computations. We propose a remote computational steering system that employs analytical models to estimate the cost of computing and communication components and optimizes the overall system performance in distributed environments with heterogeneous resources. We formulate and categorize the visualization pipeline configuration problems for maximum frame rate into three classes according to the constraints on node reuse or resource sharing, namely no, contiguous, and arbitrary reuse. We prove all three problems to be NP-complete and present heuristic approaches based on a dynamic programming strategy. The superior performance of the proposed solution is demonstrated with extensive simulation results in comparison with existing algorithms and is further evidenced by experimental results collected on a prototype implementation deployed over the Internet.

    10. Moving baseline for evaluation of advanced coal-extraction systems

      SciTech Connect (OSTI)

      Bickerton, C.R.; Westerfield, M.D.

      1981-04-15

      This document reports results from the initial effort to establish baseline economic performance comparators for a program whose intent is to define, develop, and demonstrate advanced systems suitable for coal resource extraction beyond the year 2000. Systems used in this study were selected from contemporary coal mining technology and from conservative conjectures of year 2000 technology. The analysis was also based on a seam thickness of 6 ft. Therefore, the results are specific to the study systems and the selected seam thickness. To be more beneficial to the program, the effort should be extended to other seam thicknesses. This document is one of a series which describe systems level requirements for advanced underground coal mining equipment. Five areas of performance are discussed: production cost, miner safety, miner health, environmental impact, and recovery efficiency. The projections for cost and production capability comprise a so-called moving baseline which will be used to assess compliance with the systems requirement for production cost. Separate projections were prepared for room and pillar, longwall, and shortwall technology all operating under comparable sets of mining conditions. This work is part of an effort to define and develop innovative coal extraction systems suitable for the significant resources remaining in the year 2000.

    11. NREL: Energy Analysis - Annual Technology Baseline and Standard Scenarios

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Legacy Versions Annual Technology Baseline and Standard Scenarios - Legacy Versions This section contains earlier versions of NREL's Annual Technology Baseline and Standard Scenarios products. Looking for the latest versions of the ATB? Spring 2015 Draft Versions Annual Technology Baseline (ATB) Spreadsheet ATB Summary Presentation Standard Scenarios Annual Report

    12. Development and Verification of a Computational Fluid Dynamics Model of a Horizontal-Axis Tidal Current Turbine

      SciTech Connect (OSTI)

      Lawson, M. J.; Li, Y.; Sale, D. C.

      2011-10-01

      This paper describes the development of a computational fluid dynamics (CFD) methodology to simulate the hydrodynamics of horizontal-axis tidal current turbines. Qualitative measures of the CFD solutions were independent of the grid resolution. Conversely, quantitative comparisons of the results indicated that the use of coarse computational grids results in an under prediction of the hydrodynamic forces on the turbine blade in comparison to the forces predicted using more resolved grids. For the turbine operating conditions considered in this study, the effect of the computational timestep on the CFD solution was found to be minimal, and the results from steady and transient simulations were in good agreement. Additionally, the CFD results were compared to corresponding blade element momentum method calculations and reasonable agreement was shown. Nevertheless, we expect that for other turbine operating conditions, where the flow over the blade is separated, transient simulations will be required.

    13. Scope Management Baseline Development (FPM 208), Idaho | Department...

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      This is a Level 2 core course for certification in the Project Management Career ... credit. Course Format: 3 day Instructor-led classroom delivery Equivalent Courses: None

    14. Baseline design/economics for advanced Fischer-Tropsch technology. Quarterly report, January--March 1994

      SciTech Connect (OSTI)

      1994-12-31

      The objectives of the study are to: Develop a baseline design and two alternative designs for indirect liquefaction using advanced F-T technology. The baseline design uses Illinois No. 6 Eastern Coal and conventional refining. There is an alternative refining case using ZSM-5 treatment of the vapor stream from the slurry F-T reactor and an alternative coal case using Western, coal from the Powder River Basin. Prepare the capital and operating costs for the baseline design and the alternatives. Individual plant costs for the alternative cases will be prorated on capacity, wherever possible, from the baseline case. Develop a process flowsheet simulation (PFS) model. During the reporting period, work progressed on Tasks 1, 2, 4, 6 and 7. This report covers work done during the period and consists of four sections: Introduction and Summary. Task 1: Baseline Design and Alternatives. Task 2: Evaluate baseline and alternative economics. Task 4: Process Flowsheet Simulation (PFS) model. Task 6: Document the PFS model and develop a DOE training session on its use and Project Management and Staffing Report.

    15. Development and Verification of a Computational Fluid Dynamics Model of a Horizontal-Axis Tidal Current Turbine

      SciTech Connect (OSTI)

      Lawson, Mi. J.; Li, Y.; Sale, D. C.

      2011-01-01

      This paper describes the development of a computational fluid dynamics (CFD) methodology to simulate the hydrodynamics of horizontal-axis tidal current turbines (HATTs). First, an HATT blade was designed using the blade element momentum method in conjunction with a genetic optimization algorithm. Several unstructured computational grids were generated using this blade geometry and steady CFD simulations were used to perform a grid resolution study. Transient simulations were then performed to determine the effect of time-dependent flow phenomena and the size of the computational timestep on the numerical solution. Qualitative measures of the CFD solutions were independent of the grid resolution. Conversely, quantitative comparisons of the results indicated that the use of coarse computational grids results in an under prediction of the hydrodynamic forces on the turbine blade in comparison to the forces predicted using more resolved grids. For the turbine operating conditions considered in this study, the effect of the computational timestep on the CFD solution was found to be minimal, and the results from steady and transient simulations were in good agreement. Additionally, the CFD results were compared to corresponding blade element momentum method calculations and reasonable agreement was shown. Nevertheless, we expect that for other turbine operating conditions, where the flow over the blade is separated, transient simulations will be required.

    16. International Nuclear Energy Research Initiative Development of Computational Models for Pyrochemical Electrorefiners of Nuclear Waste Transmutation Systems

      SciTech Connect (OSTI)

      M.F. Simpson; K.-R. Kim

      2010-12-01

      In support of closing the nuclear fuel cycle using non-aqueous separations technology, this project aims to develop computational models of electrorefiners based on fundamental chemical and physical processes. Spent driver fuel from Experimental Breeder Reactor-II (EBR-II) is currently being electrorefined in the Fuel Conditioning Facility (FCF) at Idaho National Laboratory (INL). And Korea Atomic Energy Research Institute (KAERI) is developing electrorefining technology for future application to spent fuel treatment and management in the Republic of Korea (ROK). Electrorefining is a critical component of pyroprocessing, a non-aqueous chemical process which separates spent fuel into four streams: (1) uranium metal, (2) U/TRU metal, (3) metallic high-level waste containing cladding hulls and noble metal fission products, and (4) ceramic high-level waste containing sodium and active metal fission products. Having rigorous yet flexible electrorefiner models will facilitate process optimization and assist in trouble-shooting as necessary. To attain such models, INL/UI has focused on approaches to develop a computationally-light and portable two-dimensional (2D) model, while KAERI/SNU has investigated approaches to develop a computationally intensive three-dimensional (3D) model for detailed and fine-tuned simulation.

    17. Computer, Computational, and Statistical Sciences

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      ... Directed Research and Development (LDRD) Defense Advanced Research Projects Agency (DARPA) Defense Threat Reduction Agency (DTRA) Research Applied Computer Science Co-design ...

    18. Gated integrator with signal baseline subtraction

      DOE Patents [OSTI]

      Wang, Xucheng

      1996-01-01

      An ultrafast, high precision gated integrator includes an opamp having differential inputs. A signal to be integrated is applied to one of the differential inputs through a first input network, and a signal indicative of the DC offset component of the signal to be integrated is applied to the other of the differential inputs through a second input network. A pair of electronic switches in the first and second input networks define an integrating period when they are closed. The first and second input networks are substantially symmetrically constructed of matched components so that error components introduced by the electronic switches appear symmetrically in both input circuits and, hence, are nullified by the common mode rejection of the integrating opamp. The signal indicative of the DC offset component is provided by a sample and hold circuit actuated as the integrating period begins. The symmetrical configuration of the integrating circuit improves accuracy and speed by balancing out common mode errors, by permitting the use of high speed switching elements and high speed opamps and by permitting the use of a small integrating time constant. The sample and hold circuit substantially eliminates the error caused by the input signal baseline offset during a single integrating window.

    19. LTC vacuum blasting machine (concrete): Baseline report

      SciTech Connect (OSTI)

      1997-07-31

      The LTC shot blast technology was tested and is being evaluated at Florida International University (FIU) as a baseline technology. In conjunction with FIU`s evaluation of efficiency and cost, this report covers the evaluation conducted for safety and health issues. It is a commercially available technology and has been used for various projects at locations throughout the country. The LTC 1073 Vacuum Blasting Machine uses a high-capacity, direct-pressure blasting system which incorporates a continuous feed for the blast media. The blast media cleans the surface within the contained brush area of the blast. It incorporates a vacuum system which removes dust and debris from the surface as it is blasted. The safety and health evaluation during the testing demonstration focused on two main areas of exposure: dust and noise. Dust exposure during maintenance activities was minimal, but due to mechanical difficulties dust monitoring could not be conducted during operation. Noise exposure was significant. Further testing for each of these exposures is recommended because of the outdoor environment where the testing demonstration took place. This may cause the results to be inaccurate. It is feasible that the dust and noise levels will be higher in an enclosed environment. In addition, other safety and health issues found were ergonomics, heat stress, tripping hazards, electrical hazards, lockout/tagout, and arm-hand vibration.

    20. LTC vacuum blasting machine (metal): Baseline report

      SciTech Connect (OSTI)

      1997-07-31

      The LTC coating removal technology was tested and is being evaluated at Florida International University (FIU) as a baseline technology. In conjunction with evaluation of efficiency and cost, this report covers the evaluation conducted for safety and health issues. It is a commercially available technology and has been used for various projects at locations throughout the country. The LTC coating removal system consisted of several hand tools, a Roto Peen scaler, and a needlegun. They are designed to remove coatings from steel, concrete, brick, and wood. These hand tools are used with the LTC PTC-6 vacuum system to capture dust and debris as removal of the coating takes place. The safety and health evaluation during the testing demonstration focused on two main areas of exposure: dust and noise. The dust exposure was minimal but noise exposure was significant. Further testing for each exposure is recommended because of the environment where the testing demonstration took place. It is feasible that the dust and noise levels will be higher in an enclosed operating environment of different construction. In addition, other areas of concern found were arm-hand vibration, whole-body vibration, ergonomics, heat stress, tripping hazards, electrical hazards, machine guarding, and lockout/tagout.

    1. Pentek metal coating removal system: Baseline report

      SciTech Connect (OSTI)

      1997-07-31

      The Pentek coating removal technology was tested and is being evaluated at Florida International University (FIU) as a baseline technology. In conjunction with FIU`s evaluation of efficiency and cost, this report covers evaluation conducted for safety and health issues. It is a commercially available technology and has been used for various projects at locations throughout the country. The Pentek coating removal system consisted of the ROTO-PEEN Scaler, CORNER-CUTTER{reg_sign}, and VAC-PAC{reg_sign}. They are designed to remove coatings from steel, concrete, brick, and wood. The Scaler uses 3M Roto Peen tungsten carbide cutters while the CORNER-CUTTER{reg_sign} uses solid needles for descaling activities. These hand tools are used with the VAC-PAC{reg_sign} vacuum system to capture dust and debris as removal of the coating takes place. The safety and health evaluation during the testing demonstration focused on two main areas of exposure: dust and noise. Dust exposure minimal, but noise exposure was significant. Further testing for each exposure is recommended because of the environment where the testing demonstration took place. It is feasible that the dust and noise levels will be higher in an enclosed operating environment of different construction. In addition, other areas of concern found were arm-hand vibration, whole-body, ergonomics, heat stress, tripping hazards, electrical hazards, machine guarding, and lockout/tagout.

    2. Baseline air quality study at Fermilab

      SciTech Connect (OSTI)

      Dave, M.J.; Charboneau, R.

      1980-10-01

      Air quality and meteorological data collected at Fermi National Accelerator Laboratory are presented. The data represent baseline values for the pre-construction phase of a proposed coal-gasification test facility. Air quality data were characterized through continuous monitoring of gaseous pollutants, collection of meteorological data, data acquisition and reduction, and collection and analysis of discrete atmospheric samples. Seven air quality parameters were monitored and recorded on a continuous real-time basis: sulfur dioxide, ozone, total hydrocarbons, nonreactive hydrocarbons, nitric oxide, nitrogen oxides, and carbon monoxide. A 20.9-m tower was erected near Argonne's mobile air monitoring laboratory, which was located immediately downwind of the proposed facility. The tower was instrumented at three levels to collect continuous meteorological data. Wind speed was monitored at three levels; wind direction, horizontal and vertical, at the top level; ambient temperature at the top level; and differential temperature between all three levels. All continuously-monitored parameters were digitized and recorded on magnetic tape. Appropriate software was prepared to reduce the data. Statistical summaries, grphical displays, and correlation studies also are presented.

    3. Gated integrator with signal baseline subtraction

      DOE Patents [OSTI]

      Wang, X.

      1996-12-17

      An ultrafast, high precision gated integrator includes an opamp having differential inputs. A signal to be integrated is applied to one of the differential inputs through a first input network, and a signal indicative of the DC offset component of the signal to be integrated is applied to the other of the differential inputs through a second input network. A pair of electronic switches in the first and second input networks define an integrating period when they are closed. The first and second input networks are substantially symmetrically constructed of matched components so that error components introduced by the electronic switches appear symmetrically in both input circuits and, hence, are nullified by the common mode rejection of the integrating opamp. The signal indicative of the DC offset component is provided by a sample and hold circuit actuated as the integrating period begins. The symmetrical configuration of the integrating circuit improves accuracy and speed by balancing out common mode errors, by permitting the use of high speed switching elements and high speed opamps and by permitting the use of a small integrating time constant. The sample and hold circuit substantially eliminates the error caused by the input signal baseline offset during a single integrating window. 5 figs.

    4. MP Salsa: a finite element computer program for reacting flow problems. Part 1--theoretical development

      SciTech Connect (OSTI)

      Shadid, J.N.; Moffat, H.K.; Hutchinson, S.A.; Hennigan, G.L.; Devine, K.D.; Salinger, A.G.

      1996-05-01

      The theoretical background for the finite element computer program, MPSalsa, is presented in detail. MPSalsa is designed to solve laminar, low Mach number, two- or three-dimensional incompressible and variable density reacting fluid flows on massively parallel computers, using a Petrov-Galerkin finite element formulation. The code has the capability to solve coupled fluid flow, heat transport, multicomponent species transport, and finite-rate chemical reactions, and to solver coupled multiple Poisson or advection-diffusion- reaction equations. The program employs the CHEMKIN library to provide a rigorous treatment of multicomponent ideal gas kinetics and transport. Chemical reactions occurring in the gas phase and on surfaces are treated by calls to CHEMKIN and SURFACE CHEMKIN, respectively. The code employs unstructured meshes, using the EXODUS II finite element data base suite of programs for its input and output files. MPSalsa solves both transient and steady flows by using fully implicit time integration, an inexact Newton method and iterative solvers based on preconditioned Krylov methods as implemented in the Aztec solver library.

    5. Statistical Analysis of Baseline Load Models for Non-Residential Buildings

      SciTech Connect (OSTI)

      Coughlin, Katie; Piette, Mary Ann; Goldman, Charles; Kiliccote, Sila

      2008-11-10

      Policymakers are encouraging the development of standardized and consistent methods to quantify the electric load impacts of demand response programs. For load impacts, an essential part of the analysis is the estimation of the baseline load profile. In this paper, we present a statistical evaluation of the performance of several different models used to calculate baselines for commercial buildings participating in a demand response program in California. In our approach, we use the model to estimate baseline loads for a large set of proxy event days for which the actual load data are also available. Measures of the accuracy and bias of different models, the importance of weather effects, and the effect of applying morning adjustment factors (which use data from the day of the event to adjust the estimated baseline) are presented. Our results suggest that (1) the accuracy of baseline load models can be improved substantially by applying a morning adjustment, (2) the characterization of building loads by variability and weather sensitivity is a useful indicator of which types of baseline models will perform well, and (3) models that incorporate temperature either improve the accuracy of the model fit or do not change it.

    6. Sandia National Laboratories, California proposed CREATE facility environmental baseline survey.

      SciTech Connect (OSTI)

      Catechis, Christopher Spyros

      2013-10-01

      Sandia National Laboratories, Environmental Programs completed an environmental baseline survey (EBS) of 12.6 acres located at Sandia National Laboratories/California (SNL/CA) in support of the proposed Collaboration in Research and Engineering for Advanced Technology and Education (CREATE) Facility. The survey area is comprised of several parcels of land within SNL/CA, County of Alameda, California. The survey area is located within T 3S, R 2E, Section 13. The purpose of this EBS is to document the nature, magnitude, and extent of any environmental contamination of the property; identify potential environmental contamination liabilities associated with the property; develop sufficient information to assess the health and safety risks; and ensure adequate protection for human health and the environment related to a specific property.

    7. NREL: Energy Analysis - Annual Technology Baseline and Standard Scenarios

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Annual Technology Baseline and Standard Scenarios Discussion Draft of NREL 2016 Annual Technology Baseline Now Available for Review NREL has posted a discussion draft of its 2016 Annual Technology Baseline (ATB) for public comment through April 15, 2016. ATB Spreadsheet ATB Presentation-an appendix summarizes significant changes from the 2015 ATB. Written public comments are welcome and must adhere to these criteria to be considered: Comments must be submitted in writing to

    8. Long-Baseline Neutrino Facility / Deep Underground Neutrino Project

      Energy Savers [EERE]

      (LBNF-DUNE) | Department of Energy Long-Baseline Neutrino Facility / Deep Underground Neutrino Project (LBNF-DUNE) Long-Baseline Neutrino Facility / Deep Underground Neutrino Project (LBNF-DUNE) Long-Baseline Neutrino Facility / Deep Underground Neutrino Project (LBNF-DUNE) Chris Mossey, Deputy Lab Director (Fermi) and Project Director for LBNF-DUNE March 23, 2016 PDF icon Presentation More Documents & Publications EA-1943: Final Environmental Assessment EA-1943: Draft Environmental

    9. Baseline Change Proposal (BCP) ESAAB and PMRC Brief Template | Department

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      of Energy Baseline Change Proposal (BCP) ESAAB and PMRC Brief Template Baseline Change Proposal (BCP) ESAAB and PMRC Brief Template Template for briefing Baseline Change Proposals to the ESAAB and/or PMRC July 2015 Office presentation icon Template Key Resources PMCDP EVMS PARS IIe FPD Resource Center PM Newsletter Forms and Templates More Documents & Publications DOE Project Management Risk Committee (PMRC) SOP Critical Decision 2 (CD-2) ESAAB and PMRC Brief Template External

    10. NEW DEVELOPMENTS ON INVERSE POLYGON MAPPING TO CALCULATE GRAVITATIONAL LENSING MAGNIFICATION MAPS: OPTIMIZED COMPUTATIONS

      SciTech Connect (OSTI)

      Mediavilla, E.; Lopez, P.; Gonzalez-Morcillo, C.; Jimenez-Vicente, J.

      2011-11-01

      We derive an exact solution (in the form of a series expansion) to compute gravitational lensing magnification maps. It is based on the backward gravitational lens mapping of a partition of the image plane in polygonal cells (inverse polygon mapping, IPM), not including critical points (except perhaps at the cell boundaries). The zeroth-order term of the series expansion leads to the method described by Mediavilla et al. The first-order term is used to study the error induced by the truncation of the series at zeroth order, explaining the high accuracy of the IPM even at this low order of approximation. Interpreting the Inverse Ray Shooting (IRS) method in terms of IPM, we explain the previously reported N {sup -3/4} dependence of the IRS error with the number of collected rays per pixel. Cells intersected by critical curves (critical cells) transform to non-simply connected regions with topological pathologies like auto-overlapping or non-preservation of the boundary under the transformation. To define a non-critical partition, we use a linear approximation of the critical curve to divide each critical cell into two non-critical subcells. The optimal choice of the cell size depends basically on the curvature of the critical curves. For typical applications in which the pixel of the magnification map is a small fraction of the Einstein radius, a one-to-one relationship between the cell and pixel sizes in the absence of lensing guarantees both the consistence of the method and a very high accuracy. This prescription is simple but very conservative. We show that substantially larger cells can be used to obtain magnification maps with huge savings in computation time.

    11. Chile-Danish Government Baseline Workstream | Open Energy Information

      Open Energy Info (EERE)

      Kenya, Mexico, South Africa, Thailand and Vietnam), to share practices on setting national greenhouse gas emissions baseline scenarios. The aim of the workstream is to...

    12. EA-1943: Long Baseline Neutrino Facility/Deep Underground Neutrino...

      Broader source: Energy.gov (indexed) [DOE]

      May 27, 2015 EA-1943: Draft Environmental Assessment Long Baseline Neutrino FacilityDeep Underground Neutrino Experiment (LBNFDUNE) at Fermilab, Batavia, Illinois and the...

    13. NETL - Bituminous Baseline Performance and Cost Interactive Tool...

      Open Energy Info (EERE)

      from the Cost and Performance Baseline for Fossil Energy Plants - Bituminous Coal and Natural Gas to Electricity report. The tool provides an interactive summary of the full...

    14. Cost and Performance Comparison Baseline for Fossil Energy Power...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      blocks together into a new, revolutionary concept for future coal-based power and energy production. Objective To establish baseline performance and cost estimates for today's...

    15. Updates to the International Linear Collider Damping Rings Baseline...

      Office of Scientific and Technical Information (OSTI)

      Updates to the International Linear Collider Damping Rings Baseline Design Citation Details In-Document Search Title: Updates to the International Linear Collider Damping Rings...

    16. South Africa - Greenhouse Gas Emission Baselines and Reduction...

      Open Energy Info (EERE)

      from Buildings AgencyCompany Organization United Nations Environment Programme Sector Energy Focus Area Buildings Topics Baseline projection, GHG inventory, Pathways analysis,...

    17. Mexico - Greenhouse Gas Emissions Baselines and Reduction Potentials...

      Open Energy Info (EERE)

      from Buildings AgencyCompany Organization United Nations Environment Programme Sector Energy Focus Area Buildings Topics Baseline projection, GHG inventory, Pathways analysis,...

    18. Sandia Energy - Scaled Wind Farm Technology Facility Baselining...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Project Accelerates Work Home Renewable Energy Energy SWIFT Facilities Partnership News Wind Energy News & Events Systems Analysis Scaled Wind Farm Technology Facility Baselining...

    19. Baseline design/economics for advanced Fischer-Tropsch technology. Quarterly report, July--September 1994

      SciTech Connect (OSTI)

      1994-12-31

      This report is Bechtel`s twelfth quarterly technical progress report and covers the period of July through September, 1994. All major tasks associated with the contract study have essentially been completed. Effort is under way in preparing various topical reports for publication. The objectives of this study are to: Develop a baseline design and two alternative designs for indirect liquefaction using advanced F-T technology. The baseline design uses Illinois No. 6 Eastern Coal and conventional refining. There is an alternative refining case using ZSM-5 treatment of the vapor stream from the slurry F-T reactor and an alternative coal case using Western coal from the Powder River Basin. Prepare the capital and operating costs for the baseline design and the alternatives. Individual plant costs for the alternative cases win be prorated on capacity, wherever possible, from the baseline case. Develop a process flowsheet simulation (PFS) model; establish the baseline design and alternatives; evaluate baseline and alternative economics; develop engineering design criteria; develop a process flowsheet simulation (PFS) model; perform sensitivity studies using the PFS model; document the PFS model and develop a DOE training session on its use; and perform project management, technical coordination and other miscellaneous support functions. Tasks 1, 2, 3 and 5 have essentially been completed. Effort is under way in preparing topical reports for publication. During the current reporting period, work progressed on Tasks 4, 6 and 7. This report covers work done during this period and consists of four sections: Introduction and Summary; Task 4 - Process Flowsheet Simulation (PFS) Model and Conversion to ASPEN PLUS; Task 6 - Document the PFS model and develop a DOE training session on its use; and Project Management and Staffing Report.

    20. Development of Computational Tools for Metabolic Model Curation, Flux Elucidation and Strain Design

      SciTech Connect (OSTI)

      Maranas, Costas D

      2012-05-21

      An overarching goal of the Department of Energy™ mission is the efficient deployment and engineering of microbial and plant systems to enable biomass conversion in pursuit of high energy density liquid biofuels. This has spurred the pace at which new organisms are sequenced and annotated. This torrent of genomic information has opened the door to understanding metabolism in not just skeletal pathways and a handful of microorganisms but for truly genome-scale reconstructions derived for hundreds of microbes and plants. Understanding and redirecting metabolism is crucial because metabolic fluxes are unique descriptors of cellular physiology that directly assess the current cellular state and quantify the effect of genetic engineering interventions. At the same time, however, trying to keep pace with the rate of genomic data generation has ushered in a number of modeling and computational challenges related to (i) the automated assembly, testing and correction of genome-scale metabolic models, (ii) metabolic flux elucidation using labeled isotopes, and (iii) comprehensive identification of engineering interventions leading to the desired metabolism redirection.

    1. Development of Computational Approaches for Simulation and Advanced Controls for Hybrid Combustion-Gasification Chemical Looping

      SciTech Connect (OSTI)

      Joshi, Abhinaya; Lou, Xinsheng; Neuschaefer, Carl; Chaudry, Majid; Quinn, Joseph

      2012-07-31

      This document provides the results of the project through September 2009. The Phase I project has recently been extended from September 2009 to March 2011. The project extension will begin work on Chemical Looping (CL) Prototype modeling and advanced control design exploration in preparation for a scale-up phase. The results to date include: successful development of dual loop chemical looping process models and dynamic simulation software tools, development and test of several advanced control concepts and applications for Chemical Looping transport control and investigation of several sensor concepts and establishment of two feasible sensor candidates recommended for further prototype development and controls integration. There are three sections in this summary and conclusions. Section 1 presents the project scope and objectives. Section 2 highlights the detailed accomplishments by project task area. Section 3 provides conclusions to date and recommendations for future work.

    2. Tank Waste Remediation System (TWRS) Technical Baseline Summary Description

      SciTech Connect (OSTI)

      TEDESCHI, A.R.

      2000-04-21

      This revision notes the supersedure of the subject document by concurrent issuance of HNF-1901 ''Technical Baseline Summary Description for the Tank Farm Contractor'', Revision 2. Safe storage mission technical baseline information was absorbed by the new revision of HNF-1901.

    3. Long-Baseline Neutrino Experiment (LBNE)Water Cherenkov Detector Basis of Estimate Forms and Backup Documentation LBNE Far Site Internal Review (December 6-9, 2011)

      SciTech Connect (OSTI)

      Stewart J.; Diwan, M.; Dolph, J.; Novakova, P.; Sharma, R.; Stewart, J.; Viren, B.; Russo, T.; Kaducak, M.; Mantsch, P.; Paulos, B.; Feyzi, F.; Sullivan, G.; Bionta, R.; Fowler, J.; Warner, D.; Bahowick, S.; Van Berg, R.; Kearns, E.; Hazen, E.; Sinnis, G.; Sanchez, M.

      2011-12-09

      Basis of Estimate (BOE) forms and backup documentation developed for the Water Cherenkov Detector (WCD) option for the far detector of the Long Baseline Neutrino Experiment (LBNE)

    4. Long-Baseline Neutrino Experiment (LBNE) Water Cherenkov Detector Schedule and Cost Books LBNE Far Site Internal Review(December 6-9,2011)

      SciTech Connect (OSTI)

      Stewart J.; Diwan, M.; Dolph, J.; Novakova, P.; Sharma, R.; Stewart, J.; Viren, B.; Russo, T.; Kaducak, M.; Mantsch, P.; Paulos, B.; Feyzi, F.; Sullivan, G.; Bionta, R.; Fowler, J.; Warner, D.; Bahowick, S.; Van Berg, R.; Kearns, E.; Hazen, E.; Sinnis, G.; Sanchez, M.

      2011-12-09

      Schedule and Cost Books developed for the Water Cherenkov Detector (WCD) option for the far detector of the Long Baseline Neutrino Experiment (LBNE)

    5. Development of computer program ENMASK for prediction of residual environmental masking-noise spectra, from any three independent environmental parameters

      SciTech Connect (OSTI)

      Chang, Y.-S.; Liebich, R. E.; Chun, K. C.

      2000-03-31

      Residual environmental sound can mask intrusive4 (unwanted) sound. It is a factor that can affect noise impacts and must be considered both in noise-impact studies and in noise-mitigation designs. Models for quantitative prediction of sensation level (audibility) and psychological effects of intrusive noise require an input with 1/3 octave-band spectral resolution of environmental masking noise. However, the majority of published residual environmental masking-noise data are given with either octave-band frequency resolution or only single A-weighted decibel values. A model has been developed that enables estimation of 1/3 octave-band residual environmental masking-noise spectra and relates certain environmental parameters to A-weighted sound level. This model provides a correlation among three environmental conditions: measured residual A-weighted sound-pressure level, proximity to a major roadway, and population density. Cited field-study data were used to compute the most probable 1/3 octave-band sound-pressure spectrum corresponding to any selected one of these three inputs. In turn, such spectra can be used as an input to models for prediction of noise impacts. This paper discusses specific algorithms included in the newly developed computer program ENMASK. In addition, the relative audibility of the environmental masking-noise spectra at different A-weighted sound levels is discussed, which is determined by using the methodology of program ENAUDIBL.

    6. Scientific Opportunities with the Long-Baseline Neutrino Experiment

      SciTech Connect (OSTI)

      Adams, C.; et al.,

      2013-07-28

      In this document, we describe the wealth of science opportunities and capabilities of LBNE, the Long-Baseline Neutrino Experiment. LBNE has been developed to provide a unique and compelling program for the exploration of key questions at the forefront of particle physics. Chief among the discovery opportunities are observation of CP symmetry violation in neutrino mixing, resolution of the neutrino mass hierarchy, determination of maximal or near-maximal mixing in neutrinos, searches for nucleon decay signatures, and detailed studies of neutrino bursts from galactic supernovae. To fulfill these and other goals as a world-class facility, LBNE is conceived around four central components: (1) a new, intense wide-band neutrino source at Fermilab, (2) a fine-grained `near' neutrino detector just downstream of the source, (3) the Sanford Underground Research Facility (SURF) in Lead, South Dakota at an optimal distance (~1300 km) from the neutrino source, and (4) a massive liquid argon time-projection chamber (LArTPC) deployed there as a 'far' detector. The facilities envisioned are expected to enable many other science opportunities due to the high event rates and excellent detector resolution from beam neutrinos in the near detector and atmospheric neutrinos in the far detector. This is a mature, well developed, world class experiment whose relevance, importance, and probability of unearthing critical and exciting physics has increased with time.

    7. Notice of Intent to Revise DOE G 413.3-5A, Performance Baseline Guide

      Broader source: Directives, Delegations, and Requirements [Office of Management (MA)]

      2016-02-03

      The proposed revision to this Department of Energy Guide focuses on updating the current guide with the latest terminology and references, regarding Performance Baseline Development process. This update also incorporates the latest Secretarial memoranda on project management issued since the last update to DOE O 413.3B, Program and Project Management for the Acquisition of Capital Assets.

    8. Baseline design/economics for advanced Fischer-Tropsch technology. Quarterly report, January--March 1992

      SciTech Connect (OSTI)

      Not Available

      1992-09-01

      The objectives of the study are to: Develop a baseline design for indirect liquefaction using advanced Fischer-Tropsch (F-T) technology. Prepare the capital and operating costs for the baseline design. Develop a process flow sheet simulation (PFS) model. This report summarizes the activities completed during the period December 23, 1992 through March 15, 1992. In Task 1, Baseline Design and Alternates, the following activities related to the tradeoff studies were completed: approach and basis; oxygen purity; F-T reactor pressure; wax yield; autothermal reformer; hydrocarbons (C{sub 3}/C{sub 4}s) recovery; and hydrogenrecovery. In Task 3, Engineering Design Criteria, activities were initiated to support the process tradeoff studies in Task I and to develop the environmental strategy for the Illinois site. The work completed to date consists of the development of the F-T reactor yield correlation from the Mobil dam and a brief review of the environmental strategy prepared for the same site in the direct liquefaction baseline study.Some work has also been done in establishing site-related criteria, in establishing the maximum vessel diameter for train sizing and in coping with the low H{sub 2}/CO ratio from the Shell gasifier. In Task 7, Project Management and Administration, the following activities were completed: the subcontract agreement between Amoco and Bechtel was negotiated; a first technical progress meeting was held at the Bechtel office in February; and the final Project Management Plan was approved by PETC and issued in March 1992.

    9. CD-2, Approve Performance Baseline | Department of Energy

      Office of Environmental Management (EM)

      2, Approve Performance Baseline CD-2, Approve Performance Baseline CD-2, Approve Performance Baseline << Resource Center | CD-3 >> Description Completion of preliminary design is the first major milestone in the project Execution Phase. The design must be sufficiently mature (refer to DOE O 413.3B, Appendix C, Paragraph 4) at the time of CD-2 approval to provide reasonable assurance that the design will be implementable within the approved PB. The document signed by the CE or PME

    10. Computational Earth Science

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      6 Computational Earth Science We develop and apply a range of high-performance computational methods and software tools to Earth science projects in support of environmental ...

    11. Sandia National Laboratories: Careers: Computer Science

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Advanced software research & development Collaborative technologies Computational science and mathematics High-performance computing Visualization and scientific computing Advanced ...

    12. Cost and Performance Baseline for Fossil Energy Plants Volume...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      www.netl.doe.gov This page intentionally left blank Cost and Performance Baseline for Coal-to-SNG and Ammonia (Volume 2) i Table of Contents LIST OF EXHIBITS......

    13. Biodiversity Research Institute Mid-Atlantic Baseline Study Webinar

      Broader source: Energy.gov [DOE]

      Carried out by the Biodiversity Research Institute (BRI) and funded by the U.S. Department of Energy Wind and Water Power Technology Office and other partners, the goal of the Mid-Atlantic Baseline...

    14. ENERGY STAR PortfolioManager Baseline Year Instructions

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      using the template you created Once the report has been generated, download it as an Excel file Open downloaded "Baseline Year" report, select all and copy In report spreadsheet, ...

    15. Annual Technology Baseline (Including Supporting Data); NREL (National

      Office of Scientific and Technical Information (OSTI)

      Renewable Energy Laboratory) (Conference) | SciTech Connect SciTech Connect Search Results Conference: Annual Technology Baseline (Including Supporting Data); NREL (National Renewable Energy Laboratory) Citation Details In-Document Search Title: Annual Technology Baseline (Including Supporting Data); NREL (National Renewable Energy Laboratory) Consistent cost and performance data for various electricity generation technologies can be difficult to find and may change frequently for certain

    16. Fort Irwin Integrated Resource Assessment. Volume 2, Baseline detail

      SciTech Connect (OSTI)

      Richman, E.E.; Keller, J.M.; Dittmer, A.L.; Hadley, D.L.

      1994-01-01

      This report documents the assessment of baseline energy use at Fort Irwin, a US Army Forces Command facility near Barstow, California. It is a companion report to Volume 1, Executive Summary, and Volume 3, Integrated Resource Assessment. The US Army Forces Command (FORSCOM) has tasked the US Department of Energy (DOE) Federal Energy Management Program (FEMP), supported by the Pacific Northwest Laboratory (PNL), to identify, evaluate, and assist in acquiring all cost-effective energy projects at Fort Irwin. This is part of a model program that PNL has designed to support energy-use decisions in the federal sector. This program (1) identifies and evaluates all cost-effective energy projects; (2) develops a schedule at each installation for project acquisition considering project type, size, timing, and capital requirements, as well as energy and dollar savings; and (3) targets 100% of the financing required to implement energy efficiency projects. PNL applied this model program to Fort Irwin. This analysis examines the characteristics of electric, propane gas, and vehicle fuel use for a typical operating year. It records energy-use intensities for the facilities at Fort Irwin by building type and energy end use. It also breaks down building energy consumption by fuel type, energy end use, and building type. A complete energy consumption reconciliation is presented that accounts for all energy use among buildings, utilities, and applicable losses.

    17. Dixie Valley Engineered Geothermal System Exploration Methodology Project, Baseline Conceptual Model Report

      DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

      Iovenitti, Joe

      2013-05-15

      The Engineered Geothermal System (EGS) Exploration Methodology Project is developing an exploration approach for EGS through the integration of geoscientific data. The Project chose the Dixie Valley Geothermal System in Nevada as a field laboratory site for methodlogy calibration purposes because, in the public domain, it is a highly characterized geothermal systems in the Basin and Range with a considerable amount of geoscience and most importantly, well data. This Baseline Conceptual Model report summarizes the results of the first three project tasks (1) collect and assess the existing public domain geoscience data, (2) design and populate a GIS database, and (3) develop a baseline (existing data) geothermal conceptual model, evaluate geostatistical relationships, and generate baseline, coupled EGS favorability/trust maps from +1km above sea level (asl) to -4km asl for the Calibration Area (Dixie Valley Geothermal Wellfield) to identify EGS drilling targets at a scale of 5km x 5km. It presents (1) an assessment of the readily available public domain data and some proprietary data provided by Terra-Gen Power, LLC, (2) a re-interpretation of these data as required, (3) an exploratory geostatistical data analysis, (4) the baseline geothermal conceptual model, and (5) the EGS favorability/trust mapping. The conceptual model presented applies to both the hydrothermal system and EGS in the Dixie Valley region.

    18. Baseline and Target Values for PV Forecasts: Toward Improved Solar Power Forecasting: Preprint

      SciTech Connect (OSTI)

      Zhang, Jie; Hodge, Bri-Mathias; Lu, Siyuan; Hamann, Hendrik F.; Lehman, Brad; Simmons, Joseph; Campos, Edwin; Banunarayanan, Venkat

      2015-08-05

      Accurate solar power forecasting allows utilities to get the most out of the solar resources on their systems. To truly measure the improvements that any new solar forecasting methods can provide, it is important to first develop (or determine) baseline and target solar forecasting at different spatial and temporal scales. This paper aims to develop baseline and target values for solar forecasting metrics. These were informed by close collaboration with utility and independent system operator partners. The baseline values are established based on state-of-the-art numerical weather prediction models and persistence models. The target values are determined based on the reduction in the amount of reserves that must be held to accommodate the uncertainty of solar power output. forecasting metrics. These were informed by close collaboration with utility and independent system operator partners. The baseline values are established based on state-of-the-art numerical weather prediction models and persistence models. The target values are determined based on the reduction in the amount of reserves that must be held to accommodate the uncertainty of solar power output.

    19. Dixie Valley Engineered Geothermal System Exploration Methodology Project, Baseline Conceptual Model Report

      DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

      Iovenitti, Joe

      The Engineered Geothermal System (EGS) Exploration Methodology Project is developing an exploration approach for EGS through the integration of geoscientific data. The Project chose the Dixie Valley Geothermal System in Nevada as a field laboratory site for methodlogy calibration purposes because, in the public domain, it is a highly characterized geothermal systems in the Basin and Range with a considerable amount of geoscience and most importantly, well data. This Baseline Conceptual Model report summarizes the results of the first three project tasks (1) collect and assess the existing public domain geoscience data, (2) design and populate a GIS database, and (3) develop a baseline (existing data) geothermal conceptual model, evaluate geostatistical relationships, and generate baseline, coupled EGS favorability/trust maps from +1km above sea level (asl) to -4km asl for the Calibration Area (Dixie Valley Geothermal Wellfield) to identify EGS drilling targets at a scale of 5km x 5km. It presents (1) an assessment of the readily available public domain data and some proprietary data provided by Terra-Gen Power, LLC, (2) a re-interpretation of these data as required, (3) an exploratory geostatistical data analysis, (4) the baseline geothermal conceptual model, and (5) the EGS favorability/trust mapping. The conceptual model presented applies to both the hydrothermal system and EGS in the Dixie Valley region.

    20. Baselines For Land-Use Change In The Tropics: Application ToAvoided Deforestation Projects

      SciTech Connect (OSTI)

      Brown, Sandra; Hall, Myrna; Andrasko, Ken; Ruiz, Fernando; Marzoli, Walter; Guerrero, Gabriela; Masera, Omar; Dushku, Aaron; Dejong,Ben; Cornell, Joseph

      2007-06-01

      Although forest conservation activities particularly in thetropics offer significant potential for mitigating carbon emissions,these types of activities have faced obstacles in the policy arena causedby the difficulty in determining key elements of the project cycle,particularly the baseline. A baseline for forest conservation has twomain components: the projected land-use change and the correspondingcarbon stocks in the applicable pools such as vegetation, detritus,products and soil, with land-use change being the most difficult toaddress analytically. In this paper we focus on developing and comparingthree models, ranging from relatively simple extrapolations of pasttrends in land use based on simple drivers such as population growth tomore complex extrapolations of past trends using spatially explicitmodels of land-use change driven by biophysical and socioeconomicfactors. The three models of the latter category used in the analysis atregional scale are The Forest Area Change (FAC) model, the Land Use andCarbon Sequestration (LUCS) model, and the Geographical Modeling (GEOMOD)model. The models were used to project deforestation in six tropicalregions that featured different ecological and socioeconomic conditions,population dynamics, and uses of the land: (1) northern Belize; (2) SantaCruz State, Bolivia; (3) Parana State in Brazil; (4) Campeche, Mexico;(5) Chiapas, Mexico; and (6) Michoacan, Mexico. A comparison of all modeloutputs across all six regions shows that each model produced quitedifferent deforestation baseline. In general, the simplest FAC model,applied at the national administrative-unit scale, projected the highestamount of forest loss (four out of six) and the LUCS model the leastamount of loss (four out of five). Based on simulations of GEOMOD, wefound that readily observable physical and biological factors as well asdistance to areas of past disturbance were each about twice as importantas either sociological/demographic or economic/infrastructure factors(less observable) in explaining empirical land-use patterns. We proposefrom the lessons learned, a methodology comprised of three main steps andsix tasks can be used to begin developing credible baselines. We alsopropose that the baselines be projected over a 10-year period because,although projections beyond 10 years are feasible, they are likely to beunrealistic for policy purposes. In the first step, an historic land-usechange and deforestation estimate is made by determining the analyticdomain (size of the region relative to the size of proposed project),obtaining historic data, analyzing candidate historic baseline drivers,and identifying three to four major drivers. In the second step, abaseline of where deforestation is likely to occur --a potential land-usechange (PLUC) map is produced using a spatial model such as GEOMOD thatuses the key drivers from step one. Then rates of deforestation areprojected over a 10-year baseline period using any of the three models.Using the PLUC maps, projected rates of deforestation, and carbon stockestimates, baselineprojections are developed that can be used for projectGHG accounting and crediting purposes: The final step proposes that, atagreed interval (eg, +10 years), the baseline assumptions about baselinedrivers be re-assessed. This step reviews the viability of the 10-yearbaseline in light of changes in one or more key baseline drivers (e.g.,new roads, new communities, new protected area, etc.). The potentialland-use change map and estimates of rates of deforestation could beredone at the agreed interval, allowing the rates and changes in spatialdrivers to be incorporated into a defense of the existing baseline, orderivation of a new baseline projection.

    1. India's baseline plan for nuclear energy self-sufficiency.

      SciTech Connect (OSTI)

      Bucher, R .G.; Nuclear Engineering Division

      2009-01-01

      India's nuclear energy strategy has traditionally strived for energy self-sufficiency, driven largely by necessity following trade restrictions imposed by the Nuclear Suppliers Group (NSG) following India's 'peaceful nuclear explosion' of 1974. On September 6, 2008, the NSG agreed to create an exception opening nuclear trade with India, which may create opportunities for India to modify its baseline strategy. The purpose of this document is to describe India's 'baseline plan,' which was developed under constrained trade conditions, as a basis for understanding changes in India's path as a result of the opening of nuclear commerce. Note that this treatise is based upon publicly available information. No attempt is made to judge whether India can meet specified goals either in scope or schedule. In fact, the reader is warned a priori that India's delivery of stated goals has often fallen short or taken a significantly longer period to accomplish. It has been evident since the early days of nuclear power that India's natural resources would determine the direction of its civil nuclear power program. It's modest uranium but vast thorium reserves dictated that the country's primary objective would be thorium utilization. Estimates of India's natural deposits vary appreciably, but its uranium reserves are known to be extremely limited, totaling approximately 80,000 tons, on the order of 1% of the world's deposits; and nominally one-third of this ore is of very low uranium concentration. However, India's roughly 300,000 tons of thorium reserves account for approximately 30% of the world's total. Confronted with this reality, the future of India's nuclear power industry is strongly dependent on the development of a thorium-based nuclear fuel cycle as the only way to insure a stable, sustainable, and autonomous program. The path to India's nuclear energy self-sufficiency was first outlined in a seminal paper by Drs. H. J. Bhabha and N. B. Prasad presented at the Second United Nations Conference on the Peaceful Uses of Atomic Energy in 1958. The paper described a three stage plan for a sustainable nuclear energy program consistent with India's limited uranium but abundant thorium natural resources. In the first stage, natural uranium would be used to fuel graphite or heavy water moderated reactors. Plutonium extracted from the spent fuel of these thermal reactors would drive fast reactors in the second stage that would contain thorium blankets for breeding uranium-233 (U-233). In the final stage, this U-233 would fuel thorium burning reactors that would breed and fission U-233 in situ. This three stage blueprint still reigns as the core of India's civil nuclear power program. India's progress in the development of nuclear power, however, has been impacted by its isolation from the international nuclear community for its development of nuclear weapons and consequent refusal to sign the Nuclear Nonproliferation Treaty (NPT). Initially, India was engaged in numerous cooperative research programs with foreign countries; for example, under the 'Atoms for Peace' program, India acquired the Cirus reactor, a 40 MWt research reactor from Canada moderated with heavy water from the United States. India was also actively engaged in negotiations for the NPT. But, on May 18, 1974, India conducted a 'peaceful nuclear explosion' at Pokharan using plutonium produced by the Cirus reactor, abruptly ending the era of international collaboration. India then refused to sign the NPT, which it viewed as discriminatory since it would be required to join as a non-nuclear weapons state. As a result of India's actions, the Nuclear Suppliers Group (NSG) was created in 1975 to establish guidelines 'to apply to nuclear transfers for peaceful purposes to help ensure that such transfers would not be diverted to unsafeguarded nuclear fuel cycle or nuclear explosive activities. These nuclear export controls have forced India to be largely self-sufficient in all nuclear-related technologies.

    2. Long-Term Stewardship Baseline Report and Transition Guidance

      SciTech Connect (OSTI)

      Kristofferson, Keith

      2001-11-01

      Long-term stewardship consists of those actions necessary to maintain and demonstrate continued protection of human health and the environment after facility cleanup is complete. As the Department of Energy’s (DOE) lead laboratory for environmental management programs, the Idaho National Engineering and Environmental Laboratory (INEEL) administers DOE’s long-term stewardship science and technology efforts. The INEEL provides DOE with technical, and scientific expertise needed to oversee its long-term environmental management obligations complexwide. Long-term stewardship is administered and overseen by the Environmental Management Office of Science and Technology. The INEEL Long-Term Stewardship Program is currently developing the management structures and plans to complete INEEL-specific, long-term stewardship obligations. This guidance document (1) assists in ensuring that the program leads transition planning for the INEEL with respect to facility and site areas and (2) describes the classes and types of criteria and data required to initiate transition for areas and sites where the facility mission has ended and cleanup is complete. Additionally, this document summarizes current information on INEEL facilities, structures, and release sites likely to enter long-term stewardship at the completion of DOE’s cleanup mission. This document is not intended to function as a discrete checklist or local procedure to determine readiness to transition. It is an overarching document meant as guidance in implementing specific transition procedures. Several documents formed the foundation upon which this guidance was developed. Principal among these documents was the Long-Term Stewardship Draft Technical Baseline; A Report to Congress on Long-Term Stewardship, Volumes I and II; Infrastructure Long-Range Plan; Comprehensive Facility Land Use Plan; INEEL End-State Plan; and INEEL Institutional Plan.

    3. Computational Science and Engineering

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Computational Science and Engineering NETL's Computational Science and Engineering competency consists of conducting applied scientific research and developing physics-based simulation models, methods, and tools to support the development and deployment of novel process and equipment designs. Research includes advanced computations to generate information beyond the reach of experiments alone by integrating experimental and computational sciences across different length and time scales. Specific

    4. Multiproject baselines for evaluation of electric power projects

      SciTech Connect (OSTI)

      Sathaye, Jayant; Murtishaw, Scott; Price, Lynn; Lefranc, Maurice; Roy, Joyashree; Winkler, Harald; Spalding-Fecher, Randall

      2003-03-12

      Calculating greenhouse gas emissions reductions from climate change mitigation projects requires construction of a baseline that sets emissions levels that would have occurred without the project. This paper describes a standardized multiproject methodology for setting baselines, represented by the emissions rate (kg C/kWh), for electric power projects. A standardized methodology would reduce the transaction costs of projects. The most challenging aspect of setting multiproject emissions rates is determining the vintage and types of plants to include in the baseline and the stringency of the emissions rates to be considered, in order to balance the desire to encourage no- or low-carbon projects while maintaining environmental integrity. The criteria for selecting power plants to include in the baseline depend on characteristics of both the project and the electricity grid it serves. Two case studies illustrate the application of these concepts to the electric power grids in eastern India and South Africa. We use hypothetical, but realistic, climate change projects in each country to illustrate the use of the multiproject methodology, and note the further research required to fully understand the implications of the various choices in constructing and using these baselines.

    5. Vehicle Technologies Office Merit Review 2014: Integrated Computational Materials Engineering Approach to Development of Lightweight 3GAHSS Vehicle Assembly

      Broader source: Energy.gov [DOE]

      Presentation given by USAMP at 2014 DOE Hydrogen and Fuel Cells Program and Vehicle Technologies Office Annual Merit Review and Peer Evaluation Meeting about integrated computational materials...

    6. Vehicle Technologies Office Merit Review 2015: Integrated Computational Materials Engineering Approach to Development of Lightweight 3GAHSS Vehicle Assembly

      Broader source: Energy.gov [DOE]

      Presentation given by USAMP at 2015 DOE Hydrogen and Fuel Cells Program and Vehicle Technologies Office Annual Merit Review and Peer Evaluation Meeting about integrated computational materials...

    7. Functional Testing Protocols for Commercial Building Efficiency Baseline Modeling Software

      SciTech Connect (OSTI)

      Jump, David; Price, Phillip N.; Granderson, Jessica; Sohn, Michael

      2013-09-06

      This document describes procedures for testing and validating proprietary baseline energy modeling software accuracy in predicting energy use over the period of interest, such as a month or a year. The procedures are designed according to the methodology used for public domain baselining software in another LBNL report that was (like the present report) prepared for Pacific Gas and Electric Company: ?Commercial Building Energy Baseline Modeling Software: Performance Metrics and Method Testing with Open Source Models and Implications for Proprietary Software Testing Protocols? (referred to here as the ?Model Analysis Report?). The test procedure focuses on the quality of the software?s predictions rather than on the specific algorithms used to predict energy use. In this way the software vendor is not required to divulge or share proprietary information about how their software works, while enabling stakeholders to assess its performance.

    8. Fort Drum integrated resource assessment. Volume 2, Baseline detail

      SciTech Connect (OSTI)

      Dixon, D.R.; Armstrong, P.R.; Brodrick, J.R.; Daellenbach, K.K.; Di Massa, F.V.; Keller, J.M.; Richman, E.E.; Sullivan, G.P.; Wahlstrom, R.R.

      1992-12-01

      The US Army Forces Command (FORSCOM) has tasked the Pacific Northwest Laboratory (PNL) as the lead laboratory supporting the US Department of Energy (DOE) Federal Energy Management Program`s mission to identify, evaluate, and assist in acquiring all cost-effective energy projects at Fort Drum. This is a model program PNL is designing for federal customers served by the Niagara Mohawk Power Company. It will identify and evaluate all electric and fossil fuel cost-effective energy projects; develop a schedule at each installation for project acquisition considering project type, size, timing, and capital requirements, as well as energy and dollar savings; and secure 100% of the financing required to implement electric energy efficiency projects from Niagara Mohawk and have Niagara Mohawk procure the necessary contractors to perform detailed audits and install the technologies. This report documents the assessment of baseline energy use at one of Niagara Mohawk`s primary federal facilities, the FORSCOM Fort Drum facility located near Watertown, New York. It is a companion report to Volume 1, the Executive Summary, and Volume 3, the Resource Assessment. This analysis examines the characteristics of electric, gas, oil, propane, coal, and purchased thermal capacity use for fiscal year (FY) 1990. It records energy-use intensities for the facilities at Fort Drum by building type and energy end use. It also breaks down building energy consumption by fuel type, energy end use, and building type. A complete energy consumption reconciliation is presented that includes the accounting of all energy use among buildings, utilities, central systems, and applicable losses.

    9. Griffiss AFB integrated resource assessment. Volume 2, Electric baseline detail

      SciTech Connect (OSTI)

      Dixon, D.R.; Armstrong, P.R.; Keller, J.M.

      1993-02-01

      The US Air Force Air Combat Command has tasked the Pacific Northwest Laboratory (PNL) as the lead laboratory supporting the US Department of Energy (DOE) Federal Energy Management Program`s (FEMP) mission to identify, evaluate, and assist in acquiring all cost-effective energy projects at Griffiss Air Force Base (AFB). This is a model program PNL is designing for federal customers served by the Niagara Mohawk Power Company (Niagara Mohawk). It will (1) identify and evaluate all electric cost-effective energy projects; (2) develop a schedule at each installation for project acquisition considering project type, size, timing, and capital requirements, as well as energy and dollar savings; and (3) secure 100% of the financing required to implement electric energy efficiency projects from Niagara Mohawk and have Niagara Mohawk procure the necessary contractors to perform detailed audits and install the technologies. This report documents the assessment of baseline energy use at one of Niagara Mohawk`s primary federal facilities, Griffiss AFB, an Air Combat Command facility located near Rome, New York. It is a companion report to Volume 1, the Executive Summary, and Volume 3, the Electric Resource Assessment. The analysis examines the characteristics of electric, gas, oil, propane, coal, and purchased thermal capacity use for fiscal year (FY) 1990. The results include energy-use intensities for the facilities at Griffiss AFB by building type and electric energy end use. A complete electric energy consumption reconciliation is presented that accounts for the distribution of all major electric energy uses and losses among buildings, utilities, and central systems.

    10. Precision Measurements of Long-Baseline Neutrino Oscillation at LBNF

      DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

      Worcester, Elizabeth

      2015-08-06

      In a long-baseline neutrino oscillation experiment, the primary physics objectives are to determine the neutrino mass hierarchy, to determine the octant of the neutrino mixing angle Ξ23, to search for CP violation in neutrino oscillation, and to precisely measure the size of any CP-violating effect that is discovered. This presentation provides a brief introduction to these measurements and reports on efforts to optimize the design of a long-baseline neutrino oscillation experiment, the status of LBNE, and the transition to an international collaboration at LBNF.

    11. Precision Measurements of Long-Baseline Neutrino Oscillation at LBNF

      SciTech Connect (OSTI)

      Worcester, Elizabeth

      2015-08-06

      In a long-baseline neutrino oscillation experiment, the primary physics objectives are to determine the neutrino mass hierarchy, to determine the octant of the neutrino mixing angle Ξ23, to search for CP violation in neutrino oscillation, and to precisely measure the size of any CP-violating effect that is discovered. This presentation provides a brief introduction to these measurements and reports on efforts to optimize the design of a long-baseline neutrino oscillation experiment, the status of LBNE, and the transition to an international collaboration at LBNF.

    12. DEVELOPMENT OF A COMPUTATIONAL MULTIPHASE FLOW MODEL FOR FISCHER TROPSCH SYNTHESIS IN A SLURRY BUBBLE COLUMN REACTOR

      SciTech Connect (OSTI)

      Donna Post Guillen; Tami Grimmett; Anastasia M. Gribik; Steven P. Antal

      2010-09-01

      The Hybrid Energy Systems Testing (HYTEST) Laboratory is being established at the Idaho National Laboratory to develop and test hybrid energy systems with the principal objective to safeguard U.S. Energy Security by reducing dependence on foreign petroleum. A central component of the HYTEST is the slurry bubble column reactor (SBCR) in which the gas-to-liquid reactions will be performed to synthesize transportation fuels using the Fischer Tropsch (FT) process. SBCRs are cylindrical vessels in which gaseous reactants (for example, synthesis gas or syngas) is sparged into a slurry of liquid reaction products and finely dispersed catalyst particles. The catalyst particles are suspended in the slurry by the rising gas bubbles and serve to promote the chemical reaction that converts syngas to a spectrum of longer chain hydrocarbon products, which can be upgraded to gasoline, diesel or jet fuel. These SBCRs operate in the churn-turbulent flow regime which is characterized by complex hydrodynamics, coupled with reacting flow chemistry and heat transfer, that effect reactor performance. The purpose of this work is to develop a computational multiphase fluid dynamic (CMFD) model to aid in understanding the physico-chemical processes occurring in the SBCR. Our team is developing a robust methodology to couple reaction kinetics and mass transfer into a four-field model (consisting of the bulk liquid, small bubbles, large bubbles and solid catalyst particles) that includes twelve species: (1) CO reactant, (2) H2 reactant, (3) hydrocarbon product, and (4) H2O product in small bubbles, large bubbles, and the bulk fluid. Properties of the hydrocarbon product were specified by vapor liquid equilibrium calculations. The absorption and kinetic models, specifically changes in species concentrations, have been incorporated into the mass continuity equation. The reaction rate is determined based on the macrokinetic model for a cobalt catalyst developed by Yates and Satterfield [1]. The model includes heat generation due to the exothermic chemical reaction, as well as heat removal from a constant temperature heat exchanger. Results of the CMFD simulations (similar to those shown in Figure 1) will be presented.

    13. Computing and Computational Sciences Directorate - Computer Science...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Computer Science and Mathematics Division The Computer Science and Mathematics Division (CSMD) is ORNL's premier source of basic and applied research in high-performance computing, ...

    14. THE FIRST VERY LONG BASELINE INTERFEROMETRIC SETI EXPERIMENT

      SciTech Connect (OSTI)

      Rampadarath, H.; Morgan, J. S.; Tingay, S. J.; Trott, C. M.

      2012-08-15

      The first Search for Extra-Terrestrial Intelligence (SETI) conducted with very long baseline interferometry (VLBI) is presented. By consideration of the basic principles of interferometry, we show that VLBI is efficient at discriminating between SETI signals and human generated radio frequency interference (RFI). The target for this study was the star Gliese 581, thought to have two planets within its habitable zone. On 2007 June 19, Gliese 581 was observed for 8 hr at 1230-1544 MHz with the Australian Long Baseline Array. The data set was searched for signals appearing on all interferometer baselines above five times the noise limit. A total of 222 potential SETI signals were detected and by using automated data analysis techniques were ruled out as originating from the Gliese 581 system. From our results we place an upper limit of 7 MW Hz{sup -1} on the power output of any isotropic emitter located in the Gliese 581 system within this frequency range. This study shows that VLBI is ideal for targeted SETI including follow-up observations. The techniques presented are equally applicable to next-generation interferometers, such as the long baselines of the Square Kilometre Array.

    15. Technical Baseline Summary Description for the Tank Farm Contractor

      SciTech Connect (OSTI)

      TEDESCHI, A.R.

      2000-04-21

      This document is a revision of the document titled above, summarizing the technical baseline of the Tank Farm Contractor. It is one of several documents prepared by CH2M HILL Hanford Group, Inc. to support the U.S. Department of Energy Office of River Protection Tank Waste Retrieval and Disposal Mission at Hanford.

    16. 241-AZ Farm Annulus Extent of Condition Baseline Inspection

      SciTech Connect (OSTI)

      Engeman, Jason K.; Girardot, Crystal L.; Vazquez, Brandon J.

      2013-05-15

      This report provides the results of the comprehensive annulus visual inspection for tanks 241- AZ-101 and 241-AZ-102 performed in fiscal year 2013. The inspection established a baseline covering about 95 percent of the annulus floor for comparison with future inspections. Any changes in the condition are also included in this document.

    17. Computational mechanics

      SciTech Connect (OSTI)

      Raboin, P J

      1998-01-01

      The Computational Mechanics thrust area is a vital and growing facet of the Mechanical Engineering Department at Lawrence Livermore National Laboratory (LLNL). This work supports the development of computational analysis tools in the areas of structural mechanics and heat transfer. Over 75 analysts depend on thrust area-supported software running on a variety of computing platforms to meet the demands of LLNL programs. Interactions with the Department of Defense (DOD) High Performance Computing and Modernization Program and the Defense Special Weapons Agency are of special importance as they support our ParaDyn project in its development of new parallel capabilities for DYNA3D. Working with DOD customers has been invaluable to driving this technology in directions mutually beneficial to the Department of Energy. Other projects associated with the Computational Mechanics thrust area include work with the Partnership for a New Generation Vehicle (PNGV) for ''Springback Predictability'' and with the Federal Aviation Administration (FAA) for the ''Development of Methodologies for Evaluating Containment and Mitigation of Uncontained Engine Debris.'' In this report for FY-97, there are five articles detailing three code development activities and two projects that synthesized new code capabilities with new analytic research in damage/failure and biomechanics. The article this year are: (1) Energy- and Momentum-Conserving Rigid-Body Contact for NIKE3D and DYNA3D; (2) Computational Modeling of Prosthetics: A New Approach to Implant Design; (3) Characterization of Laser-Induced Mechanical Failure Damage of Optical Components; (4) Parallel Algorithm Research for Solid Mechanics Applications Using Finite Element Analysis; and (5) An Accurate One-Step Elasto-Plasticity Algorithm for Shell Elements in DYNA3D.

    18. Computer-Based Procedures for Field Workers in Nuclear Power Plants: Development of a Model of Procedure Usage and Identification of Requirements

      SciTech Connect (OSTI)

      Katya Le Blanc; Johanna Oxstrand

      2012-04-01

      The nuclear industry is constantly trying to find ways to decrease the human error rate, especially the human errors associated with procedure use. As a step toward the goal of improving procedure use performance, researchers, together with the nuclear industry, have been looking at replacing the current paper-based procedures with computer-based procedure systems. The concept of computer-based procedures is not new by any means; however most research has focused on procedures used in the main control room. Procedures reviewed in these efforts are mainly emergency operating procedures and normal operating procedures. Based on lessons learned for these previous efforts we are now exploring a more unknown application for computer based procedures - field procedures, i.e. procedures used by nuclear equipment operators and maintenance technicians. The Idaho National Laboratory and participants from the U.S. commercial nuclear industry are collaborating in an applied research effort with the objective of developing requirements and specifications for a computer-based procedure system to be used by field workers. The goal is to identify the types of human errors that can be mitigated by using computer-based procedures and how to best design the computer-based procedures to do so. This paper describes the development of a Model of Procedure Use and the qualitative study on which the model is based. The study was conducted in collaboration with four nuclear utilities and five research institutes. During the qualitative study and the model development requirements and for computer-based procedures were identified.

    19. Microsoft PowerPoint - Snippet 4.6 Baseline Control Methods 20140723...

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      to an approved performance baseline, including impacts on the project scope, schedule, design, methods, and cost baselines. The BCP represents a change to one or more of the...

    20. EVMS Training Snippet: 4.2 Integrated Baseline Review (IBR) Process...

      Energy Savers [EERE]

      2 Integrated Baseline Review (IBR) Process EVMS Training Snippet: 4.2 Integrated Baseline Review (IBR) Process This EVMS Training Snippet sponsored by the Office of Project...

    1. Computation & Simulation > Theory & Computation > Research >...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      it. Click above to view. computational2 computational3 In This Section Computation & Simulation Computation & Simulation Extensive combinatorial results and ongoing basic...

    2. Computational mechanics

      SciTech Connect (OSTI)

      Goudreau, G.L.

      1993-03-01

      The Computational Mechanics thrust area sponsors research into the underlying solid, structural and fluid mechanics and heat transfer necessary for the development of state-of-the-art general purpose computational software. The scale of computational capability spans office workstations, departmental computer servers, and Cray-class supercomputers. The DYNA, NIKE, and TOPAZ codes have achieved world fame through our broad collaborators program, in addition to their strong support of on-going Lawrence Livermore National Laboratory (LLNL) programs. Several technology transfer initiatives have been based on these established codes, teaming LLNL analysts and researchers with counterparts in industry, extending code capability to specific industrial interests of casting, metalforming, and automobile crash dynamics. The next-generation solid/structural mechanics code, ParaDyn, is targeted toward massively parallel computers, which will extend performance from gigaflop to teraflop power. Our work for FY-92 is described in the following eight articles: (1) Solution Strategies: New Approaches for Strongly Nonlinear Quasistatic Problems Using DYNA3D; (2) Enhanced Enforcement of Mechanical Contact: The Method of Augmented Lagrangians; (3) ParaDyn: New Generation Solid/Structural Mechanics Codes for Massively Parallel Processors; (4) Composite Damage Modeling; (5) HYDRA: A Parallel/Vector Flow Solver for Three-Dimensional, Transient, Incompressible Viscous How; (6) Development and Testing of the TRIM3D Radiation Heat Transfer Code; (7) A Methodology for Calculating the Seismic Response of Critical Structures; and (8) Reinforced Concrete Damage Modeling.

    3. Parallel computing works

      SciTech Connect (OSTI)

      Not Available

      1991-10-23

      An account of the Caltech Concurrent Computation Program (C{sup 3}P), a five year project that focused on answering the question: Can parallel computers be used to do large-scale scientific computations '' As the title indicates, the question is answered in the affirmative, by implementing numerous scientific applications on real parallel computers and doing computations that produced new scientific results. In the process of doing so, C{sup 3}P helped design and build several new computers, designed and implemented basic system software, developed algorithms for frequently used mathematical computations on massively parallel machines, devised performance models and measured the performance of many computers, and created a high performance computing facility based exclusively on parallel computers. While the initial focus of C{sup 3}P was the hypercube architecture developed by C. Seitz, many of the methods developed and lessons learned have been applied successfully on other massively parallel architectures.

    4. Development of a Computer-based Benchmarking and Analytical Tool. Benchmarking and Energy & Water Savings Tool in Dairy Plants (BEST-Dairy)

      SciTech Connect (OSTI)

      Xu, Tengfang; Flapper, Joris; Ke, Jing; Kramer, Klaas; Sathaye, Jayant

      2012-02-01

      The overall goal of the project is to develop a computer-based benchmarking and energy and water savings tool (BEST-Dairy) for use in the California dairy industry – including four dairy processes – cheese, fluid milk, butter, and milk powder.

    5. Advanced Artificial Science. The development of an artificial science and engineering research infrastructure to facilitate innovative computational modeling, analysis, and application to interdisciplinary areas of scientific investigation.

      SciTech Connect (OSTI)

      Saffer, Shelley I.

      2014-12-01

      This is a final report of the DOE award DE-SC0001132, Advanced Artificial Science. The development of an artificial science and engineering research infrastructure to facilitate innovative computational modeling, analysis, and application to interdisciplinary areas of scientific investigation. This document describes the achievements of the goals, and resulting research made possible by this award.

    6. Computing and Computational Sciences Directorate - Information...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      cost-effective, state-of-the-art computing capabilities for research and development. ... communicates and manages strategy, policy and finance across the portfolio of IT assets. ...

    7. Special Issue On Estimation Of Baselines And Leakage In CarbonMitigation Forestry Projects

      SciTech Connect (OSTI)

      Sathaye, Jayant A.; Andrasko, Kenneth

      2006-06-01

      There is a growing acceptance that the environmentalbenefits of forests extend beyond traditional ecological benefits andinclude the mitigation of climate change. Interest in forestry mitigationactivities has led to the inclusion of forestry practices at the projectlevel in international agreements. Climate change activities place newdemands on participating institutions to set baselines, establishadditionality, determine leakage, ensure permanence, and monitor andverify a project's greenhouse gas benefits. These issues are common toboth forestry and other types of mitigation projects. They demandempirical evidence to establish conditions under which such projects canprovide sustained long term global benefits. This Special Issue reportson papers that experiment with a range of approaches based on empiricalevidence for the setting of baselines and estimation of leakage inprojects in developing Asia and Latin America.

    8. High-Level software requirements specification for the TWRS controlled baseline database system

      SciTech Connect (OSTI)

      Spencer, S.G.

      1998-09-23

      This Software Requirements Specification (SRS) is an as-built document that presents the Tank Waste Remediation System (TWRS) Controlled Baseline Database (TCBD) in its current state. It was originally known as the Performance Measurement Control System (PMCS). Conversion to the new system name has not occurred within the current production system. Therefore, for simplicity, all references to TCBD are equivalent to PMCS references. This SRS will reference the PMCS designator from this point forward to capture the as-built SRS. This SRS is written at a high-level and is intended to provide the design basis for the PMCS. The PMCS was first released as the electronic data repository for cost, schedule, and technical administrative baseline information for the TAAS Program. During its initial development, the PMCS was accepted by the customer, TARS Business Management, with no formal documentation to capture the initial requirements.

    9. Level 3 Baseline Risk Assessment for Building 3515 at Oak Ridge National Lab., Oak Ridge, TN

      SciTech Connect (OSTI)

      Wollert, D.A.; Cretella, F.M.; Golden, K.M.

      1995-08-01

      The baseline risk assessment for the Fission Product Pilot Plant (Building 3515) at the Oak Ridge National laboratory (ORNL) provides the Decontamination and Decommissioning (D&D) Program at ORNL and Building 3515 project managers with information concerning the results of the Level 3 baseline risk assessment performed for this building. The document was prepared under Work Breakdown Structure 1.4.12.6.2.01 (Activity Data Sheet 3701, Facilities D&D) and includes information on the potential long-term impacts to human health and the environment if no action is taken to remediate Building 3515. Information provided in this document forms the basis for the development of remedial alternatives and the no-action risk portion of the Engineering Evaluation/Cost Analysis report.

    10. Recent developments in large-scale finite-element Lagrangian hydrocode technology. [Dyna 20/dyna 30 computer code

      SciTech Connect (OSTI)

      Goudreau, G.L.; Hallquist, J.O.

      1981-10-01

      The state of Lagrangian hydrocodes for computing the large deformation dynamic response of inelastic continuua is reviewed in the context of engineering computation at the Lawrence Livermore National Laboratory, USA, and the DYNA2D/DYNA3D finite elements codes. The emphasis is on efficiency and computational cost. The simplest elements with explicit time integration. The two-dimensional four node quadrilateral and the three-dimensional hexahedron with one point quadrature are advocated as superior to other more expensive choices. Important auxiliary capabilities are a cheap but effective hourglass control, slidelines/planes with void opening/closure, and rezoning. Both strain measures and material formulation are seen as a homogeneous stress point problem and a flexible material subroutine interface admits both incremental and total strain formulation, dependent on internal energy or an arbitrary set of other internal variables. Vectorization on Class VI computers such as the CRAY-1 is a simple exercise for optimally organized primitive element formulations. Some examples of large scale computation are illustrated, including continuous tone graphic representation.

    11. Lawrence Livermore National Laboratories Perspective on Code Development and High Performance Computing Resources in Support of the National HED/ICF Effort

      SciTech Connect (OSTI)

      Clouse, C. J.; Edwards, M. J.; McCoy, M. G.; Marinak, M. M.; Verdon, C. P.

      2015-07-07

      Through its Advanced Scientific Computing (ASC) and Inertial Confinement Fusion (ICF) code development efforts, Lawrence Livermore National Laboratory (LLNL) provides a world leading numerical simulation capability for the National HED/ICF program in support of the Stockpile Stewardship Program (SSP). In addition the ASC effort provides high performance computing platform capabilities upon which these codes are run. LLNL remains committed to, and will work with, the national HED/ICF program community to help insure numerical simulation needs are met and to make those capabilities available, consistent with programmatic priorities and available resources.

    12. Baseline ecological risk assessment Salmon Site, Lamar County, Mississippi

      SciTech Connect (OSTI)

      1995-04-01

      The Salmon Site (SS), formerly the Tatum Dome Test Site, located in Mississippi was the site of two nuclear and two gas explosion tests conducted between 1964 and 1970. A consequence of these testing activities is that radionuclides were released into the salt dome, where they are presently contained. During reentry drilling and other site activities, incidental liquid and solid wastes that contained radioactivity were generated, resulting in some soil, ground water and equipment contamination. As part of the remedial investigation effort, a Baseline Ecological Risk Assessment was conducted at the SS. The purpose is to gauge ecological and other environmental impacts attributable to past activities at the former test facility. The results of this facility-specific baseline risk assessment are presented in this document.

    13. Baseline measurements of terrestrial gamma radioactivity at the CEBAF site

      SciTech Connect (OSTI)

      Wollenberg, H.A.; Smith, A.R.

      1991-10-01

      A survey of the gamma radiation background from terrestrial sources was conducted at the CEBAF site, Newport News, Virginia, on November 12--16, 1990, to provide a gamma radiation baseline for the site prior to the startup of the accelerator. The concentrations and distributions of the natural radioelements in exposed soil were measured, and the results of the measurements were converted into gamma-ray exposure rates. Concurrently, samples were collected for laboratory gamma spectral analyses.

    14. Hybrid Electric Vehicle Fleet and Baseline Performance Testing

      SciTech Connect (OSTI)

      J. Francfort; D. Karner

      2006-04-01

      The U.S. Department of Energy’s Advanced Vehicle Testing Activity (AVTA) conducts baseline performance and fleet testing of hybrid electric vehicles (HEV). To date, the AVTA has completed baseline performance testing on seven HEV models and accumulated 1.4 million fleet testing miles on 26 HEVs. The HEV models tested or in testing include: Toyota Gen I and Gen II Prius, and Highlander; Honda Insight, Civic and Accord; Chevrolet Silverado; Ford Escape; and Lexus RX 400h. The baseline performance testing includes dynamometer and closed track testing to document the HEV’s fuel economy (SAE J1634) and performance in a controlled environment. During fleet testing, two of each HEV model are driven to 160,000 miles per vehicle within 36 months, during which maintenance and repair events, and fuel use is recorded and used to compile life-cycle costs. At the conclusion of the 160,000 miles of fleet testing, the SAE J1634 tests are rerun and each HEV battery pack is tested. These AVTA testing activities are conducted by the Idaho National Laboratory, Electric Transportation Applications, and Exponent Failure Analysis Associates. This paper discusses the testing methods and results.

    15. Final Report. Baseline LAW Glass Formulation Testing, VSL-03R3460-1, Rev. 0

      SciTech Connect (OSTI)

      Muller, Isabelle S.; Pegg, Ian L.; Gan, Hao; Buechele, Andrew; Rielley, Elizabeth; Bazemore, Gina; Cecil, Richard; Hight, Kenneth; Mooers, Cavin; Lai, Shan-Tao T.; Kruger, Albert A.

      2015-06-18

      The major objective of the baseline glass formulation work was to develop and select glass formulations that are compliant with contractual and processing requirements for each of the LAW waste streams. Other objectives of the work included preparation and characterization of glasses with respect to the properties of interest, optimization of sulfate loading in the glasses, evaluation of ability to achieve waste loading limits, testing to demonstrate compatibility of glass melts with melter materials of construction, development of glass formulations to support ILAW qualification activities, and identification of glass formulation issues with respect to contract specifications and processing requirements.

    16. Tank waste remediation system retrieval and disposal mission initial updated baseline summary

      SciTech Connect (OSTI)

      Swita, W.R.

      1998-01-09

      This document provides a summary of the Tank Waste Remediation System (TWRS) Retrieval and Disposal Mission Initial Updated Baseline (scope, schedule, and cost), developed to demonstrate Readiness-to-Proceed (RTP) in support of the TWRS Phase 1B mission. This Updated Baseline is the proposed TWRS plan to execute and measure the mission work scope. This document and other supporting data demonstrate that the TWRS Project Hanford Management Contract (PHMC) team is prepared to fully support Phase 1B by executing the following scope, schedule, and cost baseline activities: Deliver the specified initial low-activity waste (LAW) and high-level waste (HLW) feed batches in a consistent, safe, and reliable manner to support private contractors` operations starting in June 2002; Deliver specified subsequent LAW and HLW feed batches during Phase 1B in a consistent, safe, and reliable manner; Provide for the interim storage of immobilized HLW (IHLW) products and the disposal of immobilized LAW (ILAW) products generated by the private contractors; Provide for disposal of byproduct wastes generated by the private contractors; and Provide the infrastructure to support construction and operations of the private contractors` facilities.

    17. Cognitive Computing for Security.

      SciTech Connect (OSTI)

      Debenedictis, Erik; Rothganger, Fredrick; Aimone, James Bradley; Marinella, Matthew; Evans, Brian Robert; Warrender, Christina E.; Mickel, Patrick

      2015-12-01

      Final report for Cognitive Computing for Security LDRD 165613. It reports on the development of hybrid of general purpose/ne uromorphic computer architecture, with an emphasis on potential implementation with memristors.

    18. NREL: MIDC/SRRL Baseline Measurement System (39.74 N, 105.18 W, 1829 m,

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      GMT-7) Solar Radiation Research Laboratory Baseline Measurement System

    19. Computing and Computational Sciences Directorate - Computer Science...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Computer Science and Mathematics Division Citation: For exemplary administrative secretarial support to the Computer Science and Mathematics Division and to the ORNL ...

    20. Opportunities for Russian Nuclear Weapons Institute developing computer-aided design programs for pharmaceutical drug discovery. Final report

      SciTech Connect (OSTI)

      1996-09-23

      The goal of this study is to determine whether physicists at the Russian Nuclear Weapons Institute can profitably service the need for computer aided drug design (CADD) programs. The Russian physicists` primary competitive advantage is their ability to write particularly efficient code able to work with limited computing power; a history of working with very large, complex modeling systems; an extensive knowledge of physics and mathematics, and price competitiveness. Their primary competitive disadvantage is their lack of biology, and cultural and geographic issues. The first phase of the study focused on defining the competitive landscape, primarily through interviews with and literature searches on the key providers of CADD software. The second phase focused on users of CADD technology to determine deficiencies in the current product offerings, to understand what product they most desired, and to define the potential demand for such a product.

    1. Baseline review of the U.S. LHC Accelerator project

      SciTech Connect (OSTI)

      1998-02-01

      The Department of Energy (DOE) Review of the U.S. Large Hadron Collider (LHC) Accelerator project was conducted February 23--26, 1998, at the request of Dr. John R. O`Fallon, Director, Division of High Energy Physics, Office of Energy Research, U.S. DOE. This is the first review of the U.S. LHC Accelerator project. Overall, the Committee found that the U.S. LHC Accelerator project effort is off to a good start and that the proposed scope is very conservative for the funding available. The Committee recommends that the project be initially baselined at a total cost of $110 million, with a scheduled completion data of 2005. The U.S. LHC Accelerator project will supply high technology superconducting magnets for the interaction regions (IRs) and the radio frequency (rf) straight section of the LHC intersecting storage rings. In addition, the project provides the cryogenic support interface boxes to service the magnets and radiation absorbers to protect the IR dipoles and the inner triplet quadrupoles. US scientists will provide support in analyzing some of the detailed aspects of accelerator physics in the two rings. The three laboratories participating in this project are Brookhaven National Laboratory, Fermi National Accelerator Laboratory (Fermilab), and Lawrence Berkeley National Laboratory. The Committee was very impressed by the technical capabilities of the US LHC Accelerator project team. Cost estimates for each subsystem of the US LHC Accelerator project were presented to the Review Committee, with a total cost including contingency of $110 million (then year dollars). The cost estimates were deemed to be conservative. A re-examination of the funding profile, costs, and schedules on a centralized project basis should lead to an increased list of deliverables. The Committee concluded that the proposed scope of US deliverables to CERN can be readily accomplished with the $110 million total cost baseline for the project. The current deliverables should serve as the baseline scope with the firm expectation that additional scope will be restored to the baseline as the project moves forward. The Committee supports the FY 1998 work plan and scope of deliverables but strongly recommends the reevaluation of costs and schedules with the goal of producing a plan for restoring the US deliverables to CERN. This plan should provide precise dates when scope decisions must be made.

    2. CLAMR (Compute Language Adaptive Mesh Refinement)

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      CLAMR (Compute Language Adaptive Mesh Refinement) CLAMR (Compute Language Adaptive Mesh Refinement) CLAMR (Compute Language Adaptive Mesh Refinement) is being developed as a DOE...

    3. Baseline Assessment of TREAT for Modeling and Analysis Needs

      SciTech Connect (OSTI)

      Bess, John Darrell; DeHart, Mark David

      2015-10-01

      TREAT is an air-cooled, graphite moderated, thermal, heterogeneous test facility designed to evaluate reactor fuels and structural materials under conditions simulating various types of nuclear excursions and transient undercooling situations that could occur in a nuclear reactor. After 21 years in a standby mode, TREAT is being re-activated to revive transient testing capabilities. Given the time elapsed and the concurrent loss of operating experience, current generation and advanced computational methods are being applied to begin TREAT modeling and simulation prior to renewed at-power operations. Such methods have limited value in predicting the behavior of TREAT without proper validation. Hence, the U.S. DOE has developed a number of programs to support development of benchmarks for both critical and transient operations. Extensive effort has been expended at INL to collect detailed descriptions, drawings and specifications for all aspects of TREAT, and to resolve conflicting data found through this process. This report provides a collection of these data, with updated figures that are significantly more readable than historic drawings and illustrations, compositions, and dimensions based on the best available sources. This document is not nor should it be considered to be a benchmark report. Rather, it is intended to provide one-stop shopping, to the extent possible, for other work that seeks to prepare detailed, accurate models of the core and its components. Given the nature of the variety of historic documents available and the loss of institutional memory, the only completely accurate database of TREAT data is TREAT itself. Unfortunately, disassembly of TREAT for inspection, assay, and measurement is highly unlikely. Hence the data provided herein is intended serve as a best-estimate substitute.

    4. Estimating baseline risks from biouptake and food ingestion at a contaminated site

      SciTech Connect (OSTI)

      MacDonell, M.; Woytowich, K.; Blunt, D.; Picel, M.

      1993-11-01

      Biouptake of contaminants and subsequent human exposure via food ingestion represents a public concern at many contaminated sites. Site-specific measurements from plant and animal studies are usually quite limited, so this exposure pathway is often modeled to assess the potential for adverse health effects. A modeling tool was applied to evaluate baseline risks at a contaminated site in Missouri, and the results were used to confirm that ingestion of fish and game animals from the site area do not pose a human health threat. Results were also used to support the development of cleanup criteria for site soil.

    5. Data Management Guide: Integrated Baseline System (IBS). Version 2.1

      SciTech Connect (OSTI)

      Bower, J.C. [Bower Software Services, Kennewick, Washington (United States)] Bower Software Services, Kennewick, Washington (United States); Burford, M.J.; Downing, T.R.; Moise, M.C.; Williams, J.R. [Pacific Northwest Lab., Richland, WA (United States)] Pacific Northwest Lab., Richland, WA (United States)

      1995-01-01

      The Integrated Baseline System (IBS) is an emergency management planning and analysis tool that is being developed under the direction of the US Army Nuclear and Chemical Agency (USANCA). The IBS Data Management Guide provides the background, as well as the operations and procedures needed to generate and maintain a site-specific map database. Data and system managers use this guide to manage the data files and database that support the administrative, user-environment, database management, and operational capabilities of the IBS. This document provides a description of the data files and structures necessary for running the IBS software and using the site map database.

    6. Energy baseline and energy efficiency resource opportunities for the Forest Products Laboratory, Madison, Wisconsin

      SciTech Connect (OSTI)

      Mazzucchi, R.P.; Richman, E.E.; Parker, G.B.

      1993-08-01

      This report provides recommendations to improve the energy use efficiency at the Forest Products Laboratory in Madison, Wisconsin. The assessment focuses upon the four largest buildings and central heating plant at the facility comprising a total of approximately 287,000 square feet. The analysis is comprehensive in nature, intended primarily to determine what if any energy efficiency improvements are warranted based upon the potential for cost-effective energy savings. Because of this breadth, not all opportunities are developed in detail; however, baseline energy consumption data and energy savings concepts are described to provide a foundation for detailed investigation and project design where warranted.

    7. computing | National Nuclear Security Administration

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      computing NNSA Announces Procurement of Penguin Computing Clusters to Support Stockpile Stewardship at National Labs The National Nuclear Security Administration's (NNSA's) Lawrence Livermore National Laboratory today announced the awarding of a subcontract to Penguin Computing - a leading developer of high-performance Linux cluster computing systems based in Silicon Valley - to bolster computing for stockpile

    8. Computed tomography and optical remote sensing: Development for the study of indoor air pollutant transport and dispersion

      SciTech Connect (OSTI)

      Drescher, A.C.

      1995-06-01

      This thesis investigates the mixing and dispersion of indoor air pollutants under a variety of conditions using standard experimental methods. It also extensively tests and improves a novel technique for measuring contaminant concentrations that has the potential for more rapid, non-intrusive measurements with higher spatial resolution than previously possible. Experiments conducted in a sealed room support the hypothesis that the mixing time of an instantaneously released tracer gas is inversely proportional to the cube root of the mechanical power transferred to the room air. One table-top and several room-scale experiments are performed to test the concept of employing optical remote sensing (ORS) and computed tomography (CT) to measure steady-state gas concentrations in a horizontal plane. Various remote sensing instruments, scanning geometries and reconstruction algorithms are employed. Reconstructed concentration distributions based on existing iterative CT techniques contain a high degree of unrealistic spatial variability and do not agree well with simultaneously gathered point-sample data.

    9. Advanced Scientific Computing Research

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Advanced Scientific Computing Research Advanced Scientific Computing Research Discovering, developing, and deploying computational and networking capabilities to analyze, model, simulate, and predict complex phenomena important to the Department of Energy. Get Expertise Pieter Swart (505) 665 9437 Email Pat McCormick (505) 665-0201 Email Dave Higdon (505) 667-2091 Email Fulfilling the potential of emerging computing systems and architectures beyond today's tools and techniques to deliver

    10. System maintenance verification and validation plan for the TWRS controlled baseline database system

      SciTech Connect (OSTI)

      Spencer, S.G.

      1998-09-23

      TWRS Controlled Baseline Database, formally known as the Performance Measurement Control System, is used to track and monitor TWRS project management baseline information. This document contains the verification and validation approach for system documentation changes within the database system.

    11. EVMS Training Snippet: 4.1 The Over Target Baseline (OTB) and...

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      EVMS Training Snippet: 3.1A Integrated Master Schedule (IMS) Initial Baseline Review EVMS Training Snippet: 4.6 Baseline Control Methods EVMS Training Snippet: 4.9 High-level EVM...

    12. Waste Assessment Baseline for the IPOC Second Floor, West Wing

      SciTech Connect (OSTI)

      McCord, Samuel A

      2015-04-01

      Following a building-wide waste assessment in September, 2014, and subsequent presentation to Sandia leadership regarding the goal of Zero Waste by 2025, the occupants of the IPOC Second Floor, West Wing contacted the Materials Sustainability and Pollution Prevention (MSP2) team to guide them to Zero Waste in advance of the rest of the site. The occupants are from Center 3600, Public Relations and Communications , and Center 800, Independent Audit, Ethics and Business Conduct . To accomplish this, MSP2 conducted a new limited waste assessment from March 2-6, 2015 to compare the second floor, west wing to the building as a whole. The assessment also serves as a baseline with which to mark improvements in diversion in approximately 6 months.

    13. Comparison between the Strength Levels of Baseline Nuclear-Grade Graphite and Graphite Irradiated in AGC-2

      SciTech Connect (OSTI)

      Carroll, Mark Christopher

      2015-07-01

      This report details the initial comparison of mechanical strength properties between the cylindrical nuclear-grade graphite specimens irradiated in the second Advanced Graphite Creep (AGC-2) experiment with the established baseline, or unirradiated, mechanical properties compiled in the Baseline Graphite Characterization program. The overall comparative analysis will describe the development of an appropriate test protocol for irradiated specimens, the execution of the mechanical tests on the AGC-2 sample population, and will further discuss the data in terms of developing an accurate irradiated property distribution in the limited amount of irradiated data by leveraging the considerably larger property datasets being captured in the Baseline Graphite Characterization program. Integrating information on the inherent variability in nuclear-grade graphite with more complete datasets is one of the goals of the VHTR Graphite Materials program. Between “sister” specimens, or specimens with the same geometry machined from the same sub-block of graphite from which the irradiated AGC specimens were extracted, and the Baseline datasets, a comprehensive body of data will exist that can provide both a direct and indirect indication of the full irradiated property distributions that can be expected of irradiated nuclear-grade graphite while in service in a VHTR system. While the most critical data will remain the actual irradiated property measurements, expansion of this data into accurate distributions based on the inherent variability in graphite properties will be a crucial step in qualifying graphite for nuclear use as a structural material in a VHTR environment.

    14. Compute nodes

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Compute nodes Compute nodes Click here to see more detailed hierachical map of the topology of a compute node. Last edited: 2016-04-29 11:35:0

    15. Computer System,

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      undergraduate summer institute http:isti.lanl.gov (Educational Prog) 2016 Computer System, Cluster, and Networking Summer Institute Purpose The Computer System,...

    16. Exascale Computing

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      DesignForward FastForward CAL Partnerships Shifter: User Defined Images Archive APEX Home R & D Exascale Computing Exascale Computing Moving forward into the exascale era, ...

    17. TRIDAC host computer functional specification

      SciTech Connect (OSTI)

      Hilbert, S.M.; Hunter, S.L.

      1983-08-23

      The purpose of this document is to outline the baseline functional requirements for the Triton Data Acquisition and Control (TRIDAC) Host Computer Subsystem. The requirements presented in this document are based upon systems that currently support both the SIS and the Uranium Separator Technology Groups in the AVLIS Program at the Lawrence Livermore National Laboratory and upon the specific demands associated with the extended safe operation of the SIS Triton Facility.

    18. Idaho National Laboratory’s Greenhouse Gas FY08 Baseline

      SciTech Connect (OSTI)

      Jennifer D. Morton

      2011-06-01

      A greenhouse gas (GHG) inventory is a systematic attempt to account for the production and release of certain gasses generated by an institution from various emission sources. The gasses of interest are those which have become identified by climate science as related to anthropogenic global climate change. This document presents an inventory of GHGs generated during fiscal year (FY) 2008 by Idaho National Laboratory (INL), a Department of Energy (DOE)-sponsored entity, located in southeastern Idaho. Concern about the environmental impact of GHGs has grown in recent years. This, together with a desire to decrease harmful environmental impacts, would be enough to encourage the calculation of a baseline estimate of total GHGs generated at INL. Additionally, INL has a desire to see how its emissions compare with similar institutions, including other DOE national laboratories. Executive Order 13514 requires that federal agencies and institutions document reductions in GHG emissions in the future, and such documentation will require knowledge of a baseline against which reductions can be measured. INL's FY08 GHG inventory was calculated according to methodologies identified in federal GHG guidance documents using operational control boundaries. It measures emissions generated in three Scopes: (1) INL emissions produced directly by stationary or mobile combustion and by fugitive emissions, (2) the share of emissions generated by entities from which INL purchased electrical power, and (3) indirect or shared emissions generated by outsourced activities that benefit INL (occur outside INL's organizational boundaries but are a consequence of INL's activities). This inventory found that INL generated a total of 113,049 MT of CO2-equivalent emissions during FY08. The following conclusions were made from looking at the results of the individual contributors to INL's baseline GHG inventory: (1) Electricity (including the associated transmission and distribution losses) is the largest contributor to INL's GHG inventory, with over 50% of the CO2e emissions; (2) Other sources with high emissions were stationary combustion (facility fuels), waste disposal (including fugitive emissions from the onsite landfill and contracted disposal), mobile combustion (fleet fuels), employee commuting, and business air travel; and (3) Sources with low emissions were wastewater treatment (onsite and contracted), fugitive emissions from refrigerants, and business ground travel (in personal and rental vehicles). This report details the methods behind quantifying INL's GHG inventory and discusses lessons learned on better practices by which information important to tracking GHGs can be tracked and recorded. It is important to note that because this report differentiates between those portions of INL that are managed and operated by the Battelle Energy Alliance (BEA) and those managed by other contractors, it includes only that large proportion of Laboratory activities overseen by BEA. It is assumed that other contractors will provide similar reporting for those activities they manage, where appropriate.

    19. Idaho National Laboratory’s Greenhouse Gas FY08 Baseline

      SciTech Connect (OSTI)

      Jennifer D. Morton

      2010-09-01

      A greenhouse gas (GHG) inventory is a systematic attempt to account for the production and release of certain gasses generated by an institution from various emission sources. The gasses of interest are those which have become identified by climate science as related to anthropogenic global climate change. This document presents an inventory of GHGs generated during fiscal year (FY) 2008 by Idaho National Laboratory (INL), a Department of Energy (DOE)-sponsored entity, located in southeastern Idaho. Concern about the environmental impact of GHGs has grown in recent years. This, together with a desire to decrease harmful environmental impacts, would be enough to encourage the calculation of a baseline estimate of total GHGs generated at the INL. Additionally, the INL has a desire to see how its emissions compare with similar institutions, including other DOE-sponsored national laboratories. Executive Order 13514 requires that federally-sponsored agencies and institutions document reductions in GHG emissions in the future, and such documentation will require knowledge of a baseline against which reductions can be measured. INL’s FY08 GHG inventory was calculated according to methodologies identified in Federal recommendations and an as-yet-unpublished Technical and Support Document (TSD) using operational control boundary. It measures emissions generated in three Scopes: (1) INL emissions produced directly by stationary or mobile combustion and by fugitive emissions, (2) the share of emissions generated by entities from which INL purchased electrical power, and (3) indirect or shared emissions generated by outsourced activities that benefit INL (occur outside INL’s organizational boundaries but are a consequence of INL’s activities). This inventory found that INL generated a total of 114,256 MT of CO2-equivalent emissions during fiscal year 2008 (FY08). The following conclusions were made from looking at the results of the individual contributors to INL’s baseline GHG inventory: • Electricity is the largest contributor to INL’s GHG inventory, with over 50% of the net anthropogenic CO2e emissions • Other sources with high emissions were stationary combustion, fugitive emissions from the onsite landfill, mobile combustion (fleet fuels) and the employee commute • Sources with low emissions were contracted waste disposal, wastewater treatment (onsite and contracted) and fugitive emissions from refrigerants. This report details the methods behind quantifying INL’s GHG inventory and discusses lessons learned on better practices by which information important to tracking GHGs can be tracked and recorded. It is important to stress that the methodology behind this inventory followed guidelines that have not yet been formally adopted. Thus, some modification of the conclusions may be necessary as additional guidance is received. Further, because this report differentiates between those portions of the INL that are managed and operated by the Battelle Energy Alliance (BEA) and those managed by other contractors, it includes only that large proportion of Laboratory activities overseen by BEA. It is assumed that other contractors will provide similar reporting for those activities they manage, where appropriate.

    20. Baseline Risk Assessment Supporting Closure at Waste Management Area C at the Hanford Site Washington

      SciTech Connect (OSTI)

      Singleton, Kristin M.

      2015-01-07

      The Office of River Protection under the U.S. Department of Energy is pursuing closure of the Single-Shell Tank (SST) Waste Management Area (WMA) C under the requirements of the Hanford Federal Facility Agreement and Consent Order (HFFACO). A baseline risk assessment (BRA) of current conditions is based on available characterization data and information collected at WMA C. The baseline risk assessment is being developed as a part of a Resource Conservation and Recovery Act (RCRA) Facility Investigation (RFI)/Corrective Measures Study (CMS) at WMA C that is mandatory under Comprehensive Environmental Response, Compensation, and Liability Act and RCRA corrective action. The RFI/CMS is needed to identify and evaluate the hazardous chemical and radiological contamination in the vadose zone from past releases of waste from WMA C. WMA C will be under Federal ownership and control for the foreseeable future, and managed as an industrial area with restricted access and various institutional controls. The exposure scenarios evaluated under these conditions include Model Toxics Control Act (MTCA) Method C, industrial worker, maintenance and surveillance worker, construction worker, and trespasser scenarios. The BRA evaluates several unrestricted land use scenarios (residential all-pathway, MTCA Method B, and Tribal) to provide additional information for risk management. Analytical results from 13 shallow zone (0 to 15 ft. below ground surface) sampling locations were collected to evaluate human health impacts at WMA C. In addition, soil analytical data were screened against background concentrations and ecological soil screening levels to determine if soil concentrations have the potential to adversely affect ecological receptors. Analytical data from 12 groundwater monitoring wells were evaluated between 2004 and 2013. A screening of groundwater monitoring data against background concentrations and Federal maximum concentration levels was used to determine vadose zone contamination impacts on groundwater. Waste Management Area C is the first of the Hanford tank farms to begin the closure planning process. The current baseline risk assessment will provide valuable information for making corrective actions and closure decisions for WMA C, and will also support the planning for future tank farm soil investigation and baseline risk assessments.

    1. Development

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      OF IMAGING We require: (1) a high quality ion beam, (2) computer vision and image processing techniques for isolating and re- constructing the beam, and (3) wavelengths suitable...

    2. ATIC as a testbed for the ACCESS baseline calorimeter

      SciTech Connect (OSTI)

      Isbert, J.; Authement, J.; Coleman, J.; Guzik, T. G.; Granger, D.; Lockwood, R.; McMorris, A.; Mock, L.; Oubre, C.; Panasyuk, M.; Peck, J.; Wefel, J. P.; Adams, J. H. Jr.; Boberg, P. R.; Dion-Schwarz, C.; Kroeger, R.; Bashindzhagyan, G. B.; Khein, L.; Samsonov, G. A.; Zatsepin, V. I.

      1999-01-22

      The Advanced Thin Ionization Calorimeter (ATIC) balloon experiment is designed to measure the spectrum of individual elements from H through Fe up to a total energy >10{sup 14} eV. To accomplish this goal, ATIC incorporates a Silicon matrix detector composed of more than 4,000 pixels to measure the incident particle charge in the presence of backscatter background, three plastic scintillator hodoscopes to provide an event trigger as well as a backup measurement of the particle charge and trajectory, a 3/4 interaction length carbon target and a fully active ionization calorimeter composed of 22 radiation lengths of Bismuth Germanate (BGO) crystals. This detector complement is very similar to the baseline calorimeter for the Advanced Cosmic Ray Composition Experiment for the Space Station, ACCESS. The ATIC flights can be used to evaluate such a calorimeter in the cosmic ray 'beam.' ATIC integration is currently underway with a first flight expected during 1999. This talk will discuss ATIC as it applies to ACCESS.

    3. Pentek metal coating removal system: Baseline report; Greenbook (chapter)

      SciTech Connect (OSTI)

      1997-07-31

      The Pentek coating removal technology was tested and is being evaluated at Florida International University (FIU) as a baseline technology. In conjunction with FIU`s evaluation of efficiency and cost, this report covers evaluation conducted for safety and health issues. It is a commercially available technology and has been used for various projects at locations throughout the country. The Pentek coating removal system consisted of the ROTO-PEEN Scaler, CORNER-CUTTER{reg_sign}, and VAC-PAC{reg_sign}. They are designed to remove coatings from steel, concrete, brick, and wood. The Scaler uses 3M Roto Peen tungsten carbide cutters while the CORNER-CUTTER{reg_sign} uses solid needles for descaling activities. These hand tools are used with the VAC-PAC{reg_sign} vacuum system to capture dust and debris as removal of the coating takes place. The safety and health evaluation during the testing demonstration focused on two main areas of exposure: dust and noise. Dust exposure minimal, but noise exposure was significant. Further testing for each exposure is recommended because of the environment where the testing demonstration took place. It is feasible that the dust and noise levels will be higher in an enclosed operating environment of different construction. In addition, other areas of concern found were arm-hand vibration, whole-body, ergonomics, heat stress, tripping hazards, electrical hazards, machine guarding, and lockout/tagout.

    4. Ultra-high pressure water jet: Baseline report

      SciTech Connect (OSTI)

      1997-07-31

      The ultra-high pressure waterjet technology was being evaluated at Florida International University (FIU) as a baseline technology. In conjunction with FIU`s evaluation of efficiency and cost, this report covers the evaluation conducted for safety and health issues. It is a commercially available technology and has been used for various projects at locations throughout the country. The ultra-high pressure waterjet technology acts as a cutting tool for the removal of surface substrates. The Husky{trademark} pump feeds water to a lance that directs the high pressure water at the surface to be removed. The safety and health evaluation during the testing demonstration focused on two main areas of exposure. These were dust and noise. The dust exposure was found to be minimal, which would be expected due to the wet environment inherent in the technology, but noise exposure was at a significant level. Further testing for noise is recommended because of the outdoor environment where the testing demonstration took place. In addition, other areas of concern found were arm-hand vibration, ergonomics, heat stress, tripping hazards, electrical hazards, lockout/tagout, fall hazards, slipping hazards, hazards associated with the high pressure water, and hazards associated with air pressure systems.

    5. Pentek concrete scabbling system: Baseline report; Greenbook (chapter)

      SciTech Connect (OSTI)

      1997-07-31

      The Pentek scabbling technology was tested at Florida International University (FIU) and is being evaluated as a baseline technology. This report evaluates it for safety and health issues. It is a commercially available technology and has been used for various projects at locations throughout the country. The Pentek concrete scabbling system consisted of the MOOSE, SQUIRREL-I, and SQUIRREL-III scabblers. The scabblers are designed to scarify concrete floors and slabs using cross-section, tungsten carbide tipped bits. The bits are designed to remove concrete in 318 inch increments. The bits are either 9-tooth or demolition type. The scabblers are used with a vacuum system designed to collect and filter the concrete dust and contamination that is removed from the surface. The safety and health evaluation conducted during the testing demonstration focused on two main areas of exposure: dust and noise. Dust exposure was minimal, but noise exposure was significant. Further testing for each of these exposures is recommended. Because of the outdoor environment where the testing demonstration took place, results may be inaccurate. It is feasible that the dust and noise levels will be higher in an enclosed operating environment. Other areas of concern were arm-hand vibration, whole-body vibration, ergonomics, heat stress, tripping hazards, electrical hazards, machine guarding, and lockout/tagout.

    6. LTC vacuum blasting machine (metal) baseline report: Greenbook (chapter)

      SciTech Connect (OSTI)

      1997-07-31

      The LTC coating removal technology was tested and is being evaluated at Florida International University (FIU) as a baseline technology. In conjunction with FIU`s evaluation of efficiency and cost, this report covers the evaluation conducted for safety and health issues. It is a commercially available technology and has been used for various projects at locations throughout the country. The LTC coating removal system consisted of several hand tools, a Roto Peen scaler, and a needlegun. They are designed to remove coatings from steel, concrete, brick, and wood. These hand tools are used with the LTC PTC-6 vacuum system to capture dust and debris as removal of the coating takes place. The safety and health evaluation during the testing demonstration focused on two main areas of exposure: dust and noise. The dust exposure was minimal but noise exposure was significant. Further testing for each exposure is recommended because of the environment where the testing demonstration took place. It is feasible that the dust and noise levels will be higher in an enclosed operating environment of different construction. In addition, other areas of concern found were arm-hand vibration, whole-body vibration, ergonomics, heat stress, tripping hazards, electrical hazards, machine guarding, and lockout/tagout.

    7. LTC vacuum blasting maching (concrete): Baseline report: Greenbook (Chapter)

      SciTech Connect (OSTI)

      1997-07-31

      The LTC shot blast technology was tested and is being evaluated at Florida International University (FIU) as a baseline technology. In conjuction with FIU`s evaluation of efficiency and cost, this report covers the evaluation conducted for safety and health issues. It is a commercially available technology and has been used for various projects at locations throughout the country. The LTC 1073 Vacuum Blasting Machine uses a high-capacity, direct-pressure blasting system which incorporates a continuous feed for the blast media. The blast media cleans the surface within the contained brush area of the blast. It incorporates a vacuum system which removes dust and debris from the surface as it is blasted. The safety and health evaluation during the testing demonstration focused on two main areas of exposure: dust and noise. Dust exposure during maintenance activities was minimal, but due to mechanical difficulties dust monitoring could not be conducted during operation. Noise exposure was significant. Further testing for each of these exposures is recommended because of the outdoor environment where the testing demonstration took place. This may cause the results to be inaccurate. It is feasible that the dust and noise levels will be higher in an enclosed environment. In addition, other safety and health issues found were ergonomics, heat stress, tripping hazards, electrical hazards, lockout/tagout, and arm-hand vibration.

    8. Computing Information

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Information From here you can find information relating to: Obtaining the right computer accounts. Using NIC terminals. Using BooNE's Computing Resources, including: Choosing your desktop. Kerberos. AFS. Printing. Recommended applications for various common tasks. Running CPU- or IO-intensive programs (batch jobs) Commonly encountered problems Computing support within BooNE Bringing a computer to FNAL, or purchasing a new one. Laptops. The Computer Security Program Plan for MiniBooNE The

    9. U.S. Department of Energy Performance Baseline Guide

      Broader source: Directives, Delegations, and Requirements [Office of Management (MA)]

      2011-09-23

      This guide identifies key PB elements, development processes, and practices; describes the context in which DOE PB development occurs; and suggests ways of addressing the critical elements in PB development. Supersedes DOE G 413.3-5.

    10. Sampling designs for geochemical baseline studies in the Colorado oil shale region: a manual for practical application

      SciTech Connect (OSTI)

      Klusman, R. W.; Ringrose, C. D.; Candito, R. J.; Zuccaro, B.; Rutherford, D. W.; Dean, W. E.

      1980-06-01

      This manual presents a rationale for sampling designs, and results of geochemical baseline studies in the Colorado portion of the oil-shale region. The program consists of a systematic trace element study of soils, stream sediments, and plants carried out in a way to be conservative of human and financial resources and yield maximum information. Extension of this approach to other parameters, other locations, and to environmental baseline studies in general is a primary objective. A baseline for any geochemical parameter can be defined as the concentration of that parameter in a given medium such as soil, the range of its concentration, and the geographic scale of variability. In air quality studies, and to a lesser extent for plants, the temporal scale of variability must also be considered. In studies of soil, the temporal variablility does not become a factor until such time that a study is deemed necessary to evaluate whether or not there have been changes in baseline levels as a result of development. The manual is divided into five major parts. The first is a suggested sampling protocol which is presented in an outline form for guiding baseline studies in this area. The second section is background information on the physical features of the area of study, trace elements of significance occurring in oil shale, and the sample media used in these studies. The third section is concerned primarily with sampling design and its application to the geochemical studies of the oil shale region. The last sections, in the form of appendices, provide actual data and illustrate in a systematic manner, the calculations performed to obtain the various summary data. The last segment of the appendices is a more academic discussion of the geochemistry of trace elements and the parameters of importance influencing their behavior in natural systems.

    11. COMPARISON OF THREE METHODS TO PROJECT FUTURE BASELINE CARBON EMISSIONS IN TEMPERATE RAINFOREST, CURINANCO, CHILE

      SciTech Connect (OSTI)

      Patrick Gonzalez; Antonio Lara; Jorge Gayoso; Eduardo Neira; Patricio Romero; Leonardo Sotomayor

      2005-07-14

      Deforestation of temperate rainforests in Chile has decreased the provision of ecosystem services, including watershed protection, biodiversity conservation, and carbon sequestration. Forest conservation can restore those ecosystem services. Greenhouse gas policies that offer financing for the carbon emissions avoided by preventing deforestation require a projection of future baseline carbon emissions for an area if no forest conservation occurs. For a proposed 570 km{sup 2} conservation area in temperate rainforest around the rural community of Curinanco, Chile, we compared three methods to project future baseline carbon emissions: extrapolation from Landsat observations, Geomod, and Forest Restoration Carbon Analysis (FRCA). Analyses of forest inventory and Landsat remote sensing data show 1986-1999 net deforestation of 1900 ha in the analysis area, proceeding at a rate of 0.0003 y{sup -1}. The gross rate of loss of closed natural forest was 0.042 y{sup -1}. In the period 1986-1999, closed natural forest decreased from 20,000 ha to 11,000 ha, with timber companies clearing natural forest to establish plantations of non-native species. Analyses of previous field measurements of species-specific forest biomass, tree allometry, and the carbon content of vegetation show that the dominant native forest type, broadleaf evergreen (bosque siempreverde), contains 370 {+-} 170 t ha{sup -1} carbon, compared to the carbon density of non-native Pinus radiata plantations of 240 {+-} 60 t ha{sup -1}. The 1986-1999 conversion of closed broadleaf evergreen forest to open broadleaf evergreen forest, Pinus radiata plantations, shrublands, grasslands, urban areas, and bare ground decreased the carbon density from 370 {+-} 170 t ha{sup -1} carbon to an average of 100 t ha{sup -1} (maximum 160 t ha{sup -1}, minimum 50 t ha{sup -1}). Consequently, the conversion released 1.1 million t carbon. These analyses of forest inventory and Landsat remote sensing data provided the data to evaluate the three methods to project future baseline carbon emissions. Extrapolation from Landsat change detection uses the observed rate of change to estimate change in the near future. Geomod is a software program that models the geographic distribution of change using a defined rate of change. FRCA is an integrated spatial analysis of forest inventory, biodiversity, and remote sensing that produces estimates of forest biodiversity and forest carbon density, spatial data layers of future probabilities of reforestation and deforestation, and a projection of future baseline forest carbon sequestration and emissions for an ecologically-defined area of analysis. For the period 1999-2012, extrapolation from Landsat change detection estimated a loss of 5000 ha and 520,000 t carbon from closed natural forest; Geomod modeled a loss of 2500 ha and 250 000 t; FRCA projected a loss of 4700 {+-} 100 ha and 480,000 t (maximum 760,000 t, minimum 220,000 t). Concerning labor time, extrapolation for Landsat required 90 actual days or 120 days normalized to Bachelor degree level wages; Geomod required 240 actual days or 310 normalized days; FRCA required 110 actual days or 170 normalized days. Users experienced difficulties with an MS-DOS version of Geomod before turning to the Idrisi version. For organizations with limited time and financing, extrapolation from Landsat change provides a cost-effective method. Organizations with more time and financing could use FRCA, the only method where that calculates the deforestation rate as a dependent variable rather than assuming a deforestation rate as an independent variable. This research indicates that best practices for the projection of baseline carbon emissions include integration of forest inventory and remote sensing tasks from the beginning of the analysis, definition of an analysis area using ecological characteristics, use of standard and widely used geographic information systems (GIS) software applications, and the use of species-specific allometric equations and wood densities developed for local species.

    12. Computing Resources

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Cluster-Image TRACC RESEARCH Computational Fluid Dynamics Computational Structural Mechanics Transportation Systems Modeling Computing Resources The TRACC Computational Clusters With the addition of a new cluster called Zephyr that was made operational in September of this year (2012), TRACC now offers two clusters to choose from: Zephyr and our original cluster that has now been named Phoenix. Zephyr was acquired from Atipa technologies, and it is a 92-node system with each node having two AMD

    13. Computer Architecture Lab

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      FastForward CAL Partnerships Shifter: User Defined Images Archive APEX Home » R & D » Exascale Computing » CAL Computer Architecture Lab The goal of the Computer Architecture Laboratory (CAL) is engage in research and development into energy efficient and effective processor and memory architectures for DOE's Exascale program. CAL coordinates hardware architecture R&D activities across the DOE. CAL is a joint NNSA/SC activity involving Sandia National Laboratories (CAL-Sandia) and

    14. Quality Assurance Baseline Assessment Report to Los Alamos National Laboratory Analytical Chemistry Operations

      SciTech Connect (OSTI)

      Jordan, R. A.

      1998-09-01

      This report summarizes observations that were made during a Quality Assurance (QA) Baseline Assessment of the Nuclear Materials Technology Analytical Chemistry Group (NMT-1). The Quality and Planning personnel, for NMT-1, are spending a significant amount of time transitioning out of their roles of environmental oversight into production oversight. A team from the Idaho National Engineering and Environmental Laboratory Defense Program Environmental Surety Program performed an assessment of the current status of the QA Program. Several Los Alamos National Laboratory Analytical Chemistry procedures were reviewed, as well as Transuranic Waste Characterization Program (TWCP) QA documents. Checklists were developed and the assessment was performed according to an Implementation Work Plan, INEEL/EXT-98-00740.

    15. Baseline scheme for polarization preservation and control in the MEIC ion complex

      SciTech Connect (OSTI)

      Derbenev, Yaroslav S.; Lin, Fanglei; Morozov, Vasiliy; Zhang, Yuhong; Kondratenko, Anatoliy; Kondratenko, M A; Filatov, Yury

      2015-09-01

      The scheme for preservation and control of the ion polarization in the Medium-energy Electron-Ion Collider (MEIC) has been under active development in recent years. The figure-8 configuration of the ion rings provides a unique capability to control the polarization of any ion species including deuterons by means of "weak" solenoids rotating the particle spins by small angles. Insertion of "weak" solenoids into the magnetic lattices of the booster and collider rings solves the problem of polarization preservation during acceleration of the ion beam. Universal 3D spin rotators designed on the basis of "weak" solenoids allow one to obtain any polarization orientation at an interaction point of MEIC. This paper presents the baseline scheme for polarization preservation and control in the MEIC ion complex.

    16. Economic Model For a Return on Investment Analysis of United States Government High Performance Computing (HPC) Research and Development (R & D) Investment

      SciTech Connect (OSTI)

      Joseph, Earl C.; Conway, Steve; Dekate, Chirag

      2013-09-30

      This study investigated how high-performance computing (HPC) investments can improve economic success and increase scientific innovation. This research focused on the common good and provided uses for DOE, other government agencies, industry, and academia. The study created two unique economic models and an innovation index: 1 A macroeconomic model that depicts the way HPC investments result in economic advancements in the form of ROI in revenue (GDP), profits (and cost savings), and jobs. 2 A macroeconomic model that depicts the way HPC investments result in basic and applied innovations, looking at variations by sector, industry, country, and organization size. ïč A new innovation index that provides a means of measuring and comparing innovation levels. Key findings of the pilot study include: IDC collected the required data across a broad set of organizations, with enough detail to create these models and the innovation index. The research also developed an expansive list of HPC success stories.

    17. 2008 CHP Baseline Assessment and Action Plan for the California Market |

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      Department of Energy California Market 2008 CHP Baseline Assessment and Action Plan for the California Market This 2008 report provides an updated baseline assessment and action plan for combined heat and power (CHP) in California and identifies hurdles that prevent the expanded use of CHP systems. This report was prepared by the Pacific Region CHP Application Center (RAC). PDF icon chp_california_2008.pdf More Documents & Publications 2008 CHP Baseline Assessment and Action Plan for the

    18. EVMS Training Snippet: 4.2 Integrated Baseline Review (IBR) Process

      Broader source: Energy.gov [DOE]

      This EVMS Training Snippet sponsored by the Office of Project Management (PM) covers the Integrated Baseline Review (IBR) process. 

    19. Microsoft PowerPoint - Snippet 3.1A IMS Initial Baseline Review...

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      and risk. This snippet is recommended whenever a schedule baseline is created or revised. 1 The Contract is the prevailing document regarding what Earned Value Management ...

    20. U.S. Department of Energy Performance Baseline Guide - DOE Directives...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      5A, U.S. Department of Energy Performance Baseline Guide by Brian Kong Functional areas: Program Management, Project Management, Work Processes This guide identifies key PB...

    1. Compute Nodes

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Compute Nodes Compute Nodes Quad CoreAMDOpteronprocessor Compute Node Configuration 9,572 nodes 1 quad-core AMD 'Budapest' 2.3 GHz processor per node 4 cores per node (38,288 total cores) 8 GB DDR3 800 MHz memory per node Peak Gflop rate 9.2 Gflops/core 36.8 Gflops/node 352 Tflops for the entire machine Each core has their own L1 and L2 caches, with 64 KB and 512KB respectively 2 MB L3 cache shared among the 4 cores Compute Node Software By default the compute nodes run a restricted low-overhead

    2. Cost and Performance Comparison Baseline for Fossil Energy Plants...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      energy security. A broad portfolio of technologies is being developed within the Clean Coal Program to accomplish this objective. Ever increasing technological enhancements...

    3. Kenya-Danish Government Baseline Workstream | Open Energy Information

      Open Energy Info (EERE)

      Government Partner Danish Ministry for Climate, Energy, and Building; The Danish Energy Agency Sector Energy Topics Implementation, Low emission development planning Program...

    4. Vietnam-Danish Government Baseline Workstream | Open Energy Informatio...

      Open Energy Info (EERE)

      Government Partner Danish Ministry for Climate, Energy, and Building; The Danish Energy Agency Sector Energy Topics Implementation, Low emission development planning Program...

    5. Thailand-Danish Government Baseline Workstream | Open Energy...

      Open Energy Info (EERE)

      Government Partner Danish Ministry for Climate, Energy, and Building; The Danish Energy Agency Sector Energy Topics Implementation, Low emission development planning Program...

    6. Software and High Performance Computing

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Software and High Performance Computing Software and High Performance Computing Providing world-class high performance computing capability that enables unsurpassed solutions to complex problems of strategic national interest Contact thumbnail of Kathleen McDonald Head of Intellectual Property, Business Development Executive Kathleen McDonald Richard P. Feynman Center for Innovation (505) 667-5844 Email Software Computational physics, computer science, applied mathematics, statistics and the

    7. Development of an Extensible Computational Framework for Centralized Storage and Distributed Curation and Analysis of Genomic Data Genome-scale Metabolic Models

      SciTech Connect (OSTI)

      Stevens, Rick

      2010-08-01

      The DOE funded KBase project of the Stevens group at the University of Chicago was focused on four high-level goals: (i) improve extensibility, accessibility, and scalability of the SEED framework for genome annotation, curation, and analysis; (ii) extend the SEED infrastructure to support transcription regulatory network reconstructions (2.1), metabolic model reconstruction and analysis (2.2), assertions linked to data (2.3), eukaryotic annotation (2.4), and growth phenotype prediction (2.5); (iii) develop a web-API for programmatic remote access to SEED data and services; and (iv) application of all tools to bioenergy-related genomes and organisms. In response to these goals, we enhanced and improved the ModelSEED resource within the SEED to enable new modeling analyses, including improved model reconstruction and phenotype simulation. We also constructed a new website and web-API for the ModelSEED. Further, we constructed a comprehensive web-API for the SEED as a whole. We also made significant strides in building infrastructure in the SEED to support the reconstruction of transcriptional regulatory networks by developing a pipeline to identify sets of consistently expressed genes based on gene expression data. We applied this pipeline to 29 organisms, computing regulons which were subsequently stored in the SEED database and made available on the SEED website (http://pubseed.theseed.org). We developed a new pipeline and database for the use of kmers, or short 8-residue oligomer sequences, to annotate genomes at high speed. Finally, we developed the PlantSEED, or a new pipeline for annotating primary metabolism in plant genomes. All of the work performed within this project formed the early building blocks for the current DOE Knowledgebase system, and the kmer annotation pipeline, plant annotation pipeline, and modeling tools are all still in use in KBase today.

    8. System maintenance test plan for the TWRS controlled baseline database system

      SciTech Connect (OSTI)

      Spencer, S.G.

      1998-09-23

      TWRS [Tank Waste Remediation System] Controlled Baseline Database, formally known as the Performance Measurement Control System, is used to track and monitor TWRS project management baseline information. This document contains the maintenance testing approach for software testing of the TCBD system once SCR/PRs are implemented.

    9. COMPUTATIONAL SCIENCE CENTER

      SciTech Connect (OSTI)

      DAVENPORT, J.

      2006-11-01

      Computational Science is an integral component of Brookhaven's multi science mission, and is a reflection of the increased role of computation across all of science. Brookhaven currently has major efforts in data storage and analysis for the Relativistic Heavy Ion Collider (RHIC) and the ATLAS detector at CERN, and in quantum chromodynamics. The Laboratory is host for the QCDOC machines (quantum chromodynamics on a chip), 10 teraflop/s computers which boast 12,288 processors each. There are two here, one for the Riken/BNL Research Center and the other supported by DOE for the US Lattice Gauge Community and other scientific users. A 100 teraflop/s supercomputer will be installed at Brookhaven in the coming year, managed jointly by Brookhaven and Stony Brook, and funded by a grant from New York State. This machine will be used for computational science across Brookhaven's entire research program, and also by researchers at Stony Brook and across New York State. With Stony Brook, Brookhaven has formed the New York Center for Computational Science (NYCCS) as a focal point for interdisciplinary computational science, which is closely linked to Brookhaven's Computational Science Center (CSC). The CSC has established a strong program in computational science, with an emphasis on nanoscale electronic structure and molecular dynamics, accelerator design, computational fluid dynamics, medical imaging, parallel computing and numerical algorithms. We have been an active participant in DOES SciDAC program (Scientific Discovery through Advanced Computing). We are also planning a major expansion in computational biology in keeping with Laboratory initiatives. Additional laboratory initiatives with a dependence on a high level of computation include the development of hydrodynamics models for the interpretation of RHIC data, computational models for the atmospheric transport of aerosols, and models for combustion and for energy utilization. The CSC was formed to bring together researchers in these areas and to provide a focal point for the development of computational expertise at the Laboratory. These efforts will connect to and support the Department of Energy's long range plans to provide Leadership class computing to researchers throughout the Nation. Recruitment for six new positions at Stony Brook to strengthen its computational science programs is underway. We expect some of these to be held jointly with BNL.

    10. Computing Events

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Laboratory (pdf) DOENNSA Laboratories Fulfill National Mission with Trinity and Cielo Petascale Computers (pdf) Exascale Co-design Center for Materials in Extreme...

    11. Computational Science

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      ... Advanced Materials Laboratory Center for Integrated Nanotechnologies Combustion Research Facility Computational Science Research Institute Joint BioEnergy Institute About EC News ...

    12. Computer Science

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Cite Seer Department of Energy provided open access science research citations in chemistry, physics, materials, engineering, and computer science IEEE Xplore Full text...

    13. Computer Security

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Computer Security All JLF participants must fully comply with all LLNL computer security regulations and procedures. A laptop entering or leaving B-174 for the sole use by a US citizen and so configured, and requiring no IP address, need not be registered for use in the JLF. By September 2009, it is expected that computers for use by Foreign National Investigators will have no special provisions. Notify maricle1@llnl.gov of all other computers entering, leaving, or being moved within B 174. Use

    14. Computing and Computational Sciences Directorate - Divisions

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      CCSD Divisions Computational Sciences and Engineering Computer Sciences and Mathematics Information Technolgoy Services Joint Institute for Computational Sciences National Center ...

    15. Computing and Computational Sciences Directorate - Contacts

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Home About Us Contacts Jeff Nichols Associate Laboratory Director Computing and Computational Sciences Becky Verastegui Directorate Operations Manager Computing and...

    16. Vehicle Technologies Office Merit Review 2015: Computational Design and Development of a New, Lightweight Cast Alloy for Advanced Cylinder Heads in High-Efficiency, Light-Duty Engines

      Broader source: Energy.gov [DOE]

      Presentation given by General Motors at 2015 DOE Hydrogen and Fuel Cells Program and Vehicle Technologies Office Annual Merit Review and Peer Evaluation Meeting about computational design and...

    17. Baseline System Costs for 50.0 MW Enhanced Geothermal System--A Function of: Working Fluid, Technology, and Location, Location, Location

      Broader source: Energy.gov [DOE]

      Project objectives: Develop a baseline cost model of a 50.0 MW Enhanced Geothermal System, including all aspects of the project, from finding the resource through to operation, for a particularly challenging scenario: the deep, radioactively decaying granitic rock of the Pioneer Valley in Western Massachusetts.

    18. GTA (ground test accelerator) Phase 1: Baseline design report

      SciTech Connect (OSTI)

      Not Available

      1986-08-01

      The national Neutral Particle Beam (NPB) program has two objectives: to provide the necessary basis for a discriminator/weapon decision by 1992, and to develop the technology in stages that lead ultimately to a neutral particle beam weapon. The ground test accelerator (GTA) is the test bed that permits the advancement of the state-of-the-art under experimental conditions in an integrated automated system mode. An intermediate goal of the GTA program is to support the Integrated Space Experiments, while the ultimate goal is to support the 1992 decision. The GTA system and each of its major subsystems are described, and project schedules and resource requirements are provided. (LEW)

    19. Compute Nodes

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Compute Nodes Compute Nodes There are currently 2632 nodes available on PDSF. The compute (batch) nodes at PDSF are heterogenous, reflecting the periodic procurement of new nodes (and the eventual retirement of old nodes). From the user's perspective they are essentially all equivalent except that some have more memory per job slot. If your jobs have memory requirements beyond the default maximum of 1.1GB you should specify that in your job submission and the batch system will run your job on an

    20. Insertion Devices for NSLS-II Baseline and Future

      SciTech Connect (OSTI)

      Tanabe,T.

      2008-06-23

      NSLS-II is going to employ Damping Wigglers (DWs) not only for emittance reduction but also as broad band hard X-ray source. In-Vacuum Undulators (IVUs) with the minimum RMS phase error (< 2 degree) and possible cryo-capability are planned for X-ray planar device. Elliptically Polarized Undulators (EPUs) are envisioned for polarization controls. Due to the lack of hard X-ray flux from weak dipole magnet field (0.4 Tesla), three pole wigglers (3PWs) of the peak field over 1 Tesla will be mainly used by NSLS bending magnet beam line users. Magnetic designs and kick maps for dynamic aperture surveys were created using the latest version of Radia [1] for Mathematica 6 which we supported the development. There are other devices planned for the later stage of the project, such as quasi-periodic EPU, superconducting wiggler/undulator, and Cryo-Permanent Magnet Undulator (CPMU) with Praseodymium Iron Boron (PrFeB) magnets and textured Dysprosium poles. For R&D, Hybrid PrFeB arrays were planned to be assembled and field-measured at room temperature, liquid nitrogen and liquid helium temperature using our vertical test facility. We have also developed a specialized power supply for pulsed wire measurement.

    1. Compute Nodes

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Nodes Quad CoreAMDOpteronprocessor Compute Node Configuration 9,572 nodes 1 quad-core AMD 'Budapest' 2.3 GHz processor per node 4 cores per node (38,288 total cores) 8 GB...

    2. Bioinformatics Computing Consultant Position Available

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      You can read more about the positions and apply at jobs.lbl.gov: Bioinformatics High Performance Computing Consultant (job number: 73194) and Software Developer for High...

    3. LHC Computing

      SciTech Connect (OSTI)

      Lincoln, Don

      2015-07-28

      The LHC is the world’s highest energy particle accelerator and scientists use it to record an unprecedented amount of data. This data is recorded in electronic format and it requires an enormous computational infrastructure to convert the raw data into conclusions about the fundamental rules that govern matter. In this video, Fermilab’s Dr. Don Lincoln gives us a sense of just how much data is involved and the incredible computer resources that makes it all possible.

    4. Baseline Concept Description of a Small Modular High Temperature Reactor

      SciTech Connect (OSTI)

      Hans Gougar

      2014-05-01

      The objective of this report is to provide a description of generic small modular high temperature reactors (herein denoted as an smHTR), summarize their distinguishing attributes, and lay out the research and development (R&D) required for commercialization. The generic concepts rely heavily on the modular high temperature gas-cooled reactor designs developed in the 1980s which were never built but for which pre-licensing or certification activities were conducted. The concept matured more recently under the Next Generation Nuclear Plant (NGNP) project, specifically in the areas of fuel and material qualification, methods development, and licensing. As all vendor-specific designs proposed under NGNP were all both ‘small’ or medium-sized and ‘modular’ by International Atomic Energy Agency (IAEA) and Department of Energy (DOE) standards, the technical attributes, challenges, and R&D needs identified, addressed, and documented under NGNP are valid and appropriate in the context of Small Modular Reactor (SMR) applications. Although the term High Temperature Reactor (HTR) is commonly used to denote graphite-moderated, thermal spectrum reactors with coolant temperatures in excess of 650oC at the core outlet, in this report the historical term High Temperature Gas-Cooled Reactor (HTGR) will be used to distinguish the gas-cooled technology described herein from its liquid salt-cooled cousin. Moreover, in this report it is to be understood that the outlet temperature of the helium in an HTGR has an upper limit of 950 degrees C which corresponds to the temperature to which certain alloys are currently being qualified under DOE’s ARC program. Although similar to the HTGR in just about every respect, the Very High Temperature Reactor (VHTR) may have an outlet temperature in excess of 950 degrees C and is therefore farther from commercialization because of the challenges posed to materials exposed to these temperatures. The VHTR is the focus of R&D under the Generation IV program and its specific R&D needs will be included in this report when appropriate for comparison. The distinguishing features of the HTGR are the refractory (TRISO) coated particle fuel, the low-power density, graphite-moderated core, and the high outlet temperature of the inert helium coolant. The low power density and fuel form effectively eliminate the possibility of core melt, even upon a complete loss of coolant pressure and flow. The graphite, which constitutes the bulk of the core volume and mass, provides a large thermal buffer that absorbs fission heat such that thermal transients occur over a timespan of hours or even days. As chemically-inert helium is already a gas, there is no coolant temperature or void feedback on the neutronics and no phase change or corrosion product that could degrade heat transfer. Furthermore, the particle coatings and interstitial graphite retain fission products such that the source terms at the plant boundary remain well below actionable levels under all anticipated nominal and off-normal operating conditions. These attributes enable the reactor to supply process heat to a collocated industrial plant with negligible risk of contamination and minimal dynamic coupling of the facilities (Figure 1). The exceptional retentive properties of coated particle fuel in a graphite matrix were first demonstrated in the DRAGON reactor, a European research facility that began operation in 1964.

    5. Baseline Concept Description of a Small Modular High Temperature Reactor

      SciTech Connect (OSTI)

      Gougar, Hans D.

      2014-10-01

      The objective of this report is to provide a description of generic small modular high temperature reactors (herein denoted as an smHTR), summarize their distinguishing attributes, and lay out the research and development (R&D) required for commercialization. The generic concepts rely heavily on the modular high temperature gas-cooled reactor designs developed in the 1980s which were never built but for which pre-licensing or certification activities were conducted. The concept matured more recently under the Next Generation Nuclear Plant (NGNP) project, specifically in the areas of fuel and material qualification, methods development, and licensing. As all vendor-specific designs proposed under NGNP were all both ‘small’ or medium-sized and ‘modular’ by International Atomic Energy Agency (IAEA) and Department of Energy (DOE) standards, the technical attributes, challenges, and R&D needs identified, addressed, and documented under NGNP are valid and appropriate in the context of Small Modular Reactor (SMR) applications. Although the term High Temperature Reactor (HTR) is commonly used to denote graphite-moderated, thermal spectrum reactors with coolant temperatures in excess of 650oC at the core outlet, in this report the historical term High Temperature Gas-Cooled Reactor (HTGR) will be used to distinguish the gas-cooled technology described herein from its liquid salt-cooled cousin. Moreover, in this report it is to be understood that the outlet temperature of the helium in an HTGR has an upper limit of 950 degrees C which corresponds to the temperature to which certain alloys are currently being qualified under DOE’s ARC program. Although similar to the HTGR in just about every respect, the Very High Temperature Reactor (VHTR) may have an outlet temperature in excess of 950 degrees C and is therefore farther from commercialization because of the challenges posed to materials exposed to these temperatures. The VHTR is the focus of R&D under the Generation IV program and its specific R&D needs will be included in this report when appropriate for comparison. The distinguishing features of the HTGR are the refractory (TRISO) coated particle fuel, the low-power density, graphite-moderated core, and the high outlet temperature of the inert helium coolant. The low power density and fuel form effectively eliminate the possibility of core melt, even upon a complete loss of coolant pressure and flow. The graphite, which constitutes the bulk of the core volume and mass, provides a large thermal buffer that absorbs fission heat such that thermal transients occur over a timespan of hours or even days. As chemically-inert helium is already a gas, there is no coolant temperature or void feedback on the neutronics and no phase change or corrosion product that could degrade heat transfer. Furthermore, the particle coatings and interstitial graphite retain fission products such that the source terms at the plant boundary remain well below actionable levels under all anticipated nominal and off-normal operating conditions. These attributes enable the reactor to supply process heat to a collocated industrial plant with negligible risk of contamination and minimal dynamic coupling of the facilities (Figure 1). The exceptional retentive properties of coated particle fuel in a graphite matrix were first demonstrated in the DRAGON reactor, a European research facility that began operation in 1964.

    6. Baseline Concept Description of a Small Modular High Temperature Reactor

      SciTech Connect (OSTI)

      Hans Gougar

      2014-05-01

      The objective of this report is to provide a description of generic small modular high temperature reactors (herein denoted as an smHTR), summarize their distinguishing attributes, and lay out the research and development (R&D) required for commercialization. The generic concepts rely heavily on the modular high temperature gas-cooled reactor designs developed in the 1980s which were never built but for which pre-licensing or certification activities were conducted. The concept matured more recently under the Next Generation Nuclear Plant (NGNP) project, specifically in the areas of fuel and material qualification, methods development, and licensing. As all vendor-specific designs proposed under NGNP were all both ‘small’ or medium-sized and ‘modular’ by International Atomic Energy Agency (IAEA) and Department of Energy (DOE) standards, the technical attributes, challenges, and R&D needs identified, addressed, and documented under NGNP are valid and appropriate in the context of Small Modular Reactor (SMR) applications. Although the term High Temperature Reactor (HTR) is commonly used to denote graphite-moderated, thermal spectrum reactors with coolant temperatures in excess of 650oC at the core outlet, in this report the historical term High Temperature Gas-Cooled Reactor (HTGR) will be used to distinguish the gas-cooled technology described herein from its liquid salt-cooled cousin. Moreover, in this report it is to be understood that the outlet temperature of the helium in an HTGR has an upper limit of 950 degrees C which corresponds to the temperature to which certain alloys are currently being qualified under DOE’s ARC program. Although similar to the HTGR in just about every respect, the Very High Temperature Reactor (VHTR) may have an outlet temperature in excess of 950 degrees C and is therefore farther from commercialization because of the challenges posed to materials exposed to these temperatures. The VHTR is the focus of R&D under the Generation IV program and its specific R&D needs will be included in this report when appropriate for comparison. The distinguishing features of the HTGR are the refractory (TRISO) coated particle fuel, the low-power density, graphite-moderated core, and the high outlet temperature of the inert helium coolant. The low power density and fuel form effectively eliminate the possibility of core melt, even upon a complete loss of coolant pressure and flow. The graphite, which constitutes the bulk of the core volume and mass, provides a large thermal buffer that absorbs fission heat such that thermal transients occur over a timespan of hours or even days. As chemically-inert helium is already a gas, there is no coolant temperature or void feedback on the neutronics and no phase change or corrosion product that could degrade heat transfer. Furthermore, the particle coatings and interstitial graphite retain fission products such that the source terms at the plant boundary remain well below actionable levels under all anticipated nominal and off-normal operating conditions. These attributes enable the reactor to supply process heat to a collocated industrial plant with negligible risk of contamination and minimal dynamic coupling of the facilities (Figure 1). The exceptional retentive properties of coated particle fuel in a graphite matrix were first demonstrated in the DRAGON reactor, a European research facility that began operation in 1964.

    7. High Hydrogen Concentrations Detected In The Underground Vaults For RH-TRU Waste At INEEL Compared With Calculated Values Using The INEEL-Developed Computer Code

      SciTech Connect (OSTI)

      Rajiv Bhatt; Soli Khericha

      2005-02-01

      About 700 remote-handled transuranic (RH-TRU) waste drums are stored in about 144 underground vaults at the Intermediate-Level Transuranic Storage Facility at the Idaho National Environmental and Engineering Laboratory’s (INEEL’s) Radioactive Waste Management Complex (RWMC). These drums were shipped to the INEEL from 1976 through 1996. During recent monitoring, concentrations of hydrogen were found to be in excess of lower explosive limits. The hydrogen concentration in one vault was detected to be as high as 18% (by volume). This condition required evaluation of the safety basis for the facility. The INEEL has developed a computer program to estimate the hydrogen gas generation as a function of time and diffusion through a series of layers (volumes), with a maximum five layers plus a sink/environment. The program solves the first-order diffusion equations as a function of time. The current version of the code is more flexible in terms of user input. The program allows the user to estimate hydrogen concentrations in the different layers of a configuration and then change the configuration after a given time; e.g.; installation of a filter on an unvented drum or placed in a vault or in a shipping cask. The code has been used to predict vault concentrations and to identify potential problems during retrieval and aboveground storage. The code has generally predicted higher hydrogen concentrations than the measured values, particularly for the drums older than 20 year, which could be due to uncertainty and conservative assumptions in drum age, heat generation rate, hydrogen generation rate, Geff, and diffusion rates through the layers.

    8. An evaluation of baseline conditions at lease tract C-a, Rio Blanco County, Colorado

      SciTech Connect (OSTI)

      Barteaux, W.L.; Biezugbe, G.

      1987-09-01

      An analysis was made of baseline groundwater quality data from oil shale lease tract C-a, managed by Rio Blanco Oil Shale Company. The data are limited in several respects. All conclusions drawn from the data must be qualified with these limitations. Baseline conditions were determined by analyzing data from wells in the upper bedrock and lower bedrock aquifers and from the alluvial wells. Baseline data were considered all data collected before mining operations began. The water quality was then evaluated using the 1987 Colorado State Basic Standards for Ground Water as a basis. The maximum baseline values for several parameters in each aquifer exceed the standard values. The quality of the upper lower bedrock aquifers varies from region to region within the site. Data on the lower bedrock aquifer are insufficient for speculation on the cause of the variations. Variations in the upper bedrock aquifer are possibly caused by leakage of the lower bedrock aquifer. 16 refs., 9 figs., 9 tabs.

    9. EVMS Training Snippet: 4.1 The Over Target Baseline (OTB) and...

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      PDF icon Slides Only PDF icon Slides with Notes Key Resources PMCDP EVMS PARS IIe FPD ... 4.6 Baseline Control Methods EVMS Training Snippet: 4.9 High-level EVM Expectations

    10. The Science and Strategy for Phasing of the Long-Baseline Neutrino Experiment

      SciTech Connect (OSTI)

      Diwan, Milind V.

      2012-05-22

      This note is about the principles behind a phased plan for realizing a Long-Baseline Neutrino Experiment(LBNE) in the U.S.. The most important issue that must be resolved is the direction of the first phase of the experiment. Based on both scientific and programmatic considerations, the U.S. should pursue the best option for accelerator neutrino physics, which is the longer baseline towards Homestake with an optimizedbroadband intense beam.

    11. 2008 CHP Baseline Assessment and Action Plan for the Hawaii Market |

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      Department of Energy Hawaii Market 2008 CHP Baseline Assessment and Action Plan for the Hawaii Market The purpose of this 2008 report is to provide an updated baseline assessment and action plan for combined heat and power (CHP) in Hawaii and to identify the hurdles that prevent the expanded use of CHP systems. This report was prepared by the Pacific Region CHP Application Center (RAC). PDF icon chp_hawaii_2008.pdf More Documents & Publications Renewable Power Options for Electricity

    12. Evaluation of final waste forms and recommendations for baseline alternatives to group and glass

      SciTech Connect (OSTI)

      Bleier, A.

      1997-09-01

      An assessment of final waste forms was made as part of the Federal Facilities Compliance Agreement/Development, Demonstration, Testing, and Evaluation (FFCA/DDT&E) Program because supplemental waste-form technologies are needed for the hazardous, radioactive, and mixed wastes of concern to the Department of Energy and the problematic wastes on the Oak Ridge Reservation. The principal objective was to identify a primary waste-form candidate as an alternative to grout (cement) and glass. The effort principally comprised a literature search, the goal of which was to establish a knowledge base regarding four areas: (1) the waste-form technologies based on grout and glass, (2) candidate alternatives, (3) the wastes that need to be immobilized, and (4) the technical and regulatory constraints on the waste-from technologies. This report serves, in part, to meet this goal. Six families of materials emerged as relevant; inorganic, organic, vitrified, devitrified, ceramic, and metallic matrices. Multiple members of each family were assessed, emphasizing the materials-oriented factors and accounting for the fact that the two most prevalent types of wastes for the FFCA/DDT&E Program are aqueous liquids and inorganic sludges and solids. Presently, no individual matrix is sufficiently developed to permit its immediate implementation as a baseline alternative. Three thermoplastic materials, sulfur-polymer cement (inorganic), bitumen (organic), and polyethylene (organic), are the most technologically developed candidates. Each warrants further study, emphasizing the engineering and economic factors, but each also has limitations that regulate it to a status of short-term alternative. The crystallinity and flexible processing of sulfur provide sulfur-polymer cement with the highest potential for short-term success via encapsulation. Long-term immobilization demands chemical stabilization, which the thermoplastic matrices do not offer. Among the properties of the remaining candidates, those of glass-ceramics (devitrified matrices) represent the best compromise for meeting the probable stricter disposal requirements in the future.

    13. Public Interest Energy Research (PIER) Program Development of a Computer-based Benchmarking and Analytical Tool. Benchmarking and Energy & Water Savings Tool in Dairy Plants (BEST-Dairy)

      SciTech Connect (OSTI)

      Xu, Tengfang; Flapper, Joris; Ke, Jing; Kramer, Klaas; Sathaye, Jayant

      2012-02-01

      The overall goal of the project is to develop a computer-based benchmarking and energy and water savings tool (BEST-Dairy) for use in the California dairy industry - including four dairy processes - cheese, fluid milk, butter, and milk powder. BEST-Dairy tool developed in this project provides three options for the user to benchmark each of the dairy product included in the tool, with each option differentiated based on specific detail level of process or plant, i.e., 1) plant level; 2) process-group level, and 3) process-step level. For each detail level, the tool accounts for differences in production and other variables affecting energy use in dairy processes. The dairy products include cheese, fluid milk, butter, milk powder, etc. The BEST-Dairy tool can be applied to a wide range of dairy facilities to provide energy and water savings estimates, which are based upon the comparisons with the best available reference cases that were established through reviewing information from international and national samples. We have performed and completed alpha- and beta-testing (field testing) of the BEST-Dairy tool, through which feedback from voluntary users in the U.S. dairy industry was gathered to validate and improve the tool's functionality. BEST-Dairy v1.2 was formally published in May 2011, and has been made available for free downloads from the internet (i.e., http://best-dairy.lbl.gov). A user's manual has been developed and published as the companion documentation for use with the BEST-Dairy tool. In addition, we also carried out technology transfer activities by engaging the dairy industry in the process of tool development and testing, including field testing, technical presentations, and technical assistance throughout the project. To date, users from more than ten countries in addition to those in the U.S. have downloaded the BEST-Dairy from the LBNL website. It is expected that the use of BEST-Dairy tool will advance understanding of energy and water usage in individual dairy plants, augment benchmarking activities in the market places, and facilitate implementation of efficiency measures and strategies to save energy and water usage in the dairy industry. Industrial adoption of this emerging tool and technology in the market is expected to benefit dairy plants, which are important customers of California utilities. Further demonstration of this benchmarking tool is recommended, for facilitating its commercialization and expansion in functions of the tool. Wider use of this BEST-Dairy tool and its continuous expansion (in functionality) will help to reduce the actual consumption of energy and water in the dairy industry sector. The outcomes comply very well with the goals set by the AB 1250 for PIER program.

    14. Compute Nodes

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Compute Nodes Compute Nodes MC-proc.png Compute Node Configuration 6,384 nodes 2 twelve-core AMD 'MagnyCours' 2.1-GHz processors per node (see die image to the right and schematic below) 24 cores per node (153,216 total cores) 32 GB DDR3 1333-MHz memory per node (6,000 nodes) 64 GB DDR3 1333-MHz memory per node (384 nodes) Peak Gflop/s rate: 8.4 Gflops/core 201.6 Gflops/node 1.28 Peta-flops for the entire machine Each core has its own L1 and L2 caches, with 64 KB and 512KB respectively One 6-MB

    15. Assessing health impacts in complex eco-epidemiological settings in the humid tropics: Modular baseline health surveys

      SciTech Connect (OSTI)

      Winkler, Mirko S.; Divall, Mark J.; Krieger, Gary R.; Schmidlin, Sandro; Magassouba, Mohamed L.; Knoblauch, Astrid M.; Singer, Burton H.; Utzinger, Juerg

      2012-02-15

      The quantitative assessment of health impacts has been identified as a crucial feature for realising the full potential of health impact assessment (HIA). In settings where demographic and health data are notoriously scarce, but there is a broad range of ascertainable ecological, environmental, epidemiological and socioeconomic information, a diverse toolkit of data collection strategies becomes relevant for the mainly small-area impacts of interest. We present a modular, cross-sectional baseline health survey study design, which has been developed for HIA of industrial development projects in the humid tropics. The modular nature of our toolkit allows our methodology to be readily adapted to the prevailing eco-epidemiological characteristics of a given project setting. Central to our design is a broad set of key performance indicators, covering a multiplicity of health outcomes and determinants at different levels and scales. We present experience and key findings from our modular baseline health survey methodology employed in 14 selected sentinel sites within an iron ore mining project in the Republic of Guinea. We argue that our methodology is a generic example of rapid evidence assembly in difficult-to-reach localities, where improvement of the predictive validity of the assessment and establishment of a benchmark for longitudinal monitoring of project impacts and mitigation efforts is needed.

    16. Quantum steady computation

      SciTech Connect (OSTI)

      Castagnoli, G. )

      1991-08-10

      This paper reports that current conceptions of quantum mechanical computers inherit from conventional digital machines two apparently interacting features, machine imperfection and temporal development of the computational process. On account of machine imperfection, the process would become ideally reversible only in the limiting case of zero speed. Therefore the process is irreversible in practice and cannot be considered to be a fundamental quantum one. By giving up classical features and using a linear, reversible and non-sequential representation of the computational process - not realizable in classical machines - the process can be identified with the mathematical form of a quantum steady state. This form of steady quantum computation would seem to have an important bearing on the notion of cognition.

    17. Commercial Building Energy Baseline Modeling Software: Performance Metrics and Method Testing with Open Source Models and Implications for Proprietary Software Testing

      SciTech Connect (OSTI)

      Price, Phillip N.; Granderson, Jessica; Sohn, Michael; Addy, Nathan; Jump, David

      2013-09-01

      The overarching goal of this work is to advance the capabilities of technology evaluators in evaluating the building-level baseline modeling capabilities of Energy Management and Information System (EMIS) software. Through their customer engagement platforms and products, EMIS software products have the potential to produce whole-building energy savings through multiple strategies: building system operation improvements, equipment efficiency upgrades and replacements, and inducement of behavioral change among the occupants and operations personnel. Some offerings may also automate the quantification of whole-building energy savings, relative to a baseline period, using empirical models that relate energy consumption to key influencing parameters, such as ambient weather conditions and building operation schedule. These automated baseline models can be used to streamline the whole-building measurement and verification (M&V) process, and therefore are of critical importance in the context of multi-measure whole-building focused utility efficiency programs. This report documents the findings of a study that was conducted to begin answering critical questions regarding quantification of savings at the whole-building level, and the use of automated and commercial software tools. To evaluate the modeling capabilities of EMIS software particular to the use case of whole-building savings estimation, four research questions were addressed: 1. What is a general methodology that can be used to evaluate baseline model performance, both in terms of a) overall robustness, and b) relative to other models? 2. How can that general methodology be applied to evaluate proprietary models that are embedded in commercial EMIS tools? How might one handle practical issues associated with data security, intellectual property, appropriate testing ‘blinds’, and large data sets? 3. How can buildings be pre-screened to identify those that are the most model-predictable, and therefore those whose savings can be calculated with least error? 4. What is the state of public domain models, that is, how well do they perform, and what are the associated implications for whole-building measurement and verification (M&V)? Additional project objectives that were addressed as part of this study include: (1) clarification of the use cases and conditions for baseline modeling performance metrics, benchmarks and evaluation criteria, (2) providing guidance for determining customer suitability for baseline modeling, (3) describing the portfolio level effects of baseline model estimation errors, (4) informing PG&E’s development of EMIS technology product specifications, and (5) providing the analytical foundation for future studies about baseline modeling and saving effects of EMIS technologies. A final objective of this project was to demonstrate the application of the methodology, performance metrics, and test protocols with participating EMIS product vendors.

    18. Baseline Library

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Skip navigation links Marketing Resources Reports, Publications, and Research Agricultural Commercial Consumer Products Industrial Institutional Multi-Sector Residential...

    19. Advanced Scientific Computing Research

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Advanced Scientific Computing Research Advanced Scientific Computing Research Discovering, ... The DOE Office of Science's Advanced Scientific Computing Research (ASCR) program ...

    20. Computing at JLab

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      JLab --- Accelerator Controls CAD CDEV CODA Computer Center High Performance Computing Scientific Computing JLab Computer Silo maintained by webmaster@jlab.org...

    1. Baseline Evaluations to Support Control Room Modernization at Nuclear Power Plants

      SciTech Connect (OSTI)

      Boring, Ronald L.; Joe, Jeffrey C.

      2015-02-01

      For any major control room modernization activity at a commercial nuclear power plant (NPP) in the U.S., a utility should carefully follow the four phases prescribed by the U.S. Nuclear Regulatory Commission in NUREG-0711, Human Factors Engineering Program Review Model. These four phases include Planning and Analysis, Design, Verification and Validation, and Implementation and Operation. While NUREG-0711 is a useful guideline, it is written primarily from the perspective of regulatory review, and it therefore does not provide a nuanced account of many of the steps the utility might undertake as part of control room modernization. The guideline is largely summative—intended to catalog final products—rather than formative—intended to guide the overall modernization process. In this paper, we highlight two crucial formative sub-elements of the Planning and Analysis phase specific to control room modernization that are not covered in NUREG-0711. These two sub-elements are the usability and ergonomics baseline evaluations. A baseline evaluation entails evaluating the system as-built and currently in use. The usability baseline evaluation provides key insights into operator performance using the control system currently in place. The ergonomics baseline evaluation identifies possible deficiencies in the physical configuration of the control system. Both baseline evaluations feed into the design of the replacement system and subsequent summative benchmarking activities that help ensure that control room modernization represents a successful evolution of the control system.

    2. Fermilab | Science at Fermilab | Computing | Grid Computing

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Grid Computing Center interior. Grid Computing Center interior. Computing Grid Computing As high-energy physics experiments grow larger in scope, they require more computing power to process and analyze data. Laboratories purchase rooms full of computer nodes for experiments to use. But many experiments need even more capacity during peak periods . And some experiments do not need to use all of their computing power all of the time. In the early 2000s, members of Fermilab's Computing Division

    3. NERSC seeks Computational Systems Group Lead

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      seeks Computational Systems Group Lead NERSC seeks Computational Systems Group Lead January 6, 2011 by Katie Antypas Note: This position is now closed. The Computational Systems Group provides production support and advanced development for the supercomputer systems at NERSC. Manage the Computational Systems Group (CSG) which provides production support and advanced development for the supercomputer systems at NERSC (National Energy Research Scientific Computing Center). These systems, which

    4. RATIO COMPUTER

      DOE Patents [OSTI]

      Post, R.F.

      1958-11-11

      An electronic computer circuit is described for producing an output voltage proportional to the product or quotient of tbe voltages of a pair of input signals. ln essence, the disclosed invention provides a computer having two channels adapted to receive separate input signals and each having amplifiers with like fixed amplification factors and like negatlve feedback amplifiers. One of the channels receives a constant signal for comparison purposes, whereby a difference signal is produced to control the amplification factors of the variable feedback amplifiers. The output of the other channel is thereby proportional to the product or quotient of input signals depending upon the relation of input to fixed signals in the first mentioned channel.

    5. Computer System,

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      System, Cluster, and Networking Summer Institute New Mexico Consortium and Los Alamos National Laboratory HOW TO APPLY Applications will be accepted JANUARY 5 - FEBRUARY 13, 2016 Computing and Information Technology undegraduate students are encouraged to apply. Must be a U.S. citizen. * Submit a current resume; * Offcial University Transcript (with spring courses posted and/or a copy of spring 2016 schedule) 3.0 GPA minimum; * One Letter of Recommendation from a Faculty Member; and * Letter of

    6. Borehole temperatures and a baseline for 20th-century global warming estimates

      SciTech Connect (OSTI)

      Harris, R.N.; Chapman, D.S.

      1997-03-14

      Lack of a 19th-century baseline temperature against which 20th-century warming can be referenced constitutes a deficiency in understanding recent climate change. Combination of borehole temperature profiles, which contain a memory of surface temperature changes in previous centuries, with the meteorologicl archive of surface air temperatures can provide a 19th-century baseline temperature tied to the current observational record. A test case in Utah, where boreholes are interspersed with meteorological stations belonging to the Historical Climatological network, Yields a noise reduction in estimates of 20th-century warming and a baseline temperature that is 0.6{degrees} {+-} 0.1{degrees}C below the 1951 to 1970 mean temperature for the region. 22 refs., 3 figs., 1 tab.

    7. Understanding the Effect of Baseline Modeling Implementation Choices on Analysis of Demand Response Performance

      SciTech Connect (OSTI)

      University of California, Berkeley; Addy, Nathan; Kiliccote, Sila; Mathieu, Johanna; Callaway, Duncan S.

      2012-06-13

      Accurate evaluation of the performance of buildings participating in Demand Response (DR) programs is critical to the adoption and improvement of these programs. Typically, we calculate load sheds during DR events by comparing observed electric demand against counterfactual predictions made using statistical baseline models. Many baseline models exist and these models can produce different shed calculations. Moreover, modelers implementing the same baseline model can make different modeling implementation choices, which may affect shed estimates. In this work, using real data, we analyze the effect of different modeling implementation choices on shed predictions. We focused on five issues: weather data source, resolution of data, methods for determining when buildings are occupied, methods for aligning building data with temperature data, and methods for power outage filtering. Results indicate sensitivity to the weather data source and data filtration methods as well as an immediate potential for automation of methods to choose building occupied modes.

    8. MHD computations for stellarators

      SciTech Connect (OSTI)

      Johnson, J.L.

      1985-12-01

      Considerable progress has been made in the development of computational techniques for studying the magnetohydrodynamic equilibrium and stability properties of three-dimensional configurations. Several different approaches have evolved to the point where comparison of results determined with different techniques shows good agreement. 55 refs., 7 figs.

    9. 2010-06 "Budget Priorities for FY'12 and Baseline Change Proposal with

      Office of Environmental Management (EM)

      Future Budgets at LANL" | Department of Energy 6 "Budget Priorities for FY'12 and Baseline Change Proposal with Future Budgets at LANL" 2010-06 "Budget Priorities for FY'12 and Baseline Change Proposal with Future Budgets at LANL" The intent of this recommendation is to provide LASO with the priorities, which the NNMCAB believes are important to the citizens of Northern New Mexico in the large program to clean up the legacy waste at LANL. PDF icon Rec 2010-06

    10. DOE Announces Webinars on the Mid-Atlantic Baseline Study, EPA's Clean

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      Power Plan and More | Department of Energy the Mid-Atlantic Baseline Study, EPA's Clean Power Plan and More DOE Announces Webinars on the Mid-Atlantic Baseline Study, EPA's Clean Power Plan and More November 13, 2015 - 8:30am Addthis EERE offers webinars to the public on a range of subjects, from adopting the latest energy efficiency and renewable energy technologies, to training for the clean energy workforce. Webinars are free; however, advanced registration is typically required. You can

    11. 2008 CHP Baseline Assessment and Action Plan for the Nevada Market |

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      Department of Energy 08 CHP Baseline Assessment and Action Plan for the Nevada Market 2008 CHP Baseline Assessment and Action Plan for the Nevada Market The purpose of this report is to assess the current status of combined heat and power (CHP) in Nevada and to identify the hurdles that prevent the expanded use of CHP systems. The report summarizes the CHP "landscape" in Nevada, including the current installed base of CHP systems, the potential future CHP market, and the status of

    12. Development and testing of FIDELE: a computer code for finite-difference solution to harmonic magnetic-dipole excitation of an azimuthally symmetric horizontally and radially layered earth

      SciTech Connect (OSTI)

      Vittitoe, C.N.

      1981-04-01

      The FORTRAN IV computer code FIDELE simulates the high-frequency electrical logging of a well in which induction and receiving coils are mounted in an instrument sonde immersed in a drilling fluid. The fluid invades layers of surrounding rock in an azimuthally symmetric pattern, superimposing radial layering upon the horizonally layered earth. Maxwell's equations are reduced to a second-order elliptic differential equation for the azimuthal electric-field intensity. The equation is solved at each spatial position where the complex dielectric constant, magnetic permeability, and electrical conductivity have been assigned. Receiver response is given as the complex open-circuit voltage on receiver coils. The logging operation is simulated by a succession of such solutions as the sonde traverses the borehole. Test problems verify consistency with available results for simple geometries. The code's main advantage is its treatment of a two-dimensional earth; its chief disadvantage is the large computer time required for typical problems. Possible code improvements are noted. Use of the computer code is outlined, and tests of most code features are presented.

    13. Waste Isolation Pilot Plant Transuranic Waste Baseline inventory report. Volume 2. Revision 1

      SciTech Connect (OSTI)

      1995-02-01

      This document is the Baseline Inventory Report for the transuranic (alpha-bearing) wastes stored at the Waste Isolation Pilot Plant (WIPP) in New Mexico. Waste stream profiles including origin, applicable EPA codes, typical isotopic composition, typical waste densities, and typical rates of waste generation for each facility are presented for wastes stored at the WIPP.

    14. Sandia National Laboratories/New Mexico Environmental Baseline update--Revision 1.0

      SciTech Connect (OSTI)

      1996-07-01

      This report provides a baseline update to provide the background information necessary for personnel to prepare clear and consise NEPA documentation. The environment of the Sandia National Laboratories is described in this document, including the ecology, meteorology, climatology, seismology, emissions, cultural resources and land use, visual resources, noise pollution, transportation, and socioeconomics.

    15. Baseline Risk Assessment for the F-Area Burning/Rubble Pits and Rubble Pit

      SciTech Connect (OSTI)

      Palmer, E.

      1996-03-01

      This document provides an overview of the Savannah River Site (SRS) and a description of the F-Area Burning/Rubble Pits (BRPs) and Rubble Pit (RP) unit. It also describes the objectives and scope of the baseline risk assessment (BRA).

    16. Extreme Scale Computing, Co-design

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Information Science, Computing, Applied Math » Extreme Scale Computing, Co-design Extreme Scale Computing, Co-design Computational co-design may facilitate revolutionary designs in the next generation of supercomputers. Get Expertise Tim Germann Physics and Chemistry of Materials Email Allen McPherson Energy and Infrastructure Analysis Email Turab Lookman Physics and Condensed Matter and Complex Systems Email Computational co-design involves developing the interacting components of a

    17. Computing and Computational Sciences Directorate - Joint Institute...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      (JICS). JICS combines the experience and expertise in theoretical and computational science and engineering, computer science, and mathematics in these two institutions and ...

    18. RCRA Facility Investigation/Remedial Investigation Report with the Baseline Risk Assessment for the 716-A Motor Shops Seepage Basin

      SciTech Connect (OSTI)

      Palmer, E.

      1997-08-25

      This document describes the RCRA Facility Investigation/Remedial Investigation/Baseline Risk Assessment of the 716-A Motor Shops Seepage Basin.

    19. EVMS Training Snippet: 4.1 The Over Target Baseline (OTB) and The Over Target Schedule (OTS) Implementations

      Broader source: Energy.gov [DOE]

      This EVMS Training Snippet, sponsored by the Office of Project Management (PM) covers Over Target Baseline and Over Target Schedule implementations.

    20. Magnetic resonance imaging and computational fluid dynamics (CFD) simulations of rabbit nasal airflows for the development of hybrid CFD/PBPK models

      SciTech Connect (OSTI)

      Corley, Richard A.; Minard, Kevin R.; Kabilan, Senthil; Einstein, Daniel R.; Kuprat, Andrew P.; harkema, J. R.; Kimbell, Julia; Gargas, M. L.; Kinzell, John H.

      2009-06-01

      The percentages of total air?ows over the nasal respiratory and olfactory epithelium of female rabbits were cal-culated from computational ?uid dynamics (CFD) simulations of steady-state inhalation. These air?ow calcula-tions, along with nasal airway geometry determinations, are critical parameters for hybrid CFD/physiologically based pharmacokinetic models that describe the nasal dosimetry of water-soluble or reactive gases and vapors in rabbits. CFD simulations were based upon three-dimensional computational meshes derived from magnetic resonance images of three adult female New Zealand White (NZW) rabbits. In the anterior portion of the nose, the maxillary turbinates of rabbits are considerably more complex than comparable regions in rats, mice, mon-keys, or humans. This leads to a greater surface area to volume ratio in this region and thus the potential for increased extraction of water soluble or reactive gases and vapors in the anterior portion of the nose compared to many other species. Although there was considerable interanimal variability in the ?ne structures of the nasal turbinates and air?ows in the anterior portions of the nose, there was remarkable consistency between rabbits in the percentage of total inspired air?ows that reached the ethmoid turbinate region (~50%) that is presumably lined with olfactory epithelium. These latter results (air?ows reaching the ethmoid turbinate region) were higher than previous published estimates for the male F344 rat (19%) and human (7%). These di?erences in regional air?ows can have signi?cant implications in interspecies extrapolations of nasal dosimetry.

    1. High Performance Computing

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      HPC INL Logo Home High-Performance Computing INL's high-performance computing center provides general use scientific computing capabilities to support the lab's efforts in advanced...

    2. Free-piston Stirling engine experimental program: Part 1. Baseline test summary

      SciTech Connect (OSTI)

      Berggren, R.; Moynihan, T.

      1983-06-01

      Free-Piston Stirling Engine experimental data are presented from a series of tests that establish the operating characteristics of the engine and determine performance repeatability. The operating envelope of the engine was to determine maximum parameter range and repeatability. Tests were then carried out in which individual operating parameters were varied while others were maintained constant. These data establish the baseline operation of the engine as a preliminary to a series of tests in which several suspected sources of energy loss are investigated by changing the engine geometry to isolate and magnify each suspected loss mechanism. Performance with the geometry change is compared against baseline operation to quantify the magnitude of the loss mechanism under investigation. The results of the loss mechanism investigation are presented in Part 2 of this report.

    3. Baseline point source load inventory, 1985. 1991 reevaluation report No. 2

      SciTech Connect (OSTI)

      Not Available

      1993-02-04

      The report finalizes and documents the Chesapeake Bay Agreement states' 1985 point source nutrient load estimates initially presented in the Baywide Nutrient Reduction Strategy (BNRS). The Bay Agreement states include Maryland, Virginia, Pennsylvania, and the District of Columbia. Each of the states final, annual, discharged, 1985 point source total phosphorus and total nitrogen nutrient load estimates are presented. These estimates are to serve as the point source baseline for the year 2000 40% nutrient reduction goal. Facility by facility flows, nutrient concentrations and nutrient loads for 1985 from above the fall line (AFL) and from below the fall line (BFL) are presented. The report presents the percent change in the 1985 baseline loads for each of the Bay agreement states relative to 1991. Estimates of 1991 nutrient loads are not available for non-agreement states at this time.

    4. Mixed waste focus area integrated technical baseline report. Phase I, Volume 2: Revision 0

      SciTech Connect (OSTI)

      1996-01-16

      This document (Volume 2) contains the Appendices A through J for the Mixed Waste Focus Area Integrated Technical Baseline Report Phase I for the Idaho National Engineering Laboratory. Included are: Waste Type Managers` Resumes, detailed information on wastewater, combustible organics, debris, unique waste, and inorganic homogeneous solids and soils, and waste data information. A detailed list of technology deficiencies and site needs identification is also provided.

    5. Results from baseline tests of the SPRE I and comparison with code model predictions

      SciTech Connect (OSTI)

      Cairelli, J.E.; Geng, S.M.; Skupinski, R.C.

      1994-09-01

      The Space Power Research Engine (SPRE), a free-piston Stirling engine with linear alternator, is being tested at the NASA Lewis Research Center as part of the Civil Space Technology Initiative (CSTI) as a candidate for high capacity space power. This paper presents results of base-line engine tests at design and off-design operating conditions. The test results are compared with code model predictions.

    6. Baseline and Target Values for PV Forecasts: Toward Improved Solar Power Forecasting: Preprint

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Baseline and Target Values for PV Forecasts: Toward Improved Solar Power Forecasting Preprint Jie Zhang 1 , Bri-Mathias Hodge 1 , Siyuan Lu 2 , Hendrik F. Hamann 2 , Brad Lehman 3 , Joseph Simmons 4 , Edwin Campos 5 , and Venkat Banunarayanan 6 1 National Renewable Energy Laboratory 2 IBM TJ Watson Research Center 3 Northeastern University 4 University of Arizona 5 Argonne National Laboratory 6 U.S. Department of Energy Presented at the IEEE Power and Energy Society General Meeting Denver,

    7. River Corridor Baseline Risk Assessment (RCBRA) Human Health Risk Assessment (Volume 2)

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Sands Jim Hansen U.S. Department of Energy - Richland Operations Office October 12, 2011 River Corridor Baseline Risk Assessment (RCBRA) Human Health Risk Assessment (Volume 2) * RCBRA Human Health Risk Assessment is final - Response provided to HAB advice #246 * RCBRA Ecological Risk Assessment (Draft C) was transmitted to regulators September 27 * Columbia River Component - Draft Ecological Screening Level Risk Assessment ready for regulator review - Draft Human health risk assessment will be

    8. Some Beam Dynamics and Related Studies of Possible Changes to the ILC Baseline Design

      SciTech Connect (OSTI)

      Paterson, Ewan; /SLAC

      2012-04-03

      Since the completion of the ILC Reference Design Report (RDR) in 2007, global R and D has continued on all ILC systems in a coordinated program titled Technical Design Phase 1. This program, which is planned and coordinated by the Program Managers and the Technical Area Group Leaders, will transition to a Phase 2 in 2010 which has the goal of producing a more complete Technical Design Report in 2012. In this transition there will be a re-baseline process which will update and or modify the RDR baseline design taking into account progress with systems design and progress with various technologies coming from the continuing R and D programs. The RDR design was considered by some to be a conservative one and many of the topics being studied for inclusion in a new baseline are directed towards more optimum cost versus risk designs. Some of these are engineering systems design modifications, both technical and civil, while others are accelerator parameters, technical system designs and beam dynamics optimizations. A few of the latter are described here.

    9. The impact of sterile neutrinos on CP measurements at long baselines

      DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

      Gandhi, Raj; Kayser, Boris; Masud, Mehedi; Prakash, Suprabh

      2015-09-01

      With the Deep Underground Neutrino Experiment (DUNE) as an example, we show that the presence of even one sterile neutrino of mass ~1 eV can significantly impact the measurements of CP violation in long baseline experiments. Using a probability level analysis and neutrino-antineutrino asymmetry calculations, we discuss the large magnitude of these effects, and show how they translate into significant event rate deviations at DUNE. These results demonstrate that measurements which, when interpreted in the context of the standard three family paradigm, indicate CP conservation at long baselines, may, in fact hide large CP violation if there is a sterilemore » state. Similarly, any data indicating the violation of CP cannot be properly interpreted within the standard paradigm unless the presence of sterile states of mass O(1 eV) can be conclusively ruled out. Our work underscores the need for a parallel and linked short baseline oscillation program and a highly capable near detector for DUNE, but in order that its highly anticipated results on CP violation in the lepton sector may be correctly interpreted.« less

    10. Idaho National Engineering Laboratory (INEL) Environmental Restoration Program (ERP), Baseline Safety Analysis File (BSAF). Revision 1

      SciTech Connect (OSTI)

      Not Available

      1994-06-20

      This document was prepared to take the place of a Safety Evaluation Report since the Baseline Safety Analysis File (BSAF)and associated Baseline Technical Safety Requirements (TSR) File do not meet the requirements of a complete safety analysis documentation. Its purpose is to present in summary form the background of how the BSAF and Baseline TSR originated and a description of the process by which it was produced and approved for use in the Environmental Restoration Program.The BSAF is a facility safety reference document for INEL environmental restoration activities including environmental remediation of inactive waste sites and decontamination and decommissioning (D&D) of surplus facilities. The BSAF contains safety bases common to environmental restoration activities and guidelines for performing and documenting safety analysis. The common safety bases can be incorporated by reference into the safety analysis documentation prepared for individual environmental restoration activities with justification and any necessary revisions. The safety analysis guidelines in BSAF provide an accepted method for hazard analysis; analysis of normal, abnormal, and accident conditions; human factors analysis; and derivation of TSRS. The BSAF safety bases and guidelines are graded for environmental restoration activities.

    11. Announcement of Computer Software

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      F 241.4 (10-01) (Replaces ESTSC F1 and ESTSC F2) All Other Editions Are Obsolete UNITED STATES DEPARTMENT OF ENERGY ANNOUNCEMENT OF COMPUTER SOFTWARE OMB Control Number 1910-1400 (OMB Burden Disclosure Statement is on last page of Instructions) Record Status (Select One): New Package Software Revision H. Description/Abstract PART I: STI SOFTWARE DESCRIPTION A. Software Title SHORT NAME OR ACRONYM KEYWORDS IN CONTEXT (KWIC) TITLE B. Developer(s) E-MAIL ADDRESS(ES) C. Site Product Number 1. DOE

    12. Development of a lab-scale, high-resolution, tube-generated X-ray computed-tomography system for three-dimensional (3D) materials characterization

      SciTech Connect (OSTI)

      Mertens, J.C.E. Williams, J.J. Chawla, Nikhilesh

      2014-06-01

      The design and construction of a modular high resolution X-ray computed tomography (XCT) system is highlighted in this paper. The design approach is detailed for meeting a specified set of instrument performance goals tailored towards experimental versatility and high resolution imaging. The XCT tool is unique in the detector and X-ray source design configuration, enabling control in the balance between detection efficiency and spatial resolution. The system package is also unique: The sample manipulation approach implemented enables a wide gamut of in situ experimentation to analyze structure evolution under applied stimulus, by optimizing scan conditions through a high degree of controllability. The component selection and design process is detailed: Incorporated components are specified, custom designs are shared, and the approach for their integration into a fully functional XCT scanner is provided. Custom designs discussed include the dual-target X-ray source cradle which maintains position and trajectory of the beam between the two X-ray target configurations with respect to a scintillator mounting and positioning assembly and the imaging sensor, as well as a novel large-format X-ray detector with enhanced adaptability. The instrument is discussed from an operational point of view, including the details of data acquisition and processing implemented for 3D imaging via micro-CT. The performance of the instrument is demonstrated on a silica-glass particle/hydroxyl-terminated-polybutadiene (HTPB) matrix binder PBX simulant. Post-scan data processing, specifically segmentation of the sample's relevant microstructure from the 3D reconstruction, is provided to demonstrate the utility of the instrument. - Highlights: • Custom built X-ray tomography system for microstructural characterization • Detector design for maximizing polychromatic X-ray detection efficiency • X-ray design offered for maximizing X-ray flux with respect to imaging resolution • Novel lab-scale XCT data acquisition and data processing methods • 3D characterization of glass-bead mock plastic-bonded-explosive stimulant.

    13. Community Greening: How to Develop a Strategic Plan | Open Energy...

      Open Energy Info (EERE)

      Focus Area People and Policy Phase Bring the Right People Together, Create a Vision, Determine Baseline, Evaluate Options, Develop Goals, Prepare a Plan, Get Feedback,...

    14. Reference Model Development

      SciTech Connect (OSTI)

      Jepsen, Richard

      2011-11-02

      Presentation from the 2011 Water Peer Review in which principal investigator discusses project progress to develop a representative set of Reference Models (RM) for the MHK industry to develop baseline cost of energy (COE) and evaluate key cost component/system reduction pathways.

    15. Evaluation of metrics and baselines for tracking greenhouse gas emissions trends: Recommendations for the California climate action registry

      SciTech Connect (OSTI)

      Price, Lynn; Murtishaw, Scott; Worrell, Ernst

      2003-06-01

      Executive Summary: The California Climate Action Registry, which was initially established in 2000 and began operation in Fall 2002, is a voluntary registry for recording annual greenhouse gas (GHG) emissions. The purpose of the Registry is to assist California businesses and organizations in their efforts to inventory and document emissions in order to establish a baseline and to document early actions to increase energy efficiency and decrease GHG emissions. The State of California has committed to use its ''best efforts'' to ensure that entities that establish GHG emissions baselines and register their emissions will receive ''appropriate consideration under any future international, federal, or state regulatory scheme relating to greenhouse gas emissions.'' Reporting of GHG emissions involves documentation of both ''direct'' emissions from sources that are under the entity's control and indirect emissions controlled by others. Electricity generated by an off-site power source is consider ed to be an indirect GHG emission and is required to be included in the entity's report. Registry participants include businesses, non-profit organizations, municipalities, state agencies, and other entities. Participants are required to register the GHG emissions of all operations in California, and are encouraged to report nationwide. For the first three years of participation, the Registry only requires the reporting of carbon dioxide (CO2) emissions, although participants are encouraged to report the remaining five Kyoto Protocol GHGs (CH4, N2O, HFCs, PFCs, and SF6). After three years, reporting of all six Kyoto GHG emissions is required. The enabling legislation for the Registry (SB 527) requires total GHG emissions to be registered and requires reporting of ''industry-specific metrics'' once such metrics have been adopted by the Registry. The Ernest Orlando Lawrence Berkeley National Laboratory (Berkeley Lab) was asked to provide technical assistance to the California Energy Commission (Energy Commission) related to the Registry in three areas: (1) assessing the availability and usefulness of industry-specific metrics, (2) evaluating various methods for establishing baselines for calculating GHG emissions reductions related to specific actions taken by Registry participants, and (3) establishing methods for calculating electricity CO2 emission factors. The third area of research was completed in 2002 and is documented in Estimating Carbon Dioxide Emissions Factors for the California Electric Power Sector (Marnay et al., 2002). This report documents our findings related to the first areas of research. For the first area of research, the overall objective was to evaluate the metrics, such as emissions per economic unit or emissions per unit of production that can be used to report GHG emissions trends for potential Registry participants. This research began with an effort to identify methodologies, benchmarking programs, inventories, protocols, and registries that u se industry-specific metrics to track trends in energy use or GHG emissions in order to determine what types of metrics have already been developed. The next step in developing industry-specific metrics was to assess the availability of data needed to determine metric development priorities. Berkeley Lab also determined the relative importance of different potential Registry participant categories in order to asses s the availability of sectoral or industry-specific metrics and then identified industry-specific metrics in use around the world. While a plethora of metrics was identified, no one metric that adequately tracks trends in GHG emissions while maintaining confidentiality of data was identified. As a result of this review, Berkeley Lab recommends the development of a GHG intensity index as a new metric for reporting and tracking GHG emissions trends.Such an index could provide an industry-specific metric for reporting and tracking GHG emissions trends to accurately reflect year to year changes while protecting proprietary data. This GHG intensity index changes while protecting proprietary data. This GHG intensity index would provide Registry participants with a means for demonstrating improvements in their energy and GHG emissions per unit of production without divulging specific values. For the second research area, Berkeley Lab evaluated various methods used to calculate baselines for documentation of energy consumption or GHG emissions reductions, noting those that use industry-specific metrics. Accounting for actions to reduce GHGs can be done on a project-by-project basis or on an entity basis. Establishing project-related baselines for mitigation efforts has been widely discussed in the context of two of the so-called ''flexible mechanisms'' of the Kyoto Protocol to the United Nations Framework Convention on Climate Change (Kyoto Protocol) Joint Implementation (JI) and the Clean Development Mechanism (CDM).

    16. FY12 Quarter 3 Computing Utilization Report – LANL

      SciTech Connect (OSTI)

      Wampler, Cheryl L. [Los Alamos National Laboratory; McClellan, Laura Ann [Los Alamos National Laboratory

      2012-07-25

      DSW continues to dominate the capacity workload, with a focus in Q3 on common model baselining runs in preparation for the Annual Assessment Review (AAR) of the weapon systems. There remains unmet demand for higher fidelity simulations, and for increased throughput of simulations. Common model baselining activities would benefit from doubling the resolution of the models and running twice as many simulations. Capacity systems were also utilized during the quarter to prepare for upcoming Level 2 milestones. Other notable DSW activities include validation of new physics models and safety studies. The safety team used the capacity resources extensively for projects involving 3D computer simulations for the Furrow series of experiments at DARHT (a Level 2 milestone), fragment impact, surety theme, PANTEX assessments, and the 120-day study. With the more than tripling of classified capacity computing resources with the addition of the Luna system and the safety team's imminent access to the Cielo system, demand has been met for current needs. The safety team has performed successful scaling studies on Luna up to 16K PE size-jobs with linear scaling, running the large 3D simulations required for the analysis of Furrow. They will be investigating scaling studies on the Cielo system with the Lustre file system in Q4. Overall average capacity utilization was impacted by negative effects of the LANL Voluntary Separation Program (VSP) at the beginning of Q3, in which programmatic staffing was reduced by 6%, with further losses due to management backfills and attrition, resulting in about 10% fewer users. All classified systems were impacted in April by a planned 2 day red network outage. ASC capacity workload continues to focus on code development, regression testing, and verification and validation (V&V) studies. Significant capacity cycles were used in preparation for a JOWOG in May and several upcoming L2 milestones due in Q4. A network transition has been underway on the unclassified networks to increase access of all ASC users to the unclassified systems through the Yellow Turquoise Integration (YeTI) project. This will help to alleviate the longstanding shortage of resources for ASC unclassified code development and regression testing, and also make a broader palette of machines available to unclassified ASC users, including PSAAP Alliance users. The Moonlight system will be the first capacity resource to be made available through the YETI project, and will make available a significant increase in cycles, as well as GPGPU accelerator technology. The Turing and Lobo machines will be decommissioned in the next quarter. ASC projects running on Cielo as part of the CCC-3 include turbulence, hydrodynamics, burn, asteroids, polycrystals, capability and runtime performance improvements, and materials including carbon and silicone.

    17. Computational analysis of storage synthesis in developing Brassica napus L. (oilseed rape) embryos: Flux variability analysis in relation to 13C-metabolic flux analysis

      SciTech Connect (OSTI)

      Hay, J.; Schwender, J.

      2011-08-01

      Plant oils are an important renewable resource, and seed oil content is a key agronomical trait that is in part controlled by the metabolic processes within developing seeds. A large-scale model of cellular metabolism in developing embryos of Brassica napus (bna572) was used to predict biomass formation and to analyze metabolic steady states by flux variability analysis under different physiological conditions. Predicted flux patterns are highly correlated with results from prior 13C metabolic flux analysis of B. napus developing embryos. Minor differences from the experimental results arose because bna572 always selected only one sugar and one nitrogen source from the available alternatives, and failed to predict the use of the oxidative pentose phosphate pathway. Flux variability, indicative of alternative optimal solutions, revealed alternative pathways that can provide pyruvate and NADPH to plastidic fatty acid synthesis. The nutritional values of different medium substrates were compared based on the overall carbon conversion efficiency (CCE) for the biosynthesis of biomass. Although bna572 has a functional nitrogen assimilation pathway via glutamate synthase, the simulations predict an unexpected role of glycine decarboxylase operating in the direction of NH4+ assimilation. Analysis of the light-dependent improvement of carbon economy predicted two metabolic phases. At very low light levels small reductions in CO2 efflux can be attributed to enzymes of the tricarboxylic acid cycle (oxoglutarate dehydrogenase, isocitrate dehydrogenase) and glycine decarboxylase. At higher light levels relevant to the 13C flux studies, ribulose-1,5-bisphosphate carboxylase activity is predicted to account fully for the light-dependent changes in carbon balance.

    18. Exploratory Experimentation and Computation

      SciTech Connect (OSTI)

      Bailey, David H.; Borwein, Jonathan M.

      2010-02-25

      We believe the mathematical research community is facing a great challenge to re-evaluate the role of proof in light of recent developments. On one hand, the growing power of current computer systems, of modern mathematical computing packages, and of the growing capacity to data-mine on the Internet, has provided marvelous resources to the research mathematician. On the other hand, the enormous complexity of many modern capstone results such as the Poincare conjecture, Fermat's last theorem, and the classification of finite simple groups has raised questions as to how we can better ensure the integrity of modern mathematics. Yet as the need and prospects for inductive mathematics blossom, the requirement to ensure the role of proof is properly founded remains undiminished.

    19. About the Advanced Computing Tech Team | Department of Energy

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      Advanced Computing Tech Team About the Advanced Computing Tech Team The Advanced Computing Tech Team is made up of representatives from DOE and its national laboratories who are involved with developing and using advanced computing tools. The following is a list of some of those programs and what how they are currently using advanced computing in pursuit of their respective missions. Advanced Science Computing Research (ASCR) The mission of the Advanced Scientific Computing Research (ASCR)

    20. Computational Electronics and Electromagnetics

      SciTech Connect (OSTI)

      DeFord, J.F.

      1993-03-01

      The Computational Electronics and Electromagnetics thrust area is a focal point for computer modeling activities in electronics and electromagnetics in the Electronics Engineering Department of Lawrence Livermore National Laboratory (LLNL). Traditionally, they have focused their efforts in technical areas of importance to existing and developing LLNL programs, and this continues to form the basis for much of their research. A relatively new and increasingly important emphasis for the thrust area is the formation of partnerships with industry and the application of their simulation technology and expertise to the solution of problems faced by industry. The activities of the thrust area fall into three broad categories: (1) the development of theoretical and computational models of electronic and electromagnetic phenomena, (2) the development of useful and robust software tools based on these models, and (3) the application of these tools to programmatic and industrial problems. In FY-92, they worked on projects in all of the areas outlined above. The object of their work on numerical electromagnetic algorithms continues to be the improvement of time-domain algorithms for electromagnetic simulation on unstructured conforming grids. The thrust area is also investigating various technologies for conforming-grid mesh generation to simplify the application of their advanced field solvers to design problems involving complicated geometries. They are developing a major code suite based on the three-dimensional (3-D), conforming-grid, time-domain code DSI3D. They continue to maintain and distribute the 3-D, finite-difference time-domain (FDTD) code TSAR, which is installed at several dozen university, government, and industry sites.

    1. Virtual Design Studio (VDS) - Development of an Integrated Computer Simulation Environment for Performance Based Design of Very-Low Energy and High IEQ Buildings

      SciTech Connect (OSTI)

      Chen, Yixing; Zhang, Jianshun; Pelken, Michael; Gu, Lixing; Rice, Danial; Meng, Zhaozhou; Semahegn, Shewangizaw; Feng, Wei; Ling, Francesca; Shi, Jun; Henderson, Hugh

      2013-09-01

      Executive Summary The objective of this study was to develop a “Virtual Design Studio (VDS)”: a software platform for integrated, coordinated and optimized design of green building systems with low energy consumption, high indoor environmental quality (IEQ), and high level of sustainability. This VDS is intended to assist collaborating architects, engineers and project management team members throughout from the early phases to the detailed building design stages. It can be used to plan design tasks and workflow, and evaluate the potential impacts of various green building strategies on the building performance by using the state of the art simulation tools as well as industrial/professional standards and guidelines for green building system design. Engaged in the development of VDS was a multi-disciplinary research team that included architects, engineers, and software developers. Based on the review and analysis of how existing professional practices in building systems design operate, particularly those used in the U.S., Germany and UK, a generic process for performance-based building design, construction and operation was proposed. It distinguishes the whole process into five distinct stages: Assess, Define, Design, Apply, and Monitoring (ADDAM). The current VDS is focused on the first three stages. The VDS considers building design as a multi-dimensional process, involving multiple design teams, design factors, and design stages. The intersection among these three dimensions defines a specific design task in terms of “who”, “what” and “when”. It also considers building design as a multi-objective process that aims to enhance the five aspects of performance for green building systems: site sustainability, materials and resource efficiency, water utilization efficiency, energy efficiency and impacts to the atmospheric environment, and IEQ. The current VDS development has been limited to energy efficiency and IEQ performance, with particular focus on evaluating thermal performance, air quality and lighting environmental quality because of their strong interaction with the energy performance of buildings. The VDS software framework contains four major functions: 1) Design coordination: It enables users to define tasks using the Input-Process-Output flow approach, which specifies the anticipated activities (i.e., the process), required input and output information, and anticipated interactions with other tasks. It also allows task scheduling to define the work flow, and sharing of the design data and information via the internet. 2) Modeling and simulation: It enables users to perform building simulations to predict the energy consumption and IEQ conditions at any of the design stages by using EnergyPlus and a combined heat, air, moisture and pollutant simulation (CHAMPS) model. A method for co-simulation was developed to allow the use of both models at the same time step for the combined energy and indoor air quality analysis. 3) Results visualization: It enables users to display a 3-D geometric design of the building by reading BIM (building information model) file generated by design software such as SketchUp, and the predicted results of heat, air, moisture, pollutant and light distributions in the building. 4) Performance evaluation: It enables the users to compare the performance of a proposed building design against a reference building that is defined for the same type of buildings under the same climate condition, and predicts the percent of improvements over the minimum requirements specified in ASHRAE Standard 55-2010, 62.1-2010 and 90.1-2010. An approach was developed to estimate the potential impact of a design factor on the whole building performance, and hence can assist the user to identify areas that have most pay back for investment. The VDS software was developed by using C++ with the conventional Model, View and Control (MVC) software architecture. The software has been verified by using a simple 3-zone case building. The application of the VDS concepts and framework for building design and performance analysis has been illustrated by using a medium-sized, five story office building that received LEED Platinum Certification from USGBC.

    2. The mixed waste management facility. Project baseline revision 1.2

      SciTech Connect (OSTI)

      Streit, R.D.; Throop, A.L.

      1995-04-01

      Revision 1.2 to the Project Baseline (PB) for the Mixed Waste Management Facility (MWMF) is in response to DOE directives and verbal guidance to (1) Collocate the Decontamination and Waste Treatment Facility (DWTF) and MWMF into a single complex, integrate certain and overlapping functions as a cost-saving measure; (2) Meet certain fiscal year (FY) new-BA funding objectives ($15.3M in FY95) with lower and roughly balanced funding for out years; (3) Reduce Total Project Cost (TPC) for the MWMF Project; (4) Include costs for all appropriate permitting activities in the project TPC. This baseline revision also incorporates revisions in the technical baseline design for Molten Salt Oxidation (MSO) and Mediated Electrochemical Oxidation (MEO). Changes in the WBS dictionary that are necessary as a result of this rebaseline, as well as minor title changes, at WBS Level 3 or above (DOE control level) are approved as a separate document. For completeness, the WBS dictionary that reflects these changes is contained in Appendix B. The PB, with revisions as described in this document, were also the basis for the FY97 Validation Process, presented to DOE and their reviewers on March 21-22, 1995. Appendix C lists information related to prior revisions to the PB. Several key changes relate to the integration of functions and sharing of facilities between the portion of the DWTF that will house the MWMF and those portions that are used by the Hazardous Waste Management (HWM) Division at LLNL. This collocation has been directed by DOE as a cost-saving measure and has been implemented in a manner that maintains separate operational elements from a safety and permitting viewpoint. Appendix D provides background information on the decision and implications of collocating the two facilities.

    3. Application of Robust Design and Advanced Computer Aided Engineering Technologies: Cooperative Research and Development Final Report, CRADA Number CRD-04-143

      SciTech Connect (OSTI)

      Thornton, M.

      2013-06-01

      Oshkosh Corporation (OSK) is taking an aggressive approach to implementing advanced technologies, including hybrid electric vehicle (HEV) technology, throughout their commercial and military product lines. These technologies have important implications for OSK's commercial and military customers, including fleet fuel efficiency, quiet operational modes, additional on-board electric capabilities, and lower thermal signature operation. However, technical challenges exist with selecting the optimal HEV components and design to work within the performance and packaging constraints of specific vehicle applications. SK desires to use unique expertise developed at the Department of Energy?s (DOE) National Renewable Energy Laboratory (NREL), including HEV modeling and simulation. These tools will be used to overcome technical hurdles to implementing advanced heavy vehicle technology that meet performance requirements while improving fuel efficiency.

    4. Inexpensive computer data-acquisition system

      SciTech Connect (OSTI)

      Galvin, J.E.; Brown, I.G.

      1985-10-01

      A system based on an Apple II+ personal computer is used for on-line monitoring of ion-beam characteristics in accelerator ion source development.

    5. Extreme Scale Computing, Co-design

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      ... software, and hardware in an integrated computational co-design process. * Designed Cruft, a suite of molecular dynamics proxy applications (software) developed to explore ...

    6. NERSC seeks Computational Systems Group Lead

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      and advanced development for the supercomputer systems at NERSC (National Energy Research Scientific Computing ... workload demands within hiring and budget constraints. ...

    7. Computer System, Cluster, and Networking Summer Institute Program Description

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      System, Cluster, and Networking Summer Institute Program Description The Computer System, Cluster, and Networking Summer Institute (CSCNSI) is a focused technical enrichment program targeting third-year college undergraduate students currently engaged in a computer science, computer engineering, or similar major. The program emphasizes practical skill development in setting up, configuring, administering, testing, monitoring, and scheduling computer systems, supercomputer clusters, and computer

    8. A comparison of baseline aerodynamic performance of optimally-twisted versus non-twisted HAWT blades

      SciTech Connect (OSTI)

      Simms, D.A.; Robinson, M.C.; Hand, M.M.; Fingersh, L.J.

      1995-01-01

      NREL has completed the initial twisted blade field tests of the ``Unsteady Aerodynamics Experiment.`` This test series continues systematic measurements of unsteady aerodynamic phenomena prevalent in stall-controlled horizontal axis wind turbines (HAWTs). The blade twist distribution optimizes power production at a single angle of attack along the span. Abrupt transitions into and out of stall are created due to rapid changes in inflow. Data from earlier experiments have been analyzed extensively to characterize the steady and unsteady response of untwisted blades. In this report, a characterization and comparison of the baseline aerodynamic performance of the twisted versus non-twisted blade sets will be presented for steady flow conditions.

    9. Formation and Sustainment of ITPs in ITER with the Baseline Heating Mix

      SciTech Connect (OSTI)

      Francesca M. Poli and Charles Kessel

      2012-12-03

      Plasmas with internal transport barriers (ITBs) are a potential and attractive route to steady-state operation in ITER. These plasmas exhibit radially localized regions of improved con nement with steep pressure gradients in the plasma core, which drive large bootstrap current and generate hollow current pro les and negative shear. This work examines the formation and sustainment of ITBs in ITER with electron cyclotron heating and current drive. It is shown that, with a trade-o of the power delivered to the equatorial and to the upper launcher, the sustainment of steady-state ITBs can be demonstrated in ITER with the baseline heating con guration.

    10. Vesta | Argonne Leadership Computing Facility

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Policies Documentation Feedback Please provide feedback to help guide us as we continue to build documentation for our new computing resource. [Feedback Form] Vesta Vesta is the ALCF's test and development platform, serving as a launching pad for researchers planning to use Mira. Vesta has the same architecture as Mira, but on a much smaller scale (two computer racks compared to Mira's 48 racks). This system enables researchers to debug and scale up codes for the Blue Gene/Q architecture in

    11. Secure computing for the 'Everyman'

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Secure computing for the 'Everyman' Secure computing for the 'Everyman' If implemented on a wide scale, quantum key distribution technology could ensure truly secure commerce, banking, communications and data transfer. September 2, 2014 This small device developed at Los Alamos National Laboratory uses the truly random spin of light particles as defined by laws of quantum mechanics to generate a random number for use in a cryptographic key that can be used to securely transmit information

    12. Computational Sciences and Engineering Division

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      The Computational Sciences and Engineering Division is a major research division at the Department of Energy's Oak Ridge National Laboratory. CSED develops and applies creative information technology and modeling and simulation research solutions for National Security and National Energy Infrastructure needs. The mission of the Computational Sciences and Engineering Division is to enhance the country's capabilities in achieving important objectives in the areas of national defense, homeland

    13. A Study to Develop an Industrial-Scale, Computer-Controlled High Magnetic Field Processing (HMFP) System to Assist in Commercializing the Novel, Enabling HMFP Manufacturing Technology

      SciTech Connect (OSTI)

      Ludtka, Gail Mackiewicz-; Chourey, Aashish

      2010-08-01

      As the original magnet designer and manufacturer of ORNL s 9T, 5-inch ID bore magnet, American Magnetics Inc. (AMI) has collaborated with ORNL s Materials Processing Group s and this partnership has been instrumental in the development of our unique thermo-magnetic facilities and expertise. Consequently, AMI and ORNL have realized that the commercial implementation of the High Magnetic Field Processing (HMFP) technology will require the evolution of robust, automated superconducting (SC) magnet systems that will be cost-effective and easy to operate in an industrial environment. The goal of this project and CRADA is to significantly expedite the timeline for implementing this revolutionary and pervasive cross-cutting technology for future US produced industrial components. The successful completion of this project is anticipated to significantly assist in the timely commercialization and licensing of our HMFP intellectual property for a broad spectrum of industries; and to open up a new market for AMI. One notable outcome of this project is that the ThermoMagnetic Processing Technology WON a prestigious 2009 R&D 100 Awards. This award acknowledges and recognizes our TMP Technology as one of the top 100 innovative US technologies in 2009. By successfully establishing the design requirements for a commercial scale magnetic processing system, this project effort has accomplished a key first step in facilitating the building and demonstration of a superconducting magnetic processing coil, enabling the transition of the High Magnetic Field Processing Technology beyond a laboratory novelty into a commercially viable and industrially scalable Manufacturing Technology.

    14. Computational Physicist | Princeton Plasma Physics Lab

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Computational Physicist Department: Theory Supervisor(s): Steve Jardin Staff: ENG 04 Requisition Number: 16000352 This position is in the Computational Plasma Physics Group. PPPL seeks a computational physicist for the TRANSP development and CPPG (Computational Plasma Physics Group) support group. The TRANSP software package is used by fusion physicists worldwide for comprehensive analysis and interpretation of data from magnetic-confinement fusion experiments and to predict the performance of

    15. A Study to Develop an Industrial-Scale, Computer-Controlled High Magnetic Field Processing (HMFP) System to Assist in Commercializing the Novel, Enabling HMFP Manufacturing Technology

      SciTech Connect (OSTI)

      Lutdka, G. M.; Chourey, A.

      2010-05-12

      As the original magnet designer and manufacturer of ORNL’s 9T, 5-inch ID bore magnet, American Magnetics Inc. (AMI) has collaborated with ORNL’s Materials Processing Group’s and this partnership has been instrumental in the development of our unique thermo-magnetic facilities and expertise. Consequently, AMI and ORNL have realized that the commercial implementation of the High Magnetic Field Processing (HMFP) technology will require the evolution of robust, automated superconducting (SC) magnet systems that will be cost-effective and easy to operate in an industrial environment. The goal of this project and CRADA is to significantly expedite the timeline for implementing this revolutionary and pervasive cross-cutting technology for future US produced industrial components. The successful completion of this project is anticipated to significantly assist in the timely commercialization and licensing of our HMFP intellectual property for a broad spectrum of industries; and to open up a new market for AMI. One notable outcome of this project is that the ThermoMagnetic Processing Technology WON a prestigious 2009 R&D 100 Awards. This award acknowledges and recognizes our TMP Technology as one of the top 100 innovative US technologies in 2009. By successfully establishing the design requirements for a commercial scale magnetic processing system, this project effort has accomplished a key first step in facilitating the building and demonstration of a superconducting magnetic processing coil, enabling the transition of the High Magnetic Field Processing Technology beyond a laboratory novelty into a commercially viable and industrially scalable Manufacturing Technology.

    16. Dynamic gating window for compensation of baseline shift in respiratory-gated radiation therapy

      SciTech Connect (OSTI)

      Pepin, Eric W.; Wu Huanmei; Shirato, Hiroki

      2011-04-15

      Purpose: To analyze and evaluate the necessity and use of dynamic gating techniques for compensation of baseline shift during respiratory-gated radiation therapy of lung tumors. Methods: Motion tracking data from 30 lung tumors over 592 treatment fractions were analyzed for baseline shift. The finite state model (FSM) was used to identify the end-of-exhale (EOE) breathing phase throughout each treatment fraction. Using duty cycle as an evaluation metric, several methods of end-of-exhale dynamic gating were compared: An a posteriori ideal gating window, a predictive trend-line-based gating window, and a predictive weighted point-based gating window. These methods were evaluated for each of several gating window types: Superior/inferior (SI) gating, anterior/posterior beam, lateral beam, and 3D gating. Results: In the absence of dynamic gating techniques, SI gating gave a 39.6% duty cycle. The ideal SI gating window yielded a 41.5% duty cycle. The weight-based method of dynamic SI gating yielded a duty cycle of 36.2%. The trend-line-based method yielded a duty cycle of 34.0%. Conclusions: Dynamic gating was not broadly beneficial due to a breakdown of the FSM's ability to identify the EOE phase. When the EOE phase was well defined, dynamic gating showed an improvement over static-window gating.

    17. Baseline Fracture Toughness and CGR testing of alloys X-750 and XM-19 (EPRI Phase I)

      SciTech Connect (OSTI)

      J. H. Jackson; S. P. Teysseyre

      2012-02-01

      The Advanced Test Reactor National Scientific User Facility (ATR NSUF) and Electric Power Research Institute (EPRI) formed an agreement to test representative alloys used as reactor structural materials as a pilot program toward establishing guidelines for future ATR NSUF research programs. This report contains results from the portion of this program established as Phase I (of three phases) that entails baseline fracture toughness, stress corrosion cracking (SCC), and tensile testing of selected materials for comparison to similar tests conducted at GE Global Research. The intent of this Phase I research program is to determine baseline properties for the materials of interest prior to irradiation, and to ensure comparability between laboratories using similar testing techniques, prior to applying these techniques to the same materials after having been irradiated at the Advanced Test Reactor (ATR). The materials chosen for this research are the nickel based super alloy X-750, and nitrogen strengthened austenitic stainless steel XM-19. A spare core shroud upper support bracket of alloy X-750 was purchased by EPRI from Southern Co. and a section of XM-19 plate was purchased by EPRI from GE-Hitachi. These materials were sectioned at GE Global Research and provided to INL.

    18. Baseline Fracture Toughness and CGR testing of alloys X-750 and XM-19 (EPRI Phase I)

      SciTech Connect (OSTI)

      J. H. Jackson; S. P. Teysseyre

      2012-10-01

      The Advanced Test Reactor National Scientific User Facility (ATR NSUF) and Electric Power Research Institute (EPRI) formed an agreement to test representative alloys used as reactor structural materials as a pilot program toward establishing guidelines for future ATR NSUF research programs. This report contains results from the portion of this program established as Phase I (of three phases) that entails baseline fracture toughness, stress corrosion cracking (SCC), and tensile testing of selected materials for comparison to similar tests conducted at GE Global Research. The intent of this Phase I research program is to determine baseline properties for the materials of interest prior to irradiation, and to ensure comparability between laboratories using similar testing techniques, prior to applying these techniques to the same materials after having been irradiated at the Advanced Test Reactor (ATR). The materials chosen for this research are the nickel based super alloy X-750, and nitrogen strengthened austenitic stainless steel XM-19. A spare core shroud upper support bracket of alloy X-750 was purchased by EPRI from Southern Co. and a section of XM-19 plate was purchased by EPRI from GE-Hitachi. These materials were sectioned at GE Global Research and provided to INL.

    19. Multiprocessor computing for images

      SciTech Connect (OSTI)

      Cantoni, V. ); Levialdi, S. )

      1988-08-01

      A review of image processing systems developed until now is given, highlighting the weak points of such systems and the trends that have dictated their evolution through the years producing different generations of machines. Each generation may be characterized by the hardware architecture, the programmability features and the relative application areas. The need for multiprocessing hierarchical systems is discussed focusing on pyramidal architectures. Their computational paradigms, their virtual and physical implementation, their programming and software requirements, and capabilities by means of suitable languages, are discussed.

    20. Applications of Parallel Computers

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Computers Applications of Parallel Computers UCB CS267 Spring 2015 Tuesday & Thursday, 9:30-11:00 Pacific Time Applications of Parallel Computers, CS267, is a graduate-level course...

    1. Computing for Finance

      ScienceCinema (OSTI)

      None

      2011-10-06

      The finance sector is one of the driving forces for the use of distributed or Grid computing for business purposes. The speakers will review the state-of-the-art of high performance computing in the financial sector, and provide insight into how different types of Grid computing ? from local clusters to global networks - are being applied to financial applications. They will also describe the use of software and techniques from physics, such as Monte Carlo simulations, in the financial world. There will be four talks of 20min each. The talk abstracts and speaker bios are listed below. This will be followed by a Q&A; panel session with the speakers. From 19:00 onwards there will be a networking cocktail for audience and speakers. This is an EGEE / CERN openlab event organized in collaboration with the regional business network rezonance.ch. A webcast of the event will be made available for subsequent viewing, along with powerpoint material presented by the speakers. Attendance is free and open to all. Registration is mandatory via www.rezonance.ch, including for CERN staff. 1. Overview of High Performance Computing in the Financial Industry Michael Yoo, Managing Director, Head of the Technical Council, UBS Presentation will describe the key business challenges driving the need for HPC solutions, describe the means in which those challenges are being addressed within UBS (such as GRID) as well as the limitations of some of these solutions, and assess some of the newer HPC technologies which may also play a role in the Financial Industry in the future. Speaker Bio: Michael originally joined the former Swiss Bank Corporation in 1994 in New York as a developer on a large data warehouse project. In 1996 he left SBC and took a role with Fidelity Investments in Boston. Unable to stay away for long, he returned to SBC in 1997 while working for Perot Systems in Singapore. Finally, in 1998 he formally returned to UBS in Stamford following the merger with SBC and has remained with UBS for the past 9 years. During his tenure at UBS, he has had a number of leadership roles within IT in development, support and architecture. In 2006 Michael relocated to Switzerland to take up his current role as head of the UBS IB Technical Council, responsible for the overall technology strategy and vision of the Investment Bank. One of Michael's key responsibilities is to manage the UBS High Performance Computing Research Lab and he has been involved in a number of initiatives in the HPC space. 2. Grid in the Commercial WorldFred Gedling, Chief Technology Officer EMEA and Senior Vice President Global Services, DataSynapse Grid computing gets mentions in the press for community programs starting last decade with "Seti@Home". Government, national and supranational initiatives in grid receive some press. One of the IT-industries' best-kept secrets is the use of grid computing by commercial organizations with spectacular results. Grid Computing and its evolution into Application Virtualization is discussed and how this is key to the next generation data center. Speaker Bio: Fred Gedling holds the joint roles of Chief Technology Officer for EMEA and Senior Vice President of Global Services at DataSynapse, a global provider of application virtualisation software. Based in London and working closely with organisations seeking to optimise their IT infrastructures, Fred offers unique insights into the technology of virtualisation as well as the methodology of establishing ROI and rapid deployment to the immediate advantage of the business. Fred has more than fifteen years experience of enterprise middleware and high-performance infrastructures. Prior to DataSynapse he worked in high performance CRM middleware and was the CTO EMEA for New Era of Networks (NEON) during the rapid growth of Enterprise Application Integration. His 25-year career in technology also includes management positions at Goldman Sachs and Stratus Computer. Fred holds a First Class Bsc (Hons) degree in Physics with Astrophysics from the University of Leeds and had the privilege of being a summer student at CERN.3. Opportunities for gLite in finance and related industriesAdam Vile, Head of Grid, HPC and Technical Computing, Excelian Ltd.gLite, the Grid software developed by the EGEE project, has been exceedingly successful as an enabling infrastructure, and has been a massive success in bringing together scientific and technical communities to provide the compute power to address previously incomputable problems. Not so in the finance industry. In its current form gLite would be a business disabler. There are other middleware tools that solve the finance communities compute problems much better. Things are moving on, however. There are moves afoot in the open source community to evolve the technology to address other, more sophisticated needs such as utility and interactive computing. In this talk, I will describe how Excelian is providing Grid consultancy services for the finance community and how, through its relationship to the EGEE project, Excelian is helping to identify and exploit opportunities as the research and business worlds converge. Because of the strong third party presence in the finance industry, such opportunities are few and far between, but they are there, especially as we expand sideways into related verticals such as the smaller hedge funds and energy companies. This talk will give an overview of the barriers to adoption of gLite in the finance industry and highlight some of the opportunities offered in this and related industries as the ideas around Grid mature. Speaker Bio: Dr Adam Vile is a senior consultant and head of the Grid and HPC practice at Excelian, a consultancy that focuses on financial markets professional services. He has spent many years in investment banking, as a developer, project manager and architect in both front and back office. Before joining Excelian he was senior Grid and HPC architect at Barclays Capital. Prior to joining investment banking, Adam spent a number of years lecturing in IT and mathematics at a UK University and maintains links with academia through lectures, research and through validation and steering of postgraduate courses. He is a chartered mathematician and was the conference chair of the Institute of Mathematics and its Applications first conference in computational Finance.4. From Monte Carlo to Wall Street Daniel Egloff, Head of Financial Engineering Computing Unit, Zürich Cantonal Bank High performance computing techniques provide new means to solve computationally hard problems in the financial service industry. First I consider Monte Carlo simulation and illustrate how it can be used to implement a sophisticated credit risk management and economic capital framework. From a HPC perspective, basic Monte Carlo simulation is embarrassingly parallel and can be implemented efficiently on distributed memory clusters. Additional difficulties arise for adaptive variance reduction schemes, if the information content in a sample is very small, and if the amount of simulated date becomes huge such that incremental processing algorithms are indispensable. We discuss the business value of an advanced credit risk quantification which is particularly compelling in these days. While Monte Carlo simulation is a very versatile tool it is not always the preferred solution for the pricing of complex products like multi asset options, structured products, or credit derivatives. As a second application I show how operator methods can be used to develop a pricing framework. The scalability of operator methods relies heavily on optimized dense matrix-matrix multiplications and requires specialized BLAS level-3 implementations provided by specialized FPGA or GPU boards. Speaker Bio: Daniel Egloff studied mathematics, theoretical physics, and computer science at the University of Zurich and the ETH Zurich. He holds a PhD in Mathematics from University of Fribourg, Switzerland. After his PhD he started to work for a large Swiss insurance company in the area of asset and liability management. He continued his professional career in the consulting industry. At KPMG and Arthur Andersen he consulted international clients and implemented quantitative risk management solutions for financial institutions and insurance companies. In 2002 he joined Zurich Cantonal Bank. He was assigned to develop and implement credit portfolio risk and economic capital methodologies. He built up a competence center for high performance and cluster computing. Currently, Daniel Egloff is heading the Financial Computing unit in the ZKB Financial Engineering division. He and his team is engineering and operating high performance cluster applications for computationally intensive problems in financial risk management.

    2. Computing for Finance

      SciTech Connect (OSTI)

      2010-03-24

      The finance sector is one of the driving forces for the use of distributed or Grid computing for business purposes. The speakers will review the state-of-the-art of high performance computing in the financial sector, and provide insight into how different types of Grid computing – from local clusters to global networks - are being applied to financial applications. They will also describe the use of software and techniques from physics, such as Monte Carlo simulations, in the financial world. There will be four talks of 20min each. The talk abstracts and speaker bios are listed below. This will be followed by a Q&A; panel session with the speakers. From 19:00 onwards there will be a networking cocktail for audience and speakers. This is an EGEE / CERN openlab event organized in collaboration with the regional business network rezonance.ch. A webcast of the event will be made available for subsequent viewing, along with powerpoint material presented by the speakers. Attendance is free and open to all. Registration is mandatory via www.rezonance.ch, including for CERN staff. 1. Overview of High Performance Computing in the Financial Industry Michael Yoo, Managing Director, Head of the Technical Council, UBS Presentation will describe the key business challenges driving the need for HPC solutions, describe the means in which those challenges are being addressed within UBS (such as GRID) as well as the limitations of some of these solutions, and assess some of the newer HPC technologies which may also play a role in the Financial Industry in the future. Speaker Bio: Michael originally joined the former Swiss Bank Corporation in 1994 in New York as a developer on a large data warehouse project. In 1996 he left SBC and took a role with Fidelity Investments in Boston. Unable to stay away for long, he returned to SBC in 1997 while working for Perot Systems in Singapore. Finally, in 1998 he formally returned to UBS in Stamford following the merger with SBC and has remained with UBS for the past 9 years. During his tenure at UBS, he has had a number of leadership roles within IT in development, support and architecture. In 2006 Michael relocated to Switzerland to take up his current role as head of the UBS IB Technical Council, responsible for the overall technology strategy and vision of the Investment Bank. One of Michael's key responsibilities is to manage the UBS High Performance Computing Research Lab and he has been involved in a number of initiatives in the HPC space. 2. Grid in the Commercial WorldFred Gedling, Chief Technology Officer EMEA and Senior Vice President Global Services, DataSynapse Grid computing gets mentions in the press for community programs starting last decade with "Seti@Home". Government, national and supranational initiatives in grid receive some press. One of the IT-industries' best-kept secrets is the use of grid computing by commercial organizations with spectacular results. Grid Computing and its evolution into Application Virtualization is discussed and how this is key to the next generation data center. Speaker Bio: Fred Gedling holds the joint roles of Chief Technology Officer for EMEA and Senior Vice President of Global Services at DataSynapse, a global provider of application virtualisation software. Based in London and working closely with organisations seeking to optimise their IT infrastructures, Fred offers unique insights into the technology of virtualisation as well as the methodology of establishing ROI and rapid deployment to the immediate advantage of the business. Fred has more than fifteen years experience of enterprise middleware and high-performance infrastructures. Prior to DataSynapse he worked in high performance CRM middleware and was the CTO EMEA for New Era of Networks (NEON) during the rapid growth of Enterprise Application Integration. His 25-year career in technology also includes management positions at Goldman Sachs and Stratus Computer. Fred holds a First Class Bsc (Hons) degree in Physics with Astrophysics from the University of Leeds and had the privilege of being a summer student at CERN.3. Opportunities for gLite in finance and related industriesAdam Vile, Head of Grid, HPC and Technical Computing, Excelian Ltd.gLite, the Grid software developed by the EGEE project, has been exceedingly successful as an enabling infrastructure, and has been a massive success in bringing together scientific and technical communities to provide the compute power to address previously incomputable problems. Not so in the finance industry. In its current form gLite would be a business disabler. There are other middleware tools that solve the finance communities compute problems much better. Things are moving on, however. There are moves afoot in the open source community to evolve the technology to address other, more sophisticated needs such as utility and interactive computing. In this talk, I will describe how Excelian is providing Grid consultancy services for the finance community and how, through its relationship to the EGEE project, Excelian is helping to identify and exploit opportunities as the research and business worlds converge. Because of the strong third party presence in the finance industry, such opportunities are few and far between, but they are there, especially as we expand sideways into related verticals such as the smaller hedge funds and energy companies. This talk will give an overview of the barriers to adoption of gLite in the finance industry and highlight some of the opportunities offered in this and related industries as the ideas around Grid mature. Speaker Bio: Dr Adam Vile is a senior consultant and head of the Grid and HPC practice at Excelian, a consultancy that focuses on financial markets professional services. He has spent many years in investment banking, as a developer, project manager and architect in both front and back office. Before joining Excelian he was senior Grid and HPC architect at Barclays Capital. Prior to joining investment banking, Adam spent a number of years lecturing in IT and mathematics at a UK University and maintains links with academia through lectures, research and through validation and steering of postgraduate courses. He is a chartered mathematician and was the conference chair of the Institute of Mathematics and its Applications first conference in computational Finance.4. From Monte Carlo to Wall Street Daniel Egloff, Head of Financial Engineering Computing Unit, ZĂŒrich Cantonal Bank High performance computing techniques provide new means to solve computationally hard problems in the financial service industry. First I consider Monte Carlo simulation and illustrate how it can be used to implement a sophisticated credit risk management and economic capital framework. From a HPC perspective, basic Monte Carlo simulation is embarrassingly parallel and can be implemented efficiently on distributed memory clusters. Additional difficulties arise for adaptive variance reduction schemes, if the information content in a sample is very small, and if the amount of simulated date becomes huge such that incremental processing algorithms are indispensable. We discuss the business value of an advanced credit risk quantification which is particularly compelling in these days. While Monte Carlo simulation is a very versatile tool it is not always the preferred solution for the pricing of complex products like multi asset options, structured products, or credit derivatives. As a second application I show how operator methods can be used to develop a pricing framework. The scalability of operator methods relies heavily on optimized dense matrix-matrix multiplications and requires specialized BLAS level-3 implementations provided by specialized FPGA or GPU boards. Speaker Bio: Daniel Egloff studied mathematics, theoretical physics, and computer science at the University of Zurich and the ETH Zurich. He holds a PhD in Mathematics from University of Fribourg, Switzerland. After his PhD he started to work for a large Swiss insurance company in the area of asset and liability management. He continued his professional career in the consulting industry. At KPMG and Arthur Andersen he consulted international clients and implemented quantitative risk management solutions for financial institutions and insurance companies. In 2002 he joined Zurich Cantonal Bank. He was assigned to develop and implement credit portfolio risk and economic capital methodologies. He built up a competence center for high performance and cluster computing. Currently, Daniel Egloff is heading the Financial Computing unit in the ZKB Financial Engineering division. He and his team is engineering and operating high performance cluster applications for computationally intensive problems in financial risk management.

    3. Computing for Finance

      ScienceCinema (OSTI)

      None

      2011-10-06

      The finance sector is one of the driving forces for the use of distributed or Grid computing for business purposes. The speakers will review the state-of-the-art of high performance computing in the financial sector, and provide insight into how different types of Grid computing ? from local clusters to global networks - are being applied to financial applications. They will also describe the use of software and techniques from physics, such as Monte Carlo simulations, in the financial world. There will be four talks of 20min each. The talk abstracts and speaker bios are listed below. This will be followed by a Q&A; panel session with the speakers. From 19:00 onwards there will be a networking cocktail for audience and speakers. This is an EGEE / CERN openlab event organized in collaboration with the regional business network rezonance.ch. A webcast of the event will be made available for subsequent viewing, along with powerpoint material presented by the speakers. Attendance is free and open to all. Registration is mandatory via www.rezonance.ch, including for CERN staff. 1. Overview of High Performance Computing in the Financial Industry Michael Yoo, Managing Director, Head of the Technical Council, UBS Presentation will describe the key business challenges driving the need for HPC solutions, describe the means in which those challenges are being addressed within UBS (such as GRID) as well as the limitations of some of these solutions, and assess some of the newer HPC technologies which may also play a role in the Financial Industry in the future. Speaker Bio: Michael originally joined the former Swiss Bank Corporation in 1994 in New York as a developer on a large data warehouse project. In 1996 he left SBC and took a role with Fidelity Investments in Boston. Unable to stay away for long, he returned to SBC in 1997 while working for Perot Systems in Singapore. Finally, in 1998 he formally returned to UBS in Stamford following the merger with SBC and has remained with UBS for the past 9 years. During his tenure at UBS, he has had a number of leadership roles within IT in development, support and architecture. In 2006 Michael relocated to Switzerland to take up his current role as head of the UBS IB Technical Council, responsible for the overall technology strategy and vision of the Investment Bank. One of Michael's key responsibilities is to manage the UBS High Performance Computing Research Lab and he has been involved in a number of initiatives in the HPC space. 2. Grid in the Commercial WorldFred Gedling, Chief Technology Officer EMEA and Senior Vice President Global Services, DataSynapse Grid computing gets mentions in the press for community programs starting last decade with "Seti@Home". Government, national and supranational initiatives in grid receive some press. One of the IT-industries' best-kept secrets is the use of grid computing by commercial organizations with spectacular results. Grid Computing and its evolution into Application Virtualization is discussed and how this is key to the next generation data center. Speaker Bio: Fred Gedling holds the joint roles of Chief Technology Officer for EMEA and Senior Vice President of Global Services at DataSynapse, a global provider of application virtualisation software. Based in London and working closely with organisations seeking to optimise their IT infrastructures, Fred offers unique insights into the technology of virtualisation as well as the methodology of establishing ROI and rapid deployment to the immediate advantage of the business. Fred has more than fifteen years experience of enterprise middleware and high-performance infrastructures. Prior to DataSynapse he worked in high performance CRM middleware and was the CTO EMEA for New Era of Networks (NEON) during the rapid growth of Enterprise Application Integration. His 25-year career in technology also includes management positions at Goldman Sachs and Stratus Computer. Fred holds a First Class Bsc (Hons) degree in Physics with Astrophysics from the University of Leeds and had the privilege of being a summer student at CERN.3. Opportunities for gLite in finance and related industriesAdam Vile, Head of Grid, HPC and Technical Computing, Excelian Ltd.gLite, the Grid software developed by the EGEE project, has been exceedingly successful as an enabling infrastructure, and has been a massive success in bringing together scientific and technical communities to provide the compute power to address previously incomputable problems. Not so in the finance industry. In its current form gLite would be a business disabler. There are other middleware tools that solve the finance communities compute problems much better. Things are moving on, however. There are moves afoot in the open source community to evolve the technology to address other, more sophisticated needs such as utility and interactive computing. In this talk, I will describe how Excelian is providing Grid consultancy services for the finance community and how, through its relationship to the EGEE project, Excelian is helping to identify and exploit opportunities as the research and business worlds converge. Because of the strong third party presence in the finance industry, such opportunities are few and far between, but they are there, especially as we expand sideways into related verticals such as the smaller hedge funds and energy companies. This talk will give an overview of the barriers to adoption of gLite in the finance industry and highlight some of the opportunities offered in this and related industries as the ideas around Grid mature. Speaker Bio: Dr Adam Vile is a senior consultant and head of the Grid and HPC practice at Excelian, a consultancy that focuses on financial markets professional services. He has spent many years in investment banking, as a developer, project manager and architect in both front and back office. Before joining Excelian he was senior Grid and HPC architect at Barclays Capital. Prior to joining investment banking, Adam spent a number of years lecturing in IT and mathematics at a UK University and maintains links with academia through lectures, research and through validation and steering of postgraduate courses. He is a chartered mathematician and was the conference chair of the Institute of Mathematics and its Applications first conference in computational Finance.4. From Monte Carlo to Wall Street Daniel Egloff, Head of Financial Engineering Computing Unit, ZĂŒrich Cantonal Bank High performance computing techniques provide new means to solve computationally hard problems in the financial service industry. First I consider Monte Carlo simulation and illustrate how it can be used to implement a sophisticated credit risk management and economic capital framework. From a HPC perspective, basic Monte Carlo simulation is embarrassingly parallel and can be implemented efficiently on distributed memory clusters. Additional difficulties arise for adaptive variance reduction schemes, if the information content in a sample is very small, and if the amount of simulated date becomes huge such that incremental processing algorithms are indispensable. We discuss the business value of an advanced credit risk quantification which is particularly compelling in these days. While Monte Carlo simulation is a very versatile tool it is not always the preferred solution for the pricing of complex products like multi asset options, structured products, or credit derivatives. As a second application I show how operator methods can be used to develop a pricing framework. The scalability of operator methods relies heavily on optimized dense matrix-matrix multiplications and requires specialized BLAS level-3 implementations provided by specialized FPGA or GPU boards. Speaker Bio: Daniel Egloff studied mathematics, theoretical physics, and computer science at the University of Zurich and the ETH Zurich. He holds a PhD in Mathematics from University of Fribourg, Switzerland. After his PhD he started to work for a large Swiss insurance company in the area of asset and liability management. He continued his professional career in the consulting industry. At KPMG and Arthur Andersen he consulted international clients and implemented quantitative risk management solutions for financial institutions and insurance companies. In 2002 he joined Zurich Cantonal Bank. He was assigned to develop and implement credit portfolio risk and economic capital methodologies. He built up a competence center for high performance and cluster computing. Currently, Daniel Egloff is heading the Financial Computing unit in the ZKB Financial Engineering division. He and his team is engineering and operating high performance cluster applications for computationally intensive problems in financial risk management.

    4. Computer hardware fault administration

      DOE Patents [OSTI]

      Archer, Charles J.; Megerian, Mark G.; Ratterman, Joseph D.; Smith, Brian E.

      2010-09-14

      Computer hardware fault administration carried out in a parallel computer, where the parallel computer includes a plurality of compute nodes. The compute nodes are coupled for data communications by at least two independent data communications networks, where each data communications network includes data communications links connected to the compute nodes. Typical embodiments carry out hardware fault administration by identifying a location of a defective link in the first data communications network of the parallel computer and routing communications data around the defective link through the second data communications network of the parallel computer.

    5. advanced simulation and computing

      National Nuclear Security Administration (NNSA)

      Each successive generation of computing system has provided greater computing power and energy efficiency.

      CTS-1 clusters will support NNSA's Life Extension Program and...

    6. Applied & Computational Math

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      & Computational Math - Sandia Energy Energy Search Icon Sandia Home Locations Contact Us ... Twitter Google + Vimeo GovDelivery SlideShare Applied & Computational Math HomeEnergy ...

    7. Energy Aware Computing

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Partnerships Shifter: User Defined Images Archive APEX Home R & D Energy Aware Computing Energy Aware Computing Dynamic Frequency Scaling One means to lower the energy ...

    8. Molecular Science Computing | EMSL

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      computational and state-of-the-art experimental tools, providing a cross-disciplinary environment to further research. Additional Information Computing user policies Partners...

    9. Climate Modeling using High-Performance Computing

      SciTech Connect (OSTI)

      Mirin, A A

      2007-02-05

      The Center for Applied Scientific Computing (CASC) and the LLNL Climate and Carbon Science Group of Energy and Environment (E and E) are working together to improve predictions of future climate by applying the best available computational methods and computer resources to this problem. Over the last decade, researchers at the Lawrence Livermore National Laboratory (LLNL) have developed a number of climate models that provide state-of-the-art simulations on a wide variety of massively parallel computers. We are now developing and applying a second generation of high-performance climate models. Through the addition of relevant physical processes, we are developing an earth systems modeling capability as well.

    10. Lawrence Livermore National Laboratory Emergency Response Capability Baseline Needs Assessment Requirement Document

      SciTech Connect (OSTI)

      Sharry, J A

      2009-12-30

      This revision of the LLNL Fire Protection Baseline Needs Assessment (BNA) was prepared by John A. Sharry, LLNL Fire Marshal and LLNL Division Leader for Fire Protection and reviewed by Martin Gresho, Sandia/CA Fire Marshal. The document follows and expands upon the format and contents of the DOE Model Fire Protection Baseline Capabilities Assessment document contained on the DOE Fire Protection Web Site, but only address emergency response. The original LLNL BNA was created on April 23, 1997 as a means of collecting all requirements concerning emergency response capabilities at LLNL (including response to emergencies at Sandia/CA) into one BNA document. The original BNA documented the basis for emergency response, emergency personnel staffing, and emergency response equipment over the years. The BNA has been updated and reissued five times since in 1998, 1999, 2000, 2002, and 2004. A significant format change was performed in the 2004 update of the BNA in that it was 'zero based.' Starting with the requirement documents, the 2004 BNA evaluated the requirements, and determined minimum needs without regard to previous evaluations. This 2010 update maintains the same basic format and requirements as the 2004 BNA. In this 2010 BNA, as in the previous BNA, the document has been intentionally divided into two separate documents - the needs assessment (1) and the compliance assessment (2). The needs assessment will be referred to as the BNA and the compliance assessment will be referred to as the BNA Compliance Assessment. The primary driver for separation is that the needs assessment identifies the detailed applicable regulations (primarily NFPA Standards) for emergency response capabilities based on the hazards present at LLNL and Sandia/CA and the geographical location of the facilities. The needs assessment also identifies areas where the modification of the requirements in the applicable NFPA standards is appropriate, due to the improved fire protection provided, the remote location and low population density of some the facilities. As such, the needs assessment contains equivalencies to the applicable requirements. The compliance assessment contains no such equivalencies and simply assesses the existing emergency response resources to the requirements of the BNA and can be updated as compliance changes independent of the BNA update schedule. There are numerous NFPA codes and standards and other requirements and guidance documents that address the subject of emergency response. These requirements documents are not always well coordinated and may contain duplicative or conflicting requirements or even coverage gaps. Left unaddressed, this regulatory situation results in frequent interpretation of requirements documents. Different interpretations can then lead to inconsistent implementation. This BNA addresses this situation by compiling applicable requirements from all identified sources (see Section 5) and analyzing them collectively to address conflict and overlap as applicable to the hazards presented by the LLNL and Sandia/CA sites (see Section 7). The BNA also generates requirements when needed to fill any identified gaps in regulatory coverage. Finally, the BNA produces a customized simple set of requirements, appropriate for the DOE protection goals, such as those defined in DOE O 420.1B, the hazard level, the population density, the topography, and the site layout at LLNL and Sandia/CA that will be used as the baseline requirements set - the 'baseline needs' - for emergency response at LLNL and Sandia/CA. A template approach is utilized to accomplish this evaluation for each of the nine topical areas that comprise the baseline needs for emergency response. The basis for conclusions reached in determining the baseline needs for each of the topical areas is presented in Sections 7.1 through 7.9. This BNA identifies only mandatory requirements and establishes the minimum performance criteria. The minimum performance criteria may not be the level of performance desired Lawrence Livermore National Laboratory or Sandia/CA. Performance at levels greater than those established by this document will provide a higher level of fire safety, fire protection, or loss control and is encouraged. In Section 7, Determination of Baseline Needs, a standard template was used to describe the process used that involves separating basic emergency response needs into nine separate services. Each service being evaluated contains a determination of minimum requirements, an analysis of the requirements, a statement of minimum performance, and finally a summary of the minimum performance. The requirement documents, listed in Section 5, are those laws, regulations, DOE Directives, contractual obligations, or LLNL policies that establish service levels. The determination of minimum requirements section explains the rationale or method used to determine the minimum requirements.

    11. ORISE: Web Development

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Web Development As computer-based applications become increasingly popular for the delivery of health care training and information, the need for Web development in support of ...

    12. Etalon-induced baseline drift and correction in atom flux sensors based on atomic absorption spectroscopy

      SciTech Connect (OSTI)

      Du, Yingge; Chambers, Scott A.

      2014-10-20

      Atom flux sensors based on atomic absorption (AA) spectroscopy are of significant interest in thin film growth as they can provide unobtrusive, element specific real-time flux sensing and control. The ultimate sensitivity and performance of these sensors are strongly affected by baseline drift. Here we demonstrate that an etalon effect resulting from temperature changes in optical viewport housings is a major source of signal instability, which has not been previously considered, and cannot be corrected using existing methods. We show that small temperature variations in the fused silica viewports can introduce intensity modulations of up to 1.5% which in turn significantly deteriorate AA sensor performance. This undesirable effect can be at least partially eliminated by reducing the size of the beam and tilting the incident light beam off the viewport normal.

    13. Etalon-induced Baseline Drift And Correction In Atom Flux Sensors Based On Atomic Absorption Spectroscopy

      SciTech Connect (OSTI)

      Du, Yingge; Chambers, Scott A.

      2014-10-20

      Atom flux sensors based on atomic absorption (AA) spectroscopy are of significant interest in thin film growth as they can provide unobtrusive, element specific, real-time flux sensing and control. The ultimate sensitivity and performance of the sensors are strongly affected by the long-term and short term baseline drift. Here we demonstrate that an etalon effect resulting from temperature changes in optical viewport housings is a major source of signal instability which has not been previously considered or corrected by existing methods. We show that small temperature variations in the fused silica viewports can introduce intensity modulations of up to 1.5%, which in turn significantly deteriorate AA sensor performance. This undesirable effect can be at least partially eliminated by reducing the size of the beam and tilting the incident light beam off the viewport normal.

    14. Dual baseline search for muon antineutrino disappearance at 0.1 eVÂČ

      DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

      Cheng, G.; Huelsnitz, W.; Aguilar-Arevalo, A. A.; Alcaraz-Aunion, J. L.; Brice, S. J.; Brown, B. C.; Bugel, L.; Catala-Perez, J.; Church, E. D.; Conrad, J. M.; et al

      2012-09-25

      The MiniBooNE and SciBooNE collaborations report the results of a joint search for short baseline disappearance of ÎœÂŻÎŒ at Fermilab’s Booster Neutrino Beamline. The MiniBooNE Cherenkov detector and the SciBooNE tracking detector observe antineutrinos from the same beam, therefore the combined analysis of their data sets serves to partially constrain some of the flux and cross section uncertainties. Uncertainties in the ΜΌ background were constrained by neutrino flux and cross section measurements performed in both detectors. A likelihood ratio method was used to set a 90% confidence level upper limit on ÎœÂŻÎŒ disappearance that dramatically improves upon prior limits inmore »the ΔmÂČ=0.1–100 eVÂČ region.« less

    15. Dixie Valley Engineered Geothermal System Exploration Methodology Project, Baseline Conceptual Model Report

      DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

      Iovenitti, Joe

      2014-01-02

      FSR Part I presents (1) an assessment of the readily available public domain data and some proprietary data provided by Terra-Gen Power, LLC, (2) a re-interpretation of these data as required, (3) an exploratory geostatistical data analysis, (4) the baseline geothermal conceptual model, and (5) the EGS favorability/trust mapping. The conceptual model presented applies to both the hydrothermal system and EGS in the Dixie Valley region. FSR Part II presents (1) 278 new gravity stations; (2) enhanced gravity-magnetic modeling; (3) 42 new ambient seismic noise survey stations; (4) an integration of the new seismic noise data with a regional seismic network; (5) a new methodology and approach to interpret this data; (5) a novel method to predict rock type and temperature based on the newly interpreted data; (6) 70 new magnetotelluric (MT) stations; (7) an integrated interpretation of the enhanced MT data set; (8) the results of a 308 station soil CO2 gas survey; (9) new conductive thermal modeling in the project area; (10) new convective modeling in the Calibration Area; (11) pseudo-convective modeling in the Calibration Area; (12) enhanced data implications and qualitative geoscience correlations at three scales (a) Regional, (b) Project, and (c) Calibration Area; (13) quantitative geostatistical exploratory data analysis; and (14) responses to nine questions posed in the proposal for this investigation. Enhanced favorability/trust maps were not generated because there was not a sufficient amount of new, fully-vetted (see below) rock type, temperature, and stress data. The enhanced seismic data did generate a new method to infer rock type and temperature. However, in the opinion of the Principal Investigator for this project, this new methodology needs to be tested and evaluated at other sites in the Basin and Range before it is used to generate the referenced maps. As in the baseline conceptual model, the enhanced findings can be applied to both the hydrothermal system and EGS in the Dixie Valley region.

    16. Reference manual for toxicity and exposure assessment and risk characterization. CERCLA Baseline Risk Assessment

      SciTech Connect (OSTI)

      1995-03-01

      The Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA, 1980) (CERCLA or Superfund) was enacted to provide a program for identifying and responding to releases of hazardous substances into the environment. The Superfund Amendments and Reauthorization Act (SARA, 1986) was enacted to strengthen CERCLA by requiring that site clean-ups be permanent, and that they use treatments that significantly reduce the volume, toxicity, or mobility of hazardous pollutants. The National Oil and Hazardous Substances Pollution Contingency Plan (NCP) (USEPA, 1985; USEPA, 1990) implements the CERCLA statute, presenting a process for (1) identifying and prioritizing sites requiring remediation and (2) assessing the extent of remedial action required at each site. The process includes performing two studies: a Remedial Investigation (RI) to evaluate the nature, extent, and expected consequences of site contamination, and a Feasibility Study (FS) to select an appropriate remedial alternative adequate to reduce such risks to acceptable levels. An integral part of the RI is the evaluation of human health risks posed by hazardous substance releases. This risk evaluation serves a number of purposes within the overall context of the RI/FS process, the most essential of which is to provide an understanding of ``baseline`` risks posed by a given site. Baseline risks are those risks that would exist if no remediation or institutional controls are applied at a site. This document was written to (1) guide risk assessors through the process of interpreting EPA BRA policy and (2) help risk assessors to discuss EPA policy with regulators, decision makers, and stakeholders as it relates to conditions at a particular DOE site.

    17. Idaho National Engineering Laboratory (INEL) Environmental Restoration (ER) Program Baseline Safety Analysis File (BSAF)

      SciTech Connect (OSTI)

      1995-09-01

      The Baseline Safety Analysis File (BSAF) is a facility safety reference document for the Idaho National Engineering Laboratory (INEL) environmental restoration activities. The BSAF contains information and guidance for safety analysis documentation required by the U.S. Department of Energy (DOE) for environmental restoration (ER) activities, including: Characterization of potentially contaminated sites. Remedial investigations to identify and remedial actions to clean up existing and potential releases from inactive waste sites Decontamination and dismantlement of surplus facilities. The information is INEL-specific and is in the format required by DOE-EM-STD-3009-94, Preparation Guide for U.S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports. An author of safety analysis documentation need only write information concerning that activity and refer to BSAF for further information or copy applicable chapters and sections. The information and guidance provided are suitable for: {sm_bullet} Nuclear facilities (DOE Order 5480-23, Nuclear Safety Analysis Reports) with hazards that meet the Category 3 threshold (DOE-STD-1027-92, Hazard Categorization and Accident Analysis Techniques for Compliance with DOE Order 5480.23, Nuclear Safety Analysis Reports) {sm_bullet} Radiological facilities (DOE-EM-STD-5502-94, Hazard Baseline Documentation) Nonnuclear facilities (DOE-EM-STD-5502-94) that are classified as {open_quotes}low{close_quotes} hazard facilities (DOE Order 5481.1B, Safety Analysis and Review System). Additionally, the BSAF could be used as an information source for Health and Safety Plans and for Safety Analysis Reports (SARs) for nuclear facilities with hazards equal to or greater than the Category 2 thresholds, or for nonnuclear facilities with {open_quotes}moderate{close_quotes} or {open_quotes}high{close_quotes} hazard classifications.

    18. Baseline for Climate Change: Modeling Watershed Aquatic Biodiversity Relative to Environmental and Anthropogenic Factors

      SciTech Connect (OSTI)

      Maurakis, Eugene G

      2010-10-01

      Objectives of the two-year study were to (1) establish baselines for fish and macroinvertebrate community structures in two mid-Atlantic lower Piedmont watersheds (Quantico Creek, a pristine forest watershed; and Cameron Run, an urban watershed, Virginia) that can be used to monitor changes relative to the impacts related to climate change in the future; (2) create mathematical expressions to model fish species richness and diversity, and macroinvertebrate taxa and macroinvertebrate functional feeding group taxa richness and diversity that can serve as a baseline for future comparisons in these and other watersheds in the mid-Atlantic region; and (3) heighten people’s awareness, knowledge and understanding of climate change and impacts on watersheds in a laboratory experience and interactive exhibits, through internship opportunities for undergraduate and graduate students, a week-long teacher workshop, and a website about climate change and watersheds. Mathematical expressions modeled fish and macroinvertebrate richness and diversity accurately well during most of the six thermal seasons where sample sizes were robust. Additionally, hydrologic models provide the basis for estimating flows under varying meteorological conditions and landscape changes. Continuations of long-term studies are requisite for accurately teasing local human influences (e.g. urbanization and watershed alteration) from global anthropogenic impacts (e.g. climate change) on watersheds. Effective and skillful translations (e.g. annual potential exposure of 750,000 people to our inquiry-based laboratory activities and interactive exhibits in Virginia) of results of scientific investigations are valuable ways of communicating information to the general public to enhance their understanding of climate change and its effects in watersheds.

    19. Dixie Valley Engineered Geothermal System Exploration Methodology Project, Baseline Conceptual Model Report

      DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

      Iovenitti, Joe

      FSR Part I presents (1) an assessment of the readily available public domain data and some proprietary data provided by Terra-Gen Power, LLC, (2) a re-interpretation of these data as required, (3) an exploratory geostatistical data analysis, (4) the baseline geothermal conceptual model, and (5) the EGS favorability/trust mapping. The conceptual model presented applies to both the hydrothermal system and EGS in the Dixie Valley region. FSR Part II presents (1) 278 new gravity stations; (2) enhanced gravity-magnetic modeling; (3) 42 new ambient seismic noise survey stations; (4) an integration of the new seismic noise data with a regional seismic network; (5) a new methodology and approach to interpret this data; (5) a novel method to predict rock type and temperature based on the newly interpreted data; (6) 70 new magnetotelluric (MT) stations; (7) an integrated interpretation of the enhanced MT data set; (8) the results of a 308 station soil CO2 gas survey; (9) new conductive thermal modeling in the project area; (10) new convective modeling in the Calibration Area; (11) pseudo-convective modeling in the Calibration Area; (12) enhanced data implications and qualitative geoscience correlations at three scales (a) Regional, (b) Project, and (c) Calibration Area; (13) quantitative geostatistical exploratory data analysis; and (14) responses to nine questions posed in the proposal for this investigation. Enhanced favorability/trust maps were not generated because there was not a sufficient amount of new, fully-vetted (see below) rock type, temperature, and stress data. The enhanced seismic data did generate a new method to infer rock type and temperature. However, in the opinion of the Principal Investigator for this project, this new methodology needs to be tested and evaluated at other sites in the Basin and Range before it is used to generate the referenced maps. As in the baseline conceptual model, the enhanced findings can be applied to both the hydrothermal system and EGS in the Dixie Valley region.

    20. High energy neutron Computed Tomography developed

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      objects. May 9, 2014 Neutron tomography horizontal "slice" of a tungsten and polyethylene test object containing tungsten carbide BBs. Neutron tomography horizontal "slice"...

    1. Advanced Materials Development through Computational Design ...

      Broader source: Energy.gov (indexed) [DOE]

      Presentation given at the 2007 Diesel Engine-Efficiency & Emissions Research Conference (DEER ... Office Merit Review 2015: High Temperature Materials for High Efficiency Engines ...

    2. Computational Fluid Dynamics

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      scour-tracc-cfd TRACC RESEARCH Computational Fluid Dynamics Computational Structural Mechanics Transportation Systems Modeling Computational Fluid Dynamics Overview of CFD: Video Clip with Audio Computational fluid dynamics (CFD) research uses mathematical and computational models of flowing fluids to describe and predict fluid response in problems of interest, such as the flow of air around a moving vehicle or the flow of water and sediment in a river. Coupled with appropriate and prototypical

    3. Fracture Analysis of Vessels. Oak Ridge FAVOR, v06.1, Computer Code: Theory and Implementation of Algorithms, Methods, and Correlations

      SciTech Connect (OSTI)

      Williams, P. T.; Dickson, T. L.; Yin, S.

      2007-12-01

      The current regulations to insure that nuclear reactor pressure vessels (RPVs) maintain their structural integrity when subjected to transients such as pressurized thermal shock (PTS) events were derived from computational models developed in the early-to-mid 1980s. Since that time, advancements and refinements in relevant technologies that impact RPV integrity assessment have led to an effort by the NRC to re-evaluate its PTS regulations. Updated computational methodologies have been developed through interactions between experts in the relevant disciplines of thermal hydraulics, probabilistic risk assessment, materials embrittlement, fracture mechanics, and inspection (flaw characterization). Contributors to the development of these methodologies include the NRC staff, their contractors, and representatives from the nuclear industry. These updated methodologies have been integrated into the Fracture Analysis of Vessels -- Oak Ridge (FAVOR, v06.1) computer code developed for the NRC by the Heavy Section Steel Technology (HSST) program at Oak Ridge National Laboratory (ORNL). The FAVOR, v04.1, code represents the baseline NRC-selected applications tool for re-assessing the current PTS regulations. This report is intended to document the technical bases for the assumptions, algorithms, methods, and correlations employed in the development of the FAVOR, v06.1, code.

    4. Argonne's Laboratory computing center - 2007 annual report.

      SciTech Connect (OSTI)

      Bair, R.; Pieper, G. W.

      2008-05-28

      Argonne National Laboratory founded the Laboratory Computing Resource Center (LCRC) in the spring of 2002 to help meet pressing program needs for computational modeling, simulation, and analysis. The guiding mission is to provide critical computing resources that accelerate the development of high-performance computing expertise, applications, and computations to meet the Laboratory's challenging science and engineering missions. In September 2002 the LCRC deployed a 350-node computing cluster from Linux NetworX to address Laboratory needs for mid-range supercomputing. This cluster, named 'Jazz', achieved over a teraflop of computing power (1012 floating-point calculations per second) on standard tests, making it the Laboratory's first terascale computing system and one of the 50 fastest computers in the world at the time. Jazz was made available to early users in November 2002 while the system was undergoing development and configuration. In April 2003, Jazz was officially made available for production operation. Since then, the Jazz user community has grown steadily. By the end of fiscal year 2007, there were over 60 active projects representing a wide cross-section of Laboratory expertise, including work in biosciences, chemistry, climate, computer science, engineering applications, environmental science, geoscience, information science, materials science, mathematics, nanoscience, nuclear engineering, and physics. Most important, many projects have achieved results that would have been unobtainable without such a computing resource. The LCRC continues to foster growth in the computational science and engineering capability and quality at the Laboratory. Specific goals include expansion of the use of Jazz to new disciplines and Laboratory initiatives, teaming with Laboratory infrastructure providers to offer more scientific data management capabilities, expanding Argonne staff use of national computing facilities, and improving the scientific reach and performance of Argonne's computational applications. Furthermore, recognizing that Jazz is fully subscribed, with considerable unmet demand, the LCRC has framed a 'path forward' for additional computing resources.

    5. Radiological Worker Computer Based Training

      Energy Science and Technology Software Center (OSTI)

      2003-02-06

      Argonne National Laboratory has developed an interactive computer based training (CBT) version of the standardized DOE Radiological Worker training program. This CD-ROM based program utilizes graphics, animation, photographs, sound and video to train users in ten topical areas: radiological fundamentals, biological effects, dose limits, ALARA, personnel monitoring, controls and postings, emergency response, contamination controls, high radiation areas, and lessons learned.

    6. Polymorphous computing fabric

      DOE Patents [OSTI]

      Wolinski, Christophe Czeslaw; Gokhale, Maya B.; McCabe, Kevin Peter

      2011-01-18

      Fabric-based computing systems and methods are disclosed. A fabric-based computing system can include a polymorphous computing fabric that can be customized on a per application basis and a host processor in communication with said polymorphous computing fabric. The polymorphous computing fabric includes a cellular architecture that can be highly parameterized to enable a customized synthesis of fabric instances for a variety of enhanced application performances thereof. A global memory concept can also be included that provides the host processor random access to all variables and instructions associated with the polymorphous computing fabric.

    7. Computing for Finance

      ScienceCinema (OSTI)

      None

      2011-10-06

      The finance sector is one of the driving forces for the use of distributed or Grid computing for business purposes. The speakers will review the state-of-the-art of high performance computing in the financial sector, and provide insight into how different types of Grid computing ? from local clusters to global networks - are being applied to financial applications. They will also describe the use of software and techniques from physics, such as Monte Carlo simulations, in the financial world. There will be four talks of 20min each. The talk abstracts and speaker bios are listed below. This will be followed by a Q&A; panel session with the speakers. From 19:00 onwards there will be a networking cocktail for audience and speakers. This is an EGEE / CERN openlab event organized in collaboration with the regional business network rezonance.ch. A webcast of the event will be made available for subsequent viewing, along with powerpoint material presented by the speakers. Attendance is free and open to all. Registration is mandatory via www.rezonance.ch, including for CERN staff. 1. Overview of High Performance Computing in the Financial Industry Michael Yoo, Managing Director, Head of the Technical Council, UBS Presentation will describe the key business challenges driving the need for HPC solutions, describe the means in which those challenges are being addressed within UBS (such as GRID) as well as the limitations of some of these solutions, and assess some of the newer HPC technologies which may also play a role in the Financial Industry in the future. Speaker Bio: Michael originally joined the former Swiss Bank Corporation in 1994 in New York as a developer on a large data warehouse project. In 1996 he left SBC and took a role with Fidelity Investments in Boston. Unable to stay away for long, he returned to SBC in 1997 while working for Perot Systems in Singapore. Finally, in 1998 he formally returned to UBS in Stamford following the merger with SBC and has remained with UBS for the past 9 years. During his tenure at UBS, he has had a number of leadership roles within IT in development, support and architecture. In 2006 Michael relocated to Switzerland to take up his current role as head of the UBS IB Technical Council, responsible for the overall technology strategy and vision of the Investment Bank. One of Michael's key responsibilities is to manage the UBS High Performance Computing Research Lab and he has been involved in a number of initiatives in the HPC space. 2. Grid in the Commercial WorldFred Gedling, Chief Technology Officer EMEA and Senior Vice President Global Services, DataSynapse Grid computing gets mentions in the press for community programs starting last decade with "Seti@Home". Government, national and supranational initiatives in grid receive some press. One of the IT-industries' best-kept secrets is the use of grid computing by commercial organizations with spectacular results. Grid Computing and its evolution into Application Virtualization is discussed and how this is key to the next generation data center. Speaker Bio: Fred Gedling holds the joint roles of Chief Technology Officer for EMEA and Senior Vice President of Global Services at DataSynapse, a global provider of application virtualisation software. Based in London and working closely with organisations seeking to optimise their IT infrastructures, Fred offers unique insights into the technology of virtualisation as well as the methodology of establishing ROI and rapid deployment to the immediate advantage of the business. Fred has more than fifteen years experience of enterprise middleware and high-performance infrastructures. Prior to DataSynapse he worked in high performance CRM middleware and was the CTO EMEA for New Era of Networks (NEON) during the rapid growth of Enterprise Application Integration. His 25-year career in technology also includes management positions at Goldman Sachs and Stratus Computer. Fred holds a First Class Bsc (Hons) degree in Physics with Astrophysics from the University of Leeds and had the privilege o

    8. Fermilab | Science at Fermilab | Computing | High-performance Computing

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Lattice QCD Farm at the Grid Computing Center at Fermilab. Lattice QCD Farm at the Grid Computing Center at Fermilab. Computing High-performance Computing A workstation computer can perform billions of multiplication and addition operations each second. High-performance parallel computing becomes necessary when computations become too large or too long to complete on a single such machine. In parallel computing, computations are divided up so that many computers can work on the same problem at

    9. The 1993 baseline biological studies and proposed monitoring plan for the Device Assembly Facility at the Nevada Test Site

      SciTech Connect (OSTI)

      Woodward, B.D.; Hunter, R.B.; Greger, P.D.; Saethre, M.B.

      1995-02-01

      This report contains baseline data and recommendations for future monitoring of plants and animals near the new Device Assembly Facility (DAF) on the Nevada Test Site (NTS). The facility is a large structure designed for safely assembling nuclear weapons. Baseline data was collected in 1993, prior to the scheduled beginning of DAF operations in early 1995. Studies were not performed prior to construction and part of the task of monitoring operational effects will be to distinguish those effects from the extensive disturbance effects resulting from construction. Baseline information on species abundances and distributions was collected on ephemeral and perennial plants, mammals, reptiles, and birds in the desert ecosystems within three kilometers (km) of the DAF. Particular attention was paid to effects of selected disturbances, such as the paved road, sewage pond, and the flood-control dike, associated with the facility. Radiological monitoring of areas surrounding the DAF is not included in this report.

    10. Vehicle Technologies Office Merit Review 2014: Computational design and development of a new, lightweight cast alloy for advanced cylinder heads in high-efficiency, light-duty engines FOA 648-3a

      Broader source: Energy.gov [DOE]

      Presentation given by General Motors at 2014 DOE Hydrogen and Fuel Cells Program and Vehicle Technologies Office Annual Merit Review and Peer Evaluation Meeting about computational design and...

    11. Baseline tests for arc melter vitrification of INEL buried wastes. Volume 1: Facility description and summary data report

      SciTech Connect (OSTI)

      Oden, L.L.; O`Connor, W.K.; Turner, P.C.; Soelberg, N.R.; Anderson, G.L.

      1993-11-19

      This report presents field results and raw data from the Buried Waste Integrated Demonstration (BWID) Arc Melter Vitrification Project Phase 1 baseline test series conducted by the Idaho National Engineering Laboratory (INEL) in cooperation with the U.S. Bureau of Mines (USBM). The baseline test series was conducted using the electric arc melter facility at the USBM Albany Research Center in Albany, Oregon. Five different surrogate waste feed mixtures were tested that simulated thermally-oxidized, buried, TRU-contaminated, mixed wastes and soils present at the INEL. The USBM Arc Furnace Integrated Waste Processing Test Facility includes a continuous feed system, the arc melting furnace, an offgas control system, and utilities. The melter is a sealed, 3-phase alternating current (ac) furnace approximately 2 m high and 1.3 m wide. The furnace has a capacity of 1 metric ton of steel and can process as much as 1,500 lb/h of soil-type waste materials. The surrogate feed materials included five mixtures designed to simulate incinerated TRU-contaminated buried waste materials mixed with INEL soil. Process samples, melter system operations data and offgas composition data were obtained during the baseline tests to evaluate the melter performance and meet test objectives. Samples and data gathered during this program included (a) automatically and manually logged melter systems operations data, (b) process samples of slag, metal and fume solids, and (c) offgas composition, temperature, velocity, flowrate, moisture content, particulate loading and metals content. This report consists of 2 volumes: Volume I summarizes the baseline test operations. It includes an executive summary, system and facility description, review of the surrogate waste mixtures, and a description of the baseline test activities, measurements, and sample collection. Volume II contains the raw test data and sample analyses from samples collected during the baseline tests.

    12. Computers for Learning

      Broader source: Energy.gov [DOE]

      Through Executive Order 12999, the Computers for Learning Program was established to provide Federal agencies a quick and easy system for donating excess and surplus computer equipment to schools...

    13. Computers in Commercial Buildings

      U.S. Energy Information Administration (EIA) Indexed Site

      Government-owned buildings of all types, had, on average, more than one computer per person (1,104 computers per thousand employees). They also had a fairly high ratio of...

    14. Lawrence Livermore National Laboratory Emergency Response Capability 2009 Baseline Needs Assessment Performance Assessment

      SciTech Connect (OSTI)

      Sharry, J A

      2009-12-30

      This document was prepared by John A. Sharry, LLNL Fire Marshal and Division Leader for Fire Protection and was reviewed by Sandia/CA Fire Marshal, Martin Gresho. This document is the second of a two-part analysis of Emergency Response Capabilities of Lawrence Livermore National Laboratory. The first part, 2009 Baseline Needs Assessment Requirements Document established the minimum performance criteria necessary to meet mandatory requirements. This second part analyses the performance of Lawrence Livermore Laboratory Emergency Management Department to the contents of the Requirements Document. The document was prepared based on an extensive review of information contained in the 2004 BNA, a review of Emergency Planning Hazards Assessments, a review of building construction, occupancy, fire protection features, dispatch records, LLNL alarm system records, fire department training records, and fire department policies and procedures. On October 1, 2007, LLNL contracted with the Alameda County Fire Department to provide emergency response services. The level of service called for in that contract is the same level of service as was provided by the LLNL Fire Department prior to that date. This Compliance Assessment will evaluate fire department services beginning October 1, 2008 as provided by the Alameda County Fire Department.

    15. Tank Waste Remediation System retrieval and disposal mission technical baseline summary description

      SciTech Connect (OSTI)

      McLaughlin, T.J.

      1998-01-06

      This document is prepared in order to support the US Department of Energy`s evaluation of readiness-to-proceed for the Waste Retrieval and Disposal Mission at the Hanford Site. The Waste Retrieval and Disposal Mission is one of three primary missions under the Tank Waste Remediation System (TWRS) Project. The other two include programs to characterize tank waste and to provide for safe storage of the waste while it awaits treatment and disposal. The Waste Retrieval and Disposal Mission includes the programs necessary to support tank waste retrieval, wastefeed, delivery, storage and disposal of immobilized waste, and closure of tank farms. This mission will enable the tank farms to be closed and turned over for final remediation. The Technical Baseline is defined as the set of science and engineering, equipment, facilities, materials, qualified staff, and enabling documentation needed to start up and complete the mission objectives. The primary purposes of this document are (1) to identify the important technical information and factors that should be used by contributors to the mission and (2) to serve as a basis for configuration management of the technical information and factors.

    16. NREL Solar Radiation Research Laboratory (SRRL): Baseline Measurement System (BMS); Golden, Colorado (Data)

      DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

      Stoffel, T.; Andreas, A.

      The SRRL was established at the Solar Energy Research Institute (now NREL) in 1981 to provide continuous measurements of the solar resources, outdoor calibrations of pyranometers and pyrheliometers, and to characterize commercially available instrumentation. The SRRL is an outdoor laboratory located on South Table Mountain, a mesa providing excellent solar access throughout the year, overlooking Denver. Beginning with the basic measurements of global horizontal irradiance, direct normal irradiance and diffuse horizontal irradiance at 5-minute intervals, the SRRL Baseline Measurement System now produces more than 130 data elements at 1-min intervals that are available from the Measurement & Instrumentation Data Center Web site. Data sources include global horizontal, direct normal, diffuse horizontal (from shadowband and tracking disk), global on tilted surfaces, reflected solar irradiance, ultraviolet, infrared (upwelling and downwelling), photometric and spectral radiometers, sky imagery, and surface meteorological conditions (temperature, relative humidity, barometric pressure, precipitation, snow cover, wind speed and direction at multiple levels). Data quality control and assessment include daily instrument maintenance (M-F) with automated data quality control based on real-time examinations of redundant instrumentation and internal consistency checks using NREL's SERI-QC methodology. Operators are notified of equipment problems by automatic e-mail messages generated by the data acquisition and processing system. Radiometers are recalibrated at least annually with reference instruments traceable to the World Radiometric Reference (WRR).

    17. LTC America`s, Inc. PTC-6 vacuum system (metal): Baseline report

      SciTech Connect (OSTI)

      1997-07-31

      The LTC coating removal technology was tested and is being evaluated at Florida International University (FIU) as a baseline technology. In conjunction with FIU`s evaluation of efficiency and cost, this report covers the evaluation conducted for safety and health issues. It is a commercially available technology and has been used for various projects at locations throughout the country. The LTC coating removal system consisted of several hand tools, a Roto Peen scaler, and a needlegun. They are designed to remove coatings from steel, concrete, brick, and wood. These hand tools are used with the LTC PTC-6 vacuum system to capture dust and debris as removal of the coating takes place. The safety and health evaluation during the testing demonstration focused on two main areas of exposure: dust and noise. The dust exposure was minimal but noise exposure was significant. Further testing for each exposure is recommended because of the environment where the testing demonstration took place. It is feasible that the dust and noise levels will be higher in an enclosed operating environment of different construction. In addition, other areas of concern found were arm-hand vibration, whole body vibration, ergonomics, heat stress, tripping hazards, electrical hazards, machine guarding, and lockout/tagout.

    18. NREL Solar Radiation Research Laboratory (SRRL): Baseline Measurement System (BMS); Golden, Colorado (Data)

      DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

      Stoffel, T.; Andreas, A.

      1981-07-15

      The SRRL was established at the Solar Energy Research Institute (now NREL) in 1981 to provide continuous measurements of the solar resources, outdoor calibrations of pyranometers and pyrheliometers, and to characterize commercially available instrumentation. The SRRL is an outdoor laboratory located on South Table Mountain, a mesa providing excellent solar access throughout the year, overlooking Denver. Beginning with the basic measurements of global horizontal irradiance, direct normal irradiance and diffuse horizontal irradiance at 5-minute intervals, the SRRL Baseline Measurement System now produces more than 130 data elements at 1-min intervals that are available from the Measurement & Instrumentation Data Center Web site. Data sources include global horizontal, direct normal, diffuse horizontal (from shadowband and tracking disk), global on tilted surfaces, reflected solar irradiance, ultraviolet, infrared (upwelling and downwelling), photometric and spectral radiometers, sky imagery, and surface meteorological conditions (temperature, relative humidity, barometric pressure, precipitation, snow cover, wind speed and direction at multiple levels). Data quality control and assessment include daily instrument maintenance (M-F) with automated data quality control based on real-time examinations of redundant instrumentation and internal consistency checks using NREL's SERI-QC methodology. Operators are notified of equipment problems by automatic e-mail messages generated by the data acquisition and processing system. Radiometers are recalibrated at least annually with reference instruments traceable to the World Radiometric Reference (WRR).

    19. Supporting collaborative computing and interaction

      SciTech Connect (OSTI)

      Agarwal, Deborah; McParland, Charles; Perry, Marcia

      2002-05-22

      To enable collaboration on the daily tasks involved in scientific research, collaborative frameworks should provide lightweight and ubiquitous components that support a wide variety of interaction modes. We envision a collaborative environment as one that provides a persistent space within which participants can locate each other, exchange synchronous and asynchronous messages, share documents and applications, share workflow, and hold videoconferences. We are developing the Pervasive Collaborative Computing Environment (PCCE) as such an environment. The PCCE will provide integrated tools to support shared computing and task control and monitoring. This paper describes the PCCE and the rationale for its design.

    20. Computational Structural Mechanics

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      load-2 TRACC RESEARCH Computational Fluid Dynamics Computational Structural Mechanics Transportation Systems Modeling Computational Structural Mechanics Overview of CSM Computational structural mechanics is a well-established methodology for the design and analysis of many components and structures found in the transportation field. Modern finite-element models (FEMs) play a major role in these evaluations, and sophisticated software, such as the commercially available LS-DYNAÂź code, is

    1. Multicore: Fallout from a Computing Evolution

      ScienceCinema (OSTI)

      Yelick, Kathy [Director, NERSC

      2009-09-01

      July 22, 2008 Berkeley Lab lecture: Parallel computing used to be reserved for big science and engineering projects, but in two years that's all changed. Even laptops and hand-helds use parallel processors. Unfortunately, the software hasn't kept pace. Kathy Yelick, Director of the National Energy Research Scientific Computing Center at Berkeley Lab, describes the resulting chaos and the computing community's efforts to develop exciting applications that take advantage of tens or hundreds of processors on a single chip.

    2. Computing environment logbook

      DOE Patents [OSTI]

      Osbourn, Gordon C; Bouchard, Ann M

      2012-09-18

      A computing environment logbook logs events occurring within a computing environment. The events are displayed as a history of past events within the logbook of the computing environment. The logbook provides search functionality to search through the history of past events to find one or more selected past events, and further, enables an undo of the one or more selected past events.

    3. Mathematical and Computational Epidemiology

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Mathematical and Computational Epidemiology Search Site submit Contacts | Sponsors Mathematical and Computational Epidemiology Los Alamos National Laboratory change this image and alt text Menu About Contact Sponsors Research Agent-based Modeling Mixing Patterns, Social Networks Mathematical Epidemiology Social Internet Research Uncertainty Quantification Publications People Mathematical and Computational Epidemiology (MCEpi) Quantifying model uncertainty in agent-based simulations for

    4. BNL ATLAS Grid Computing

      ScienceCinema (OSTI)

      Michael Ernst

      2010-01-08

      As the sole Tier-1 computing facility for ATLAS in the United States and the largest ATLAS computing center worldwide Brookhaven provides a large portion of the overall computing resources for U.S. collaborators and serves as the central hub for storing,

    5. Scalable optical quantum computer

      SciTech Connect (OSTI)

      Manykin, E A; Mel'nichenko, E V [Institute for Superconductivity and Solid-State Physics, Russian Research Centre 'Kurchatov Institute', Moscow (Russian Federation)

      2014-12-31

      A way of designing a scalable optical quantum computer based on the photon echo effect is proposed. Individual rare earth ions Pr{sup 3+}, regularly located in the lattice of the orthosilicate (Y{sub 2}SiO{sub 5}) crystal, are suggested to be used as optical qubits. Operations with qubits are performed using coherent and incoherent laser pulses. The operation protocol includes both the method of measurement-based quantum computations and the technique of optical computations. Modern hybrid photon echo protocols, which provide a sufficient quantum efficiency when reading recorded states, are considered as most promising for quantum computations and communications. (quantum computer)

    6. COMPUTATIONAL SCIENCE CENTER

      SciTech Connect (OSTI)

      DAVENPORT, J.

      2005-11-01

      The Brookhaven Computational Science Center brings together researchers in biology, chemistry, physics, and medicine with applied mathematicians and computer scientists to exploit the remarkable opportunities for scientific discovery which have been enabled by modern computers. These opportunities are especially great in computational biology and nanoscience, but extend throughout science and technology and include, for example, nuclear and high energy physics, astrophysics, materials and chemical science, sustainable energy, environment, and homeland security. To achieve our goals we have established a close alliance with applied mathematicians and computer scientists at Stony Brook and Columbia Universities.

    7. Breckinridge Project, initial effort. Report VII, Volume II. Environmental baseline report

      SciTech Connect (OSTI)

      1982-01-01

      Ashland Synthetic Fuels, Inc. (ASFI) and Airco Energy Company, Inc. (AECI) have recently formed the Breckinridge Project and are currently conducting a process and economic feasibility study of a commercial scale facility to produce synthetic liquid fuels from coal. The coal conversion process to be used is the H-COAL process, which is in the pilot plant testing stage under the auspices of the US Department of Energy at the H-COAL Pilot Plant Project near Catlettsburg, Kentucky. The preliminary plans for the commercial plant are for a 18,140 metric ton/day (24,000 ton/day) nominal coal assumption capacity utilizing the abundant high sulfur Western Kentucky coals. The Western Kentucky area offers a source of the coal along with adequate water, power, labor, transportation and other factors critical to the successful siting of a plant. Various studies by federal and state governments, as well as private industry, have reached similar conclusions regarding the suitability of such plant sites in western Kentucky. Of the many individual sites evaluated, a site in Breckinridge County, Kentucky, approximately 4 kilometers (2.5 miles) west of the town of Stephensport, has been identified as the plant location. Actions have been taken to obtain options to insure that this site will be available when needed. This report contains an overview of the regional setting and results of the baseline environmental studies. These studies include collection of data on ambient air and water quality, sound, aquatic and terrestrial biology and geology. This report contains the following chapters; introduction, review of significant findings, ambient air quality monitoring, sound, aquatic ecology, vegetation, wildlife, geology, soils, surface water, and ground water.

    8. Vandenberg Air Force Base integrated resource assessment. Volume 2, Baseline detail

      SciTech Connect (OSTI)

      Halverson, M.A.; Richman, E.E.; Dagle, J.E.; Hickman, B.J.; Daellenbach, K.K.; Sullivan, G.P.

      1993-06-01

      The US Air Force Space Command has tasked the Pacific Northwest Laboratory, as the lead laboratory supporting the US Department of Energy Federal Energy Management Program, to identify, evaluate, and assist in acquiring all cost-effective energy projects at Vandenberg Air Force Base (VAFB). This is a model program PNL is designing for federal customers served by the Pacific Gas and Electric Company (PG and E). The primary goal of the VAFB project is to identify all electric energy efficiency opportunities, and to negotiate with PG and E to acquire those resources through a customized demand-side management program for its federal clients. That customized program should have three major characteristics: (1) 100% up-front financing; (2) substantial utility cost-sharing; and (3) utility implementation through energy service companies under contract to the utility. A similar arrangement will be pursued with Southern California Gas for non-electric resource opportunities if that is deemed desirable by the site and if the gas utility seems open to such an approach. This report documents the assessment of baseline energy use at VAFB located near Lompoc, California. It is a companion report to Volume 1, Executive Summary, and Volume 3, Resource Assessment. This analysis examines the characteristics of electric, natural gas, fuel oil, and propane use for fiscal year 1991. It records energy-use intensities for the facilities at VAFB by building type and energy end use. It also breaks down building energy consumption by fuel type, energy end use, and building type. A more complete energy consumption reconciliation is presented that includes the accounting of all energy use among buildings, utilities, and applicable losses.

    9. Baseline biological risk assessment for aquatic populations occurring near Eielson Air Force Base, Alaska

      SciTech Connect (OSTI)

      Dauble, D.; Brandt, C.; Lewis, R.; Smith, R.

      1995-12-31

      Eielson Air Force Base (AFB), Alaska was listed as a Superfund site in November 1989 with 64 potential source areas of contamination. As part of a sitewide remedial investigation, baseline risk assessments were conducted in 1993 and 1994 to evaluate hazards posed to biological receptors and to human health. Fish tissue, aquatic invertebrates, aquatic vegetation, sediment, and surface water data were collected from several on-site and off-site surface water bodies. An initial screening risk assessment indicated that several surface water sites along two major tributary creeks flowing through the base had unacceptable risks to both aquatic receptors and to human health because of DDTs. Other contaminants of concern (i.e., PCBs and PAHs) were below screening risk levels for aquatic organisms, but contributed to an unacceptable risk to human health. Additional samples was taken in 1994 to characterize the site-wide distribution of PAHs, DDTs, and PCBs in aquatic biota and sediments. Concentrations of PAHs were invertebrates > aquatic vegetation > fish, but concentrations were sufficiently low that they posed no significant risk to biological receptors. Pesticides were detected in all fish tissue samples. Polychlorinated biphenyls (PCBs) were also detected in most fish from Garrison Slough. The pattern of PCB concentrations in Arctic grayling (Thymallus arcticus) was related to their proximity to a sediment source in lower Garrison Slough. Ingestion of PCB-contaminated fish is the primary human-health risk driver for surface water bodies on Eielson AFB, resulting in carcinogenic risks > 1 {times} 10{sup {minus}4} for future recreational land-use at some sites. Principal considerations affecting uncertainty in the risk assessment process included spatial and temporal variability in media contaminant concentrations and inconsistencies between modelled and measured body burdens.

    10. Baseline risk assessment for exposure to contaminants at the St. Louis Site, St. Louis, Missouri

      SciTech Connect (OSTI)

      Not Available

      1993-11-01

      The St. Louis Site comprises three noncontiguous areas in and near St. Louis, Missouri: the St. Louis Downtown Site (SLDS), the St. Louis Airport Storage Site (SLAPS), and the Latty Avenue Properties. The main site of the Latty Avenue Properties includes the Hazelwood Interim Storage Site (HISS) and the Futura Coatings property, which are located at 9200 Latty Avenue. Contamination at the St. Louis Site is the result of uranium processing and disposal activities that took place from the 1940s through the 1970s. Uranium processing took place at the SLDS from 1942 through 1957. From the 1940s through the 1960s, SLAPS was used as a storage area for residues from the manufacturing operations at SLDS. The materials stored at SLAPS were bought by Continental Mining and Milling Company of Chicago, Illinois, in 1966, and moved to the HISS/Futura Coatings property at 9200 Latty Avenue. Vicinity properties became contaminated as a result of transport and movement of the contaminated material among SLDS, SLAPS, and the 9200 Latty Avenue property. This contamination led to the SLAPS, HISS, and Futura Coatings properties being placed on the National Priorities List (NPL) of the US Environmental Protection Agency (EPA). The US Department of Energy (DOE) is responsible for cleanup activities at the St. Louis Site under its Formerly Utilized Sites Remedial Action Program (FUSRAP). The primary goal of FUSRAP is the elimination of potential hazards to human health and the environment at former Manhattan Engineer District/Atomic Energy Commission (MED/AEC) sites so that, to the extent possible, these properties can be released for use without restrictions. To determine and establish cleanup goals for the St. Louis Site, DOE is currently preparing a remedial investigation/feasibility study-environmental impact statement (RI/FS-EIS). This baseline risk assessment (BRA) is a component of the process; it addresses potential risk to human health and the environment associated wi

    11. Statistical Comparison of the Baseline Mechanical Properties of NBG-18 and PCEA Graphite

      SciTech Connect (OSTI)

      Mark C. Carroll; David T. Rohrbaugh

      2013-08-01

      High-purity graphite is the core structural material of choice in the Very High Temperature Reactor (VHTR), a graphite-moderated, helium-cooled design that is capable of producing process heat for power generation and for industrial process that require temperatures higher than the outlet temperatures of present nuclear reactors. The Baseline Graphite Characterization Program is endeavoring to minimize the conservative estimates of as-manufactured mechanical and physical properties by providing comprehensive data that captures the level of variation in measured values. In addition to providing a comprehensive comparison between these values in different nuclear grades, the program is also carefully tracking individual specimen source, position, and orientation information in order to provide comparisons and variations between different lots, different billets, and different positions from within a single billet. This report is a preliminary comparison between the two grades of graphite that were initially favored in the two main VHTR designs. NBG-18, a medium-grain pitch coke graphite from SGL formed via vibration molding, was the favored structural material in the pebble-bed configuration, while PCEA, a smaller grain, petroleum coke, extruded graphite from GrafTech was favored for the prismatic configuration. An analysis of the comparison between these two grades will include not only the differences in fundamental and statistically-significant individual strength levels, but also the differences in variability in properties within each of the grades that will ultimately provide the basis for the prediction of in-service performance. The comparative performance of the different types of nuclear grade graphites will continue to evolve as thousands more specimens are fully characterized from the numerous grades of graphite being evaluated.

    12. M & V Shootout: Setting the Stage For Testing the Performance of New Energy Baseline

      SciTech Connect (OSTI)

      Touzani, Samir; Custodio, Claudine; Sohn, Michael; Fernandes, Samuel; Granderson, Jessica; Jump, David; Taylor, Cody

      2015-07-01

      Trustworthy savings calculations are critical to convincing investors in energy efficiency projects of the benefit and cost-effectiveness of such investments and their ability to replace or defer supply-side capital investments. However, today’s methods for measurement and verification (M&V) of energy savings constitute a significant portion of the total costs of efficiency projects. They also require time-consuming data acquisition and often do not deliver results until years after the program period has ended. A spectrum of savings calculation approaches are used, with some relying more heavily on measured data and others relying more heavily on estimated or modeled data, or stipulated information. The rising availability of “smart” meters, combined with new analytical approaches to quantifying savings, has opened the door to conducting M&V more quickly and at lower cost, with comparable or improved accuracy. Energy management and information systems (EMIS) technologies, not only enable significant site energy savings, but are also beginning to offer M&V capabilities. This paper expands recent analyses of public-domain, whole-building M&V methods, focusing on more novel baseline modeling approaches that leverage interval meter data. We detail a testing procedure and metrics to assess the performance of these new approaches using a large test dataset. We also provide conclusions regarding the accuracy, cost, and time trade-offs between more traditional M&V and these emerging streamlined methods. Finally, we discuss the potential evolution of M&V to better support the energy efficiency industry through low-cost approaches, and the long-term agenda for validation of building energy analytics.

    13. Baseline Utilization of Breast Radiotherapy Before Institution of the Medicare Practice Quality Reporting Initiative

      SciTech Connect (OSTI)

      Smith, Benjamin D. Smith, Grace L.; Roberts, Kenneth B.; Buchholz, Thomas A.

      2009-08-01

      Purpose: In 2007, Medicare implemented the Physician Quality Reporting Initiative (PQRI), which provides financial incentives to physicians who report their performance on certain quality measures. PQRI measure no. 74 recommends radiotherapy for patients treated with conservative surgery (CS) for invasive breast cancer. As a first step in evaluating the potential impact of this measure, we assessed baseline use of radiotherapy among women diagnosed with invasive breast cancer before implementation of PQRI. Methods and Materials: Using the SEER-Medicare data set, we identified women aged 66-70 diagnosed with invasive breast cancer and treated with CS between 2000 and 2002. Treatment with radiotherapy was determined using SEER and claims data. Multivariate logistic regression tested whether receipt of radiotherapy varied significantly across clinical, pathologic, and treatment covariates. Results: Of 3,674 patients, 94% (3,445) received radiotherapy. In adjusted analysis, the presence of comorbid illness (odds ratio [OR] 1.69; 95% confidence interval [CI], 1.19-2.42) and unmarried marital status were associated with omission of radiotherapy (OR 1.65; 95% CI, 1.22-2.20). In contrast, receipt of chemotherapy was protective against omission of radiotherapy (OR 0.25; 95% CI, 0.16-0.38). Race and geographic region did not correlate with radiotherapy utilization. Conclusions: Utilization of radiotherapy following CS was high for patients treated before institution of PQRI, suggesting that at most 6% of patients could benefit from measure no. 74. Further research is needed to determine whether institution of PQRI will affect radiotherapy utilization.

    14. Level III baseline risk evaluation for Building 3505 at Oak Ridge National Laboratory, Oak Ridge, Tennessee

      SciTech Connect (OSTI)

      Mostella, W.B. Jr.

      1994-12-01

      The Level III Baseline Risk Evaluation (BRE) for Building 3505, the ORNL Metal Recovery Facility, provides an analysis of the potential for adverse health effects, current or future, associated with the presence of hazardous substances in the building. The Metal Recovery Facility was used from 1952 through 1960 to process large quantities of radioactive material using the PUREX process for the recovery of uranium-238, plutonium-239, neptunium-237, and americium-241. The facility consists of seven process cells (A through G), a canal, a dissolver room, a dissolver pit, an office, locker room, storage area, control room, electrical gallery, shop, and makeup area. The cells were used to house the nuclear fuel reprocessing equipment, and the canal was constructed to be used as a water-shielded transfer canal. Currently, there are no known releases of radioactive contaminants from Building 3505. To perform the BRE, historical radiological survey data were used to estimate the concentration of alpha- and beta/gamma emitting radionuclides in the various cells, rooms, and other areas in Building 3505. Data from smear surveys were used to estimate the amount of transferable contamination (to which receptors can be exposed via inhalation and ingestion), and data from probe surveys were used to estimate the amount of both fixed and transferable contamination (from which receptors can receive external exposure). Two land use scenarios, current and future, and their subsequent exposure scenarios were explored in the BRE. Under the current land use scenario, two exposure scenarios were evaluated. The first was a worst-case industrial exposure scenario in which the receptor is a maintenance worker who works 8 hours/day, 350 days/year in the building for 25 years. In the second, more realistic exposure scenario, the receptor is a surveillance and maintenance (S&M) worker who spends two 8-hour days/year in the building for 25 years.

    15. EA-2020: Energy Efficiency Standards for New Federal Low-Rise Residential Buildings’ Baseline Standards Update (RIN 1904-AD56)

      Broader source: Energy.gov [DOE]

      This EA will evaluate the potential environmental impacts of implementing the provisions in the Energy Conservation and Production Act (ECPA) that require DOE to update the baseline Federal energy efficiency performance standards for the construction of new Federal buildings, including low-rise residential buildings.

    16. Computers for artificial intelligence a technology assessment and forecast

      SciTech Connect (OSTI)

      Miller, R.K.

      1986-01-01

      This study reviews the development and current state-of-the-art in computers for artificial intelligence, including LISP machines, AI workstations, professional and engineering workstations, minicomputers, mainframes, and supercomputers. Major computer systems for AI applications are reviewed. The use of personal computers for expert system development is discussed, and AI software for the IBM PC, Texas Instrument Professional Computer, and Apple MacIntosh is presented. Current research aimed at developing a new computer for artificial intelligence is described, and future technological developments are discussed.

    17. Sandia Energy - High Performance Computing

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      High Performance Computing Home Energy Research Advanced Scientific Computing Research (ASCR) High Performance Computing High Performance Computingcwdd2015-03-18T21:41:24+00:00...

    18. Session on computation in biological pathways

      SciTech Connect (OSTI)

      Karp, P.D.; Riley, M.

      1996-12-31

      The papers in this session focus on the development of pathway databases and computational tools for pathway analysis. The discussion involves existing databases of sequenced genomes, as well as techniques for studying regulatory pathways.

    19. Tribal Energy Development - Process and Guide

      Energy Savers [EERE]

      Development - Process & "Guide" Integrate supply and demand alternatives Tribal Objectives * Energy Reliability & Security * Off-Grid Electrification * Minimize Environmental Impacts * Supply Diversification * Use of Local Resources * Economic Development * Jobs * Build technical expertise * Respect for Mother Earth * Others?? Develop a community energy baseline Develop a common Tribal energy vision Identify and support a Tribal champion Identify and evaluate resource options

    20. Short-baseline electron neutrino disappearance, tritium beta decay, and neutrinoless double-beta decay

      SciTech Connect (OSTI)

      Giunti, Carlo; Laveder, Marco [INFN, Sezione di Torino, Via P. Giuria 1, I-10125 Torino (Italy); Dipartimento di Fisica G. Galilei, Universita di Padova, and INFN, Sezione di Padova, Via F. Marzolo 8, I-35131 Padova (Italy)

      2010-09-01

      We consider the interpretation of the MiniBooNE low-energy anomaly and the gallium radioactive source experiments anomaly in terms of short-baseline electron neutrino disappearance in the framework of 3+1 four-neutrino mixing schemes. The separate fits of MiniBooNE and gallium data are highly compatible, with close best-fit values of the effective oscillation parameters {Delta}m{sup 2} and sin{sup 2}2{theta}. The combined fit gives {Delta}m{sup 2}(greater-or-similar sign)0.1 eV{sup 2} and 0.11(less-or-similar sign)sin{sup 2}2{theta}(less-or-similar sign)0.48 at 2{sigma}. We consider also the data of the Bugey and Chooz reactor antineutrino oscillation experiments and the limits on the effective electron antineutrino mass in {beta} decay obtained in the Mainz and Troitsk tritium experiments. The fit of the data of these experiments limits the value of sin{sup 2}2{theta} below 0.10 at 2{sigma}. Considering the tension between the neutrino MiniBooNE and gallium data and the antineutrino reactor and tritium data as a statistical fluctuation, we perform a combined fit which gives {Delta}m{sup 2}{approx_equal}2 eV and 0.01(less-or-similar sign)sin{sup 2}2{theta}(less-or-similar sign)0.13 at 2{sigma}. Assuming a hierarchy of masses m{sub 1}, m{sub 2}, m{sub 3}<

    1. INITIAL COMPARISON OF BASELINE PHYSICAL AND MECHANICAL PROPERTIES FOR THE VHTR CANDIDATE GRAPHITE GRADES

      SciTech Connect (OSTI)

      Carroll, Mark C

      2014-09-01

      High-purity graphite is the core structural material of choice in the Very High Temperature Reactor (VHTR) design, a graphite-moderated, helium-cooled configuration that is capable of producing thermal energy for power generation as well as process heat for industrial applications that require temperatures higher than the outlet temperatures of present nuclear reactors. The Baseline Graphite Characterization Program is endeavoring to minimize the conservative estimates of as-manufactured mechanical and physical properties in nuclear-grade graphites by providing comprehensive data that captures the level of variation in measured values. In addition to providing a thorough comparison between these values in different graphite grades, the program is also carefully tracking individual specimen source, position, and orientation information in order to provide comparisons both in specific properties and in the associated variability between different lots, different billets, and different positions from within a single billet. This report is a preliminary comparison between each of the grades of graphite that are considered “candidate” grades from four major international graphite producers. These particular grades (NBG-18, NBG-17, PCEA, IG-110, and 2114) are the major focus of the evaluations presently underway on irradiated graphite properties through the series of Advanced Graphite Creep (AGC) experiments. NBG-18, a medium-grain pitch coke graphite from SGL from which billets are formed via vibration molding, was the favored structural material in the pebble-bed configuration. NBG-17 graphite from SGL is essentially NBG-18 with the grain size reduced by a factor of two. PCEA, petroleum coke graphite from GrafTech with a similar grain size to NBG-17, is formed via an extrusion process and was initially considered the favored grade for the prismatic layout. IG-110 and 2114, from Toyo Tanso and Mersen (formerly Carbone Lorraine), respectively, are fine-grain grades produced via an isomolding process. An analysis of the comparison between each of these grades will include not only the differences in fundamental and statistically-significant individual strength levels, but also the differences in variability in properties within each of the grades that will ultimately provide the basis for the prediction of in-service performance. The comparative performance of the different types of nuclear-grade graphites will continue to evolve as thousands more specimens are fully characterized from the numerous grades of graphite being evaluated.

    2. NERSC Computer Security

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Security NERSC Computer Security NERSC computer security efforts are aimed at protecting NERSC systems and its users' intellectual property from unauthorized access or modification. Among NERSC's security goal are: 1. To protect NERSC systems from unauthorized access. 2. To prevent the interruption of services to its users. 3. To prevent misuse or abuse of NERSC resources. Security Incidents If you think there has been a computer security incident you should contact NERSC Security as soon as

    3. Edison Electrifies Scientific Computing

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Edison Electrifies Scientific Computing Edison Electrifies Scientific Computing NERSC Flips Switch on New Flagship Supercomputer January 31, 2014 Contact: Margie Wylie, mwylie@lbl.gov, +1 510 486 7421 The National Energy Research Scientific Computing (NERSC) Center recently accepted "Edison," a new flagship supercomputer designed for scientific productivity. Named in honor of American inventor Thomas Alva Edison, the Cray XC30 will be dedicated in a ceremony held at the Department of

    4. Computational Earth Science

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Nucleosynthesis (Technical Report) | SciTech Connect Computational Astrophysics Consortium 3 - Supernovae, Gamma-Ray Bursts and Nucleosynthesis Citation Details In-Document Search Title: Computational Astrophysics Consortium 3 - Supernovae, Gamma-Ray Bursts and Nucleosynthesis Final project report for UCSC's participation in the Computational Astrophysics Consortium - Supernovae, Gamma-Ray Bursts and Nucleosynthesis. As an appendix, the report of the entire Consortium is also appended.

    5. Applied Computer Science

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Results from a climate simulation computed using the Model for Prediction Across Scales (MPAS) code. This visualization shows the temperature of ocean currents using a green and ...

    6. Computational Physics and Methods

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      ... for use in Advanced Strategic Computing codes Theory and modeling of dense plasmas in ICF and astrophysics environments Theory and modeling of astrophysics in support of NASA ...

    7. Personal Computer Inventory System

      Energy Science and Technology Software Center (OSTI)

      1993-10-04

      PCIS is a database software system that is used to maintain a personal computer hardware and software inventory, track transfers of hardware and software, and provide reports.

    8. Molecular Science Computing: 2010 Greenbook

      SciTech Connect (OSTI)

      De Jong, Wibe A.; Cowley, David E.; Dunning, Thom H.; Vorpagel, Erich R.

      2010-04-02

      This 2010 Greenbook outlines the science drivers for performing integrated computational environmental molecular research at EMSL and defines the next-generation HPC capabilities that must be developed at the MSC to address this critical research. The EMSL MSC Science Panel used EMSL’s vision and science focus and white papers from current and potential future EMSL scientific user communities to define the scientific direction and resulting HPC resource requirements presented in this 2010 Greenbook.

    9. 60 Years of Computing | Department of Energy

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      60 Years of Computing 60 Years of Computing

    10. ABB SCADA/EMS System INEEL Baseline Summary Test Report (November...

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      Work was performed by specialists in the fields of control system development, networking, software engineering, and cybersecurity. This report is the result of the team effort of ...

    11. Microsoft PowerPoint - Snippet 4.2 Integrated Baseline Review...

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      its relationship to the underlying Earned Value Management (EVM) systems and processes ... process risk areas, as well as develop confidence in the project's operating plans. ...

    12. Appendix A - GPRA06 benefits estimates: MARKAL and NEMS model baseline cases

      SciTech Connect (OSTI)

      None, None

      2009-01-18

      NEMS is an integrated energy model of the U.S. energy system developed by the Energy Information Administration (EIA) for forecasting and policy analysis purposes.

    13. Information Science, Computing, Applied Math

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Information Science, Computing, Applied Math science-innovationassetsimagesicon-science.jpg Information Science, Computing, Applied Math National security depends on science ...

    14. 2011 Computation Directorate Annual Report

      SciTech Connect (OSTI)

      Crawford, D L

      2012-04-11

      From its founding in 1952 until today, Lawrence Livermore National Laboratory (LLNL) has made significant strategic investments to develop high performance computing (HPC) and its application to national security and basic science. Now, 60 years later, the Computation Directorate and its myriad resources and capabilities have become a key enabler for LLNL programs and an integral part of the effort to support our nation's nuclear deterrent and, more broadly, national security. In addition, the technological innovation HPC makes possible is seen as vital to the nation's economic vitality. LLNL, along with other national laboratories, is working to make supercomputing capabilities and expertise available to industry to boost the nation's global competitiveness. LLNL is on the brink of an exciting milestone with the 2012 deployment of Sequoia, the National Nuclear Security Administration's (NNSA's) 20-petaFLOP/s resource that will apply uncertainty quantification to weapons science. Sequoia will bring LLNL's total computing power to more than 23 petaFLOP/s-all brought to bear on basic science and national security needs. The computing systems at LLNL provide game-changing capabilities. Sequoia and other next-generation platforms will enable predictive simulation in the coming decade and leverage industry trends, such as massively parallel and multicore processors, to run petascale applications. Efficient petascale computing necessitates refining accuracy in materials property data, improving models for known physical processes, identifying and then modeling for missing physics, quantifying uncertainty, and enhancing the performance of complex models and algorithms in macroscale simulation codes. Nearly 15 years ago, NNSA's Accelerated Strategic Computing Initiative (ASCI), now called the Advanced Simulation and Computing (ASC) Program, was the critical element needed to shift from test-based confidence to science-based confidence. Specifically, ASCI/ASC accelerated the development of simulation capabilities necessary to ensure confidence in the nuclear stockpile-far exceeding what might have been achieved in the absence of a focused initiative. While stockpile stewardship research pushed LLNL scientists to develop new computer codes, better simulation methods, and improved visualization technologies, this work also stimulated the exploration of HPC applications beyond the standard sponsor base. As LLNL advances to a petascale platform and pursues exascale computing (1,000 times faster than Sequoia), ASC will be paramount to achieving predictive simulation and uncertainty quantification. Predictive simulation and quantifying the uncertainty of numerical predictions where little-to-no data exists demands exascale computing and represents an expanding area of scientific research important not only to nuclear weapons, but to nuclear attribution, nuclear reactor design, and understanding global climate issues, among other fields. Aside from these lofty goals and challenges, computing at LLNL is anything but 'business as usual.' International competition in supercomputing is nothing new, but the HPC community is now operating in an expanded, more aggressive climate of global competitiveness. More countries understand how science and technology research and development are inextricably linked to economic prosperity, and they are aggressively pursuing ways to integrate HPC technologies into their native industrial and consumer products. In the interest of the nation's economic security and the science and technology that underpins it, LLNL is expanding its portfolio and forging new collaborations. We must ensure that HPC remains an asymmetric engine of innovation for the Laboratory and for the U.S. and, in doing so, protect our research and development dynamism and the prosperity it makes possible. One untapped area of opportunity LLNL is pursuing is to help U.S. industry understand how supercomputing can benefit their business. Industrial investment in HPC applications has historically been limited by the prohibitive cost of entry, the inaccessibility of software to run the powerful systems, and the years it takes to grow the expertise to develop codes and run them in an optimal way. LLNL is helping industry better compete in the global market place by providing access to some of the world's most powerful computing systems, the tools to run them, and the experts who are adept at using them. Our scientists are collaborating side by side with industrial partners to develop solutions to some of industry's toughest problems. The goal of the Livermore Valley Open Campus High Performance Computing Innovation Center is to allow American industry the opportunity to harness the power of supercomputing by leveraging the scientific and computational expertise at LLNL in order to gain a competitive advantage in the global economy.

    15. Theory, Simulation, and Computation

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      ADTSC Theory, Simulation, and Computation Supporting the Laboratory's overarching strategy to provide cutting-edge tools to guide and interpret experiments and further our fundamental understanding and predictive capabilities for complex systems. Theory, modeling, informatics Suites of experiment data High performance computing, simulation, visualization Contacts Associate Director John Sarrao Deputy Associate Director Paul Dotson Directorate Office (505) 667-6645 Email Applying the Scientific

    16. ELECTRONIC DIGITAL COMPUTER

      DOE Patents [OSTI]

      Stone, J.J. Jr.; Bettis, E.S.; Mann, E.R.

      1957-10-01

      The electronic digital computer is designed to solve systems involving a plurality of simultaneous linear equations. The computer can solve a system which converges rather rapidly when using Von Seidel's method of approximation and performs the summations required for solving for the unknown terms by a method of successive approximations.

    17. Computer Processor Allocator

      Energy Science and Technology Software Center (OSTI)

      2004-03-01

      The Compute Processor Allocator (CPA) provides an efficient and reliable mechanism for managing and allotting processors in a massively parallel (MP) computer. It maintains information in a database on the health. configuration and allocation of each processor. This persistent information is factored in to each allocation decision. The CPA runs in a distributed fashion to avoid a single point of failure.

    18. Indirection and computer security.

      SciTech Connect (OSTI)

      Berg, Michael J.

      2011-09-01

      The discipline of computer science is built on indirection. David Wheeler famously said, 'All problems in computer science can be solved by another layer of indirection. But that usually will create another problem'. We propose that every computer security vulnerability is yet another problem created by the indirections in system designs and that focusing on the indirections involved is a better way to design, evaluate, and compare security solutions. We are not proposing that indirection be avoided when solving problems, but that understanding the relationships between indirections and vulnerabilities is key to securing computer systems. Using this perspective, we analyze common vulnerabilities that plague our computer systems, consider the effectiveness of currently available security solutions, and propose several new security solutions.

    19. DOE/SC-ARM/TR-095 The Microbase Value-Added Product: A Baseline Retrieval of Cloud

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      5 The Microbase Value-Added Product: A Baseline Retrieval of Cloud Microphysical Properties M Dunn K Johnson M Jensen May 2011 DISCLAIMER This report was prepared as an account of work sponsored by the U.S. Government. Neither the United States nor any agency thereof, nor any of their employees, makes any warranty, express or implied, or assumes any legal liability or responsibility for the accuracy, completeness, or usefulness of any information, apparatus, product, or process disclosed, or

    20. Electromagnetic analysis of forces and torques on the baseline and enhanced ITER shield modules due to plasma disruption.

      SciTech Connect (OSTI)

      Kotulski, Joseph Daniel; Coats, Rebecca Sue; Pasik, Michael Francis; Ulrickson, Michael Andrew

      2009-08-01

      An electromagnetic analysis is performed on the ITER shield modules under different plasma-disruption scenarios using the OPERA-3d software. The models considered include the baseline design as provided by the International Organization and an enhanced design that includes the more realistic geometrical features of a shield module. The modeling procedure is explained, electromagnetic torques are presented, and results of the modeling are discussed.

    1. Work Domain Analysis of a Predecessor Sodium-cooled Reactor as Baseline for AdvSMR Operational Concepts

      SciTech Connect (OSTI)

      Ronald Farris; David Gertman; Jacques Hugo

      2014-03-01

      This report presents the results of the Work Domain Analysis for the Experimental Breeder Reactor (EBR-II). This is part of the phase of the research designed to incorporate Cognitive Work Analysis in the development of a framework for the formalization of an Operational Concept (OpsCon) for Advanced Small Modular Reactors (AdvSMRs). For a new AdvSMR design, information obtained through Cognitive Work Analysis, combined with human performance criteria, can and should be used in during the operational phase of a plant to assess the crew performance aspects associated with identified AdvSMR operational concepts. The main objective of this phase was to develop an analytical and descriptive framework that will help systems and human factors engineers to understand the design and operational requirements of the emerging generation of small, advanced, multi-modular reactors. Using EBR-II as a predecessor to emerging sodium-cooled reactor designs required the application of a method suitable to the structured and systematic analysis of the plant to assist in identifying key features of the work associated with it and to clarify the operational and other constraints. The analysis included the identification and description of operating scenarios that were considered characteristic of this type of nuclear power plant. This is an invaluable aspect of Operational Concept development since it typically reveals aspects of future plant configurations that will have an impact on operations. These include, for example, the effect of core design, different coolants, reactor-to-power conversion unit ratios, modular plant layout, modular versus central control rooms, plant siting, and many more. Multi-modular plants in particular are expected to have a significant impact on overall OpsCon in general, and human performance in particular. To support unconventional modes of operation, the modern control room of a multi-module plant would typically require advanced HSIs that would provide sophisticated operational information visualization, coupled with adaptive automation schemes and operator support systems to reduce complexity. These all have to be mapped at some point to human performance requirements. The EBR-II results will be used as a baseline that will be extrapolated in the extended Cognitive Work Analysis phase to the analysis of a selected advanced sodium-cooled SMR design as a way to establish non-conventional operational concepts. The Work Domain Analysis results achieved during this phase have not only established an organizing and analytical framework for describing existing sociotechnical systems, but have also indicated that the method is particularly suited to the analysis of prospective and immature designs. The results of the EBR-II Work Domain Analysis have indicated that the methodology is scientifically sound and generalizable to any operating environment.

    2. Renewable Diesel from Algal Lipids: An Integrated Baseline for Cost, Emissions, and Resource Potential from a Harmonized Model

      SciTech Connect (OSTI)

      Davis, R.; Fishman, D.; Frank, E. D.; Wigmosta, M. S.; Aden, A.; Coleman, A. M.; Pienkos, P. T.; Skaggs, R. J.; Venteris, E. R.; Wang, M. Q.

      2012-06-01

      The U.S. Department of Energy's Biomass Program has begun an initiative to obtain consistent quantitative metrics for algal biofuel production to establish an 'integrated baseline' by harmonizing and combining the Program's national resource assessment (RA), techno-economic analysis (TEA), and life-cycle analysis (LCA) models. The baseline attempts to represent a plausible near-term production scenario with freshwater microalgae growth, extraction of lipids, and conversion via hydroprocessing to produce a renewable diesel (RD) blendstock. Differences in the prior TEA and LCA models were reconciled (harmonized) and the RA model was used to prioritize and select the most favorable consortium of sites that supports production of 5 billion gallons per year of RD. Aligning the TEA and LCA models produced slightly higher costs and emissions compared to the pre-harmonized results. However, after then applying the productivities predicted by the RA model (13 g/m2/d on annual average vs. 25 g/m2/d in the original models), the integrated baseline resulted in markedly higher costs and emissions. The relationship between performance (cost and emissions) and either productivity or lipid fraction was found to be non-linear, and important implications on the TEA and LCA results were observed after introducing seasonal variability from the RA model. Increasing productivity and lipid fraction alone was insufficient to achieve cost and emission targets; however, combined with lower energy, less expensive alternative technology scenarios, emissions and costs were substantially reduced.

    3. Appendix A: GPRA08 benefits estimates: NEMS and MARKAL Model Baseline Cases

      SciTech Connect (OSTI)

      None, None

      2009-01-18

      Document summarizes the results of the benefits analysis of EERE’s programs, as described in the FY 2008 Budget Request. EERE estimates benefits for its overall portfolio and nine Research, Development, Demonstration, and Deployment (RD3) programs.

    4. Computational Tools to Assess Turbine Biological Performance

      SciTech Connect (OSTI)

      Richmond, Marshall C.; Serkowski, John A.; Rakowski, Cynthia L.; Strickler, Brad; Weisbeck, Molly; Dotson, Curtis L.

      2014-07-24

      Public Utility District No. 2 of Grant County (GCPUD) operates the Priest Rapids Dam (PRD), a hydroelectric facility on the Columbia River in Washington State. The dam contains 10 Kaplan-type turbine units that are now more than 50 years old. Plans are underway to refit these aging turbines with new runners. The Columbia River at PRD is a migratory pathway for several species of juvenile and adult salmonids, so passage of fish through the dam is a major consideration when upgrading the turbines. In this paper, a method for turbine biological performance assessment (BioPA) is demonstrated. Using this method, a suite of biological performance indicators is computed based on simulated data from a CFD model of a proposed turbine design. Each performance indicator is a measure of the probability of exposure to a certain dose of an injury mechanism. Using known relationships between the dose of an injury mechanism and frequency of injury (dose–response) from laboratory or field studies, the likelihood of fish injury for a turbine design can be computed from the performance indicator. By comparing the values of the indicators from proposed designs, the engineer can identify the more-promising alternatives. We present an application of the BioPA method for baseline risk assessment calculations for the existing Kaplan turbines at PRD that will be used as the minimum biological performance that a proposed new design must achieve.

    5. Parallel computing in enterprise modeling.

      SciTech Connect (OSTI)

      Goldsby, Michael E.; Armstrong, Robert C.; Shneider, Max S.; Vanderveen, Keith; Ray, Jaideep; Heath, Zach; Allan, Benjamin A.

      2008-08-01

      This report presents the results of our efforts to apply high-performance computing to entity-based simulations with a multi-use plugin for parallel computing. We use the term 'Entity-based simulation' to describe a class of simulation which includes both discrete event simulation and agent based simulation. What simulations of this class share, and what differs from more traditional models, is that the result sought is emergent from a large number of contributing entities. Logistic, economic and social simulations are members of this class where things or people are organized or self-organize to produce a solution. Entity-based problems never have an a priori ergodic principle that will greatly simplify calculations. Because the results of entity-based simulations can only be realized at scale, scalable computing is de rigueur for large problems. Having said that, the absence of a spatial organizing principal makes the decomposition of the problem onto processors problematic. In addition, practitioners in this domain commonly use the Java programming language which presents its own problems in a high-performance setting. The plugin we have developed, called the Parallel Particle Data Model, overcomes both of these obstacles and is now being used by two Sandia frameworks: the Decision Analysis Center, and the Seldon social simulation facility. While the ability to engage U.S.-sized problems is now available to the Decision Analysis Center, this plugin is central to the success of Seldon. Because Seldon relies on computationally intensive cognitive sub-models, this work is necessary to achieve the scale necessary for realistic results. With the recent upheavals in the financial markets, and the inscrutability of terrorist activity, this simulation domain will likely need a capability with ever greater fidelity. High-performance computing will play an important part in enabling that greater fidelity.

    6. Computers as tools

      SciTech Connect (OSTI)

      Eriksson, I.V.

      1994-12-31

      The following message was recently posted on a bulletin board and clearly shows the relevance of the conference theme: {open_quotes}The computer and digital networks seem poised to change whole regions of human activity -- how we record knowledge, communicate, learn, work, understand ourselves and the world. What`s the best framework for understanding this digitalization, or virtualization, of seemingly everything? ... Clearly, symbolic tools like the alphabet, book, and mechanical clock have changed some of our most fundamental notions -- self, identity, mind, nature, time, space. Can we say what the computer, a purely symbolic {open_quotes}machine,{close_quotes} is doing to our thinking in these areas? Or is it too early to say, given how much more powerful and less expensive the technology seems destinated to become in the next few decades?{close_quotes} (Verity, 1994) Computers certainly affect our lives and way of thinking but what have computers to do with ethics? A narrow approach would be that on the one hand people can and do abuse computer systems and on the other hand people can be abused by them. Weli known examples of the former are computer comes such as the theft of money, services and information. The latter can be exemplified by violation of privacy, health hazards and computer monitoring. Broadening the concept from computers to information systems (ISs) and information technology (IT) gives a wider perspective. Computers are just the hardware part of information systems which also include software, people and data. Information technology is the concept preferred today. It extends to communication, which is an essential part of information processing. Now let us repeat the question: What has IT to do with ethics? Verity mentioned changes in {open_quotes}how we record knowledge, communicate, learn, work, understand ourselves and the world{close_quotes}.

    7. High performance computing in chemistry and massively parallel computers: A simple transition?

      SciTech Connect (OSTI)

      Kendall, R.A.

      1993-03-01

      A review of the various problems facing any software developer targeting massively parallel processing (MPP) systems is presented. Issues specific to computational chemistry application software will be also outlined. Computational chemistry software ported to and designed for the Intel Touchstone Delta Supercomputer will be discussed. Recommendations for future directions will also be made.

    8. Results of the 2004 Knowledge and Opinions Surveys for the Baseline Knowledge Assessment of the U.S. Department of Energy Hydrogen Program

      SciTech Connect (OSTI)

      Schmoyer, Richard L; Truett, Lorena Faith; Cooper, Christy

      2006-04-01

      The U.S. Department of Energy (DOE) Hydrogen Program focuses on overcoming critical barriers to the widespread use of hydrogen fuel cell technology. The transition to a new, hydrogen-based energy economy requires an educated human infrastructure. With this in mind, the DOE Hydrogen Program conducted statistical surveys to measure and establish baselines for understanding and awareness about hydrogen, fuel cells, and a hydrogen economy. The baseline data will serve as a reference in designing an education program, and it will be used in comparisons with future survey results (2008 and 2011) to measure changes in understanding and awareness. Scientific sampling was used to survey four populations: (1) the general public, ages 18 and over; (2) students, ages 12-17; (3) state and local government officials; and (4) potential large-scale hydrogen users. It was decided that the survey design should include about 1,000 individuals in each of the general public and student categories, about 250 state and local officials, and almost 100 large-scale end users. The survey questions were designed to accomplish specific objectives. Technical questions measured technical understanding and awareness of hydrogen technology. Opinion questions measured attitudes about safety, cost, the environment, and convenience, as well as the likelihood of future applications of hydrogen technology. For most of the questions, "I don't know" or "I have no opinion" were acceptable answers. Questions about information sources assessed how energy technology information is received. The General Public and Student Survey samples were selected by random digit dialing. Potential large-scale end users were selected by random sampling. The State and Local Government Survey was of the entire targeted population of government officials (not a random sample). All four surveys were administered by computer-assisted telephone interviewing (CATI). For each population, the length of the survey was less than 15 minutes. Design of an education program is beyond the scope of the report, and comparisons of the baseline data with future results will not be made until the survey is fielded again. Nevertheless, a few observations about the data are salient: For every population group, average scores on the technical knowledge questions were lower for the fuel cell questions than for the other technical questions. State and local officials expressed more confidence in hydrogen safety than large-scale end users, and they were much more confident than either the general public or students. State and local officials also scored much higher on the technical questions. Technical understanding appears to influence opinions about safety. For the General Public, Student, and Large-Scale End User Surveys, respondents with above-average scores on the eleven technical questions were more likely to have an opinion about hydrogen technology safety, and for those respondents who expressed an opinion, their opinion was more likely to be positive. These differences were statistically significant. Using criteria of "Sometimes" or "Frequently" to describe usage, respondents rated media sources for obtaining energy information. The general public and students responded that television is the primary media source of energy information. State and local officials and large-scale end users indicated that their primary media sources are newspapers, the Internet, and science and technology journals. In order of importance, the general public values safety, cost, environment, and convenience. The Large-Scale End User Survey suggests that there is presently little penetration of hydrogen technology; nor is there much planning for it.

    9. Representation of Limited Rights Data and Restricted Computer Software |

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      Department of Energy Representation of Limited Rights Data and Restricted Computer Software Representation of Limited Rights Data and Restricted Computer Software PDF icon Representation of Limited Rights Data and Restricted Computer Software More Documents & Publications CLB-1003.PDF&#0; Intellectual Property Provisions (CSB-1003) Cooperative Agreement Research, Development, or Demonstration Domestic Small Businesses CDLB-1003.PDF&#0;

    10. Present and Future Computing

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      ... Important for DOE Energy Frontier Mission 2 * TH HEP is new ... & PDSF (studies based on usage for end of Sep 2012 - Nov ... framework (Sherpa), and a library for the computation of ...

    11. Argonne Leadership Computing Facility

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      a n n u a l r e p o r t 2 0 1 2 Argonne Leadership Computing Facility Director's Message .............................................................................................................................1 About ALCF ......................................................................................................................................... 2 IntroDuCIng MIrA Introducing Mira

    12. Cloud computing security.

      SciTech Connect (OSTI)

      Shin, Dongwan; Claycomb, William R.; Urias, Vincent E.

      2010-10-01

      Cloud computing is a paradigm rapidly being embraced by government and industry as a solution for cost-savings, scalability, and collaboration. While a multitude of applications and services are available commercially for cloud-based solutions, research in this area has yet to fully embrace the full spectrum of potential challenges facing cloud computing. This tutorial aims to provide researchers with a fundamental understanding of cloud computing, with the goals of identifying a broad range of potential research topics, and inspiring a new surge in research to address current issues. We will also discuss real implementations of research-oriented cloud computing systems for both academia and government, including configuration options, hardware issues, challenges, and solutions.

    13. Edison Electrifies Scientific Computing

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      ... Deployment of Edison was made possible in part by funding from DOE's Office of Science and the DARPA High Productivity Computing Systems program. DOE's Office of Science is the ...

    14. Systems, Methods and Computer Readable Media for Modeling Cell...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Systems, Methods and Computer Readable Media for Modeling Cell Performance Fade, Kinetic ... CellSage also supports battery system development by characterizing cell and string ...

    15. NREL: Water Power Research - Computer-Aided Engineering

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      sets of computer-aided engineering modeling tools to accelerate the development of marine hydrokinetic technologies and improve the performance of hydroelectric facilities. ...

    16. Ames Lab 101: Improving Materials with Advanced Computing

      ScienceCinema (OSTI)

      Johnson, Duane

      2014-06-04

      Ames Laboratory's Chief Research Officer Duane Johnson talks about using advanced computing to develop new materials and predict what types of properties those materials will have.

    17. Computing | Department of Energy

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      Computing Computing Fun fact: Most systems require air conditioning or chilled water to cool super powerful supercomputers, but the Olympus supercomputer at Pacific Northwest National Laboratory is cooled by the location's 65 degree groundwater. Traditional cooling systems could cost up to $61,000 in electricity each year, but this more efficient setup uses 70 percent less energy. | Photo courtesy of PNNL. Fun fact: Most systems require air conditioning or chilled water to cool super powerful

    18. Argonne Leadership Computing Facility

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Argonne National Laboratory | 9700 South Cass Avenue | Argonne, IL 60439 | www.anl.gov | September 2013 alcf_keyfacts_fs_0913 Key facts about the Argonne Leadership Computing Facility User support and services Skilled experts at the ALCF enable researchers to conduct breakthrough science on the Blue Gene system in key ways. Catalysts are computational scientist with domain expertise and work directly with project principal investigators to maximize discovery and reduce time-to- solution.

    19. New TRACC Cluster Computer

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      TRACC Cluster Computer With the addition of a new cluster called Zephyr that was made operational in September of this year (2012), TRACC now offers two clusters to choose from: Zephyr and our original cluster that has now been named Phoenix. Zephyr was acquired from Atipa technologies, and it is a 92-node system with each node having two AMD 16 core, 2.3 GHz, 32 GB processors. See also Computing Resources.

    20. Computational Modeling | Bioenergy | NREL

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Computational Modeling NREL uses computational modeling to increase the efficiency of biomass conversion by rational design using multiscale modeling, applying theoretical approaches, and testing scientific hypotheses. model of enzymes wrapping on cellulose; colorful circular structures entwined through blue strands Cellulosomes are complexes of protein scaffolds and enzymes that are highly effective in decomposing biomass. This is a snapshot of a coarse-grain model of complex cellulosome

    1. Computational Physics and Methods

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      2 Computational Physics and Methods Performing innovative simulations of physics phenomena on tomorrow's scientific computing platforms Growth and emissivity of young galaxy hosting a supermassive black hole as calculated in cosmological code ENZO and post-processed with radiative transfer code AURORA. image showing detailed turbulence simulation, Rayleigh-Taylor Turbulence imaging: the largest turbulence simulations to date Advanced multi-scale modeling Turbulence datasets Density iso-surfaces

    2. Advanced Simulation and Computing

      National Nuclear Security Administration (NNSA)

      NA-ASC-117R-09-Vol.1-Rev.0 Advanced Simulation and Computing PROGRAM PLAN FY09 October 2008 ASC Focal Point Robert Meisner, Director DOE/NNSA NA-121.2 202-586-0908 Program Plan Focal Point for NA-121.2 Njema Frazier DOE/NNSA NA-121.2 202-586-5789 A Publication of the Office of Advanced Simulation & Computing, NNSA Defense Programs i Contents Executive Summary ----------------------------------------------------------------------------------------------- 1 I. Introduction

    3. Compute Reservation Request Form

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Compute Reservation Request Form Compute Reservation Request Form Users can request a scheduled reservation of machine resources if their jobs have special needs that cannot be accommodated through the regular batch system. A reservation brings some portion of the machine to a specific user or project for an agreed upon duration. Typically this is used for interactive debugging at scale or real time processing linked to some experiment or event. It is not intended to be used to guarantee fast

    4. Applied Computer Science

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      7 Applied Computer Science Innovative co-design of applications, algorithms, and architectures in order to enable scientific simulations at extreme scale Leadership Group Leader Linn Collins Email Deputy Group Leader (Acting) Bryan Lally Email Climate modeling visualization Results from a climate simulation computed using the Model for Prediction Across Scales (MPAS) code. This visualization shows the temperature of ocean currents using a green and blue color scale. These colors were

    5. Intro to computer programming, no computer required! | Argonne...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      ... "Computational thinking requires you to think in abstractions," said Papka, who spoke to computer science and computer-aided design students at Kaneland High School in Maple Park about ...

    6. Fermilab computing at the Intensity Frontier

      DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

      Group, Craig; Fuess, S.; Gutsche, O.; Kirby, M.; Kutschke, R.; Lyon, A.; Norman, A.; Perdue, G.; Sexton-Kennedy, E.

      2015-12-23

      The Intensity Frontier refers to a diverse set of particle physics experiments using high- intensity beams. In this paper I will focus the discussion on the computing requirements and solutions of a set of neutrino and muon experiments in progress or planned to take place at the Fermi National Accelerator Laboratory located near Chicago, Illinois. In addition, the experiments face unique challenges, but also have overlapping computational needs. In principle, by exploiting the commonality and utilizing centralized computing tools and resources, requirements can be satisfied efficiently and scientists of individual experiments can focus more on the science and less onmore » the development of tools and infrastructure.« less

    7. Extreme Scale Computing, Co-design

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Extreme Scale Computing, Co-design Informing system design, ensuring productive and efficient code Project Description To address the increasingly complex problems of the modern world, scientists at Los Alamos are pushing the scale of computing to the extreme, forming partnerships with other national laboratories and industry to develop supercomputers that can achieve "exaflop" speeds-that is, a quintillion (a million trillion) calculations per second. To put such speed in perspective,

    8. computational-fluid-dynamics-student-thesis

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Fluid Dynamics Student Thesis Abstract DEVELOPMENT OF A THREE-DIMENSIONAL SCOURING METHODOLOGY AND ITS IMPLEMENTATION IN A COMMERCIAL CFD CODE FOR OPEN CHANNEL FLOW OVER A FLOODED BRIDGE DECK The Computational Fluid Dynamics staff at TRACC is supporting three students from Northern Illinois University who are working for a Masters degree. The CFD staff is directing the thesis research and working with them on three projects: (1) a three-dimensional scour computation methodology for pressure flow

    9. LANL computer model boosts engine efficiency

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      LANL computer model boosts engine efficiency LANL computer model boosts engine efficiency The KIVA model has been instrumental in helping researchers and manufacturers understand combustion processes, accelerate engine development and improve engine design and efficiency. September 25, 2012 KIVA simulation of an experimental engine with DOHC quasi-symmetric pent-roof combustion chamber and 4 valves. KIVA simulation of an experimental engine with DOHC quasi-symmetric pent-roof combustion chamber

    10. DOE ASSESSMENT SEAB Recommendations Related to High Performance Computing

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      of 10 DOE ASSESSMENT SEAB Recommendations Related to High Performance Computing 1. Introduction The Department of Energy (DOE) is planning to develop and deliver capable exascale computing systems by 2023-24. These systems are expected to have a one-hundred to one-thousand-fold increase in sustained performance over today's computing capabilities, capabilities critical to enabling the next-generation computing for national security, science, engineering, and large- scale data analytics needed to

    11. NNSA Announces Procurement of Penguin Computing Clusters to Support

      National Nuclear Security Administration (NNSA)

      Stockpile Stewardship at National Labs | National Nuclear Security Administration Announces Procurement of Penguin Computing Clusters to Support Stockpile Stewardship at National Labs October 20, 2015 The National Nuclear Security Administration's (NNSA's) Lawrence Livermore National Laboratory today announced the awarding of a subcontract to Penguin Computing - a leading developer of high-performance Linux cluster computing systems based in Silicon Valley - to bolster computing for

    12. Can Cloud Computing Address the Scientific Computing Requirements for DOE

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Researchers? Well, Yes, No and Maybe Can Cloud Computing Address the Scientific Computing Requirements for DOE Researchers? Well, Yes, No and Maybe Can Cloud Computing Address the Scientific Computing Requirements for DOE Researchers? Well, Yes, No and Maybe January 30, 2012 Jon Bashor, Jbashor@lbl.gov, +1 510-486-5849 Magellan1.jpg Magellan at NERSC After a two-year study of the feasibility of cloud computing systems for meeting the ever-increasing computational needs of scientists,

    13. in High Performance Computing Computer System, Cluster, and Networking...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      iSSH v. Auditd: Intrusion Detection in High Performance Computing Computer System, Cluster, and Networking Summer Institute David Karns, New Mexico State University Katy Protin,...

    14. A common language for computer security incidents

      SciTech Connect (OSTI)

      John D. Howard; Thomas A Longstaff

      1998-10-01

      Much of the computer security information regularly gathered and disseminated by individuals and organizations cannot currently be combined or compared because a common language has yet to emerge in the field of computer security. A common language consists of terms and taxonomies (principles of classification) which enable the gathering, exchange and comparison of information. This paper presents the results of a project to develop such a common language for computer security incidents. This project results from cooperation between the Security and Networking Research Group at the Sandia National Laboratories, Livermore, CA, and the CERT{reg_sign} Coordination Center at Carnegie Mellon University, Pittsburgh, PA. This Common Language Project was not an effort to develop a comprehensive dictionary of terms used in the field of computer security. Instead, the authors developed a minimum set of high-level terms, along with a structure indicating their relationship (a taxonomy), which can be used to classify and understand computer security incident information. They hope these high-level terms and their structure will gain wide acceptance, be useful, and most importantly, enable the exchange and comparison of computer security incident information. They anticipate, however, that individuals and organizations will continue to use their own terms, which may be more specific both in meaning and use. They designed the common language to enable these lower-level terms to be classified within the common language structure.

    15. Exploring HPCS Languages in Scientific Computing

      SciTech Connect (OSTI)

      Barrett, Richard F; Alam, Sadaf R; de Almeida, Valmor F; Bernholdt, David E; Elwasif, Wael R; Kuehn, Jeffery A; Poole, Stephen W; Shet, Aniruddha G

      2008-01-01

      As computers scale up dramatically to tens and hundreds of thousands of cores, develop deeper computational and memory hierarchies, and increased heterogeneity, developers of scientific software are increasingly challenged to express complex parallel simulations effectively and efficiently. In this paper, we explore the three languages developed under the DARPA High-Productivity Computing Systems (HPCS) program to help address these concerns: Chapel, Fortress, and X10. These languages provide a variety of features not found in currently popular HPC programming environments and make it easier to express powerful computational constructs, leading to new ways of thinking about parallel programming. Though the languages and their implementations are not yet mature enough for a comprehensive evaluation, we discuss some of the important features, and provide examples of how they can be used in scientific computing. We believe that these characteristics will be important to the future of high-performance scientific computing, whether the ultimate language of choice is one of the HPCS languages or something else.

    16. System configuration management plan for the TWRS controlled baseline database system [TCBD

      SciTech Connect (OSTI)

      Spencer, S.G.

      1998-09-23

      LHMC, TWRS Business Management Organization (BMO) is designated as system owner, operator, and maintenance authority. The TWAS BMO identified the need for the TCBD. The TWRS BMO users have established all requirements for the database and are responsible for maintaining database integrity and control (after the interface data has been received). Initial interface data control and integrity is maintained through functional and administrative processes and is the responsibility of the database owners who are providing the data. The specific groups within the TWRS BMO affected by this plan are the Financial Management and TWRS Management Support Project, Master Planning, and the Financial Control Integration and Reporting. The interfaces between these organizations are through normal line management chain of command. The Master Planning Group is assigned the responsibility to continue development and maintenance of the TCBD. This group maintains information that includes identification of requirements and changes to those requirements in a TCBD project file. They are responsible for the issuance, maintenance, and change authority of this SCW. LHMC, TWRS TCBD Users are designated as providing the project`s requirement changes for implementation and also testing of the TCBD during development. The Master Planning Group coordinates and monitors the user`s requests for system requirements (new/existing) as well as beta and acceptance testing. Users are those individuals and organizations needing data or information from the TCBD and having both a need-to-know and the proper training and authority to access the database. Each user or user organization is required to comply with the established requirements and procedures governing the TCBD. Lockheed Martin Services, Inc. (LMSI) is designated the TCBD developer, maintainer, and custodian until acceptance and process testing of the system has been completed via the TWRS BMO. Once this occurs, the TCBD will be completed and the software transferred to the system owner. The primary programming for the TCBD was developed by Selfware, Inc. The Automatic Data Processing (ADP) Acquisition Plan, Systems Development and Support Services for TWRS Performance Measurement and Control System, Acquisition Plan No. C94-705286 identifies work scope, deliverables, and cost.

    17. Computational design and analysis of flatback airfoil wind tunnel experiment.

      SciTech Connect (OSTI)

      Mayda, Edward A.; van Dam, C.P.; Chao, David D.; Berg, Dale E.

      2008-03-01

      A computational fluid dynamics study of thick wind turbine section shapes in the test section of the UC Davis wind tunnel at a chord Reynolds number of one million is presented. The goals of this study are to validate standard wind tunnel wall corrections for high solid blockage conditions and to reaffirm the favorable effect of a blunt trailing edge or flatback on the performance characteristics of a representative thick airfoil shape prior to building the wind tunnel models and conducting the experiment. The numerical simulations prove the standard wind tunnel corrections to be largely valid for the proposed test of 40% maximum thickness to chord ratio airfoils at a solid blockage ratio of 10%. Comparison of the computed lift characteristics of a sharp trailing edge baseline airfoil and derived flatback airfoils reaffirms the earlier observed trend of reduced sensitivity to surface contamination with increasing trailing edge thickness.

    18. Annual Report on Environmental Monitoring Activities for FY 1995 (Baseline Year) at Waste Area Grouping 6 at Oak Ridge National Laboratory, Oak Ridge, Tennessee

      SciTech Connect (OSTI)

      1996-06-01

      This report describes baseline contaminant release conditions for Waste Area Grouping (WAG) 6 at Oak Ridge National Laboratory (ORNL). The sampling approach and data analysis methods used to establish baseline conditions were presented in ``Environmental Monitoring Plan for Waste Area Grouping 6 at Oak Ridge National Laboratory, Oak Ridge, Tennessee (EMP).`` As outlined in the EMP, the purpose of the baseline monitoring year at WAG 6 was to determine the annual contaminant releases from the site during fiscal year 1995 (FY95) against which any potential changes in releases over time could be compared. The baseline year data set provides a comprehensive understanding of release conditions from all major waste units in the WAG through each major contaminant transport pathway. Due to a mandate to reduce all monitoring work, WAG 6 monitoring was scaled back and reporting efforts on the baseline year results are being minimized. This report presents the quantified baseline year contaminant flux conditions for the site and briefly summarizes other findings. All baseline data cited in this report will reside in the Oak Ridge Environmental Information system (OREIS) database, and will be available for use in future years as the need arises to identify potential release changes.

    19. computational-structural-mechanics-training

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Table of Contents Date Location Training Course: HyperMesh and HyperView April 12-14, 2011 Argonne TRACC Argonne, IL Introductory Course: Developing Compute-efficient, Quality Models with LS-PrePostÂź 3 on the TRACC Cluster October 21-22, 2010 Argonne TRACC West Chicago, IL Modeling and Simulation with LS-DYNAÂź: Insights into Modeling with a Goal of Providing Credible Predictive Simulations February 11-12, 2010 Argonne TRACC West Chicago, IL Introductory Course: Using LS-OPTÂź on the TRACC

    20. Baseline and Postremediation Monitoring Program Plan for the Lower East Fork Poplar Creek operable unit, Oak Ridge, Tennessee

      SciTech Connect (OSTI)

      1996-04-01

      This report was prepared in accordance with CERCLA requirements to present the plan for baseline and postremediation monitoring as part of the selected remedy. It provides the Environmental Restoration Program with information about the requirements to monitor for soil and terrestrial biota in the Lower East Fork Poplar Creek (LEFPC) floodplain; sediment, surface water, and aquatic biota in LEFPC; wetland restoration in the LEFPC floodplain; and human use of shallow groundwater wells in the LEFPC floodplain for drinking water. This document describes the monitoring program that will ensure that actions taken under Phases I and II of the LEFPC remedial action are protective of human health and the environment.

    1. Description of Model Data for SNL100-00: The Sandia 100-meter All-glass Baseline Wind Turbine

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      00-00: The Sandia 100-meter All-glass Baseline Wind Turbine Blade D. Todd Griffith, Brian R. Resor Sandia National Laboratories Wind and Water Power Technologies Department Introduction This document provides a brief description of model files that are available for the SNL100-00 blade [1]. For each file, codes used to create/read the model files are detailed (e.g. code version and date, description, etc). A summary of the blade model data is also provided from the design report [1]. A Design

    2. Extensible Computational Chemistry Environment

      Energy Science and Technology Software Center (OSTI)

      2012-08-09

      ECCE provides a sophisticated graphical user interface, scientific visualization tools, and the underlying data management framework enabling scientists to efficiently set up calculations and store, retrieve, and analyze the rapidly growing volumes of data produced by computational chemistry studies. ECCE was conceived as part of the Environmental Molecular Sciences Laboratory construction to solve the problem of researchers being able to effectively utilize complex computational chemistry codes and massively parallel high performance compute resources. Bringing themore » power of these codes and resources to the desktops of researcher and thus enabling world class research without users needing a detailed understanding of the inner workings of either the theoretical codes or the supercomputers needed to run them was a grand challenge problem in the original version of the EMSL. ECCE allows collaboration among researchers using a web-based data repository where the inputs and results for all calculations done within ECCE are organized. ECCE is a first of kind end-to-end problem solving environment for all phases of computational chemistry research: setting up calculations with sophisticated GUI and direct manipulation visualization tools, submitting and monitoring calculations on remote high performance supercomputers without having to be familiar with the details of using these compute resources, and performing results visualization and analysis including creating publication quality images. ECCE is a suite of tightly integrated applications that are employed as the user moves through the modeling process.« less

    3. On the Use of an ER-213 Detonator to Establish a Baseline for the ER-486

      SciTech Connect (OSTI)

      Thomas, Keith A.; Liechty, Gary H.; Jaramillo, Dennis C.; Munger, Alan C.; McHugh, Douglas C.; Kennedy, James E.

      2014-08-19

      This report documents a series of tests using a TSD-115 fireset coupled with an ER-213, a gold exploding bridgewire (EBW) detonator. These tests were designed to fire this EBW with a smaller fireset to obtain current and voltage data as well as timing information at voltage levels below, above, and throughout the threshold firing region. This study could then create a database for comparison to our current ER-486 EBW development, which is designed to be a lower voltage (<500V) device.

    4. Information Science, Computing, Applied Math

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Information Science, Computing, Applied Math /science-innovation/_assets/images/icon-science.jpg Information Science, Computing, Applied Math National security depends on science and technology. The United States relies on Los Alamos National Laboratory for the best of both. No place on Earth pursues a broader array of world-class scientific endeavors. Computer, Computational, and Statistical Sciences (CCS)» High Performance Computing (HPC)» Extreme Scale Computing, Co-design» supercomputing

    5. Modeling of Electric Water Heaters for Demand Response: A Baseline PDE Model

      SciTech Connect (OSTI)

      Xu, Zhijie; Diao, Ruisheng; Lu, Shuai; Lian, Jianming; Zhang, Yu

      2014-09-05

      Demand response (DR)control can effectively relieve balancing and frequency regulation burdens on conventional generators, facilitate integrating more renewable energy, and reduce generation and transmission investments needed to meet peak demands. Electric water heaters (EWHs) have a great potential in implementing DR control strategies because: (a) the EWH power consumption has a high correlation with daily load patterns; (b) they constitute a significant percentage of domestic electrical load; (c) the heating element is a resistor, without reactive power consumption; and (d) they can be used as energy storage devices when needed. Accurately modeling the dynamic behavior of EWHs is essential for designing DR controls. Various water heater models, simplified to different extents, were published in the literature; however, few of them were validated against field measurements, which may result in inaccuracy when implementing DR controls. In this paper, a partial differential equation physics-based model, developed to capture detailed temperature profiles at different tank locations, is validated against field test data for more than 10 days. The developed model shows very good performance in capturing water thermal dynamics for benchmark testing purposes

    6. NREL: Computational Science Home Page

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      high-performance computing, computational science, applied mathematics, scientific data management, visualization, and informatics. NREL is home to the largest high performance...

    7. computers | National Nuclear Security Administration

      National Nuclear Security Administration (NNSA)

      Sandia donates 242 computers to northern California schools Sandia National Laboratories electronics technologist Mitch Williams prepares the disassembly of 242 computers for ...

    8. Careers | Argonne Leadership Computing Facility

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      At the Argonne Leadership Computing Facility, we are helping to redefine what's possible in computational science. With some of the most powerful supercomputers in the world and a ...

    9. Computer simulation | Open Energy Information

      Open Energy Info (EERE)

      Computer simulation Jump to: navigation, search OpenEI Reference LibraryAdd to library Web Site: Computer simulation Author wikipedia Published wikipedia, 2013 DOI Not Provided...

    10. Super recycled water: quenching computers

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Super recycled water: quenching computers Super recycled water: quenching computers New facility and methods support conserving water and creating recycled products. Using reverse ...

    11. Human-computer interface

      DOE Patents [OSTI]

      Anderson, Thomas G.

      2004-12-21

      The present invention provides a method of human-computer interfacing. Force feedback allows intuitive navigation and control near a boundary between regions in a computer-represented space. For example, the method allows a user to interact with a virtual craft, then push through the windshield of the craft to interact with the virtual world surrounding the craft. As another example, the method allows a user to feel transitions between different control domains of a computer representation of a space. The method can provide for force feedback that increases as a user's locus of interaction moves near a boundary, then perceptibly changes (e.g., abruptly drops or changes direction) when the boundary is traversed.

    12. Computation Directorate 2007 Annual Report

      SciTech Connect (OSTI)

      Henson, V E; Guse, J A

      2008-03-06

      If there is a single word that both characterized 2007 and dominated the thoughts and actions of many Laboratory employees throughout the year, it is transition. Transition refers to the major shift that took place on October 1, when the University of California relinquished management responsibility for Lawrence Livermore National Laboratory (LLNL), and Lawrence Livermore National Security, LLC (LLNS), became the new Laboratory management contractor for the Department of Energy's (DOE's) National Nuclear Security Administration (NNSA). In the 55 years under the University of California, LLNL amassed an extraordinary record of significant accomplishments, clever inventions, and momentous contributions in the service of protecting the nation. This legacy provides the new organization with a built-in history, a tradition of excellence, and a solid set of core competencies from which to build the future. I am proud to note that in the nearly seven years I have had the privilege of leading the Computation Directorate, our talented and dedicated staff has made far-reaching contributions to the legacy and tradition we passed on to LLNS. Our place among the world's leaders in high-performance computing, algorithmic research and development, applications, and information technology (IT) services and support is solid. I am especially gratified to report that through all the transition turmoil, and it has been considerable, the Computation Directorate continues to produce remarkable achievements. Our most important asset--the talented, skilled, and creative people who work in Computation--has continued a long-standing Laboratory tradition of delivering cutting-edge science even in the face of adversity. The scope of those achievements is breathtaking, and in 2007, our accomplishments span an amazing range of topics. From making an important contribution to a Nobel Prize-winning effort to creating tools that can detect malicious codes embedded in commercial software; from expanding BlueGene/L, the world's most powerful computer, by 60% and using it to capture the most prestigious prize in the field of computing, to helping create an automated control system for the National Ignition Facility (NIF) that monitors and adjusts more than 60,000 control and diagnostic points; from creating a microarray probe that rapidly detects virulent high-threat organisms, natural or bioterrorist in origin, to replacing large numbers of physical computer servers with small numbers of virtual servers, reducing operating expense by 60%, the people in Computation have been at the center of weighty projects whose impacts are felt across the Laboratory and the DOE community. The accomplishments I just mentioned, and another two dozen or so, make up the stories contained in this report. While they form an exceptionally diverse set of projects and topics, it is what they have in common that excites me. They share the characteristic of being central, often crucial, to the mission-driven business of the Laboratory. Computational science has become fundamental to nearly every aspect of the Laboratory's approach to science and even to the conduct of administration. It is difficult to consider how we would proceed without computing, which occurs at all scales, from handheld and desktop computing to the systems controlling the instruments and mechanisms in the laboratories to the massively parallel supercomputers. The reasons for the dramatic increase in the importance of computing are manifest. Practical, fiscal, or political realities make the traditional approach to science, the cycle of theoretical analysis leading to experimental testing, leading to adjustment of theory, and so on, impossible, impractical, or forbidden. How, for example, can we understand the intricate relationship between human activity and weather and climate? We cannot test our hypotheses by experiment, which would require controlled use of the entire earth over centuries. It is only through extremely intricate, detailed computational simulation that we can test our theories, and simulating weather and climate over the entire globe requires the most massive high-performance computers that exist. Such extreme problems are found in numerous laboratory missions, including astrophysics, weapons programs, materials science, and earth science.

    13. Baseline risk assessment of ground water contamination at the Uranium Mill Tailings Sites near Rifle, Colorado

      SciTech Connect (OSTI)

      1995-05-01

      The ground water project evaluates the nature and extent of ground water contamination resulting from the uranium ore processing activities. This report is a site specific document that will be used to evaluate current and future impacts to the public and the environment from exposure to contaminated ground water. Currently, no one is using the ground water and therefore, no one is at risk. However, the land will probably be developed in the future and so the possibility of people using the ground water does exist. This report examines the future possibility of health hazards resulting from the ingestion of contaminated drinking water, skin contact, fish ingestion, or contact with surface waters and sediments.

    14. SCFA lead lab technical assistance at Lawrence Berkeley National Laboratory: Baseline review of three groundwater plumes

      SciTech Connect (OSTI)

      Hazen, Terry; et al.

      2002-09-26

      During the closeout session, members of the technical assistance team conveyed to the site how impressed they were at the thoroughness of the site's investigation and attempts at remediation. Team members were uniformly pleased at the skilled detection work to identify sources, make quick remediation decisions, and change course when a strategy did not work well. The technical assistance team also noted that, to their knowledge, this is the only DOE site at which a world-class scientist has had primary responsibility for the environmental restoration activities. This has undoubtedly contributed to the successes observed and DOE should take careful note. The following overall recommendations were agreed upon: (1) The site has done a phenomenal job of characterization and identifying and removing source terms. (2) Technologies selected to date are appropriate and high impact, e.g. collection trenches are an effective remedial strategy for this complicated geology. The site should continue using technology that is adapted to the site's unique geology, such as the collection trenches. (3) The site should develop a better way to determine the basis of cleanup for all sites. (4) The sentinel well system should be evaluated and modified, if needed, to assure that the sentinel wells provide coverage to the current site boundary. Potential modifications could include installation, abandonment or relocation of wells based on the large amount of data collected since the original sentinel well system was designed. (5) Modeling to assist in remedial design and communication should continue. (6) The site should develop a plan to ensure institutional memory. (7) The most likely possibility for improving closure to 2006 is by removing the residual source of the Old Town plume and establishing the efficacy of remediation for the 51/64 plume.

    15. Low No{sub x}/SO{sub x} burner retrofit for utility cyclone boilers. Baseline test report: Issue A

      SciTech Connect (OSTI)

      Moore, K.; Martin, L.; Smith, J.

      1991-05-01

      The Low NO{sub x}/SO{sub x} (LNS) Burner Retrofit for Utility Cyclone Boilers program consists of the retrofit and subsequent demonstration of the technology at Southern Illinois Power Cooperative`s (SIPC`s) 33-MW unit 1 cyclone boiler located near Marion, Illinois. The LNS Burner employs a simple innovative combustion process burning high-sulfur Illinois coal to provide substantial SO{sub 2} and NO{sub x} control within the burner. A complete series of boiler performance and characterization tests, called the baseline tests, was conducted in October 1990 on unit 1 of SIPC`s Marion Station. The primary objective of the baseline test was to collect data from the existing plant that could provide a comparison of performance after the LNS Burner retrofit. These data could confirm the LNS Burner`s SO{sub x} and NO{sub x} emissions control and any effect on boiler operation. Further, these tests would provide to the project experience with the operating characteristics of the host unit as well as engineering design information to minimize technical uncertainties in the application of the LNS Burner technology.

    16. GPU COMPUTING FOR PARTICLE TRACKING

      SciTech Connect (OSTI)

      Nishimura, Hiroshi; Song, Kai; Muriki, Krishna; Sun, Changchun; James, Susan; Qin, Yong

      2011-03-25

      This is a feasibility study of using a modern Graphics Processing Unit (GPU) to parallelize the accelerator particle tracking code. To demonstrate the massive parallelization features provided by GPU computing, a simplified TracyGPU program is developed for dynamic aperture calculation. Performances, issues, and challenges from introducing GPU are also discussed. General purpose Computation on Graphics Processing Units (GPGPU) bring massive parallel computing capabilities to numerical calculation. However, the unique architecture of GPU requires a comprehensive understanding of the hardware and programming model to be able to well optimize existing applications. In the field of accelerator physics, the dynamic aperture calculation of a storage ring, which is often the most time consuming part of the accelerator modeling and simulation, can benefit from GPU due to its embarrassingly parallel feature, which fits well with the GPU programming model. In this paper, we use the Tesla C2050 GPU which consists of 14 multi-processois (MP) with 32 cores on each MP, therefore a total of 448 cores, to host thousands ot threads dynamically. Thread is a logical execution unit of the program on GPU. In the GPU programming model, threads are grouped into a collection of blocks Within each block, multiple threads share the same code, and up to 48 KB of shared memory. Multiple thread blocks form a grid, which is executed as a GPU kernel. A simplified code that is a subset of Tracy++ [2] is developed to demonstrate the possibility of using GPU to speed up the dynamic aperture calculation by having each thread track a particle.

    17. Synchronizing compute node time bases in a parallel computer

      DOE Patents [OSTI]

      Chen, Dong; Faraj, Daniel A; Gooding, Thomas M; Heidelberger, Philip

      2015-01-27

      Synchronizing time bases in a parallel computer that includes compute nodes organized for data communications in a tree network, where one compute node is designated as a root, and, for each compute node: calculating data transmission latency from the root to the compute node; configuring a thread as a pulse waiter; initializing a wakeup unit; and performing a local barrier operation; upon each node completing the local barrier operation, entering, by all compute nodes, a global barrier operation; upon all nodes entering the global barrier operation, sending, to all the compute nodes, a pulse signal; and for each compute node upon receiving the pulse signal: waking, by the wakeup unit, the pulse waiter; setting a time base for the compute node equal to the data transmission latency between the root node and the compute node; and exiting the global barrier operation.

    18. Synchronizing compute node time bases in a parallel computer

      DOE Patents [OSTI]

      Chen, Dong; Faraj, Daniel A; Gooding, Thomas M; Heidelberger, Philip

      2014-12-30

      Synchronizing time bases in a parallel computer that includes compute nodes organized for data communications in a tree network, where one compute node is designated as a root, and, for each compute node: calculating data transmission latency from the root to the compute node; configuring a thread as a pulse waiter; initializing a wakeup unit; and performing a local barrier operation; upon each node completing the local barrier operation, entering, by all compute nodes, a global barrier operation; upon all nodes entering the global barrier operation, sending, to all the compute nodes, a pulse signal; and for each compute node upon receiving the pulse signal: waking, by the wakeup unit, the pulse waiter; setting a time base for the compute node equal to the data transmission latency between the root node and the compute node; and exiting the global barrier operation.

    19. Computer Security Risk Assessment

      Energy Science and Technology Software Center (OSTI)

      1992-02-11

      LAVA/CS (LAVA for Computer Security) is an application of the Los Alamos Vulnerability Assessment (LAVA) methodology specific to computer and information security. The software serves as a generic tool for identifying vulnerabilities in computer and information security safeguards systems. Although it does not perform a full risk assessment, the results from its analysis may provide valuable insights into security problems. LAVA/CS assumes that the system is exposed to both natural and environmental hazards and tomore » deliberate malevolent actions by either insiders or outsiders. The user in the process of answering the LAVA/CS questionnaire identifies missing safeguards in 34 areas ranging from password management to personnel security and internal audit practices. Specific safeguards protecting a generic set of assets (or targets) from a generic set of threats (or adversaries) are considered. There are four generic assets: the facility, the organization''s environment; the hardware, all computer-related hardware; the software, the information in machine-readable form stored both on-line or on transportable media; and the documents and displays, the information in human-readable form stored as hard-copy materials (manuals, reports, listings in full-size or microform), film, and screen displays. Two generic threats are considered: natural and environmental hazards, storms, fires, power abnormalities, water and accidental maintenance damage; and on-site human threats, both intentional and accidental acts attributable to a perpetrator on the facility''s premises.« less

    20. Applications in Data-Intensive Computing

      SciTech Connect (OSTI)

      Shah, Anuj R.; Adkins, Joshua N.; Baxter, Douglas J.; Cannon, William R.; Chavarría-Miranda, Daniel; Choudhury, Sutanay; Gorton, Ian; Gracio, Deborah K.; Halter, Todd D.; Jaitly, Navdeep; Johnson, John R.; Kouzes, Richard T.; Macduff, Matt C.; Marquez, Andres; Monroe, Matthew E.; Oehmen, Christopher S.; Pike, William A.; Scherrer, Chad; Villa, Oreste; Webb-Robertson, Bobbie-Jo M.; Whitney, Paul D.; Zuljevic, Nino

      2010-04-01

      This book chapter, to be published in Advances in Computers, Volume 78, in 2010 describes applications of data intensive computing (DIC). This is an invited chapter resulting from a previous publication on DIC. This work summarizes efforts coming out of the PNNL's Data Intensive Computing Initiative. Advances in technology have empowered individuals with the ability to generate digital content with mouse clicks and voice commands. Digital pictures, emails, text messages, home videos, audio, and webpages are common examples of digital content that are generated on a regular basis. Data intensive computing facilitates human understanding of complex problems. Data-intensive applications provide timely and meaningful analytical results in response to exponentially growing data complexity and associated analysis requirements through the development of new classes of software, algorithms, and hardware.