National Library of Energy BETA

Sample records for develop baseline computational

  1. Baseline Glass Development for Combined Fission Products Waste Streams

    SciTech Connect (OSTI)

    Crum, Jarrod V.; Billings, Amanda Y.; Lang, Jesse B.; Marra, James C.; Rodriguez, Carmen P.; Ryan, Joseph V.; Vienna, John D.

    2009-06-29

    Borosilicate glass was selected as the baseline technology for immobilization of the Cs/Sr/Ba/Rb (Cs), lanthanide (Ln) and transition metal fission product (TM) waste steams as part of a cost benefit analysis study.[1] Vitrification of the combined waste streams have several advantages, minimization of the number of waste forms, a proven technology, and similarity to waste forms currently accepted for repository disposal. A joint study was undertaken by Pacific Northwest National Laboratory (PNNL) and Savannah River National Laboratory (SRNL) to develop acceptable glasses for the combined Cs + Ln + TM waste streams (Option 1) and Cs + Ln combined waste streams (Option 2) generated by the AFCI UREX+ set of processes. This study is aimed to develop baseline glasses for both combined waste stream options and identify key waste components and their impact on waste loading. The elemental compositions of the four-corners study were used along with the available separations data to determine the effect of burnup, decay, and separations variability on estimated waste stream compositions.[2-5] Two different components/scenarios were identified that could limit waste loading of the combined Cs + LN + TM waste streams, where as the combined Cs + LN waste stream has no single component that is perceived to limit waste loading. Combined Cs + LN waste stream in a glass waste form will most likely be limited by heat due to the high activity of Cs and Sr isotopes.

  2. A SURVEY OF ASTRONOMICAL RESEARCH: A BASELINE FOR ASTRONOMICAL DEVELOPMENT

    SciTech Connect (OSTI)

    Ribeiro, V. A. R. M.; Russo, P.; Crdenas-Avendao, A. E-mail: russo@strw.leidenuniv.nl

    2013-12-01

    Measuring scientific development is a difficult task. Different metrics have been put forward to evaluate scientific development; in this paper we explore a metric that uses the number of peer-reviewed, and when available non-peer-reviewed, research articles as an indicator of development in the field of astronomy. We analyzed the available publication record, using the Smithsonian Astrophysical Observatory/NASA Astrophysics Database System, by country affiliation in the time span between 1950 and 2011 for countries with a gross national income of less than 14,365 USD in 2010. This represents 149 countries. We propose that this metric identifies countries in ''astronomical development'' with a culture of research publishing. We also propose that for a country to develop in astronomy, it should invest in outside expert visits, send its staff abroad to study, and establish a culture of scientific publishing. Furthermore, we propose that this paper may be used as a baseline to measure the success of major international projects, such as the International Year of Astronomy 2009.

  3. developing-compute-efficient

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Developing Compute-efficient, Quality Models with LS-PrePost 3 on the TRACC Cluster Oct. ... with an emphasis on applying these capabilities to build computationally efficient models. ...

  4. Development Of Regional Climate Mitigation Baseline For A DominantAgro-Ecological Zone Of Karnataka, India

    SciTech Connect (OSTI)

    Sudha, P.; Shubhashree, D.; Khan, H.; Hedge, G.T.; Murthy, I.K.; Shreedhara, V.; Ravindranath, N.H.

    2007-06-01

    Setting a baseline for carbon stock changes in forest andland use sector mitigation projects is an essential step for assessingadditionality of the project. There are two approaches for settingbaselines namely, project-specific and regional baseline. This paperpresents the methodology adopted for estimating the land available formitigation, for developing a regional baseline, transaction cost involvedand a comparison of project-specific and regional baseline. The studyshowed that it is possible to estimate the potential land and itssuitability for afforestation and reforestation mitigation projects,using existing maps and data, in the dry zone of Karnataka, southernIndia. The study adopted a three-step approach for developing a regionalbaseline, namely: i) identification of likely baseline options for landuse, ii) estimation of baseline rates of land-use change, and iii)quantification of baseline carbon profile over time. The analysis showedthat carbon stock estimates made for wastelands and fallow lands forproject-specific as well as the regional baseline are comparable. Theratio of wasteland Carbon stocks of a project to regional baseline is1.02, and that of fallow lands in the project to regional baseline is0.97. The cost of conducting field studies for determination of regionalbaseline is about a quarter of the cost of developing a project-specificbaseline on a per hectare basis. The study has shown the reliability,feasibility and cost-effectiveness of adopting regional baseline forforestry sectormitigation projects.

  5. Initial Comparisons between the Advanced Technology Development Gen 2 Baseline Cells and Variant C Cells

    SciTech Connect (OSTI)

    Christophersen, Jon Petter; Motloch, Chester George; Wright, Randy Ben; Murphy, Timothy Collins; Belt, Jeffrey R; Ho, Chinh Dac; Bloom, Ira D.; Jones, S. A.; Battaglia, Vincent S.; Jungst, Rudy G.; Case, Herb L.; Sutula, Raymond A.; Barnes, James A.; Duong, Tien Q.

    2002-06-01

    The Advanced Technology Development Program is testing a second generation of lithium-ion cells, consisting of a baseline and three variant chemistries. The cathode composition of the Variant C chemistry was altered with an increase to the aluminum dopant and a decrease to the cobalt dopant to explore the impact on performance. However, it resulted in a 20% drop in rated capacity. Also, the Variant C average power fade is higher, but capacity fade is higher for the Baseline cell chemistry. Initial results indicate that the Variant C chemistry will reach end of life sooner than the Baseline chemistry.

  6. Development of computer graphics

    SciTech Connect (OSTI)

    Nuttall, H.E.

    1989-07-01

    The purpose of this project was to screen and evaluate three graphics packages as to their suitability for displaying concentration contour graphs. The information to be displayed is from computer code simulations describing air-born contaminant transport. The three evaluation programs were MONGO (John Tonry, MIT, Cambridge, MA, 02139), Mathematica (Wolfram Research Inc.), and NCSA Image (National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign). After a preliminary investigation of each package, NCSA Image appeared to be significantly superior for generating the desired concentration contour graphs. Hence subsequent work and this report describes the implementation and testing of NCSA Image on both an Apple MacII and Sun 4 computers. NCSA Image includes several utilities (Layout, DataScope, HDF, and PalEdit) which were used in this study and installed on Dr. Ted Yamada`s Mac II computer. Dr. Yamada provided two sets of air pollution plume data which were displayed using NCSA Image. Both sets were animated into a sequential expanding plume series.

  7. Tools for Closure Project and Contract Management: Development of the Rocky Flats Integrated Closure Project Baseline

    SciTech Connect (OSTI)

    Gelles, C. M.; Sheppard, F. R.

    2002-02-26

    This paper details the development of the Rocky Flats Integrated Closure Project Baseline - an innovative project management effort undertaken to ensure proactive management of the Rocky Flats Closure Contract in support of the Department's goal for achieving the safe closure of the Rocky Flats Environmental Technology Site (RFETS) in December 2006. The accelerated closure of RFETS is one of the most prominent projects within the Department of Energy (DOE) Environmental Management program. As the first major former weapons plant to be remediated and closed, it is a first-of-kind effort requiring the resolution of multiple complex technical and institutional challenges. Most significantly, the closure of RFETS is dependent upon the shipment of all special nuclear material and wastes to other DOE sites. The Department is actively working to strengthen project management across programs, and there is increasing external interest in this progress. The development of the Rocky Flats Integrated Closure Project Baseline represents a groundbreaking and cooperative effort to formalize the management of such a complex project across multiple sites and organizations. It is original in both scope and process, however it provides a useful precedent for the other ongoing project management efforts within the Environmental Management program.

  8. Development of baseline water quality stormwater detention pond model for Chesapeake Bay catchments

    SciTech Connect (OSTI)

    Musico, W.J.; Yoon, J.

    1999-07-01

    An environmental impact assessment is required for every proposed development in the Commonwealth of Virginia to help identify areas of potential concerns. The purpose of the Chesapeake Bay Local Assistance Department (CBLAD), Guidance Calculation Procedures is to ensure that development of previously constructed areas do not further exacerbate current problems of stormwater-induced eutrophication and downstream flooding. The methodology is based on the post development conditions that will not generate greater peak flows and will result in a 10% overall reduction of total phosphorus. Currently, several well-known models can develop hydrographs and pollutographs that accurately model the real response of a given watershed to any given rainfall event. However, conventional method of achieving the desired peak flow reduction and pollutant removal is not a deterministic procedure, and is inherently a trail and error process. A method of quickly and accurately determining the required size of stormwater easements was developed to evaluate the effectiveness of alternative stormwater collection and treatment systems. In this method, predevelopment conditions were modeled first to estimate the peak flows and subsequent pollutants generation that can be used as a baseline for post development plan. Resulting stormwater easement estimates facilitate decision-making processes during the planning and development phase of a project. The design can be optimized for the minimum cost or the smallest-possible pond size required for peak flow reduction and detention time given the most basic data such as: inflow hydrograph and maximum allowable pond depth.

  9. Baseline data for the residential sector and development of a residential forecasting database

    SciTech Connect (OSTI)

    Hanford, J.W.; Koomey, J.G.; Stewart, L.E.; Lecar, M.E.; Brown, R.E.; Johnson, F.X.; Hwang, R.J.; Price, L.K.

    1994-05-01

    This report describes the Lawrence Berkeley Laboratory (LBL) residential forecasting database. It provides a description of the methodology used to develop the database and describes the data used for heating and cooling end-uses as well as for typical household appliances. This report provides information on end-use unit energy consumption (UEC) values of appliances and equipment historical and current appliance and equipment market shares, appliance and equipment efficiency and sales trends, cost vs efficiency data for appliances and equipment, product lifetime estimates, thermal shell characteristics of buildings, heating and cooling loads, shell measure cost data for new and retrofit buildings, baseline housing stocks, forecasts of housing starts, and forecasts of energy prices and other economic drivers. Model inputs and outputs, as well as all other information in the database, are fully documented with the source and an explanation of how they were derived.

  10. Baseline information development for energy smart schools -- applied research, field testing and technology integration

    SciTech Connect (OSTI)

    Xu, Tengfang; Piette, Mary Ann

    2004-08-05

    The original scope of work was to obtain and analyze existing and emerging data in four states: California, Florida, New York, and Wisconsin. The goal of this data collection was to deliver a baseline database or recommendations for such a database that could possibly contain window and daylighting features and energy performance characteristics of Kindergarten through 12th grade (K-12) school buildings (or those of classrooms when available). In particular, data analyses were performed based upon the California Commercial End-Use Survey (CEUS) databases to understand school energy use, features of window glazing, and availability of daylighting in California K-12 schools. The outcomes from this baseline task can be used to assist in establishing a database of school energy performance, assessing applications of existing technologies relevant to window and daylighting design, and identifying future R&D needs. These are in line with the overall project goals as outlined in the proposal. Through the review and analysis of this data, it is clear that there are many compounding factors impacting energy use in K-12 school buildings in the U.S., and that there are various challenges in understanding the impact of K-12 classroom energy use associated with design features of window glazing and skylight. First, the energy data in the existing CEUS databases has, at most, provided the aggregated electricity and/or gas usages for the building establishments that include other school facilities on top of the classroom spaces. Although the percentage of classroom floor area in schools is often available from the databases, there is no additional information that can be used to quantitatively segregate the EUI for classroom spaces. In order to quantify the EUI for classrooms, sub-metering of energy usage by classrooms must be obtained. Second, magnitudes of energy use for electricity lighting are not attainable from the existing databases, nor are the lighting levels contributed

  11. Development and application of a statistical methodology to evaluate the predictive accuracy of building energy baseline models

    SciTech Connect (OSTI)

    Granderson, Jessica; Price, Phillip N.

    2014-03-01

    This paper documents the development and application of a general statistical methodology to assess the accuracy of baseline energy models, focusing on its application to Measurement and Verification (M&V) of whole-­building energy savings. The methodology complements the principles addressed in resources such as ASHRAE Guideline 14 and the International Performance Measurement and Verification Protocol. It requires fitting a baseline model to data from a ``training period’’ and using the model to predict total electricity consumption during a subsequent ``prediction period.’’ We illustrate the methodology by evaluating five baseline models using data from 29 buildings. The training period and prediction period were varied, and model predictions of daily, weekly, and monthly energy consumption were compared to meter data to determine model accuracy. Several metrics were used to characterize the accuracy of the predictions, and in some cases the best-­performing model as judged by one metric was not the best performer when judged by another metric.

  12. High energy neutron Computed Tomography developed

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    High energy neutron Computed Tomography developed High energy neutron Computed Tomography developed LANSCE now has a high-energy neutron imaging capability that can be deployed on WNR flight paths for unclassified and classified objects. May 9, 2014 Neutron tomography horizontal "slice" of a tungsten and polyethylene test object containing tungsten carbide BBs. Neutron tomography horizontal "slice" of a tungsten and polyethylene test object containing tungsten carbide BBs.

  13. Development And Implementation Of A Strategic Technical Baseline Approach For Nuclear Decommissioning And Clean Up Programmes In The UK

    SciTech Connect (OSTI)

    Brownridge, M.; Ensor, B.

    2008-07-01

    The NDA mission as set out within the Energy Act 2004 and stated in the NDA strategy is clear: - 'to deliver a world class programme of safe, cost-effective, accelerated and environmentally responsible decommissioning of the UK's civil nuclear legacy in an open and transparent manner and with due regard to the socio-economic impacts on our communities. Critical to achieving the NDA main objective and overall mission is to accelerate and deliver clean-up programmes through the application of appropriate and innovative technology. The NDA remit also requires us to secure good practice by contractors and carry out and promote research into matters relating to the decommissioning and clean up of nuclear installations and sites. NDA have defined a strategic approach for the underpinning of operational and decommissioning activities where each nuclear site is required to write within the Life Time Plans (LTP) the proposed technical baseline for those activities. This enables the robustness of the activities to be assessed, the gaps and opportunities and accompanying Research and Developments (R and D) requirements to be highlighted and investment to be targeted at key technical issues. NDA also supports the development of a commercial framework where innovation is encouraged and improvements can be demonstrated against the technical baseline. In this paper we will present NDA's overall strategic approach, the benefits already realised and highlight the areas for continued development. In conclusion: The development and implementation of a strategic approach to robustly underpin the technical components of the lifetime plans for operational and decommissioning activities on NDA sites has been extremely successful. As well as showing how mature technology assumptions are and where the key gaps and risks are it has also provided a method for highlighting opportunities to improve on that baseline. The use of a common template across all NDA LTPs has enabled direct comparison

  14. Scalable Computational Chemistry: New Developments and Applications

    SciTech Connect (OSTI)

    Yuri Alexeev

    2002-12-31

    The computational part of the thesis is the investigation of titanium chloride (II) as a potential catalyst for the bis-silylation reaction of ethylene with hexaclorodisilane at different levels of theory. Bis-silylation is an important reaction for producing bis(silyl) compounds and new C-Si bonds, which can serve as monomers for silicon containing polymers and silicon carbides. Ab initio calculations on the steps involved in a proposed mechanism are presented. This choice of reactants allows them to study this reaction at reliable levels of theory without compromising accuracy. The calculations indicate that this is a highly exothermic barrierless reaction. The TiCl{sub 2} catalyst removes a 50 kcal/mol activation energy barrier required for the reaction without the catalyst. The first step is interaction of TiCl{sub 2} with ethylene to form an intermediate that is 60 kcal/mol below the energy of the reactants. This is the driving force for the entire reaction. Dynamic correlation plays a significant role because RHF calculations indicate that the net barrier for the catalyzed reaction is 50 kcal/mol. They conclude that divalent Ti has the potential to become an important industrial catalyst for silylation reactions. In the programming part of the thesis, parallelization of different quantum chemistry methods is presented. The parallelization of code is becoming important aspects of quantum chemistry code development. Two trends contribute to it: the overall desire to study large chemical systems and the desire to employ highly correlated methods which are usually computationally and memory expensive. In the presented distributed data algorithms computation is parallelized and the largest arrays are evenly distributed among CPUs. First, the parallelization of the Hartree-Fock self-consistent field (SCF) method is considered. SCF method is the most common starting point for more accurate calculations. The Fock build (sub step of SCF) from AO integrals is also

  15. Development of Computer-Aided Design Tools for Automotive Batteries...

    Broader source: Energy.gov (indexed) [DOE]

    Progress of Computer-Aided Engineering of Batteries (CAEBAT) Vehicle Technologies Office Merit Review 2014: Development of Computer-Aided Design Tools for Automotive Batteries ...

  16. Baseline design/economics for advanced Fischer-Tropsch technology

    SciTech Connect (OSTI)

    Not Available

    1992-04-27

    The objectives of the study are to: Develop a baseline design for indirect liquefaction using advanced Fischer-Tropsch (F-T) technology. Prepare the capital and operating costs for the baseline design. Develop a process flowsheet simulation (PFS) model. The baseline design, the economic analysis, and the computer model will be the major research planning tools that Pittsburgh Energy Technology Center will use to plan, guide, and evaluate its ongoing and future research and commercialization programs relating to indirect coal liquefaction for the manufacture of synthetic liquid fuels from coal.

  17. Computational Tools to Accelerate Commercial Development

    SciTech Connect (OSTI)

    Miller, David C.

    2013-01-01

    The goals of the work reported are: to develop new computational tools and models to enable industry to more rapidly develop and deploy new advanced energy technologies; to demonstrate the capabilities of the CCSI Toolset on non-proprietary case studies; and to deploy the CCSI Toolset to industry. Challenges of simulating carbon capture (and other) processes include: dealing with multiple scales (particle, device, and whole process scales); integration across scales; verification, validation, and uncertainty; and decision support. The tools cover: risk analysis and decision making; validated, high-fidelity CFD; high-resolution filtered sub-models; process design and optimization tools; advanced process control and dynamics; process models; basic data sub-models; and cross-cutting integration tools.

  18. Development of probabilistic multimedia multipathway computer codes.

    SciTech Connect (OSTI)

    Yu, C.; LePoire, D.; Gnanapragasam, E.; Arnish, J.; Kamboj, S.; Biwer, B. M.; Cheng, J.-J.; Zielen, A. J.; Chen, S. Y.; Mo, T.; Abu-Eid, R.; Thaggard, M.; Sallo, A., III.; Peterson, H., Jr.; Williams, W. A.; Environmental Assessment; NRC; EM

    2002-01-01

    The deterministic multimedia dose/risk assessment codes RESRAD and RESRAD-BUILD have been widely used for many years for evaluation of sites contaminated with residual radioactive materials. The RESRAD code applies to the cleanup of sites (soils) and the RESRAD-BUILD code applies to the cleanup of buildings and structures. This work describes the procedure used to enhance the deterministic RESRAD and RESRAD-BUILD codes for probabilistic dose analysis. A six-step procedure was used in developing default parameter distributions and the probabilistic analysis modules. These six steps include (1) listing and categorizing parameters; (2) ranking parameters; (3) developing parameter distributions; (4) testing parameter distributions for probabilistic analysis; (5) developing probabilistic software modules; and (6) testing probabilistic modules and integrated codes. The procedures used can be applied to the development of other multimedia probabilistic codes. The probabilistic versions of RESRAD and RESRAD-BUILD codes provide tools for studying the uncertainty in dose assessment caused by uncertain input parameters. The parameter distribution data collected in this work can also be applied to other multimedia assessment tasks and multimedia computer codes.

  19. Preliminary Phase Field Computational Model Development

    SciTech Connect (OSTI)

    Li, Yulan; Hu, Shenyang Y.; Xu, Ke; Suter, Jonathan D.; McCloy, John S.; Johnson, Bradley R.; Ramuhalli, Pradeep

    2014-12-15

    This interim report presents progress towards the development of meso-scale models of magnetic behavior that incorporate microstructural information. Modeling magnetic signatures in irradiated materials with complex microstructures (such as structural steels) is a significant challenge. The complexity is addressed incrementally, using the monocrystalline Fe (i.e., ferrite) film as model systems to develop and validate initial models, followed by polycrystalline Fe films, and by more complicated and representative alloys. In addition, the modeling incrementally addresses inclusion of other major phases (e.g., martensite, austenite), minor magnetic phases (e.g., carbides, FeCr precipitates), and minor nonmagnetic phases (e.g., Cu precipitates, voids). The focus of the magnetic modeling is on phase-field models. The models are based on the numerical solution to the Landau-Lifshitz-Gilbert equation. From the computational standpoint, phase-field modeling allows the simulation of large enough systems that relevant defect structures and their effects on functional properties like magnetism can be simulated. To date, two phase-field models have been generated in support of this work. First, a bulk iron model with periodic boundary conditions was generated as a proof-of-concept to investigate major loop effects of single versus polycrystalline bulk iron and effects of single non-magnetic defects. More recently, to support the experimental program herein using iron thin films, a new model was generated that uses finite boundary conditions representing surfaces and edges. This model has provided key insights into the domain structures observed in magnetic force microscopy (MFM) measurements. Simulation results for single crystal thin-film iron indicate the feasibility of the model for determining magnetic domain wall thickness and mobility in an externally applied field. Because the phase-field model dimensions are limited relative to the size of most specimens used in

  20. BASELINE DESIGN/ECONOMICS FOR ADVANCED FISCHER-TROPSCH TECHNOLOGY

    SciTech Connect (OSTI)

    1998-04-01

    Bechtel, along with Amoco as the main subcontractor, developed a Baseline design, two alternative designs, and computer process simulation models for indirect coal liquefaction based on advanced Fischer-Tropsch (F-T) technology for the U. S. Department of Energy's (DOE's) Federal Energy Technology Center (FETC).

  1. NASA technical baseline

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    ... Twitter Google + Vimeo GovDelivery SlideShare SunShot Grand Challenge: Regional Test Centers NASA technical baseline HomeTag:NASA technical baseline Curiosity's multi-mission ...

  2. Development of Computer-Aided Design Tools for Automotive Batteries |

    Broader source: Energy.gov (indexed) [DOE]

    Department of Energy 9_han_2012_o.pdf (3.61 MB) More Documents & Publications Progress of Computer-Aided Engineering of Batteries (CAEBAT) Computer-Aided Engineering for Electric Drive Vehicle Batteries (CAEBAT) Vehicle Technologies Office Merit Review 2014: Development of Computer-Aided Design Tools for Automotive Batteries

  3. Computable General Equilibrium Model Fiscal Year 2013 Capability Development Report - April 2014

    SciTech Connect (OSTI)

    Edwards, Brian Keith; Rivera, Michael K.; Boero, Riccardo

    2014-04-01

    This report documents progress made on continued developments of the National Infrastructure Simulation and Analysis Center (NISAC) Computable General Equilibrium Model (NCGEM), developed in fiscal year 2012. In fiscal year 2013, NISAC the treatment of the labor market and tests performed with the model to examine the properties of the solutions computed by the model. To examine these, developers conducted a series of 20 simulations for 20 U.S. States. Each of these simulations compared an economic baseline simulation with an alternative simulation that assumed a 20-percent reduction in overall factor productivity in the manufacturing industries of each State. Differences in the simulation results between the baseline and alternative simulations capture the economic impact of the reduction in factor productivity. While not every State is affected in precisely the same way, the reduction in manufacturing industry productivity negatively affects the manufacturing industries in each State to an extent proportional to the reduction in overall factor productivity. Moreover, overall economic activity decreases when manufacturing sector productivity is reduced. Developers ran two additional simulations: (1) a version of the model for the State of Michigan, with manufacturing divided into two sub-industries (automobile and other vehicle manufacturing as one sub-industry and the rest of manufacturing as the other subindustry); and (2) a version of the model for the United States, divided into 30 industries. NISAC conducted these simulations to illustrate the flexibility of industry definitions in NCGEM and to examine the simulation properties of in more detail.

  4. TWRS baseline system description

    SciTech Connect (OSTI)

    Lee, A.K.

    1995-03-28

    This document provides a description of the baseline system conceptualized for remediating the tank waste stored within the Hanford Site. Remediation of the tank waste will be performed by the Tank Waste Remediation System (TWRS). This baseline system description (BSD) document has been prepared to describe the current planning basis for the TWRS for accomplishing the tank waste remediation functions. The BSD document is not intended to prescribe firm program management strategies for implementing the TWRS. The scope of the TWRS Program includes managing existing facilities, developing technology for new systems; building, testing and operating new facilities; and maintaining the system. The TWRS Program will manage the system used for receiving, safely storing, maintaining, treating, and disposing onsite, or packaging for offsite disposal, all tank waste. The scope of the TWRS Program encompasses existing facilities such as waste storage tanks, evaporators, pipelines, and low-level radioactive waste treatment and disposal facilities. It includes support facilities that comprise the total TWRS infrastructure, including upgrades to existing facilities or equipment and the addition of new facilities.

  5. Synthesis and Comparison of Baseline Avian and Bat Use, Raptor Nesting and Mortality Information from Proposed and Existing Wind Developments: Final Report.

    SciTech Connect (OSTI)

    Erickson, Wallace P.

    2002-12-01

    Primarily due to concerns generated from observed raptor mortality at the Altamont Pass (CA) wind plant, one of the first commercial electricity generating wind plants in the U.S., new proposed wind projects both within and outside of California have received a great deal of scrutiny and environmental review. A large amount of baseline and operational monitoring data have been collected at proposed and existing U.S. wind plants. The primary use of the avian baseline data collected at wind developments has been to estimate the overall project impacts (e.g., very low, low, moderate, and high relative mortality) on birds, especially raptors and sensitive species (e.g., state and federally listed species). In a few cases, these data have also been used for guiding placement of turbines within a project boundary. This new information has strengthened our ability to accurately predict and mitigate impacts from new projects. This report should assist various stakeholders in the interpretation and use of this large information source in evaluating new projects. This report also suggests that the level of baseline data (e.g., avian use data) required to adequately assess expected impacts of some projects may be reduced. This report provides an evaluation of the ability to predict direct impacts on avian resources (primarily raptors and waterfowl/waterbirds) using less than an entire year of baseline avian use data (one season, two seasons, etc.). This evaluation is important because pre-construction wildlife surveys can be one of the most time-consuming aspects of permitting wind power projects. For baseline data, this study focuses primarily on standardized avian use data usually collected using point count survey methodology and raptor nest survey data. In addition to avian use and raptor nest survey data, other baseline data is usually collected at a proposed project to further quantify potential impacts. These surveys often include vegetation mapping and state or

  6. Hazard baseline documentation

    SciTech Connect (OSTI)

    Not Available

    1994-08-01

    This DOE limited technical standard establishes uniform Office of Environmental Management (EM) guidance on hazards baseline documents that identify and control radiological and nonradiological hazards for all EM facilities. It provides a road map to the safety and health hazard identification and control requirements contained in the Department`s orders and provides EM guidance on the applicability and integration of these requirements. This includes a definition of four classes of facilities (nuclear, non-nuclear, radiological, and other industrial); the thresholds for facility hazard classification; and applicable safety and health hazard identification, controls, and documentation. The standard applies to the classification, development, review, and approval of hazard identification and control documentation for EM facilities.

  7. Hazard Baseline Documentation

    Broader source: Directives, Delegations, and Requirements [Office of Management (MA)]

    1995-12-04

    This standard establishes uniform Office of Environmental Management (EM) guidance on hazard baseline documents that identify and control radiological and non-radiological hazards for all EM facilities.

  8. Develop baseline computational model for proactive welding stress management to suppress helium induced cracking during weld repair

    Broader source: Energy.gov [DOE]

    There are over 100 nuclear power plants operating in the U.S., which generate approximately 20% of the nation’s electricity. These plants range from 15 to 40 years old. Extending the service lives...

  9. Computational Tools for Accelerating Carbon Capture Process Development

    SciTech Connect (OSTI)

    Miller, David; Sahinidis, N.V,; Cozad, A; Lee, A; Kim, H; Morinelly, J.; Eslick, J.; Yuan, Z.

    2013-06-04

    This presentation reports development of advanced computational tools to accelerate next generation technology development. These tools are to develop an optimized process using rigorous models. They include: Process Models; Simulation-Based Optimization; Optimized Process; Uncertainty Quantification; Algebraic Surrogate Models; and Superstructure Optimization (Determine Configuration).

  10. The role of customized computational tools in product development.

    SciTech Connect (OSTI)

    Heinstein, Martin Wilhelm; Kempka, Steven Norman; Tikare, Veena

    2005-06-01

    Model-based computer simulations have revolutionized product development in the last 10 to 15 years. Technologies that have existed for many decades or even centuries have been improved with the aid of computer simulations. Everything from low-tech consumer goods such as detergents, lubricants and light bulb filaments to the most advanced high-tech products such as airplane wings, wireless communication technologies and pharmaceuticals is engineered with the aid of computer simulations today. In this paper, we present a framework for describing computational tools and their application within the context of product engineering. We examine a few cases of product development that integrate numerical computer simulations into the development stage. We will discuss how the simulations were integrated into the development process, what features made the simulations useful, the level of knowledge and experience that was necessary to run meaningful simulations and other details of the process. Based on this discussion, recommendations for the incorporation of simulations and computational tools into product development will be made.

  11. Climate-Science Computational End Station Development and Grand Challenge

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Team | Argonne Leadership Computing Facility Total precipitable water, a measure of how much moisture is in the air from a single moment in time in the global simulation of the atmosphere at a resolution of half a degree of latitude. (Figure provided by Mark Taylor, Sandia National Laboratories.) Figure provided by Mark Taylor, Sandia National Laboratories. Climate-Science Computational End Station Development and Grand Challenge Team PI Name: Warren Washington, Tom Bettge PI Email:

  12. Baseline design/economics for advanced Fischer-Tropsch technology. Quarterly report, October--December 1991

    SciTech Connect (OSTI)

    Not Available

    1992-04-27

    The objectives of the study are to: Develop a baseline design for indirect liquefaction using advanced Fischer-Tropsch (F-T) technology. Prepare the capital and operating costs for the baseline design. Develop a process flowsheet simulation (PFS) model. The baseline design, the economic analysis, and the computer model will be the major research planning tools that Pittsburgh Energy Technology Center will use to plan, guide, and evaluate its ongoing and future research and commercialization programs relating to indirect coal liquefaction for the manufacture of synthetic liquid fuels from coal.

  13. High performance parallel computers for science: New developments at the Fermilab advanced computer program

    SciTech Connect (OSTI)

    Nash, T.; Areti, H.; Atac, R.; Biel, J.; Cook, A.; Deppe, J.; Edel, M.; Fischler, M.; Gaines, I.; Hance, R.

    1988-08-01

    Fermilab's Advanced Computer Program (ACP) has been developing highly cost effective, yet practical, parallel computers for high energy physics since 1984. The ACP's latest developments are proceeding in two directions. A Second Generation ACP Multiprocessor System for experiments will include $3500 RISC processors each with performance over 15 VAX MIPS. To support such high performance, the new system allows parallel I/O, parallel interprocess communication, and parallel host processes. The ACP Multi-Array Processor, has been developed for theoretical physics. Each $4000 node is a FORTRAN or C programmable pipelined 20 MFlops (peak), 10 MByte single board computer. These are plugged into a 16 port crossbar switch crate which handles both inter and intra crate communication. The crates are connected in a hypercube. Site oriented applications like lattice gauge theory are supported by system software called CANOPY, which makes the hardware virtually transparent to users. A 256 node, 5 GFlop, system is under construction. 10 refs., 7 figs.

  14. Transportation Baseline Report

    SciTech Connect (OSTI)

    Fawcett, Ricky Lee; Kramer, George Leroy Jr.

    1999-12-01

    The National Transportation Program 1999 Transportation Baseline Report presents data that form a baseline to enable analysis and planning for future Department of Energy (DOE) Environmental Management (EM) waste and materials transportation. In addition, this Report provides a summary overview of DOEs projected quantities of waste and materials for transportation. Data presented in this report were gathered as a part of the IPABS Spring 1999 update of the EM Corporate Database and are current as of July 30, 1999. These data were input and compiled using the Analysis and Visualization System (AVS) which is used to update all stream-level components of the EM Corporate Database, as well as TSD System and programmatic risk (disposition barrier) information. Project (PBS) and site-level IPABS data are being collected through the Interim Data Management System (IDMS). The data are presented in appendices to this report.

  15. Accelerating technology development through integrated computation and experimentation

    SciTech Connect (OSTI)

    Shekhawat, Dushyant; Srivastava, Rameshwar

    2013-01-01

    This special section of Energy & Fuels comprises a selection of papers presented at the topical conference Accelerating Technology Development through Integrated Computation and Experimentation, sponsored and organized by the United States Department of Energys National Energy Technology Laboratory (NETL) as part of the 2012 American Institute of Chemical Engineers (AIChE) Annual Meeting held in Pittsburgh, PA, Oct 28?Nov 2, 2012. That topical conference focused on the latest research and development efforts in five main areas related to fossil energy, with each area focusing on the utilization of both experimental and computational approaches: (1) gas separations (membranes, sorbents, and solvents for CO{sub 2}, H{sub 2}, and O{sub 2} production), (2) CO{sub 2} utilization (enhanced oil recovery, chemical production, mineralization, etc.), (3) carbon sequestration (flow in natural systems), (4) advanced power cycles (oxy-combustion, chemical looping, gasification, etc.), and (5) fuel processing (H{sub 2} production for fuel cells).

  16. Baseline LAW Glass Formulation Testing

    SciTech Connect (OSTI)

    Kruger, Albert A.; Mooers, Cavin; Bazemore, Gina; Pegg, Ian L.; Hight, Kenneth; Lai, Shan Tao; Buechele, Andrew; Rielley, Elizabeth; Gan, Hao; Muller, Isabelle S.; Cecil, Richard

    2013-06-13

    The major objective of the baseline glass formulation work was to develop and select glass formulations that are compliant with contractual and processing requirements for each of the LAW waste streams. Other objectives of the work included preparation and characterization of glasses with respect to the properties of interest, optimization of sulfate loading in the glasses, evaluation of ability to achieve waste loading limits, testing to demonstrate compatibility of glass melts with melter materials of construction, development of glass formulations to support ILAW qualification activities, and identification of glass formulation issues with respect to contract specifications and processing requirements.

  17. Development of a Very Dense Liquid Cooled Compute Platform

    SciTech Connect (OSTI)

    Hughes, Phillip N.; Lipp, Robert J.

    2013-12-10

    The objective of this project was to design and develop a prototype very energy efficient high density compute platform with 100% pumped refrigerant liquid cooling using commodity components and high volume manufacturing techniques. Testing at SLAC has indicated that we achieved a DCIE of 0.93 against our original goal of 0.85. This number includes both cooling and power supply and was achieved employing some of the highest wattage processors available.

  18. Present status of computational tools for maglev development

    SciTech Connect (OSTI)

    Wang, Z.; Chen, S.S.; Rote, D.M.

    1991-10-01

    High-speed vehicles that employ magnetic levitation (maglev) have received great attention worldwide as a means of relieving both highway and air-traffic congestion. At this time, Japan and Germany are leading the development of maglev. After fifteen years of inactivity that is attributed to technical policy decisions, the federal government of the United States has reconsidered the possibility of using maglev in the United States. The National Maglev Initiative (NMI) was established in May 1990 to assess the potential of maglev in the United States. One of the tasks of the NMI, which is also the objective of this report, is to determine the status of existing computer software that can be applied to maglev-related problems. The computational problems involved in maglev assessment, research, and development can be classified into two categories: electromagnetic and mechanical. Because most maglev problems are complicated and difficult to solve analytically, proper numerical methods are needed to find solutions. To determine the status of maglev-related software, developers and users of computer codes were surveyed. The results of the survey are described in this report. 25 refs.

  19. Computer security plan development using an expert system

    SciTech Connect (OSTI)

    Hunteman, W.J. ); Evans, R.; Brownstein, M.; Chapman, L. )

    1990-01-01

    The Computer Security Plan Assistant (SPA) is an expert system for reviewing Department of Energy (DOE) Automated Data Processing (ADP) Security Plans. DOE computer security policies require ADP security plans to be periodically reviewed and updated by all DOE sites. SPA is written in XI-Plus, an expert system shell. SPA was developed by BDM International, Inc., under sponsorship by the DOE Center for Computer Security at Los Alamos National Laboratory. SPA runs on an IBM or compatible personal computer. It presents a series of questions about the ADP security plan being reviewed. The SPA user references the ADP Security Plan and answers the questions. The SPA user reviews each section of the security plan, in any order, until all sections have been reviewed. The SPA user can stop the review process after any section and restart later. A Security Plan Review Report is available after the review of each section of the Security Plan. The Security Plan Review Report gives the user a written assessment of the completeness of the ADP Security Plan. SPA is being tested at Los Alamos and will soon be available to the DOE community.

  20. Baseline design/economics for advanced Fischer-Tropsch technology. Quarterly report, October--December 1992

    SciTech Connect (OSTI)

    Not Available

    1992-12-31

    Bechtel, with Amoco as the main subcontractor, initiated a study on September 26, 1991, for the US Department of Energy`s (DOE`s) Pittsburgh Energy Technology Center (PETC) to develop a computer model and baseline design for advanced Fischer-Tropsch (F-T) technology. This 24-month study, with an approved budget of $2.3 million, is being performed under DOE Contract Number AC22-91PC90027. (1) Develop a baseline design and two alternative designs for indirect liquefaction using advanced F-T technology. The baseline design uses Illinois No. 6 Eastern Coal and conventional refining. There is an alternative refining case using ZSM-5 treatment of the vapor stream from the slurry F-T reactor and an alternative coal case using Western coal from the Powder River Basin. (2) Prepare the capital and operating costs for the baseline design and the alternatives. Individual plant costs for the alternative cases will be prorated on capacity, wherever possible, from the baseline case. (3) Develop a process flowsheet simulation (PFS) model. The baseline design, the economic analysis and computer model will be major research planning tools that PETC will use to plan, guide and evaluate its ongoing and future research and commercialization programs relating to indirect coal liquefaction for the manufacture of synthetic liquid fuels from coal.

  1. Annual Technology Baseline

    Broader source: Energy.gov [DOE]

    The National Renewable Energy Laboratory is conducting a study sponsored by the U.S. Department of Energy DOE, Office of Energy Efficiency and Renewable Energy (EERE), that aims to document and implement an annual process designed to identify a realistic and timely set of input assumptions (e.g., technology cost and performance, fuel costs), and a diverse set of potential futures (standard scenarios), initially for electric sector analysis. This primary product of the Annual Technology Baseline (ATB) project component includes detailed cost and performance data (both current and projected) for both renewable and conventional technologies. This data is presented in MS Excel.

  2. Development of Computer-Aided Design Tools for Automotive Batteries...

    Broader source: Energy.gov (indexed) [DOE]

    More Documents & Publications Progress of Computer-Aided Engineering of Batteries (CAEBAT) Computer-Aided Engineering for Electric Drive Vehicle Batteries (CAEBAT) Vehicle ...

  3. FED baseline engineering studies report

    SciTech Connect (OSTI)

    Sager, P.H.

    1983-04-01

    Studies were carried out on the FED Baseline to improve design definition, establish feasibility, and reduce cost. Emphasis was placed on cost reduction, but significant feasibility concerns existed in several areas, and better design definition was required to establish feasibility and provide a better basis for cost estimates. Design definition and feasibility studies included the development of a labyrinth shield ring concept to prevent radiation streaming between the torus spool and the TF coil cryostat. The labyrinth shield concept which was developed reduced radiation streaming sufficiently to permit contact maintenance of the inboard EF coils. Various concepts of preventing arcing between adjacent shield sectors were also explored. It was concluded that installation of copper straps with molybdenum thermal radiation shields would provide the most reliable means of preventing arcing. Other design studies included torus spool electrical/structural concepts, test module shielding, torus seismic response, poloidal conditions in the magnets, disruption characteristics, and eddy current effects. These additional studies had no significant impact on cost but did confirm the feasibility of the basic FED Baseline concept.

  4. Computational Tools for Accelerating Carbon Capture Process Development

    SciTech Connect (OSTI)

    Miller, David

    2013-01-01

    The goals of the work reported are: to develop new computational tools and models to enable industry to more rapidly develop and deploy new advanced energy technologies; to demonstrate the capabilities of the CCSI Toolset on non-proprietary case studies; and to deploy the CCSI Toolset to industry. Challenges of simulating carbon capture (and other) processes include: dealing with multiple scales (particle, device, and whole process scales); integration across scales; verification, validation, and uncertainty; and decision support. The tools cover: risk analysis and decision making; validated, high-fidelity CFD; high-resolution filtered sub-models; process design and optimization tools; advanced process control and dynamics; process models; basic data sub-models; and cross-cutting integration tools.

  5. Direct coal liquefaction baseline design and system analysis

    SciTech Connect (OSTI)

    Not Available

    1991-04-01

    The primary objective of the study is to develop a computer model for a base line direct coal liquefaction design based on two stage direct coupled catalytic reactors. This primary objective is to be accomplished by completing the following: a base line design based on previous DOE/PETC results from Wilsonville pilot plant and other engineering evaluations; a cost estimate and economic analysis; a computer model incorporating the above two steps over a wide range of capacities and selected process alternatives; a comprehensive training program for DOE/PETC Staff to understand and use the computer model; a thorough documentation of all underlying assumptions for baseline economics; and a user manual and training material which will facilitate updating of the model in the future.

  6. Direct coal liquefaction baseline design and system analysis

    SciTech Connect (OSTI)

    Not Available

    1991-07-01

    The primary objective of the study is to develop a computer model for a base line direct coal liquefaction design based on two stage direct coupled catalytic reactors. This primary objective is to be accomplished by completing the following: a base line design based on previous DOE/PETC results from Wilsonville pilot plant and other engineering evaluations; a cost estimate and economic analysis; a computer model incorporating the above two steps over a wide range of capacities and selected process alternatives; a comprehensive training program for DOE/PETC Staff to understand and use the computer model; a thorough documentation of all underlying assumptions for baseline economics; and a user manual and training material which will facilitate updating of the model in the future.

  7. Integrated Baseline System (IBS) Version 2.0: Models guide

    SciTech Connect (OSTI)

    Not Available

    1994-03-01

    The Integrated Baseline System (IBS) is an emergency management planning and analysis tool being developed under the direction of the US Army Nuclear and Chemical Agency. This Models Guide summarizes the IBS use of several computer models for predicting the results of emergency situations. These include models for predicting dispersion/doses of airborne contaminants, traffic evacuation, explosion effects, heat radiation from a fire, and siren sound transmission. The guide references additional technical documentation on the models when such documentation is available from other sources. The audience for this manual is chiefly emergency management planners and analysts, but also data managers and system managers.

  8. Baseline design/economics for advanced Fischer-Tropsch technology. Quarterly report, April--June 1993

    SciTech Connect (OSTI)

    1993-12-31

    The objectives of this study are to: (1) Develop a baseline design and two alternative designs for indirect liquefaction using advanced F-T technology. The baseline design uses Illinois No. 6 Eastern Coal and conventional refining. There is an alternative refining case using ZSM-5 treatment of the vapor stream from the slurry F-T reactor and an alternative coal case using Western coal from the Powder River Basin. (2) Prepare the capital and operating costs for the baseline design and the alternatives. Individual plant costs for the alternative cases will be prorated on capacity, wherever possible, from the baseline case. (3) Develop a process flowsheet simulation model. The baseline design, the economic analysis and computer model will be major research planning tools that PETC will use to plan, guide and evaluate its ongoing and future research and commercialization programs relating to indirect coal liquefaction for the manufacture of synthetic liquid fuels from coal. The study has been divided into seven major tasks: Task 1: Establish the baseline design and alternatives. Task 2: Evaluate baseline and alternative economics. Task 3: Develop engineering design criteria. Task 4: Develop a process flowsheet simulation (PFS) model. Task 5: Perform sensitivity studies using the PFS model. Task 6: Document the PFS model and develop a DOE training session on its use. Task 7: Perform project management, technical coordination and other miscellaneous support functions. During the reporting period, work progressed on Tasks 1, 4 and 7. This report covers work done during the period and consists of four sections: Introduction and Summary. Task 1--Baseline Design and Alternatives. Task 4--Process Flowsheet Simulation (PFS) Model, and Project Management and Staffing Report.

  9. Baseline design/economics for advanced Fischer-Tropsch technology. Quarterly report, January--March 1993

    SciTech Connect (OSTI)

    1993-12-31

    The objectives of this study are to: Develop a baseline design and two alternative designs for indirect liquefaction using advanced F-T technology. The baseline design uses Illinois No. 6 Eastern Coal and conventional refining. There is an alternative refining case using ZSM- 5 treatment of the vapor stream from the slurry F-T reactor and an alternative coal case using Western coal from the Powder River Basin. Prepare the capital and operating costs for the baseline design and the alternatives. Individual plant costs for the alternative cases will be prorated on capacity, wherever possible, from the baseline case, and develop a process flowsheet simulation model. The baseline design, the economic analysis and computer model will be major research planning tools that Pittsburgh Energy Technology Center will use to plan, guide and evaluate its ongoing and future research and commercialization programs relating to indirect coal liquefaction for the manufacture of synthetic liquid fuels from coal. The study has been divided into seven major tasks: Task 1, establish the baseline design and alternatives; Task 2, evaluate baseline and alternative economics; Task 3, develop engineering design criteria; Task 4, develop a process flowsheet simulation (PFS) model; Task 5, perform sensitivity studies using the PFS model; Task 6, document the PFS model and develop a DOE training session on its use; and Task 7, perform project management, technical coordination and other miscellaneous support functions. This report covers work done during the period and consists of four sections: Introduction and summary; Task 1, baseline design and alternatives; Task 4, process flowsheet simulation (PFS) model; and project management and staffing report.

  10. Baseline design/economics for advanced Fischer-Tropsch technology. Quarterly report, October--December 1994

    SciTech Connect (OSTI)

    1993-12-31

    The objectives of the study are to: Develop a baseline design and two alternative designs for indirect liquefaction using advanced F-T technology. The baseline design uses Illinois No. 6 Eastern Coal and conventional refining. There is an alternative refining case using ZSM-5 treatment of the vapor stream from the slurry F-T reactor and an alternative coal case using Western coal from the Powder River Basin. Prepare the capital and operating costs for the baseline design and the alternatives. Individual plant costs for the alternative cases will be prorated on capacity, wherever possible, from the baseline case. Develop a process flowsheet simulation model. The baseline design, the economic analysis and computer model will be major research planning tools that PETC will use to plan, guide and evaluate its ongoing and future research and commercialization programs relating to indirect coal liquefaction for the manufacture of synthetic liquid fuels from coal. The study has been divided into seven major tasks. Task 1: Establish the baseline design and alternatives. Task 2: Evaluate baseline and alternative economics. Task 3: Develop engineering design criteria. Task 4: Develop a process flowsheet simulation model. Task 5: Perform sensitivity studies using the PFS model. Task 6: Document the PFS model and develop a DOE training session on its use, and Task 7: Perform project management, technical coordination and other miscellaneous support functions. During the reporting period, work progressed on Tasks 1, 4 and 7. This report covers work done during the period and consists of four sections: Introduction and Summary. Task 1--Baseline Design and Alternatives. Task 4--Process Flowsheet Simulation Model. Project Management and Staffing Report.

  11. Energy Intensity Baselining and Tracking Guidance | Department of Energy

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Technical Assistance » Better Plants » Energy Intensity Baselining and Tracking Guidance Energy Intensity Baselining and Tracking Guidance The Energy Intensity Baselining and Tracking Guidance for the Better Buildings, Better Plants Program helps companies meet the program's reporting requirements by describing the steps necessary to develop an energy consumption and energy intensity baseline and calculating consumption and intensity changes over time. Most of the calculation steps described

  12. Single ion implantation for solid state quantum computer development

    SciTech Connect (OSTI)

    Schenkel, Thomas; Meijers, Jan; Persaud, Arun; McDonald, Joseph W.; Holder, Joseph P.; Schneider, Dieter H.

    2001-12-18

    Several solid state quantum computer schemes are based on the manipulation of electron and nuclear spins of single donor atoms in a solid matrix. The fabrication of qubit arrays requires the placement of individual atoms with nanometer precision and high efficiency. In this article we describe first results from low dose, low energy implantations and our development of a low energy (<10 keV), single ion implantation scheme for {sup 31}P{sup q+} ions. When {sup 31}P{sup q+} ions impinge on a wafer surface, their potential energy (9.3 keV for P{sup 15+}) is released, and about 20 secondary electrons are emitted. The emission of multiple secondary electrons allows detection of each ion impact with 100% efficiency. The beam spot on target is controlled by beam focusing and collimation. Exactly one ion is implanted into a selected area avoiding a Poissonian distribution of implanted ions.

  13. Baseline Test Specimen Machining Report

    SciTech Connect (OSTI)

    mark Carroll

    2009-08-01

    The Next Generation Nuclear Plant (NGNP) Project is tasked with selecting a high temperature gas reactor technology that will be capable of generating electricity and supplying large amounts of process heat. The NGNP is presently being designed as a helium-cooled high temperature gas reactor (HTGR) with a large graphite core. The graphite baseline characterization project is conducting the research and development (R&D) activities deemed necessary to fully qualify nuclear-grade graphite for use in the NGNP reactor. Establishing nonirradiated thermomechanical and thermophysical properties by characterizing lot-to-lot and billet-to-billet variations (for probabilistic baseline data needs) through extensive data collection and statistical analysis is one of the major fundamental objectives of the project. The reactor core will be made up of stacks of graphite moderator blocks. In order to gain a more comprehensive understanding of the varying characteristics in a wide range of suitable graphites, any of which can be classified as nuclear grade, an experimental program has been initiated to develop an extensive database of the baseline characteristics of numerous candidate graphites. Various factors known to affect the properties of graphite will be investigated, including specimen size, spatial location within a graphite billet, specimen orientation within a billet (either parallel to [P] or transverse to [T] the long axis of the as-produced billet), and billet-to-billet variations within a lot or across different production lots. Because each data point is based on a certain position within a given billet of graphite, particular attention must be paid to the traceability of each specimen and its spatial location and orientation within each billet. The evaluation of these properties is discussed in the Graphite Technology Development Plan (Windes et. al, 2007). One of the key components in the evaluation of these graphite types will be mechanical testing on

  14. Baseline design/economics for advanced Fischer-Tropsch technology. Quarterly report, April--June 1994

    SciTech Connect (OSTI)

    1994-01-01

    The objectives of this study are to: Develop a baseline design and two alternative designs for indirect liquefaction using advanced F-T technology. The baseline design uses Illinois No. 6 Eastern Coal and conventional refining. There is an alternative refining case using ZSM-5 treatment of the vapor steam from the flurry F-T reactor and an alternative coal case using Western coal from the Powder River Basin. Prepare the capital and operating costs for the baseline design and the alternatives. Individual plant costs for the alternative cases will be prorated on capacity, wherever possible, from the baseline case, develop a process flowsheet simulation (PFS) model. The baseline design, the economic analysis and computer model will be major research planning tools that Pittsburgh Energy Technology Center will use to plan, guide and evaluate its ongoing and future research and commercialization programs relating to indirect coal liquefaction for the manufacture of synthetic liquid fuels from coal. During the reporting period, work progressed on Tasks 1, 4, 5, 6 and 7. This report covers work done during the period and consists of six sections: introduction and summary; Task 1, baseline design and alternatives; Task 4, process flowsheet simulation (PFS) model; Task 5, perform sensitivity studies using the PFS model; Task 6, document the PFS model and develop a DOE training session on its use, and project management and staffing report.

  15. Pinellas Plant Environmental Baseline Report

    SciTech Connect (OSTI)

    Not Available

    1997-06-01

    The Pinellas Plant has been part of the Department of Energy`s (DOE) nuclear weapons complex since the plant opened in 1957. In March 1995, the DOE sold the Pinellas Plant to the Pinellas County Industry Council (PCIC). DOE has leased back a large portion of the plant site to facilitate transition to alternate use and safe shutdown. The current mission is to achieve a safe transition of the facility from defense production and prepare the site for alternative uses as a community resource for economic development. Toward that effort, the Pinellas Plant Environmental Baseline Report (EBR) discusses the current and past environmental conditions of the plant site. Information for the EBR is obtained from plant records. Historical process and chemical usage information for each area is reviewed during area characterizations.

  16. Baseline design/economics for advanced Fischer-Tropsch technology. Auarterly report, July--September 1992

    SciTech Connect (OSTI)

    1992-12-31

    The objectives of this study are to: Develop a baseline design for indirect liquefaction using advanced F-T technology; prepare the capital and operating costs for the baseline design; and develop a process flowsheet simulation model. The baseline design, the economic analysis and computer model will be major research planning tools that Pittsburgh Energy Technology Center will use to plan, guide and evaluate its ongoing and future research and commercialisation programs relating to indirect coal liquefaction for the manufacture of synthetic liquid fuels from coal. The study has been divided into seven major tasks: Task 1, establish the baseline design and alternatives; Task 2, evaluate baseline economics; Task 3: Develop engineering design criteria; Task 4, develop a process flowsheet simulation (PFS) model; Task 5, perform sensitivity studies using the PFS model; Task 6, document the PFS model and develop a DOE training session on its use; Task 7, perform project management, technical coordination and other miscellaneous support functions. During the reporting period work progressed on Tasks 1, 4 and 7. This report covers work done during the period and consists of five sections: Introduction and summary; preliminary design for syngas production; Task 1, preliminary F-T reaction loop design; Task 1, development of a process simulation model; Task 4, key personnel staffing report, Task 7.

  17. Baseline Control Measures.pdf

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Individual Permit Baseline Control Measures at Los Alamos National Laboratory, Poster, Individual Permit for Storm Water, NPDES Permit No. NM0030759 Author(s): Veenis, Steven J....

  18. computers

    National Nuclear Security Administration (NNSA)

    Each successive generation of computing system has provided greater computing power and energy efficiency.

    CTS-1 clusters will support NNSA's Life Extension Program and...

  19. New DOE Office of Science support for CAMERA to develop computational...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    New DOE Office of Science support for CAMERA to develop computational mathematics for experimental facilities research New DOE Office of Science support for CAMERA to develop ...

  20. Hanford Site technical baseline database

    SciTech Connect (OSTI)

    Porter, P.E., Westinghouse Hanford

    1996-05-10

    This document includes a cassette tape that contains the Hanford specific files that make up the Hanford Site Technical Baseline Database as of May 10, 1996. The cassette tape also includes the delta files that delineate the differences between this revision and revision 3 (April 10, 1996) of the Hanford Site Technical Baseline Database.

  1. HEV America Baseline Test Sequence

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    BASELINE TEST SEQUENCE Revision 1 September 1, 2006 Prepared by Electric Transportation Applications Prepared by: _______________________________ Date: __________ Roberta Brayer Approved by: _________ _________________________________ Date: _______________ _____ Donald B. Karner ©2005 Electric Transportation Applications All Rights Reserved HEV America Baseline Test Sequence Page 1 HEV PERFORMANCE TEST PROCEDURE SEQUENCE The following test sequence shall be used for conduct of HEV America

  2. U.S. Department of Energy Performance Baseline Guide

    Broader source: Directives, Delegations, and Requirements [Office of Management (MA)]

    2008-09-12

    The guide supports DOE O 413.3A and identifies key performance baseline development processes and practices. Does not cancel other directives.

  3. Ethiopia-National Greenhouse Gas Emissions Baseline Scenarios...

    Open Energy Info (EERE)

    National Greenhouse Gas Emissions Baseline Scenarios: Learning from Experiences in Developing Countries Jump to: navigation, search Name Ethiopia-National Greenhouse Gas Emissions...

  4. Baseline Graphite Characterization: First Billet

    SciTech Connect (OSTI)

    Mark C. Carroll; Joe Lords; David Rohrbaugh

    2010-09-01

    The Next Generation Nuclear Plant Project Graphite Research and Development program is currently establishing the safe operating envelope of graphite core components for a very high temperature reactor design. To meet this goal, the program is generating the extensive amount of quantitative data necessary for predicting the behavior and operating performance of the available nuclear graphite grades. In order determine the in-service behavior of the graphite for the latest proposed designs, two main programs are underway. The first, the Advanced Graphite Creep (AGC) program, is a set of experiments that are designed to evaluate the irradiated properties and behavior of nuclear grade graphite over a large spectrum of temperatures, neutron fluences, and compressive loads. Despite the aggressive experimental matrix that comprises the set of AGC test runs, a limited amount of data can be generated based upon the availability of space within the Advanced Test Reactor and the geometric constraints placed on the AGC specimens that will be inserted. In order to supplement the AGC data set, the Baseline Graphite Characterization program will endeavor to provide supplemental data that will characterize the inherent property variability in nuclear-grade graphite without the testing constraints of the AGC program. This variability in properties is a natural artifact of graphite due to the geologic raw materials that are utilized in its production. This variability will be quantified not only within a single billet of as-produced graphite, but also from billets within a single lot, billets from different lots of the same grade, and across different billets of the numerous grades of nuclear graphite that are presently available. The thorough understanding of this variability will provide added detail to the irradiated property data, and provide a more thorough understanding of the behavior of graphite that will be used in reactor design and licensing. This report covers the

  5. 324 Building Baseline Radiological Characterization

    SciTech Connect (OSTI)

    R.J. Reeder, J.C. Cooper

    2010-06-24

    This report documents the analysis of radiological data collected as part of the characterization study performed in 1998. The study was performed to create a baseline of the radiological conditions in the 324 Building.

  6. Baseline design/economics for advanced Fischer-Tropsch technology. Quarterly report, April--June 1992

    SciTech Connect (OSTI)

    Not Available

    1992-10-01

    Effective September 26, 1991, Bechtel, with Amoco as the main subcontractor, initiated a study to develop a computer model and baseline design for advanced Fischer-Tropsch (F-T) technology for the US Department of Energy`s Pittsburgh Energy Technology Center (PETC). The objectives of the study are to: Develop a baseline design for indirect liquefaction using advanced F-T technology; prepare the capital and operating costs for the baseline design; and develop a process flow sheet simulation (PI-S) model. The baseline design, the economic analysis, and the computer model win be the major research planning tools that PETC will use to plan, guide, and evaluate its ongoing and future research and commercialization programs relating to indirect coal liquefaction. for the manufacture of synthetic liquid fuels from coal. This report is Bechtel`s third quarterly technical progress report covering the period from March 16, 1992 through June 21, 1992. This report consists of seven sections: Section 1 - introduction; Section 2 - summary; Section 3 - carbon dioxide removal tradeoff study; Section 4 - preliminary plant designs for coal preparation; Section 5 - preliminary design for syngas production; Section 6 - Task 3 - engineering design criteria; and Section 7 - project management.

  7. Computing

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Computing and Storage Requirements Computing and Storage Requirements for FES J. Candy General Atomics, San Diego, CA Presented at DOE Technical Program Review Hilton Washington DC/Rockville Rockville, MD 19-20 March 2013 2 Computing and Storage Requirements Drift waves and tokamak plasma turbulence Role in the context of fusion research * Plasma performance: In tokamak plasmas, performance is limited by turbulent radial transport of both energy and particles. * Gradient-driven: This turbulent

  8. U.S Department of Energy Performance Baseline Guide

    Broader source: Directives, Delegations, and Requirements [Office of Management (MA)]

    2011-09-23

    This guide identifies key Performance Baseline (PB) elements, development processes, and practices; describes the context in which DOE PB development occurs; and suggests ways of addressing the critical elements in PB development.

  9. computers

    National Nuclear Security Administration (NNSA)

    California.

    Retired computers used for cybersecurity research at Sandia National...

  10. Mid-Atlantic Baseline Studies Project | Department of Energy

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Mid-Atlantic Baseline Studies Project Mid-Atlantic Baseline Studies Project Funded by the Department of Energy, along with a number of partners, the collaborative Mid-Atlantic Baseline Studies Project, led by the Biodiversity Research Institute (BRI), helps improve understanding of species composition and use of the Mid-Atlantic marine environment in order to promote more sustainable offshore wind development. This first-of-its-kind study along the Eastern Seaboard of the United States delivers

  11. Firm develops own EMS built on Apple computer

    SciTech Connect (OSTI)

    Pospisil, R.

    1982-04-05

    Firestone Fibers and Textile Co. programmed a $2000 desktop Apple II computer and special electronic panels designed by the engineering staff to perform process control and other energy-management functions. The system should reduce natural gas consumption 40% and save the company up to $75,000 a year by reducing the amount of hot air exhausted from fabric-treating ovens. The system can be expanded to control lights and space-conditioning equipment. The company is willing to negotiate with other firms to market the panels. The Apple II was chosen because it has a high capacity for data acquisition and testing and because of the available software. (DCK)

  12. Examining Uncertainty in Demand Response Baseline Models and Variability in Automated Response to Dynamic Pricing

    SciTech Connect (OSTI)

    Mathieu, Johanna L.; Callaway, Duncan S.; Kiliccote, Sila

    2011-08-15

    Controlling electric loads to deliver power system services presents a number of interesting challenges. For example, changes in electricity consumption of Commercial and Industrial (C&I) facilities are usually estimated using counterfactual baseline models, and model uncertainty makes it difficult to precisely quantify control responsiveness. Moreover, C&I facilities exhibit variability in their response. This paper seeks to understand baseline model error and demand-side variability in responses to open-loop control signals (i.e. dynamic prices). Using a regression-based baseline model, we define several Demand Response (DR) parameters, which characterize changes in electricity use on DR days, and then present a method for computing the error associated with DR parameter estimates. In addition to analyzing the magnitude of DR parameter error, we develop a metric to determine how much observed DR parameter variability is attributable to real event-to-event variability versus simply baseline model error. Using data from 38 C&I facilities that participated in an automated DR program in California, we find that DR parameter errors are large. For most facilities, observed DR parameter variability is likely explained by baseline model error, not real DR parameter variability; however, a number of facilities exhibit real DR parameter variability. In some cases, the aggregate population of C&I facilities exhibits real DR parameter variability, resulting in implications for the system operator with respect to both resource planning and system stability.

  13. Tank waste remediation system technical baseline summary description

    SciTech Connect (OSTI)

    Raymond, R.E.

    1998-01-08

    This document is one of the tools used to develop and control the mission work as depicted in the included figure. This Technical Baseline Summary Description document is the top-level tool for management of the Technical Baseline for waste storage operations.

  14. Baseline Wind Energy Facility | Open Energy Information

    Open Energy Info (EERE)

    Wind Energy Facility Jump to: navigation, search Name Baseline Wind Energy Facility Facility Baseline Wind Energy Facility Sector Wind energy Facility Type Commercial Scale Wind...

  15. Engineering task plan TWRS technical baseline completion

    SciTech Connect (OSTI)

    Moore, T.L

    1996-03-08

    The Tank Waste Remediation System (TWRS) includes many activities required to remediate the radioactive waste stored in underground waste storage tanks. These activities include routine monitoring of the waste, facilities maintenance, upgrades to existing equipment, and installation of new equipment necessary to manage, retrieve, process, and dispose of the waste. In order to ensure that these multiple activities are integrated, cost effective, and necessary, a sound technical baseline is required from which all activities can be traced and measured. The process by which this technical baseline is developed will consist of the identification of functions, requirements, architecture, and test (FRAT) methodology. This process must be completed for TWRS to a level that provides the technical basis for all facility/system/component maintenance, upgrades, or new equipment installation.

  16. Baseline Control Measures.pdf

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Individual Permit Baseline Control Measures at Los Alamos National Laboratory, Poster, Individual Permit for Storm Water, NPDES Permit No. NM0030759 Author(s): Veenis, Steven J. Intended for: Public Purpose: This poster was prepared for the June 2013 Individual Permit for Storm Water (IP) public meeting. The purpose of the meeting was to update the public on implementation of the permit as required under Part 1.I (7) of the IP (National Pollutant Discharge Elimination System Permit No.

  17. Computations

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Computations - Sandia Energy Energy Search Icon Sandia Home Locations Contact Us Employee Locator Energy & Climate Secure & Sustainable Energy Future Stationary Power Energy Conversion Efficiency Solar Energy Wind Energy Water Power Supercritical CO2 Geothermal Natural Gas Safety, Security & Resilience of the Energy Infrastructure Energy Storage Nuclear Power & Engineering Grid Modernization Battery Testing Nuclear Energy Defense Waste Management Programs Advanced Nuclear Energy

  18. Computing

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Office of Advanced Scientific Computing Research in the Department of Energy Office of Science under contract number DE-AC02-05CH11231. ! Application and System Memory Use, Configuration, and Problems on Bassi Richard Gerber Lawrence Berkeley National Laboratory NERSC User Services ScicomP 13 Garching bei München, Germany, July 17, 2007 ScicomP 13, July 17, 2007, Garching Overview * About Bassi * Memory on Bassi * Large Page Memory (It's Great!) * System Configuration * Large Page

  19. Development of Computer-Aided Design Tools for Automotive Batteries |

    Broader source: Energy.gov (indexed) [DOE]

    Monitoring O2 and NOx in Combustion Environments | Department of Energy Compact sensors have been developed to allow for real-time monitoring of O2 and NOx during combustion. deer08_singh.pdf (396.99 KB) More Documents & Publications Compact Electrochemical Bi-functional NOx/O2 Sensors with an Internal Reference for High Temperature Applications Compact Potentiometric NOx Sensor Compact Potentiometric NOx Sensor Department of Energy

    8_hartridge_2012_o.pdf (1.32 MB) More Documents

  20. Open source development experience with a computational gas-solids flow code

    SciTech Connect (OSTI)

    Syamlal, M; O'Brien, T. J.; Benyahia, Sofiane; Gel, Aytekin; Pannala, Sreekanth

    2008-01-01

    A case study on the use of open source (OS) software development in chemical engineering research and education is presented here. The multiphase computational fluid dynamics software MFIX is the object of the case study. The verification and validation steps required for constructing modern computational software and the advantages of OS development in those steps are discussed. The infrastructure used for enabling the OS development of MFIX is described. The impact of OS development on computational research and education in gas-solids flow and the dissemination of information to other areas such as geotechnical and volcanology research are demonstrated. It is shown that the advantages of OS development methodology were realized: verification by many users, which enhances software quality; the use of software as a means for accumulating and exchanging information; and the facilitation of peer review of the results of computational research.

    1. Computational and Experimental Development of Novel High Temperature Alloys

      SciTech Connect (OSTI)

      Kramer, M.J.; Ray, P.K.; and Akinc, M.

      2010-06-29

      The work done in this paper is based on our earlier work on developing an extended Miedema model and then using it to downselect potential alloy systems. Our approach is to closely couple the semi-empirical methodologies to more accurate ab initio methods to dentify the best candidates for ternary alloying additions. The architectural framework for our material's design is a refractory base metal with a high temperature intermetallic which provides both high temperature creep strength and a source of oxidatively stable elements. Potential refractory base metals are groups IIIA, IVA and VA. For Fossil applications, Ni-Al appears to be the best choice to provide the source of oxidatively stable elements but this system requires a 'boost' in melting temperatures to be a viable candidate in the ultra-high temperature regime (> 1200C). Some late transition metals and noble elements are known to increase the melting temperature of Ni-Al phases. Such an approach suggested that a Mo-Ni-Al system would be a good base alloy system that could be further improved upon by dding Platinum group metals (PGMs). In this paper, we demonstrate the variety of microstructures that can be synthesized for the base alloy system, its oxidation behavior as well as the oxidation behavior of the PGM substituted oxidation resistant B2 NiAl phase.

    2. Vehicle Technologies Office Merit Review 2015: Development of Computer-Aided Design Tools for Automotive Batteries

      Broader source: Energy.gov [DOE]

      Presentation given by General Motors at 2015 DOE Hydrogen and Fuel Cells Program and Vehicle Technologies Office Annual Merit Review and Peer Evaluation Meeting about development of computer-aided...

    3. Vehicle Technologies Office Merit Review 2014: Development of Computer-Aided Design Tools for Automotive Batteries

      Broader source: Energy.gov [DOE]

      Presentation given by CD-Adapco at 2014 DOE Hydrogen and Fuel Cells Program and Vehicle Technologies Office Annual Merit Review and Peer Evaluation Meeting about development of computer-aided...

    4. Vehicle Technologies Office Merit Review 2014: Development of Computer-Aided Design Tools for Automotive Batteries

      Broader source: Energy.gov [DOE]

      Presentation given by General Motors at 2014 DOE Hydrogen and Fuel Cells Program and Vehicle Technologies Office Annual Merit Review and Peer Evaluation Meeting about development of computer-aided...

    5. NREL Supports Industry to Develop Computer-Aided Engineering Tools for Car

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Batteries - News Releases | NREL NREL Supports Industry to Develop Computer-Aided Engineering Tools for Car Batteries July 7, 2011 The U.S. Department of Energy's (DOE) National Renewable Energy Laboratory (NREL) recently awarded three industry teams, after a competitive procurement process, a total of $7 million for the development of computer-aided software design tools to help produce the next generation of electric drive vehicle (EDV) batteries. These projects support DOE's

    6. New DOE Office of Science support for CAMERA to develop computational

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      mathematics for experimental facilities research New DOE Office of Science support for CAMERA to develop computational mathematics for experimental facilities research New DOE Office of Science support for CAMERA to develop computational mathematics for experimental facilities research September 22, 2015 Contact: Linda Vu, +1 510 495 2402, lvu@lbl.gov newcameralogofinal Experimental science is evolving. With the advent of new technology, scientific facilities are collecting data at

    7. Laboratory Directed Research & Development Page National Energy Research Scientific Computing Center

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Directed Research & Development Page National Energy Research Scientific Computing Center T3E Individual Node Optimization Michael Stewart, SGI/Cray, 4/9/98 * Introduction * T3E Processor * T3E Local Memory * Cache Structure * Optimizing Codes for Cache Usage * Loop Unrolling * Other Useful Optimization Options * References 1 Laboratory Directed Research & Development Page National Energy Research Scientific Computing Center Introduction * Primary topic will be single processor

    8. TWRS privatization process technical baseline

      SciTech Connect (OSTI)

      Orme, R.M.

      1996-09-13

      The U.S. Department of Energy (DOE) is planning a two-phased program for the remediation of Hanford tank waste. Phase 1 is a pilot program to demonstrate the procurement of treatment services. The volume of waste treated during the Phase 1 is a small percentage of the tank waste. During Phase 2, DOE intends to procure treatment services for the balance of the waste. The TWRS Privatization Process Technical Baseline (PPTB) provides a summary level flowsheet/mass balance of tank waste treatment operations which is consistent with the tank inventory information, waste feed staging studies, and privatization guidelines currently available. The PPTB will be revised periodically as privatized processing concepts are crystallized.

    9. Computer

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      simulation of three-dimensional heavy ion beam trajectory imaging techniques used for magnetic field estimation C. Ling, K. A. Connor, D. R. Demers, R. J. Radke, and P. M. Schoch a͒ ECSE Department, Rensselaer Polytechnic Institute, Troy, New York, 12180, USA ͑Received 28 August 2007; accepted 6 October 2007; published online 26 November 2007͒ A magnetic field mapping technique via heavy ion beam trajectory imaging is being developed on the Madison Symmetric Torus reversed field pinch. This

    10. Direct coal liquefaction baseline design and system analysis. Quarterly report, April--June 1991

      SciTech Connect (OSTI)

      Not Available

      1991-07-01

      The primary objective of the study is to develop a computer model for a base line direct coal liquefaction design based on two stage direct coupled catalytic reactors. This primary objective is to be accomplished by completing the following: a base line design based on previous DOE/PETC results from Wilsonville pilot plant and other engineering evaluations; a cost estimate and economic analysis; a computer model incorporating the above two steps over a wide range of capacities and selected process alternatives; a comprehensive training program for DOE/PETC Staff to understand and use the computer model; a thorough documentation of all underlying assumptions for baseline economics; and a user manual and training material which will facilitate updating of the model in the future.

    11. Toward developing a computational capability for PEM fuel cell design and optimization.

      SciTech Connect (OSTI)

      Wang, Chao Yang; Luo, Gang; Jiang, Fangming; Carnes, Brian; Chen, Ken Shuang

      2010-05-01

      In this paper, we report the progress made in our project recently funded by the US Department of Energy (DOE) toward developing a computational capability, which includes a two-phase, three-dimensional PEM (polymer electrolyte membrane) fuel cell model and its coupling with DAKOTA (a design and optimization toolkit developed and being enhanced by Sandia National Laboratories). We first present a brief literature survey in which the prominent/notable PEM fuel cell models developed by various researchers or groups are reviewed. Next, we describe the two-phase, three-dimensional PEM fuel cell model being developed, tested, and later validated by experimental data. Results from case studies are presented to illustrate the utility of our comprehensive, integrated cell model. The coupling between the PEM fuel cell model and DAKOTA is briefly discussed. Our efforts in this DOE-funded project are focused on developing a validated computational capability that can be employed for PEM fuel cell design and optimization.

    12. Accelerating Development of EV Batteries Through Computer-Aided Engineering (Presentation)

      SciTech Connect (OSTI)

      Pesaran, A.; Kim, G. H.; Smith, K.; Santhanagopalan, S.

      2012-12-01

      The Department of Energy's Vehicle Technology Program has launched the Computer-Aided Engineering for Automotive Batteries (CAEBAT) project to work with national labs, industry and software venders to develop sophisticated software. As coordinator, NREL has teamed with a number of companies to help improve and accelerate battery design and production. This presentation provides an overview of CAEBAT, including its predictive computer simulation of Li-ion batteries known as the Multi-Scale Multi-Dimensional (MSMD) model framework. MSMD's modular, flexible architecture connects the physics of battery charge/discharge processes, thermal control, safety and reliability in a computationally efficient manner. This allows independent development of submodels at the cell and pack levels.

    13. High-Performance Computing for Alloy Development | netl.doe.gov

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      High-Performance Computing for Alloy Development alloy-development.jpg Tomorrow's fossil-fuel based power plants will achieve higher efficiencies by operating at higher pressures and temperatures and under harsher and more corrosive conditions. Unfortunately, conventional metals simply cannot withstand these extreme environments, so advanced alloys must be designed and fabricated to meet the needs of these advanced systems. The properties of metal alloys, which are mixtures of metallic elements,

    14. Global Nuclear Energy Partnership Waste Treatment Baseline

      SciTech Connect (OSTI)

      Dirk Gombert; William Ebert; James Marra; Robert Jubin; John Vienna

      2008-05-01

      The Global Nuclear Energy Partnership program (GNEP) is designed to demonstrate a proliferation-resistant and sustainable integrated nuclear fuel cycle that can be commercialized and used internationally. Alternative stabilization concepts for byproducts and waste streams generated by fuel recycling processes were evaluated and a baseline of waste forms was recommended for the safe disposition of waste streams. Waste forms are recommended based on the demonstrated or expected commercial practicability and technical maturity of the processes needed to make the waste forms, and performance of the waste form materials when disposed. Significant issues remain in developing technologies to process some of the wastes into the recommended waste forms, and a detailed analysis of technology readiness and availability may lead to the choice of a different waste form than what is recommended herein. Evolving regulations could also affect the selection of waste forms.

    15. Baselines for Greenhouse Gas Reductions: Problems, Precedents...

      Open Energy Info (EERE)

      Baseline projection, GHG inventory, Pathways analysis Resource Type: Publications, Lessons learnedbest practices Website: www.p2pays.orgref2221739.pdf References:...

    16. Tank waste remediation systems technical baseline database

      SciTech Connect (OSTI)

      Porter, P.E.

      1996-10-16

      This document includes a cassette tape that contains Hanford generated data for the Tank Waste Remediation Systems Technical Baseline Database as of October 09, 1996.

    17. ARM: Baseline Solar Radiation Network (BSRN): solar irradiances...

      Office of Scientific and Technical Information (OSTI)

      Baseline Solar Radiation Network (BSRN): solar irradiances Title: ARM: Baseline Solar Radiation Network (BSRN): solar irradiances Baseline Solar Radiation Network (BSRN): solar ...

    18. Long-Baseline Neutrino Facility / Deep Underground Neutrino Project...

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      Long-Baseline Neutrino Facility Deep Underground Neutrino Project (LBNF-DUNE) Long-Baseline Neutrino Facility Deep Underground Neutrino Project (LBNF-DUNE) Long-Baseline ...

    19. LEDSGP/Transportation Toolkit/Key Actions/Create a Baseline ...

      Open Energy Info (EERE)

      a Baseline) Jump to: navigation, search LEDSGP Logo.png Transportation Toolkit Home Tools Training Request Assistance Key Actions for Low-Emission Development in Transportation...

    20. TWRS technical baseline database manager definition document

      SciTech Connect (OSTI)

      Acree, C.D.

      1997-08-13

      This document serves as a guide for using the TWRS Technical Baseline Database Management Systems Engineering (SE) support tool in performing SE activities for the Tank Waste Remediation System (TWRS). This document will provide a consistent interpretation of the relationships between the TWRS Technical Baseline Database Management software and the present TWRS SE practices. The Database Manager currently utilized is the RDD-1000 System manufactured by the Ascent Logic Corporation. In other documents, the term RDD-1000 may be used interchangeably with TWRS Technical Baseline Database Manager.

    1. Management of the baseline shift using a new and simple method for respiratory-gated radiation therapy: Detectability and effectiveness of a flexible monitoring system

      SciTech Connect (OSTI)

      Tachibana, Hidenobu; Kitamura, Nozomi; Ito, Yasushi; Kawai, Daisuke; Nakajima, Masaru; Tsuda, Akihisa; Shiizuka, Hisao

      2011-07-15

      Purpose: In respiratory-gated radiation therapy, a baseline shift decreases the accuracy of target coverage and organs at risk (OAR) sparing. The effectiveness of audio-feedback and audio-visual feedback in correcting the baseline shift in the breathing pattern of the patient has been demonstrated previously. However, the baseline shift derived from the intrafraction motion of the patient's body cannot be corrected by these methods. In the present study, the authors designed and developed a simple and flexible system. Methods: The system consisted of a web camera and a computer running our in-house software. The in-house software was adapted to template matching and also to no preimage processing. The system was capable of monitoring the baseline shift in the intrafraction motion of the patient's body. Another marker box was used to monitor the baseline shift due to the flexible setups required of a marker box for gated signals. The system accuracy was evaluated by employing a respiratory motion phantom and was found to be within AAPM Task Group 142 tolerance (positional accuracy <2 mm and temporal accuracy <100 ms) for respiratory-gated radiation therapy. Additionally, the effectiveness of this flexible and independent system in gated treatment was investigated in healthy volunteers, in terms of the results from the differences in the baseline shift detectable between the marker positions, which the authors evaluated statistically. Results: The movement of the marker on the sternum [1.599 {+-} 0.622 mm (1 SD)] was substantially decreased as compared with the abdomen [6.547 {+-} 0.962 mm (1 SD)]. Additionally, in all of the volunteers, the baseline shifts for the sternum [-0.136 {+-} 0.868 (2 SD)] were in better agreement with the nominal baseline shifts than was the case for the abdomen [-0.722 {+-} 1.56 mm (2 SD)]. The baseline shifts could be accurately measured and detected using the monitoring system, which could acquire the movement of the marker on the

    2. Computing Videos

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Computing Videos Computing

    3. Development of a three-phase reacting flow computer model for analysis of petroleum cracking

      SciTech Connect (OSTI)

      Chang, S.L.; Lottes, S.A.; Petrick, M.

      1995-07-01

      A general computational fluid dynamics computer code (ICRKFLO) has been developed for the simulation of the multi-phase reacting flow in a petroleum fluid catalytic cracker riser. ICRKFLO has several unique features. A new integral reaction submodel couples calculations of hydrodynamics and cracking kinetics by making the calculations more efficient in achieving stable convergence while still preserving the major physical effects of reaction processes. A new coke transport submodel handles the process of coke formation in gas phase reactions and the subsequent deposition on the surface of adjacent particles. The code was validated by comparing with experimental results of a pilot scale fluid cracker unit. The code can predict the flow characteristics of gas, liquid, and particulate solid phases, vaporization of the oil droplets, and subsequent cracking of the oil in a riser reactor, which may lead to a better understanding of the internal processes of the riser and the impact of riser geometry and operating parameters on the riser performance.

    4. Hanford Site technical baseline database. Revision 1

      SciTech Connect (OSTI)

      Porter, P.E.

      1995-01-27

      This report lists the Hanford specific files (Table 1) that make up the Hanford Site Technical Baseline Database. Table 2 includes the delta files that delineate the differences between this revision and revision 0 of the Hanford Site Technical Baseline Database. This information is being managed and maintained on the Hanford RDD-100 System, which uses the capabilities of RDD-100, a systems engineering software system of Ascent Logic Corporation (ALC). This revision of the Hanford Site Technical Baseline Database uses RDD-100 version 3.0.2.2 (see Table 3). Directories reflect those controlled by the Hanford RDD-100 System Administrator. Table 4 provides information regarding the platform. A cassette tape containing the Hanford Site Technical Baseline Database is available.

    5. Integrated Baseline Bystem (IBS) Version 1.03: Models guide

      SciTech Connect (OSTI)

      Not Available

      1993-01-01

      The Integrated Baseline System)(IBS), operated by the Federal Emergency Management Agency (FEMA), is a system of computerized tools for emergency planning and analysis. This document is the models guide for the IBS and explains how to use the emergency related computer models. This document provides information for the experienced system user, and is the primary reference for the computer modeling software supplied with the system. It is designed for emergency managers and planners, and others familiar with the concepts of computer modeling. Although the IBS manual set covers basic and advanced operations, it is not a complete reference document set. Emergency situation modeling software in the IBS is supported by additional technical documents. Some of the other IBS software is commercial software for which more complete documentation is available. The IBS manuals reference such documentation where necessary.

    6. Development of a computer wellbore simulator for coiled-tube operations

      SciTech Connect (OSTI)

      Gu, H.; Walton, I.C.; Dowell, S.

      1994-12-31

      This paper describes a computer wellbore simulator developed for coiled tubing operations of fill cleanout and unloading of oil and gas wells. The simulator models the transient, multiphase fluid flow and mass transport process that occur in these operations. Unique features of the simulator include a sand bed that may form during fill cleanout in deviated and horizontal wells, particle transport with multiphase compressible fluids, and the transient unloading process of oil and gas wells. The requirements for a computer wellbore simulator for coiled tubing operations are discussed and it is demonstrated that the developed simulator is suitable for modeling these operations. The simulator structure and the incorporation of submodules for gas/liquid two-phase flow, reservoir and choke models, and coiled tubing movement are addressed. Simulation examples are presented to show the sand bed formed in cleanout in a deviated well and the transient unloading results of oil and gas wells. The wellbore simulator developed in this work can assist a field engineer with the design of coiled tubing operations. By using the simulator to predict the pressure, flow rates, sand concentration and bed depth, the engineer will be able to select the coiled tubing, fluid and schedule of an optimum design for particular well and reservoir conditions.

    7. GridPACK Toolkit for Developing Power Grid Simulations on High Performance Computing Platforms

      SciTech Connect (OSTI)

      Palmer, Bruce J.; Perkins, William A.; Glass, Kevin A.; Chen, Yousu; Jin, Shuangshuang; Callahan, Charles D.

      2013-11-30

      This paper describes the GridPACK framework, which is designed to help power grid engineers develop modeling software capable of running on todays high performance computers. The framework contains modules for setting up distributed power grid networks, assigning buses and branches with arbitrary behaviors to the network, creating distributed matrices and vectors, using parallel linear and non-linear solvers to solve algebraic equations, and mapping functionality to create matrices and vectors based on properties of the network. In addition, the framework contains additional functionality to support IO and to manage errors.

    8. Fortran Transformational Tools in Support of Scientific Application Development for Petascale Computer Architectures

      SciTech Connect (OSTI)

      Sottille, Matthew

      2013-09-12

      This document is the final report for a multi-year effort building infrastructure to support tool development for Fortran programs. We also investigated static analysis and code transformation methods relevant to scientific programmers who are writing Fortran programs for petascale-class high performance computing systems. This report details our accomplishments, technical approaches, and provides information on where the research results and code may be obtained from an open source software repository. The report for the first year of the project that was performed at the University of Oregon prior to the PI moving to Galois, Inc. is included as an appendix.

    9. Development of high performance scientific components for interoperability of computing packages

      SciTech Connect (OSTI)

      Gulabani, Teena Pratap

      2008-12-01

      Three major high performance quantum chemistry computational packages, NWChem, GAMESS and MPQC have been developed by different research efforts following different design patterns. The goal is to achieve interoperability among these packages by overcoming the challenges caused by the different communication patterns and software design of each of these packages. A chemistry algorithm is hard to develop as well as being a time consuming process; integration of large quantum chemistry packages will allow resource sharing and thus avoid reinvention of the wheel. Creating connections between these incompatible packages is the major motivation of the proposed work. This interoperability is achieved by bringing the benefits of Component Based Software Engineering through a plug-and-play component framework called Common Component Architecture (CCA). In this thesis, I present a strategy and process used for interfacing two widely used and important computational chemistry methodologies: Quantum Mechanics and Molecular Mechanics. To show the feasibility of the proposed approach the Tuning and Analysis Utility (TAU) has been coupled with NWChem code and its CCA components. Results show that the overhead is negligible when compared to the ease and potential of organizing and coping with large-scale software applications.

    10. Revised SRC-I project baseline. Volume 1

      SciTech Connect (OSTI)

      Not Available

      1984-01-01

      International Coal Refining Company (ICRC), in cooperation with the Commonwealth of Kentucky has contracted with the United States Department of Energy (DOE) to design, build and operate a first-of-its-kind plant demonstrating the economic, environmental, socioeconomic and technical feasibility of the direct coal liquefaction process known as SRC-I. ICRC has made a massive commitment of time and expertise to design processes, plan and formulate policy, schedules, costs and technical drawings for all plant systems. These fully integrated plans comprise the Project Baseline and are the basis for all future detailed engineering, plant construction, operation, and other work set forth in the contract between ICRC and the DOE. Volumes I and II of the accompanying documents constitute the updated Project Baseline for the SRC-I two-stage liquefaction plant. International Coal Refining Company believes this versatile plant design incorporates the most advanced coal liquefaction system available in the synthetic fuels field. SRC-I two-stage liquefaction, as developed by ICRC, is the way of the future in coal liquefaction because of its product slate flexibility, high process thermal efficiency, and low consumption of hydrogen. The SRC-I Project Baseline design also has made important state-of-the-art advances in areas such as environmental control systems. Because of a lack of funding, the DOE has curtailed the total project effort without specifying a definite renewal date. This precludes the development of revised accurate and meaningful schedules and, hence, escalated project costs. ICRC has revised and updated the original Design Baseline to include in the technical documentation all of the approved but previously non-incorporated Category B and C and new Post-Baseline Engineering Change Proposals.

    11. Spent nuclear fuel technical baseline description, Fiscal Year 1996: Volume II, supporting data

      SciTech Connect (OSTI)

      Womack, J.C.

      1995-11-01

      The Technical Baseline Description documents the Project-Level functions and requirements, along with associated enabling assumptions, issues, trade studies, interfaces, and products. It is a snapshot in time of the baseline at the beginning of September 1995. It supports the individual subprojects in the development of lower-tier functions, requirements, and specifications in FY 1996. It also supports the need for Hanford site planning to be based on an integrated Hanford site systems engineering technical baseline; and is traceable to that baseline. This document replaces and supercedes WHC-SD-SNF-SD-003.

    12. TWRS privatization phase I - site characterization and environmental baseline work plan

      SciTech Connect (OSTI)

      Reidel, S.P.; Hodges, F.N., Westinghouse Hanford

      1996-08-27

      This work plan defines the steps necessary to develop a Site Characterization Plan and Environmental Baseline for the TWRS Privatization Phase I area. The Data Quality Objectives Process will be the primary tool used to develop these plans.

    13. Solid Waste Program technical baseline description

      SciTech Connect (OSTI)

      Carlson, A.B.

      1994-07-01

      The system engineering approach has been taken to describe the technical baseline under which the Solid Waste Program is currently operating. The document contains a mission analysis, function analysis, system definition, documentation requirements, facility and project bases, and uncertainties facing the program.

    14. Waste management project technical baseline description

      SciTech Connect (OSTI)

      Sederburg, J.P.

      1997-08-13

      A systems engineering approach has been taken to describe the technical baseline under which the Waste Management Project is currently operating. The document contains a mission analysis, function analysis, requirement analysis, interface definitions, alternative analysis, system definition, documentation requirements, implementation definitions, and discussion of uncertainties facing the Project.

    15. Final Scientific Report: A Scalable Development Environment for Peta-Scale Computing

      SciTech Connect (OSTI)

      Karbach, Carsten; Frings, Wolfgang

      2013-02-20

      This document is the final scientific report of the project DE-SC000120 (A scalable Development Environment for Peta-Scale Computing). The objective of this project is the extension of the Parallel Tools Platform (PTP) for applying it to peta-scale systems. PTP is an integrated development environment for parallel applications. It comprises code analysis, performance tuning, parallel debugging and system monitoring. The contribution of the Juelich Supercomputing Centre (JSC) aims to provide a scalable solution for system monitoring of supercomputers. This includes the development of a new communication protocol for exchanging status data between the target remote system and the client running PTP. The communication has to work for high latency. PTP needs to be implemented robustly and should hide the complexity of the supercomputer's architecture in order to provide a transparent access to various remote systems via a uniform user interface. This simplifies the porting of applications to different systems, because PTP functions as abstraction layer between parallel application developer and compute resources. The common requirement for all PTP components is that they have to interact with the remote supercomputer. E.g. applications are built remotely and performance tools are attached to job submissions and their output data resides on the remote system. Status data has to be collected by evaluating outputs of the remote job scheduler and the parallel debugger needs to control an application executed on the supercomputer. The challenge is to provide this functionality for peta-scale systems in real-time. The client server architecture of the established monitoring application LLview, developed by the JSC, can be applied to PTP's system monitoring. LLview provides a well-arranged overview of the supercomputer's current status. A set of statistics, a list of running and queued jobs as well as a node display mapping running jobs to their compute resources form

    16. India-Danish Government Baseline Workstream | Open Energy Information

      Open Energy Info (EERE)

      Danish Government Baseline Workstream Jump to: navigation, search Name India-Danish Government Baseline Workstream AgencyCompany Organization Danish Government Partner Danish...

    17. Baseline and Target Values for PV Forecasts: Toward Improved...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Baseline and Target Values for PV Forecasts: Toward Improved Solar Power Forecasting ... Baseline and Target Values for PV Forecasts: Toward Improved Solar Power Forecasting Jie ...

    18. EA-1943: Construction and Operation of the Long Baseline Neutrino...

      Energy Savers [EERE]

      43: Construction and Operation of the Long Baseline Neutrino Facility and Deep Underground ... EA-1943: Construction and Operation of the Long Baseline Neutrino Facility and Deep ...

    19. South Africa-Danish Government Baseline Workstream | Open Energy...

      Open Energy Info (EERE)

      Baseline Workstream Jump to: navigation, search Name South Africa-Danish Government Baseline Workstream AgencyCompany Organization Danish Government Partner Danish Ministry for...

    20. Brazil-Danish Government Baseline Workstream | Open Energy Information

      Open Energy Info (EERE)

      Danish Government Baseline Workstream Jump to: navigation, search Name Brazil-Danish Government Baseline Workstream AgencyCompany Organization Danish Government Partner Danish...

    1. China-Danish Government Baseline Workstream | Open Energy Information

      Open Energy Info (EERE)

      Danish Government Baseline Workstream Jump to: navigation, search Name China-Danish Government Baseline Workstream AgencyCompany Organization Danish Government Partner Danish...

    2. NREL: Energy Analysis - Annual Technology Baseline and Standard...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Annual Technology Baseline and Standard Scenarios - Legacy Versions This section contains earlier versions of NREL's Annual Technology Baseline and Standard Scenarios products. ...

    3. UNFCCC-Consolidated baseline and monitoring methodology for landfill...

      Open Energy Info (EERE)

      Consolidated baseline and monitoring methodology for landfill gas project activities Jump to: navigation, search Tool Summary LAUNCH TOOL Name: UNFCCC-Consolidated baseline and...

    4. Indonesia-Danish Government Baseline Workstream | Open Energy...

      Open Energy Info (EERE)

      Indonesia-Danish Government Baseline Workstream Jump to: navigation, search Name Indonesia-Danish Government Baseline Workstream AgencyCompany Organization Danish Government...

    5. Mexico-Danish Government Baseline Workstream | Open Energy Information

      Open Energy Info (EERE)

      Danish Government Baseline Workstream Jump to: navigation, search Name Mexico-Danish Government Baseline Workstream AgencyCompany Organization Danish Government Partner Danish...

    6. Computing Sciences

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Division The Computational Research Division conducts research and development in mathematical modeling and simulation, algorithm design, data storage, management and...

    7. TWRS phase I privatization site environmental baseline and characterization plan

      SciTech Connect (OSTI)

      Shade, J.W.

      1997-09-01

      This document provides a plan to characterize and develop an environmental baseline for the TWRS Phase I Privatization Site before construction begins. A site evaluation study selected the former Grout Disposal Area of the Grout Treatment Facility in the 200 East Area as the TWRS Phase I Demonstration Site. The site is generally clean and has not been used for previous activities other than the GTF. A DQO process was used to develop a Sampling and Analysis Plan that would allow comparison of site conditions during operations and after Phase I ends to the presently existing conditions and provide data for the development of a preoperational monitoring plan.

    8. Baseline Microstructural Characterization of Outer 3013 Containers

      SciTech Connect (OSTI)

      Zapp, Phillip E.; Dunn, Kerry A

      2005-07-31

      Three DOE Standard 3013 outer storage containers were examined to characterize the microstructure of the type 316L stainless steel material of construction. Two of the containers were closure-welded yielding production-quality outer 3013 containers; the third examined container was not closed. Optical metallography and Knoop microhardness measurements were performed to establish a baseline characterization that will support future destructive examinations of 3013 outer containers in the storage inventory. Metallography revealed the microstructural features typical of this austenitic stainless steel as it is formed and welded. The grains were equiaxed with evident annealing twins. Flow lines were prominent in the forming directions of the cylindrical body and flat lids and bottom caps. No adverse indications were seen. Microhardness values, although widely varying, were consistent with annealed austenitic stainless steel. The data gathered as part of this characterization will be used as a baseline for the destructive examination of 3013 containers removed from the storage inventory.

    9. The Fermilab long-baseline neutrino program

      SciTech Connect (OSTI)

      Goodman, M.; MINOS Collaboration

      1997-10-01

      Fermilab is embarking upon a neutrino oscillation program which includes a long-baseline neutrino experiment MINOS. MINOS will be a 10 kiloton detector located 730 km Northwest of Fermilab in the Soudan underground laboratory. It will be sensitive to neutrino oscillations with parameters above {Delta}m{sup 2} {approximately} 3 {times} 10{sup {minus}3} eV{sup 2} and sin{sup 2}(2{theta}) {approximately} 0.02.

    10. Systematic errors in long baseline oscillation experiments

      SciTech Connect (OSTI)

      Harris, Deborah A.; /Fermilab

      2006-02-01

      This article gives a brief overview of long baseline neutrino experiments and their goals, and then describes the different kinds of systematic errors that are encountered in these experiments. Particular attention is paid to the uncertainties that come about because of imperfect knowledge of neutrino cross sections and more generally how neutrinos interact in nuclei. Near detectors are planned for most of these experiments, and the extent to which certain uncertainties can be reduced by the presence of near detectors is also discussed.

    11. Module 7 - Integrated Baseline Review and Change Control | Department of

      Office of Environmental Management (EM)

      Energy 7 - Integrated Baseline Review and Change Control Module 7 - Integrated Baseline Review and Change Control This module focuses on integrated baseline reviews (IBR) and change control. This module outlines the objective and responsibility of an integrated baseline review. Additionally, this module will discuss the change control process required for implementing earned value. Begin Module >> (418.59

    12. Long-Baseline Neutrino Experiment (LBNE)Conceptual Design ReportThe LBNE Water Cherenkov DetectorApril 13 2012

      SciTech Connect (OSTI)

      Kettell S. H.; Bishai, M.; Brown, R.; Chen, H.; Diwan, M.; Dolph, J., Geronimo, G.; Gill, R.; Hackenburg, R.; Hahn, R.; Hans, S.; Isvan, Z.; Jaffe, D.; Junnarkar, S.; Kettell, S.H.; Lanni,F.; Li, Y.; Ling, J.; Littenberg, L.; Makowiecki, D.; Marciano, W.; Morse, W.; Parsa, Z.; Radeka, V.; Rescia, S.; Samios, N.; Sharma, R.; Simos, N.; Sondericker, J.; Stewart, J.; Tanaka, H.; Themann, H.; Thorn, C.; Viren, B., White, S.; Worcester, E.; Yeh, M.; Yu, B.; Zhang, C.

      2012-04-13

      Conceptual Design Report (CDR) developed for the Water Cherekov Detector (WCD) option for the far detector of the Long Baseline Neutrino Experiment (LBNE)

    13. Process Simulation Role in the Development of New Alloys Based on Integrated Computational Material Science and Engineering

      SciTech Connect (OSTI)

      Sabau, Adrian S; Porter, Wallace D; Roy, Shibayan; Shyam, Amit

      2014-01-01

      To accelerate the introduction of new materials and components, the development of metal casting processes requires the teaming between different disciplines, as multi-physical phenomena have to be considered simultaneously for the process design and optimization of mechanical properties. The required models for physical phenomena as well as their validation status for metal casting are reviewed. The data on materials properties, model validation, and relevant microstructure for materials properties are highlighted. One vehicle to accelerate the development of new materials is through combined experimental-computational efforts. Integrated computational/experimental practices are reviewed; strengths and weaknesses are identified with respect to metal casting processes. Specifically, the examples are given for the knowledge base established at Oak Ridge National Laboratory and computer models for predicting casting defects and microstructure distribution in aluminum alloy components.

    14. Integrated Baseline System (IBS) Version 1.03: Utilities guide

      SciTech Connect (OSTI)

      Burford, M.J.; Downing, T.R.; Pottier, M.C.; Schrank, E.E.; Williams, J.R.

      1993-01-01

      The Integrated Baseline System (IBS) is an emergency management planning and analysis tool that was developed under the direction of the Federal Emergency Management Agency (FEMA). This Utilities Guide explains how to operate utility programs that are supplied as a part of the IBS. These utility programs are chiefly for managing and manipulating various kinds of IBS data and system administration files. Many of the utilities are for creating, editing, converting, or displaying map data and other data that are related to geographic location.

    15. Integrated Baseline System (IBS) Version 2.0: Utilities Guide

      SciTech Connect (OSTI)

      Burford, M.J.; Downing, T.R.; Williams, J.R.; Bower, J.C.

      1994-03-01

      The Integrated Baseline System (IBS) is an emergency management planning and analysis tool being developed under the direction of the US Army Nuclear and Chemical Agency. This Utilities Guide explains how you can use the IBS utility programs to manage and manipulate various kinds of IBS data. These programs include utilities for creating, editing, and displaying maps and other data that are referenced to geographic location. The intended audience for this document are chiefly data managers but also system managers and some emergency management planners and analysts.

    16. Baseline requirements of the proposed action for the Transportation Management Division routing models

      SciTech Connect (OSTI)

      Johnson, P.E.; Joy, D.S.

      1995-02-01

      The potential impacts associated with the transportation of hazardous materials are important to shippers, carriers, and the general public. This is particularly true for shipments of radioactive material. The shippers are primarily concerned with safety, security, efficiency, and equipment requirements. The carriers are concerned with the potential impact that radioactive shipments may have on their operations--particularly if such materials are involved in an accident. The general public has also expressed concerns regarding the safety of transporting radioactive and other hazardous materials through their communities. Because transportation routes are a central concern in hazardous material transport, the prediction of likely routes is the first step toward resolution of these issues. In response to these routing needs, several models have been developed over the past fifteen years at Oak Ridge National Laboratory (ORNL). The HIGHWAY routing model is used to predict routes for truck transportation, the INTERLINE routing model is used to predict both rail and barge routes, and the AIRPORT locator model is used to determine airports with specified criteria near a specific location. As part of the ongoing improvement of the US Department of Energy`s (DOE) Environmental Management Transportation Management Division`s (EM-261) computer systems and development efforts, a Baseline Requirements Assessment Session on the HIGHWAY, INTERLINE, and AIRPORT models was held at ORNL on April 27, 1994. The purpose of this meeting was to discuss the existing capabilities of the models and data bases and to review enhancements of the models and data bases to expand their usefulness. The results of the Baseline Requirements Assessment Section will be discussed in this report. The discussions pertaining to the different models are contained in separate sections.

    17. ABB SCADA/EMS System INEEL Baseline Summary Test Report (November 2004) |

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      Department of Energy SCADA/EMS System INEEL Baseline Summary Test Report (November 2004) ABB SCADA/EMS System INEEL Baseline Summary Test Report (November 2004) This document covers the security evaluation of the "baseline" or "as delivered" system performed in the Idaho National Engineering and Environmental Laboratory (INEEL) SCADA test bed as part of the Critical Infrastructure Test Range Development Program, which is funded by the U.S. Department of Energy; Office of

    18. System design and algorithmic development for computational steering in distributed environments

      SciTech Connect (OSTI)

      Wu, Qishi; Zhu, Mengxia; Gu, Yi; Rao, Nageswara S

      2010-03-01

      Supporting visualization pipelines over wide-area networks is critical to enabling large-scale scientific applications that require visual feedback to interactively steer online computations. We propose a remote computational steering system that employs analytical models to estimate the cost of computing and communication components and optimizes the overall system performance in distributed environments with heterogeneous resources. We formulate and categorize the visualization pipeline configuration problems for maximum frame rate into three classes according to the constraints on node reuse or resource sharing, namely no, contiguous, and arbitrary reuse. We prove all three problems to be NP-complete and present heuristic approaches based on a dynamic programming strategy. The superior performance of the proposed solution is demonstrated with extensive simulation results in comparison with existing algorithms and is further evidenced by experimental results collected on a prototype implementation deployed over the Internet.

    19. Integrated Baseline System (IBS). Version 1.03, System Management Guide

      SciTech Connect (OSTI)

      Williams, J.R.; Bailey, S.; Bower, J.C.

      1993-01-01

      This IBS System Management Guide explains how to install or upgrade the Integrated Baseline System (IBS) software package. The IBS is an emergency management planning and analysis tool that was developed under the direction of the Federal Emergency Management Agency (FEMA). This guide includes detailed instructions for installing the IBS software package on a Digital Equipment Corporation (DEC) VAX computer from the IBS distribution tapes. The installation instructions include procedures for both first-time installations and upgrades to existing IBS installations. To ensure that the system manager has the background necessary for successful installation of the IBS package, this guide also includes information on IBS computer requirements, software organization, and the generation of IBS distribution tapes. When special utility programs are used during IBS installation and setups, this guide refers you to the IBS Utilities Guide for specific instructions. This guide also refers you to the IBS Data Management Guide for detailed descriptions of some IBS data files and structures. Any special requirements for installation are not documented here but should be included in a set of installation notes that come with the distribution tapes.

    20. Baseline design/economics for advanced Fischer-Tropsch technology. Quarterly report, July--September 1993

      SciTech Connect (OSTI)

      1993-12-31

      The objectives of this study are to: Develop a baseline design and two alternative designs for indirect liquefaction using advanced F-T technology. The baseline design uses Illinois No. 6 Eastern Coal and conventional refining. There is an alternative refining case using ZSM-5 treatment of the vapor stream from the slurry F-T reactor and an alternative coal case using Western coal from the Powder River Basin. Prepare the capital and operating costs for the baseline design and the alternatives. Individual plant costs for the alternative cases will be prorated on capacity, wherever possible, from the baseline case. Develop a process flowsheet simulation (PFS) model. During the period of this report, a Topical Report summarizing the Baseline Case design was drafted and issued to DOE/PETC for review and release approval. Major effort was spent on the Alternate Upgrading and Refining Case. Its design specifications were finalized, and material and utility balances completed. Initial capital cost estimates were developed. A Topical Report, summarizing the Alternative (ZSM-5) Upgrading and Refining Case design, is being drafted. Under Task 4, some of the individual plant models were expanded and enhanced. An overall ASPEN/SP process simulation model was developed for the Baseline Design Case by combining the individual models of Areas 100, 200 and 300. In addition, a separate model for the simplified product refining area, Area 300, of the Alternate Upgrading and Refining case was developed. Under Task 7, cost and schedule control was the primary activity. A technical paper entitled ``Baseline Design/Economics for Advanced Fischer-Tropsch Technology`` was presented in the DOE/PETC`s Annual Contractors Review Conference, held at Pittsburgh, Pennsylvania, on September 27-29, 1993. A contract amendment was submitted to include the Kerr McGee ROSE unit in the Baseline design case and to convert the PFS models from the ASPEN/SP to ASPEN/Plus software code.

    1. Proposed Methodology for LEED Baseline Refrigeration Modeling (Presentation)

      SciTech Connect (OSTI)

      Deru, M.

      2011-02-01

      This PowerPoint presentation summarizes a proposed methodology for LEED baseline refrigeration modeling. The presentation discusses why refrigeration modeling is important, the inputs of energy models, resources, reference building model cases, baseline model highlights, example savings calculations and results.

    2. Grocery 2009 TSD Miami Baseline | Open Energy Information

      Open Energy Info (EERE)

      Jump to: navigation, search Model Name Grocery 2009 TSD Miami Baseline Building Type Food Sales Model Type Baseline Model Target Type ASHRAE 90.1 2004 Model Year 2009 IDF file...

    3. Grocery 2009 TSD Chicago Baseline | Open Energy Information

      Open Energy Info (EERE)

      Jump to: navigation, search Model Name Grocery 2009 TSD Chicago Baseline Building Type Food Sales Model Type Baseline Model Target Type ASHRAE 90.1 2004 Model Year 2009 IDF file...

    4. Development and Verification of a Computational Fluid Dynamics Model of a Horizontal-Axis Tidal Current Turbine

      SciTech Connect (OSTI)

      Lawson, M. J.; Li, Y.; Sale, D. C.

      2011-10-01

      This paper describes the development of a computational fluid dynamics (CFD) methodology to simulate the hydrodynamics of horizontal-axis tidal current turbines. Qualitative measures of the CFD solutions were independent of the grid resolution. Conversely, quantitative comparisons of the results indicated that the use of coarse computational grids results in an under prediction of the hydrodynamic forces on the turbine blade in comparison to the forces predicted using more resolved grids. For the turbine operating conditions considered in this study, the effect of the computational timestep on the CFD solution was found to be minimal, and the results from steady and transient simulations were in good agreement. Additionally, the CFD results were compared to corresponding blade element momentum method calculations and reasonable agreement was shown. Nevertheless, we expect that for other turbine operating conditions, where the flow over the blade is separated, transient simulations will be required.

    5. Moving baseline for evaluation of advanced coal-extraction systems

      SciTech Connect (OSTI)

      Bickerton, C.R.; Westerfield, M.D.

      1981-04-15

      This document reports results from the initial effort to establish baseline economic performance comparators for a program whose intent is to define, develop, and demonstrate advanced systems suitable for coal resource extraction beyond the year 2000. Systems used in this study were selected from contemporary coal mining technology and from conservative conjectures of year 2000 technology. The analysis was also based on a seam thickness of 6 ft. Therefore, the results are specific to the study systems and the selected seam thickness. To be more beneficial to the program, the effort should be extended to other seam thicknesses. This document is one of a series which describe systems level requirements for advanced underground coal mining equipment. Five areas of performance are discussed: production cost, miner safety, miner health, environmental impact, and recovery efficiency. The projections for cost and production capability comprise a so-called moving baseline which will be used to assess compliance with the systems requirement for production cost. Separate projections were prepared for room and pillar, longwall, and shortwall technology all operating under comparable sets of mining conditions. This work is part of an effort to define and develop innovative coal extraction systems suitable for the significant resources remaining in the year 2000.

    6. Development and Verification of a Computational Fluid Dynamics Model of a Horizontal-Axis Tidal Current Turbine

      SciTech Connect (OSTI)

      Lawson, Mi. J.; Li, Y.; Sale, D. C.

      2011-01-01

      This paper describes the development of a computational fluid dynamics (CFD) methodology to simulate the hydrodynamics of horizontal-axis tidal current turbines (HATTs). First, an HATT blade was designed using the blade element momentum method in conjunction with a genetic optimization algorithm. Several unstructured computational grids were generated using this blade geometry and steady CFD simulations were used to perform a grid resolution study. Transient simulations were then performed to determine the effect of time-dependent flow phenomena and the size of the computational timestep on the numerical solution. Qualitative measures of the CFD solutions were independent of the grid resolution. Conversely, quantitative comparisons of the results indicated that the use of coarse computational grids results in an under prediction of the hydrodynamic forces on the turbine blade in comparison to the forces predicted using more resolved grids. For the turbine operating conditions considered in this study, the effect of the computational timestep on the CFD solution was found to be minimal, and the results from steady and transient simulations were in good agreement. Additionally, the CFD results were compared to corresponding blade element momentum method calculations and reasonable agreement was shown. Nevertheless, we expect that for other turbine operating conditions, where the flow over the blade is separated, transient simulations will be required.

    7. Scope Management Baseline Development (FPM 208), Idaho | Department...

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      This is a Level 2 core course for certification in the Project Management Career ... credit. Course Format: 3 day Instructor-led classroom delivery Equivalent Courses: None

    8. International Nuclear Energy Research Initiative Development of Computational Models for Pyrochemical Electrorefiners of Nuclear Waste Transmutation Systems

      SciTech Connect (OSTI)

      M.F. Simpson; K.-R. Kim

      2010-12-01

      In support of closing the nuclear fuel cycle using non-aqueous separations technology, this project aims to develop computational models of electrorefiners based on fundamental chemical and physical processes. Spent driver fuel from Experimental Breeder Reactor-II (EBR-II) is currently being electrorefined in the Fuel Conditioning Facility (FCF) at Idaho National Laboratory (INL). And Korea Atomic Energy Research Institute (KAERI) is developing electrorefining technology for future application to spent fuel treatment and management in the Republic of Korea (ROK). Electrorefining is a critical component of pyroprocessing, a non-aqueous chemical process which separates spent fuel into four streams: (1) uranium metal, (2) U/TRU metal, (3) metallic high-level waste containing cladding hulls and noble metal fission products, and (4) ceramic high-level waste containing sodium and active metal fission products. Having rigorous yet flexible electrorefiner models will facilitate process optimization and assist in trouble-shooting as necessary. To attain such models, INL/UI has focused on approaches to develop a computationally-light and portable two-dimensional (2D) model, while KAERI/SNU has investigated approaches to develop a computationally intensive three-dimensional (3D) model for detailed and fine-tuned simulation.

    9. Baseline design/economics for advanced Fischer-Tropsch technology. Quarterly report, January--March 1994

      SciTech Connect (OSTI)

      1994-12-31

      The objectives of the study are to: Develop a baseline design and two alternative designs for indirect liquefaction using advanced F-T technology. The baseline design uses Illinois No. 6 Eastern Coal and conventional refining. There is an alternative refining case using ZSM-5 treatment of the vapor stream from the slurry F-T reactor and an alternative coal case using Western, coal from the Powder River Basin. Prepare the capital and operating costs for the baseline design and the alternatives. Individual plant costs for the alternative cases will be prorated on capacity, wherever possible, from the baseline case. Develop a process flowsheet simulation (PFS) model. During the reporting period, work progressed on Tasks 1, 2, 4, 6 and 7. This report covers work done during the period and consists of four sections: Introduction and Summary. Task 1: Baseline Design and Alternatives. Task 2: Evaluate baseline and alternative economics. Task 4: Process Flowsheet Simulation (PFS) model. Task 6: Document the PFS model and develop a DOE training session on its use and Project Management and Staffing Report.

    10. Climate-Science Computational Development Team: The Climate End Station II

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      | Argonne Leadership Computing Facility Model Model: Community Atmosphere Model (CAM5) with spectral element (SE) dynamics at 1/8th degree resolution. Physics options include full prognostic aerosols. Fixed annual cycle sea surface temperatures and sea ice extent, interactive land surface (CLM). Run within CESM1.0 coupled system. INCITE PI: Warren Washington, National Center for Atmospheric Research Setup and Integration: Mark Taylor, Sandia National Laboratory Visualization: Joseph A.

    11. Climate-Science Computational Development Team: The Climate End Station II

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      | Argonne Leadership Computing Facility Model (CAM5) Model: Community Atmosphere Model (CAM5) with spectral element (SE) dynamics at 1/8th degree resolution. Physics options include full prognostic aerosols. Fixed annual cycle sea surface temperatures and sea ice extent, interactive land surface (CLM). Run within CESM1.0 coupled system. INCITE PI: Warren Washington, National Center for Atmospheric Research Setup and Integration: Mark Taylor, Sandia National Laboratory Visualization: Joseph

    12. C-018H Pre-Operational Baseline Sampling Plan

      SciTech Connect (OSTI)

      Guzek, S.J.

      1993-08-20

      The objective of this task is to field characterize and sample the soil at selected locations along the proposed effluent line routes for Project C-018H. The overall purpose of this effort is to meet the proposed plan to discontinue the disposal of contaminated liquids into the Hanford soil column as described by DOE (1987). Detailed information describing proposed transport pipeline route and associated Kaiser Engineers Hanford Company (KEH) preliminary drawings (H288746...755) all inclusive, have been prepared by KEH (1992). The information developed from field monitoring and sampling will be utilized to characterize surface and subsurface soil along the proposed C-018H effluent pipeline and it`s associated facilities. Potentially existing contaminant levels may be encountered therefore, soil characterization will provide a construction preoperational baseline reference, develop personnel safety requirements, and determine the need for any changes in the proposed routes prior to construction of the pipeline.

    13. LTC vacuum blasting machine (concrete): Baseline report

      SciTech Connect (OSTI)

      1997-07-31

      The LTC shot blast technology was tested and is being evaluated at Florida International University (FIU) as a baseline technology. In conjunction with FIU`s evaluation of efficiency and cost, this report covers the evaluation conducted for safety and health issues. It is a commercially available technology and has been used for various projects at locations throughout the country. The LTC 1073 Vacuum Blasting Machine uses a high-capacity, direct-pressure blasting system which incorporates a continuous feed for the blast media. The blast media cleans the surface within the contained brush area of the blast. It incorporates a vacuum system which removes dust and debris from the surface as it is blasted. The safety and health evaluation during the testing demonstration focused on two main areas of exposure: dust and noise. Dust exposure during maintenance activities was minimal, but due to mechanical difficulties dust monitoring could not be conducted during operation. Noise exposure was significant. Further testing for each of these exposures is recommended because of the outdoor environment where the testing demonstration took place. This may cause the results to be inaccurate. It is feasible that the dust and noise levels will be higher in an enclosed environment. In addition, other safety and health issues found were ergonomics, heat stress, tripping hazards, electrical hazards, lockout/tagout, and arm-hand vibration.

    14. LTC vacuum blasting machine (metal): Baseline report

      SciTech Connect (OSTI)

      1997-07-31

      The LTC coating removal technology was tested and is being evaluated at Florida International University (FIU) as a baseline technology. In conjunction with evaluation of efficiency and cost, this report covers the evaluation conducted for safety and health issues. It is a commercially available technology and has been used for various projects at locations throughout the country. The LTC coating removal system consisted of several hand tools, a Roto Peen scaler, and a needlegun. They are designed to remove coatings from steel, concrete, brick, and wood. These hand tools are used with the LTC PTC-6 vacuum system to capture dust and debris as removal of the coating takes place. The safety and health evaluation during the testing demonstration focused on two main areas of exposure: dust and noise. The dust exposure was minimal but noise exposure was significant. Further testing for each exposure is recommended because of the environment where the testing demonstration took place. It is feasible that the dust and noise levels will be higher in an enclosed operating environment of different construction. In addition, other areas of concern found were arm-hand vibration, whole-body vibration, ergonomics, heat stress, tripping hazards, electrical hazards, machine guarding, and lockout/tagout.

    15. Pentek metal coating removal system: Baseline report

      SciTech Connect (OSTI)

      1997-07-31

      The Pentek coating removal technology was tested and is being evaluated at Florida International University (FIU) as a baseline technology. In conjunction with FIU`s evaluation of efficiency and cost, this report covers evaluation conducted for safety and health issues. It is a commercially available technology and has been used for various projects at locations throughout the country. The Pentek coating removal system consisted of the ROTO-PEEN Scaler, CORNER-CUTTER{reg_sign}, and VAC-PAC{reg_sign}. They are designed to remove coatings from steel, concrete, brick, and wood. The Scaler uses 3M Roto Peen tungsten carbide cutters while the CORNER-CUTTER{reg_sign} uses solid needles for descaling activities. These hand tools are used with the VAC-PAC{reg_sign} vacuum system to capture dust and debris as removal of the coating takes place. The safety and health evaluation during the testing demonstration focused on two main areas of exposure: dust and noise. Dust exposure minimal, but noise exposure was significant. Further testing for each exposure is recommended because of the environment where the testing demonstration took place. It is feasible that the dust and noise levels will be higher in an enclosed operating environment of different construction. In addition, other areas of concern found were arm-hand vibration, whole-body, ergonomics, heat stress, tripping hazards, electrical hazards, machine guarding, and lockout/tagout.

    16. Gated integrator with signal baseline subtraction

      DOE Patents [OSTI]

      Wang, X.

      1996-12-17

      An ultrafast, high precision gated integrator includes an opamp having differential inputs. A signal to be integrated is applied to one of the differential inputs through a first input network, and a signal indicative of the DC offset component of the signal to be integrated is applied to the other of the differential inputs through a second input network. A pair of electronic switches in the first and second input networks define an integrating period when they are closed. The first and second input networks are substantially symmetrically constructed of matched components so that error components introduced by the electronic switches appear symmetrically in both input circuits and, hence, are nullified by the common mode rejection of the integrating opamp. The signal indicative of the DC offset component is provided by a sample and hold circuit actuated as the integrating period begins. The symmetrical configuration of the integrating circuit improves accuracy and speed by balancing out common mode errors, by permitting the use of high speed switching elements and high speed opamps and by permitting the use of a small integrating time constant. The sample and hold circuit substantially eliminates the error caused by the input signal baseline offset during a single integrating window. 5 figs.

    17. Gated integrator with signal baseline subtraction

      DOE Patents [OSTI]

      Wang, Xucheng

      1996-01-01

      An ultrafast, high precision gated integrator includes an opamp having differential inputs. A signal to be integrated is applied to one of the differential inputs through a first input network, and a signal indicative of the DC offset component of the signal to be integrated is applied to the other of the differential inputs through a second input network. A pair of electronic switches in the first and second input networks define an integrating period when they are closed. The first and second input networks are substantially symmetrically constructed of matched components so that error components introduced by the electronic switches appear symmetrically in both input circuits and, hence, are nullified by the common mode rejection of the integrating opamp. The signal indicative of the DC offset component is provided by a sample and hold circuit actuated as the integrating period begins. The symmetrical configuration of the integrating circuit improves accuracy and speed by balancing out common mode errors, by permitting the use of high speed switching elements and high speed opamps and by permitting the use of a small integrating time constant. The sample and hold circuit substantially eliminates the error caused by the input signal baseline offset during a single integrating window.

    18. Arc melter demonstration baseline test results

      SciTech Connect (OSTI)

      Soelberg, N.R.; Chambers, A.G.; Anderson, G.L.; Oden, L.L.; O`Connor, W.K.; Turner, P.C.

      1994-07-01

      This report describes the test results and evaluation for the Phase 1 (baseline) arc melter vitrification test series conducted for the Buried Waste Integrated Demonstration program (BWID). Phase 1 tests were conducted on surrogate mixtures of as-incinerated wastes and soil. Some buried wastes, soils, and stored wastes at the INEL and other DOE sites, are contaminated with transuranic (TRU) radionuclides and hazardous organics and metals. The high temperature environment in an electric arc furnace may be used to process these wastes to produce materials suitable for final disposal. An electric arc furnace system can treat heterogeneous wastes and contaminated soils by (a) dissolving and retaining TRU elements and selected toxic metals as oxides in the slag phase, (b) destroying organic materials by dissociation, pyrolyzation, and combustion, and (c) capturing separated volatilized metals in the offgas system for further treatment. Structural metals in the waste may be melted and tapped separately for recycle or disposal, or these metals may be oxidized and dissolved into the slag. The molten slag, after cooling, will provide a glass/ceramic final waste form that is homogeneous, highly nonleachable, and extremely durable. These features make this waste form suitable for immobilization of TRU radionuclides and toxic metals for geologic timeframes. Further, the volume of contaminated wastes and soils will be substantially reduced in the process.

    19. Baseline air quality study at Fermilab

      SciTech Connect (OSTI)

      Dave, M.J.; Charboneau, R.

      1980-10-01

      Air quality and meteorological data collected at Fermi National Accelerator Laboratory are presented. The data represent baseline values for the pre-construction phase of a proposed coal-gasification test facility. Air quality data were characterized through continuous monitoring of gaseous pollutants, collection of meteorological data, data acquisition and reduction, and collection and analysis of discrete atmospheric samples. Seven air quality parameters were monitored and recorded on a continuous real-time basis: sulfur dioxide, ozone, total hydrocarbons, nonreactive hydrocarbons, nitric oxide, nitrogen oxides, and carbon monoxide. A 20.9-m tower was erected near Argonne's mobile air monitoring laboratory, which was located immediately downwind of the proposed facility. The tower was instrumented at three levels to collect continuous meteorological data. Wind speed was monitored at three levels; wind direction, horizontal and vertical, at the top level; ambient temperature at the top level; and differential temperature between all three levels. All continuously-monitored parameters were digitized and recorded on magnetic tape. Appropriate software was prepared to reduce the data. Statistical summaries, grphical displays, and correlation studies also are presented.

    20. MP Salsa: a finite element computer program for reacting flow problems. Part 1--theoretical development

      SciTech Connect (OSTI)

      Shadid, J.N.; Moffat, H.K.; Hutchinson, S.A.; Hennigan, G.L.; Devine, K.D.; Salinger, A.G.

      1996-05-01

      The theoretical background for the finite element computer program, MPSalsa, is presented in detail. MPSalsa is designed to solve laminar, low Mach number, two- or three-dimensional incompressible and variable density reacting fluid flows on massively parallel computers, using a Petrov-Galerkin finite element formulation. The code has the capability to solve coupled fluid flow, heat transport, multicomponent species transport, and finite-rate chemical reactions, and to solver coupled multiple Poisson or advection-diffusion- reaction equations. The program employs the CHEMKIN library to provide a rigorous treatment of multicomponent ideal gas kinetics and transport. Chemical reactions occurring in the gas phase and on surfaces are treated by calls to CHEMKIN and SURFACE CHEMKIN, respectively. The code employs unstructured meshes, using the EXODUS II finite element data base suite of programs for its input and output files. MPSalsa solves both transient and steady flows by using fully implicit time integration, an inexact Newton method and iterative solvers based on preconditioned Krylov methods as implemented in the Aztec solver library.

    1. NEW DEVELOPMENTS ON INVERSE POLYGON MAPPING TO CALCULATE GRAVITATIONAL LENSING MAGNIFICATION MAPS: OPTIMIZED COMPUTATIONS

      SciTech Connect (OSTI)

      Mediavilla, E.; Lopez, P.; Gonzalez-Morcillo, C.; Jimenez-Vicente, J.

      2011-11-01

      We derive an exact solution (in the form of a series expansion) to compute gravitational lensing magnification maps. It is based on the backward gravitational lens mapping of a partition of the image plane in polygonal cells (inverse polygon mapping, IPM), not including critical points (except perhaps at the cell boundaries). The zeroth-order term of the series expansion leads to the method described by Mediavilla et al. The first-order term is used to study the error induced by the truncation of the series at zeroth order, explaining the high accuracy of the IPM even at this low order of approximation. Interpreting the Inverse Ray Shooting (IRS) method in terms of IPM, we explain the previously reported N {sup -3/4} dependence of the IRS error with the number of collected rays per pixel. Cells intersected by critical curves (critical cells) transform to non-simply connected regions with topological pathologies like auto-overlapping or non-preservation of the boundary under the transformation. To define a non-critical partition, we use a linear approximation of the critical curve to divide each critical cell into two non-critical subcells. The optimal choice of the cell size depends basically on the curvature of the critical curves. For typical applications in which the pixel of the magnification map is a small fraction of the Einstein radius, a one-to-one relationship between the cell and pixel sizes in the absence of lensing guarantees both the consistence of the method and a very high accuracy. This prescription is simple but very conservative. We show that substantially larger cells can be used to obtain magnification maps with huge savings in computation time.

    2. Sandia National Laboratories, California proposed CREATE facility environmental baseline survey.

      SciTech Connect (OSTI)

      Catechis, Christopher Spyros

      2013-10-01

      Sandia National Laboratories, Environmental Programs completed an environmental baseline survey (EBS) of 12.6 acres located at Sandia National Laboratories/California (SNL/CA) in support of the proposed Collaboration in Research and Engineering for Advanced Technology and Education (CREATE) Facility. The survey area is comprised of several parcels of land within SNL/CA, County of Alameda, California. The survey area is located within T 3S, R 2E, Section 13. The purpose of this EBS is to document the nature, magnitude, and extent of any environmental contamination of the property; identify potential environmental contamination liabilities associated with the property; develop sufficient information to assess the health and safety risks; and ensure adequate protection for human health and the environment related to a specific property.

    3. Statistical Analysis of Baseline Load Models for Non-Residential Buildings

      SciTech Connect (OSTI)

      Coughlin, Katie; Piette, Mary Ann; Goldman, Charles; Kiliccote, Sila

      2008-11-10

      Policymakers are encouraging the development of standardized and consistent methods to quantify the electric load impacts of demand response programs. For load impacts, an essential part of the analysis is the estimation of the baseline load profile. In this paper, we present a statistical evaluation of the performance of several different models used to calculate baselines for commercial buildings participating in a demand response program in California. In our approach, we use the model to estimate baseline loads for a large set of proxy event days for which the actual load data are also available. Measures of the accuracy and bias of different models, the importance of weather effects, and the effect of applying morning adjustment factors (which use data from the day of the event to adjust the estimated baseline) are presented. Our results suggest that (1) the accuracy of baseline load models can be improved substantially by applying a morning adjustment, (2) the characterization of building loads by variability and weather sensitivity is a useful indicator of which types of baseline models will perform well, and (3) models that incorporate temperature either improve the accuracy of the model fit or do not change it.

    4. NREL: Climate Neutral Research Campuses - Determine Baseline Energy

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Consumption Determine Baseline Energy Consumption To create a climate action plan for your research campus, begin by determining current energy consumption and the resulting greenhouse gas emissions. You can then break down emissions by sector. It important to understand the following at the beginning: The Importance of a Baseline "The baseline inventory also provides a common data set for establishing benchmarks and priorities during the strategic planning stage and a means for

    5. Long-Baseline Neutrino Facility / Deep Underground Neutrino Project

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      (LBNF-DUNE) | Department of Energy Long-Baseline Neutrino Facility / Deep Underground Neutrino Project (LBNF-DUNE) Long-Baseline Neutrino Facility / Deep Underground Neutrino Project (LBNF-DUNE) Long-Baseline Neutrino Facility / Deep Underground Neutrino Project (LBNF-DUNE) Chris Mossey, Deputy Lab Director (Fermi) and Project Director for LBNF-DUNE March 23, 2016 Presentation (5.94 MB) Key Resources PMCDP EVMS PARS IIe FPD Resource Center PM Newsletter Forms and Templates More Documents

    6. 2016 Annual Technology Baseline (ATB) (Conference) | SciTech Connect

      Office of Scientific and Technical Information (OSTI)

      Conference: 2016 Annual Technology Baseline (ATB) Citation Details In-Document Search Title: 2016 Annual Technology Baseline (ATB) Consistent cost and performance data for various electricity generation technologies can be difficult to find and may change frequently for certain technologies. With the Annual Technology Baseline (ATB), National Renewable Energy Laboratory provides an organized and centralized dataset that was reviewed by internal and external experts. It uses the best information

    7. Spent Nuclear Fuel Project technical baseline document. Fiscal year 1995: Volume 1, Baseline description

      SciTech Connect (OSTI)

      Womack, J.C.; Cramond, R.; Paedon, R.J.

      1995-03-13

      This document is a revision to WHC-SD-SNF-SD-002, and is issued to support the individual projects that make up the Spent Nuclear Fuel Project in the lower-tier functions, requirements, interfaces, and technical baseline items. It presents results of engineering analyses since Sept. 1994. The mission of the SNFP on the Hanford site is to provide safety, economic, environmentally sound management of Hanford SNF in a manner that stages it to final disposition. This particularly involves K Basin fuel, although other SNF is involved also.

    8. ENERGY STAR PortfolioManager Baseline Year Instructions

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      Baseline Year" Time frame Select "Multiple Properties" Using filters, choose properties to include in report Check box to Select all filtered properties Select these reporting ...

    9. CD-2, Approve Performance Baseline | Department of Energy

      Office of Environmental Management (EM)

      ... External Independent Review (EIR), Independent Project Review (IPR), Independent Cost Estimate (ICE) Perform a Performance Baseline External Independent Review (EIR) or an ...

    10. EA-1943: Long Baseline Neutrino Facility/Deep Underground Neutrino...

      Broader source: Energy.gov (indexed) [DOE]

      May 27, 2015 EA-1943: Draft Environmental Assessment Long Baseline Neutrino FacilityDeep Underground Neutrino Experiment (LBNFDUNE) at Fermilab, Batavia, Illinois and the...

    11. Cost and Performance Comparison Baseline for Fossil Energy Power...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      blocks together into a new, revolutionary concept for future coal-based power and energy production. Objective To establish baseline performance and cost estimates for today's...

    12. Updates to the International Linear Collider Damping Rings Baseline...

      Office of Scientific and Technical Information (OSTI)

      Updates to the International Linear Collider Damping Rings Baseline Design Citation Details In-Document Search Title: Updates to the International Linear Collider Damping Rings...

    13. Sandia Energy - Scaled Wind Farm Technology Facility Baselining...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Project Accelerates Work Home Renewable Energy Energy SWIFT Facilities Partnership News Wind Energy News & Events Systems Analysis Scaled Wind Farm Technology Facility Baselining...

    14. South Africa - Greenhouse Gas Emission Baselines and Reduction...

      Open Energy Info (EERE)

      from Buildings AgencyCompany Organization United Nations Environment Programme Sector Energy Focus Area Buildings Topics Baseline projection, GHG inventory, Pathways analysis,...

    15. Mexico - Greenhouse Gas Emissions Baselines and Reduction Potentials...

      Open Energy Info (EERE)

      from Buildings AgencyCompany Organization United Nations Environment Programme Sector Energy Focus Area Buildings Topics Baseline projection, GHG inventory, Pathways analysis,...

    16. Chile-Danish Government Baseline Workstream | Open Energy Information

      Open Energy Info (EERE)

      Kenya, Mexico, South Africa, Thailand and Vietnam), to share practices on setting national greenhouse gas emissions baseline scenarios. The aim of the workstream is to...

    17. NREL: Energy Analysis - Annual Technology Baseline and Standard...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Annual Technology Baseline and Standard Scenarios Discussion Draft of NREL 2016 Annual ... and a diverse set of potential futures (standard scenarios) to inform electric sector ...

    18. NETL - Bituminous Baseline Performance and Cost Interactive Tool...

      Open Energy Info (EERE)

      from the Cost and Performance Baseline for Fossil Energy Plants - Bituminous Coal and Natural Gas to Electricity report. The tool provides an interactive summary of the full...

    19. River Corridor Baseline Risk Assessment (RCBRA) Human Health...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      12, 2011 River Corridor Baseline Risk Assessment (RCBRA) Human Health Risk Assessment (Volume 2) * RCBRA Human Health Risk Assessment is final - Response provided to HAB ...

    20. Development of Computational Tools for Metabolic Model Curation, Flux Elucidation and Strain Design

      SciTech Connect (OSTI)

      Maranas, Costas D

      2012-05-21

      An overarching goal of the Department of Energy™ mission is the efficient deployment and engineering of microbial and plant systems to enable biomass conversion in pursuit of high energy density liquid biofuels. This has spurred the pace at which new organisms are sequenced and annotated. This torrent of genomic information has opened the door to understanding metabolism in not just skeletal pathways and a handful of microorganisms but for truly genome-scale reconstructions derived for hundreds of microbes and plants. Understanding and redirecting metabolism is crucial because metabolic fluxes are unique descriptors of cellular physiology that directly assess the current cellular state and quantify the effect of genetic engineering interventions. At the same time, however, trying to keep pace with the rate of genomic data generation has ushered in a number of modeling and computational challenges related to (i) the automated assembly, testing and correction of genome-scale metabolic models, (ii) metabolic flux elucidation using labeled isotopes, and (iii) comprehensive identification of engineering interventions leading to the desired metabolism redirection.

    1. Baseline design/economics for advanced Fischer-Tropsch technology. Quarterly report, July--September 1994

      SciTech Connect (OSTI)

      1994-12-31

      This report is Bechtel`s twelfth quarterly technical progress report and covers the period of July through September, 1994. All major tasks associated with the contract study have essentially been completed. Effort is under way in preparing various topical reports for publication. The objectives of this study are to: Develop a baseline design and two alternative designs for indirect liquefaction using advanced F-T technology. The baseline design uses Illinois No. 6 Eastern Coal and conventional refining. There is an alternative refining case using ZSM-5 treatment of the vapor stream from the slurry F-T reactor and an alternative coal case using Western coal from the Powder River Basin. Prepare the capital and operating costs for the baseline design and the alternatives. Individual plant costs for the alternative cases win be prorated on capacity, wherever possible, from the baseline case. Develop a process flowsheet simulation (PFS) model; establish the baseline design and alternatives; evaluate baseline and alternative economics; develop engineering design criteria; develop a process flowsheet simulation (PFS) model; perform sensitivity studies using the PFS model; document the PFS model and develop a DOE training session on its use; and perform project management, technical coordination and other miscellaneous support functions. Tasks 1, 2, 3 and 5 have essentially been completed. Effort is under way in preparing topical reports for publication. During the current reporting period, work progressed on Tasks 4, 6 and 7. This report covers work done during this period and consists of four sections: Introduction and Summary; Task 4 - Process Flowsheet Simulation (PFS) Model and Conversion to ASPEN PLUS; Task 6 - Document the PFS model and develop a DOE training session on its use; and Project Management and Staffing Report.

    2. Development of Computational Approaches for Simulation and Advanced Controls for Hybrid Combustion-Gasification Chemical Looping

      SciTech Connect (OSTI)

      Joshi, Abhinaya; Lou, Xinsheng; Neuschaefer, Carl; Chaudry, Majid; Quinn, Joseph

      2012-07-31

      This document provides the results of the project through September 2009. The Phase I project has recently been extended from September 2009 to March 2011. The project extension will begin work on Chemical Looping (CL) Prototype modeling and advanced control design exploration in preparation for a scale-up phase. The results to date include: successful development of dual loop chemical looping process models and dynamic simulation software tools, development and test of several advanced control concepts and applications for Chemical Looping transport control and investigation of several sensor concepts and establishment of two feasible sensor candidates recommended for further prototype development and controls integration. There are three sections in this summary and conclusions. Section 1 presents the project scope and objectives. Section 2 highlights the detailed accomplishments by project task area. Section 3 provides conclusions to date and recommendations for future work.

    3. MIVS Mark V; The development of a computer assisted safeguards review station

      SciTech Connect (OSTI)

      Kadner, S. )

      1991-01-01

      This paper reports on the development and fielding of video recorder based surveillance systems, with associated high capacity scene storage capability, that have increased inspector visual review requirements. A clear need for decreasing inspector review participation while assuring no resulting degradation of technical and safeguards analyses has been demonstrated. A development utilizing a general purpose microcomputer based platform with video recorder control capabilities and analog to digital conversion of visual scene content is described. Such a system may be used to safeguard nuclear facilities.

    4. Advanced Hydrogen Turbine Development

      SciTech Connect (OSTI)

      Marra, John

      2015-09-30

      Under the sponsorship of the U.S. Department of Energy (DOE) National Energy Technology Laboratories, Siemens has completed the Advanced Hydrogen Turbine Development Program to develop an advanced gas turbine for incorporation into future coal-based Integrated Gasification Combined Cycle (IGCC) plants. All the scheduled DOE Milestones were completed and significant technical progress was made in the development of new technologies and concepts. Advanced computer simulations and modeling, as well as subscale, full scale laboratory, rig and engine testing were utilized to evaluate and select concepts for further development. Program Requirements of: A 3 to 5 percentage point improvement in overall plant combined cycle efficiency when compared to the reference baseline plant; 20 to 30 percent reduction in overall plant capital cost when compared to the reference baseline plant; and NOx emissions of 2 PPM out of the stack. were all met. The program was completed on schedule and within the allotted budget

    5. Tank Waste Remediation System (TWRS) Technical Baseline Summary Description

      SciTech Connect (OSTI)

      TEDESCHI, A.R.

      2000-04-21

      This revision notes the supersedure of the subject document by concurrent issuance of HNF-1901 ''Technical Baseline Summary Description for the Tank Farm Contractor'', Revision 2. Safe storage mission technical baseline information was absorbed by the new revision of HNF-1901.

    6. Development of computer program ENMASK for prediction of residual environmental masking-noise spectra, from any three independent environmental parameters

      SciTech Connect (OSTI)

      Chang, Y.-S.; Liebich, R. E.; Chun, K. C.

      2000-03-31

      Residual environmental sound can mask intrusive4 (unwanted) sound. It is a factor that can affect noise impacts and must be considered both in noise-impact studies and in noise-mitigation designs. Models for quantitative prediction of sensation level (audibility) and psychological effects of intrusive noise require an input with 1/3 octave-band spectral resolution of environmental masking noise. However, the majority of published residual environmental masking-noise data are given with either octave-band frequency resolution or only single A-weighted decibel values. A model has been developed that enables estimation of 1/3 octave-band residual environmental masking-noise spectra and relates certain environmental parameters to A-weighted sound level. This model provides a correlation among three environmental conditions: measured residual A-weighted sound-pressure level, proximity to a major roadway, and population density. Cited field-study data were used to compute the most probable 1/3 octave-band sound-pressure spectrum corresponding to any selected one of these three inputs. In turn, such spectra can be used as an input to models for prediction of noise impacts. This paper discusses specific algorithms included in the newly developed computer program ENMASK. In addition, the relative audibility of the environmental masking-noise spectra at different A-weighted sound levels is discussed, which is determined by using the methodology of program ENAUDIBL.

    7. Automation of ORIGEN2 calculations for the transuranic waste baseline inventory database using a pre-processor and a post-processor

      SciTech Connect (OSTI)

      Liscum-Powell, J.

      1997-06-01

      The purpose of the work described in this report was to automate ORIGEN2 calculations for the Waste Isolation Pilot Plant (WIPP) Transuranic Waste Baseline Inventory Database (WTWBID); this was done by developing a pre-processor to generate ORIGEN2 input files from WWBID inventory files and a post-processor to remove excess information from the ORIGEN2 output files. The calculations performed with ORIGEN2 estimate the radioactive decay and buildup of various radionuclides in the waste streams identified in the WTWBID. The resulting radionuclide inventories are needed for performance assessment calculations for the WIPP site. The work resulted in the development of PreORG, which requires interaction with the user to generate ORIGEN2 input files on a site-by-site basis, and PostORG, which processes ORIGEN2 output into more manageable files. Both programs are written in the FORTRAN 77 computer language. After running PreORG, the user will run ORIGEN2 to generate the desired data; upon completion of ORIGEN2 calculations, the user can run PostORG to process the output to make it more manageable. All the programs run on a 386 PC or higher with a math co-processor or a computer platform running under VMS operating system. The pre- and post-processors for ORIGEN2 were generated for use with Rev. 1 data of the WTWBID and can also be used with Rev. 2 and 3 data of the TWBID (Transuranic Waste Baseline Inventory Database).

    8. Long-Baseline Neutrino Experiment (LBNE)Water Cherenkov Detector Basis of Estimate Forms and Backup Documentation LBNE Far Site Internal Review (December 6-9, 2011)

      SciTech Connect (OSTI)

      Stewart J.; Diwan, M.; Dolph, J.; Novakova, P.; Sharma, R.; Stewart, J.; Viren, B.; Russo, T.; Kaducak, M.; Mantsch, P.; Paulos, B.; Feyzi, F.; Sullivan, G.; Bionta, R.; Fowler, J.; Warner, D.; Bahowick, S.; Van Berg, R.; Kearns, E.; Hazen, E.; Sinnis, G.; Sanchez, M.

      2011-12-09

      Basis of Estimate (BOE) forms and backup documentation developed for the Water Cherenkov Detector (WCD) option for the far detector of the Long Baseline Neutrino Experiment (LBNE)

    9. Long-Baseline Neutrino Experiment (LBNE) Water Cherenkov Detector Schedule and Cost Books LBNE Far Site Internal Review(December 6-9,2011)

      SciTech Connect (OSTI)

      Stewart J.; Diwan, M.; Dolph, J.; Novakova, P.; Sharma, R.; Stewart, J.; Viren, B.; Russo, T.; Kaducak, M.; Mantsch, P.; Paulos, B.; Feyzi, F.; Sullivan, G.; Bionta, R.; Fowler, J.; Warner, D.; Bahowick, S.; Van Berg, R.; Kearns, E.; Hazen, E.; Sinnis, G.; Sanchez, M.

      2011-12-09

      Schedule and Cost Books developed for the Water Cherenkov Detector (WCD) option for the far detector of the Long Baseline Neutrino Experiment (LBNE)

    10. Computational Earth Science

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      6 Computational Earth Science We develop and apply a range of high-performance computational methods and software tools to Earth science projects in support of environmental ...

    11. Current and planned numerical development for improving computing performance for long duration and/or low pressure transients

      SciTech Connect (OSTI)

      Faydide, B.

      1997-07-01

      This paper presents the current and planned numerical development for improving computing performance in case of Cathare applications needing real time, like simulator applications. Cathare is a thermalhydraulic code developed by CEA (DRN), IPSN, EDF and FRAMATOME for PWR safety analysis. First, the general characteristics of the code are presented, dealing with physical models, numerical topics, and validation strategy. Then, the current and planned applications of Cathare in the field of simulators are discussed. Some of these applications were made in the past, using a simplified and fast-running version of Cathare (Cathare-Simu); the status of the numerical improvements obtained with Cathare-Simu is presented. The planned developments concern mainly the Simulator Cathare Release (SCAR) project which deals with the use of the most recent version of Cathare inside simulators. In this frame, the numerical developments are related with the speed up of the calculation process, using parallel processing and improvement of code reliability on a large set of NPP transients.

    12. Scientific Opportunities with the Long-Baseline Neutrino Experiment

      SciTech Connect (OSTI)

      Adams, C.; et al.,

      2013-07-28

      In this document, we describe the wealth of science opportunities and capabilities of LBNE, the Long-Baseline Neutrino Experiment. LBNE has been developed to provide a unique and compelling program for the exploration of key questions at the forefront of particle physics. Chief among the discovery opportunities are observation of CP symmetry violation in neutrino mixing, resolution of the neutrino mass hierarchy, determination of maximal or near-maximal mixing in neutrinos, searches for nucleon decay signatures, and detailed studies of neutrino bursts from galactic supernovae. To fulfill these and other goals as a world-class facility, LBNE is conceived around four central components: (1) a new, intense wide-band neutrino source at Fermilab, (2) a fine-grained `near' neutrino detector just downstream of the source, (3) the Sanford Underground Research Facility (SURF) in Lead, South Dakota at an optimal distance (~1300 km) from the neutrino source, and (4) a massive liquid argon time-projection chamber (LArTPC) deployed there as a 'far' detector. The facilities envisioned are expected to enable many other science opportunities due to the high event rates and excellent detector resolution from beam neutrinos in the near detector and atmospheric neutrinos in the far detector. This is a mature, well developed, world class experiment whose relevance, importance, and probability of unearthing critical and exciting physics has increased with time.

    13. Baseline design/economics for advanced Fischer-Tropsch technology. Quarterly report, January--March 1992

      SciTech Connect (OSTI)

      Not Available

      1992-09-01

      The objectives of the study are to: Develop a baseline design for indirect liquefaction using advanced Fischer-Tropsch (F-T) technology. Prepare the capital and operating costs for the baseline design. Develop a process flow sheet simulation (PFS) model. This report summarizes the activities completed during the period December 23, 1992 through March 15, 1992. In Task 1, Baseline Design and Alternates, the following activities related to the tradeoff studies were completed: approach and basis; oxygen purity; F-T reactor pressure; wax yield; autothermal reformer; hydrocarbons (C{sub 3}/C{sub 4}s) recovery; and hydrogenrecovery. In Task 3, Engineering Design Criteria, activities were initiated to support the process tradeoff studies in Task I and to develop the environmental strategy for the Illinois site. The work completed to date consists of the development of the F-T reactor yield correlation from the Mobil dam and a brief review of the environmental strategy prepared for the same site in the direct liquefaction baseline study.Some work has also been done in establishing site-related criteria, in establishing the maximum vessel diameter for train sizing and in coping with the low H{sub 2}/CO ratio from the Shell gasifier. In Task 7, Project Management and Administration, the following activities were completed: the subcontract agreement between Amoco and Bechtel was negotiated; a first technical progress meeting was held at the Bechtel office in February; and the final Project Management Plan was approved by PETC and issued in March 1992.

    14. Notice of Intent to Revise DOE G 413.3-5A, Performance Baseline Guide

      Broader source: Directives, Delegations, and Requirements [Office of Management (MA)]

      2016-02-03

      The proposed revision to this Department of Energy Guide focuses on updating the current guide with the latest terminology and references, regarding Performance Baseline Development process. This update also incorporates the latest Secretarial memoranda on project management issued since the last update to DOE O 413.3B, Program and Project Management for the Acquisition of Capital Assets.

    15. Development of a computer code to predict a ventilation requirement for an underground radioactive waste storage tank

      SciTech Connect (OSTI)

      Lee, Y.J.; Dalpiaz, E.L.

      1997-08-01

      Computer code, WTVFE (Waste Tank Ventilation Flow Evaluation), has been developed to evaluate the ventilation requirement for an underground storage tank for radioactive waste. Heat generated by the radioactive waste and mixing pumps in the tank is removed mainly through the ventilation system. The heat removal process by the ventilation system includes the evaporation of water from the waste and the heat transfer by natural convection from the waste surface. Also, a portion of the heat will be removed through the soil and the air circulating through the gap between the primary and secondary tanks. The heat loss caused by evaporation is modeled based on recent evaporation test results by the Westinghouse Hanford Company using a simulated small scale waste tank. Other heat transfer phenomena are evaluated based on well established conduction and convection heat transfer relationships. 10 refs., 3 tabs.

    16. OSTIblog Articles in the Long Baseline Neutrino Experiment Topic...

      Office of Scientific and Technical Information (OSTI)

      Long Baseline Neutrino Experiment Topic Mining for Gold, Neutrinos and the Neutrinoless ... The site of the former Homestake Mine was once one of the largest and deepest gold mines ...

    17. Cost and Performance Baseline for Fossil Energy Plants Volume...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      www.netl.doe.gov This page intentionally left blank Cost and Performance Baseline for Coal-to-SNG and Ammonia (Volume 2) i Table of Contents LIST OF EXHIBITS......

    18. Biodiversity Research Institute Mid-Atlantic Baseline Study Webinar

      Broader source: Energy.gov [DOE]

      Carried out by the Biodiversity Research Institute (BRI) and funded by the U.S. Department of Energy Wind and Water Power Technology Office and other partners, the goal of the Mid-Atlantic Baseline...

    19. Computing Frontier: Distributed Computing

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Computing Frontier: Distributed Computing and Facility Infrastructures Conveners: Kenneth Bloom 1 , Richard Gerber 2 1 Department of Physics and Astronomy, University of Nebraska-Lincoln 2 National Energy Research Scientific Computing Center (NERSC), Lawrence Berkeley National Laboratory 1.1 Introduction The field of particle physics has become increasingly reliant on large-scale computing resources to address the challenges of analyzing large datasets, completing specialized computations and

    20. Fort Irwin Integrated Resource Assessment. Volume 2, Baseline detail

      SciTech Connect (OSTI)

      Richman, E.E.; Keller, J.M.; Dittmer, A.L.; Hadley, D.L.

      1994-01-01

      This report documents the assessment of baseline energy use at Fort Irwin, a US Army Forces Command facility near Barstow, California. It is a companion report to Volume 1, Executive Summary, and Volume 3, Integrated Resource Assessment. The US Army Forces Command (FORSCOM) has tasked the US Department of Energy (DOE) Federal Energy Management Program (FEMP), supported by the Pacific Northwest Laboratory (PNL), to identify, evaluate, and assist in acquiring all cost-effective energy projects at Fort Irwin. This is part of a model program that PNL has designed to support energy-use decisions in the federal sector. This program (1) identifies and evaluates all cost-effective energy projects; (2) develops a schedule at each installation for project acquisition considering project type, size, timing, and capital requirements, as well as energy and dollar savings; and (3) targets 100% of the financing required to implement energy efficiency projects. PNL applied this model program to Fort Irwin. This analysis examines the characteristics of electric, propane gas, and vehicle fuel use for a typical operating year. It records energy-use intensities for the facilities at Fort Irwin by building type and energy end use. It also breaks down building energy consumption by fuel type, energy end use, and building type. A complete energy consumption reconciliation is presented that accounts for all energy use among buildings, utilities, and applicable losses.

    1. Future Long-Baseline Neutrino Oscillations: View from North America

      SciTech Connect (OSTI)

      Wilson, R. J.

      2015-06-01

      In late 2012 the US Department of Energy gave approval for the first phase of the Long-Baseline Neutrino Experiment (LBNE), that will conduct a broad scientific program including neutrino oscillations, neutrino scattering physics, search for baryon violation, supernova burst neutrinos and other related astrophysical phenomena. The project is now being reformulated as an international facility hosted by the United States. The facility will consist of an intense neutrino beam produced at Fermi National Accelerator Laboratory (Fermilab), a highly capable set of neutrino detectors on the Fermilab campus, and a large underground liquid argon time projection chamber at Sanford Underground Research Facility (SURF) in South Dakota 1300 km from Fermilab. With an intense beam and massive far detector, the experimental program at the facility will make detailed studies of neutrino oscillations, including measurements of the neutrino mass hierarchy and Charge-Parity symmetry violation, by measuring neutrino and anti-neutrino mixing separately. At the near site, the high-statistics neutrino scattering data will allow for many cross section measurements and precision tests of the Standard Model. This presentation will describe the configuration developed by the LBNE collaboration, the broad physics program, and the status of the formation of the international facility.

    2. The benzene⋯naphthalene complex: A more challenging system than the benzene dimer for newly developed computational methods

      SciTech Connect (OSTI)

      Wang, Weizhou E-mail: ybw@gzu.edu.cn; Zhang, Yu; Sun, Tao; Wang, Yi-Bo E-mail: ybw@gzu.edu.cn

      2015-09-21

      High-level coupled cluster singles, doubles, and perturbative triples [CCSD(T)] computations with up to the aug-cc-pVQZ basis set (1924 basis functions) and various extrapolations toward the complete basis set (CBS) limit are presented for the sandwich, T-shaped, and parallel-displaced benzene⋯naphthalene complex. Using the CCSD(T)/CBS interaction energies as a benchmark, the performance of some newly developed wave function and density functional theory methods has been evaluated. The best performing methods were found to be the dispersion-corrected PBE0 functional (PBE0-D3) and spin-component scaled zeroth-order symmetry-adapted perturbation theory (SCS-SAPT0). The success of SCS-SAPT0 is very encouraging because it provides one method for energy component analysis of π-stacked complexes with 200 atoms or more. Most newly developed methods do, however, overestimate the interaction energies. The results of energy component analysis show that interaction energies are overestimated mainly due to the overestimation of dispersion energy.

    3. Spent Nuclear Fuel Project technical baseline document. Volumes 1--4, FY 1994

      SciTech Connect (OSTI)

      Womack, J.C.

      1995-04-03

      The Technical Baseline document presents the results of the SNFP systems engineering analyses to date. These analyses establish the baseline functions, requirements, and interfaces to a level necessary to complete the SNFP mission as defined in the SNFP Mission Analysis Report (Volume II). It is a summary-level description of the activities necessary to successfully complete the SNFP mission. The time relationship of these activities is depicted in a two page Activities Logic Diagram (Figure 1.0-1). This document lacks detail in some areas because a clear programmatic and technical path forward is currently under development, and several key trade studies are yet to be completed. A narrative baseline description has been provided as a basis for a point of departure for individual trade studies and further ANFP definition and development. This document is composed of four volumes. Volume I presents the SNFP baseline description. Volume II presents an updated version of the Mission Analysis Report. Volume III presents the results of the functions and requirements analyses, which is an update of the draft of the SNFP Functions and Requirements Document. The initial alternatives analyses are also included in this volume. Volume IV presents the supporting data for the first three volumes.

    4. Dixie Valley Engineered Geothermal System Exploration Methodology Project, Baseline Conceptual Model Report

      SciTech Connect (OSTI)

      Iovenitti, Joe

      2013-05-15

      The Engineered Geothermal System (EGS) Exploration Methodology Project is developing an exploration approach for EGS through the integration of geoscientific data. The Project chose the Dixie Valley Geothermal System in Nevada as a field laboratory site for methodlogy calibration purposes because, in the public domain, it is a highly characterized geothermal systems in the Basin and Range with a considerable amount of geoscience and most importantly, well data. This Baseline Conceptual Model report summarizes the results of the first three project tasks (1) collect and assess the existing public domain geoscience data, (2) design and populate a GIS database, and (3) develop a baseline (existing data) geothermal conceptual model, evaluate geostatistical relationships, and generate baseline, coupled EGS favorability/trust maps from +1km above sea level (asl) to -4km asl for the Calibration Area (Dixie Valley Geothermal Wellfield) to identify EGS drilling targets at a scale of 5km x 5km. It presents (1) an assessment of the readily available public domain data and some proprietary data provided by Terra-Gen Power, LLC, (2) a re-interpretation of these data as required, (3) an exploratory geostatistical data analysis, (4) the baseline geothermal conceptual model, and (5) the EGS favorability/trust mapping. The conceptual model presented applies to both the hydrothermal system and EGS in the Dixie Valley region.

    5. Dixie Valley Engineered Geothermal System Exploration Methodology Project, Baseline Conceptual Model Report

      DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

      Iovenitti, Joe

      The Engineered Geothermal System (EGS) Exploration Methodology Project is developing an exploration approach for EGS through the integration of geoscientific data. The Project chose the Dixie Valley Geothermal System in Nevada as a field laboratory site for methodlogy calibration purposes because, in the public domain, it is a highly characterized geothermal systems in the Basin and Range with a considerable amount of geoscience and most importantly, well data. This Baseline Conceptual Model report summarizes the results of the first three project tasks (1) collect and assess the existing public domain geoscience data, (2) design and populate a GIS database, and (3) develop a baseline (existing data) geothermal conceptual model, evaluate geostatistical relationships, and generate baseline, coupled EGS favorability/trust maps from +1km above sea level (asl) to -4km asl for the Calibration Area (Dixie Valley Geothermal Wellfield) to identify EGS drilling targets at a scale of 5km x 5km. It presents (1) an assessment of the readily available public domain data and some proprietary data provided by Terra-Gen Power, LLC, (2) a re-interpretation of these data as required, (3) an exploratory geostatistical data analysis, (4) the baseline geothermal conceptual model, and (5) the EGS favorability/trust mapping. The conceptual model presented applies to both the hydrothermal system and EGS in the Dixie Valley region.

    6. Baseline and target values for regional and point PV power forecasts: Toward improved solar forecasting

      SciTech Connect (OSTI)

      Zhang, Jie; Hodge, Bri -Mathias; Lu, Siyuan; Hamann, Hendrik F.; Lehman, Brad; Simmons, Joseph; Campos, Edwin; Banunarayanan, Venkat; Black, Jon; Tedesco, John

      2015-11-10

      Accurate solar photovoltaic (PV) power forecasting allows utilities to reliably utilize solar resources on their systems. However, to truly measure the improvements that any new solar forecasting methods provide, it is important to develop a methodology for determining baseline and target values for the accuracy of solar forecasting at different spatial and temporal scales. This paper aims at developing a framework to derive baseline and target values for a suite of generally applicable, value-based, and custom-designed solar forecasting metrics. The work was informed by close collaboration with utility and independent system operator partners. The baseline values are established based on state-of-the-art numerical weather prediction models and persistence models in combination with a radiative transfer model. The target values are determined based on the reduction in the amount of reserves that must be held to accommodate the uncertainty of PV power output. The proposed reserve-based methodology is a reasonable and practical approach that can be used to assess the economic benefits gained from improvements in accuracy of solar forecasting. Lastly, the financial baseline and targets can be translated back to forecasting accuracy metrics and requirements, which will guide research on solar forecasting improvements toward the areas that are most beneficial to power systems operations.

    7. Baseline and Target Values for PV Forecasts: Toward Improved Solar Power Forecasting: Preprint

      SciTech Connect (OSTI)

      Zhang, Jie; Hodge, Bri-Mathias; Lu, Siyuan; Hamann, Hendrik F.; Lehman, Brad; Simmons, Joseph; Campos, Edwin; Banunarayanan, Venkat

      2015-08-05

      Accurate solar power forecasting allows utilities to get the most out of the solar resources on their systems. To truly measure the improvements that any new solar forecasting methods can provide, it is important to first develop (or determine) baseline and target solar forecasting at different spatial and temporal scales. This paper aims to develop baseline and target values for solar forecasting metrics. These were informed by close collaboration with utility and independent system operator partners. The baseline values are established based on state-of-the-art numerical weather prediction models and persistence models. The target values are determined based on the reduction in the amount of reserves that must be held to accommodate the uncertainty of solar power output. forecasting metrics. These were informed by close collaboration with utility and independent system operator partners. The baseline values are established based on state-of-the-art numerical weather prediction models and persistence models. The target values are determined based on the reduction in the amount of reserves that must be held to accommodate the uncertainty of solar power output.

    8. Baseline and target values for regional and point PV power forecasts: Toward improved solar forecasting

      DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

      Zhang, Jie; Hodge, Bri -Mathias; Lu, Siyuan; Hamann, Hendrik F.; Lehman, Brad; Simmons, Joseph; Campos, Edwin; Banunarayanan, Venkat; Black, Jon; Tedesco, John

      2015-11-10

      Accurate solar photovoltaic (PV) power forecasting allows utilities to reliably utilize solar resources on their systems. However, to truly measure the improvements that any new solar forecasting methods provide, it is important to develop a methodology for determining baseline and target values for the accuracy of solar forecasting at different spatial and temporal scales. This paper aims at developing a framework to derive baseline and target values for a suite of generally applicable, value-based, and custom-designed solar forecasting metrics. The work was informed by close collaboration with utility and independent system operator partners. The baseline values are established based onmore » state-of-the-art numerical weather prediction models and persistence models in combination with a radiative transfer model. The target values are determined based on the reduction in the amount of reserves that must be held to accommodate the uncertainty of PV power output. The proposed reserve-based methodology is a reasonable and practical approach that can be used to assess the economic benefits gained from improvements in accuracy of solar forecasting. Lastly, the financial baseline and targets can be translated back to forecasting accuracy metrics and requirements, which will guide research on solar forecasting improvements toward the areas that are most beneficial to power systems operations.« less

    9. Baselines For Land-Use Change In The Tropics: Application ToAvoided Deforestation Projects

      SciTech Connect (OSTI)

      Brown, Sandra; Hall, Myrna; Andrasko, Ken; Ruiz, Fernando; Marzoli, Walter; Guerrero, Gabriela; Masera, Omar; Dushku, Aaron; Dejong,Ben; Cornell, Joseph

      2007-06-01

      Although forest conservation activities particularly in thetropics offer significant potential for mitigating carbon emissions,these types of activities have faced obstacles in the policy arena causedby the difficulty in determining key elements of the project cycle,particularly the baseline. A baseline for forest conservation has twomain components: the projected land-use change and the correspondingcarbon stocks in the applicable pools such as vegetation, detritus,products and soil, with land-use change being the most difficult toaddress analytically. In this paper we focus on developing and comparingthree models, ranging from relatively simple extrapolations of pasttrends in land use based on simple drivers such as population growth tomore complex extrapolations of past trends using spatially explicitmodels of land-use change driven by biophysical and socioeconomicfactors. The three models of the latter category used in the analysis atregional scale are The Forest Area Change (FAC) model, the Land Use andCarbon Sequestration (LUCS) model, and the Geographical Modeling (GEOMOD)model. The models were used to project deforestation in six tropicalregions that featured different ecological and socioeconomic conditions,population dynamics, and uses of the land: (1) northern Belize; (2) SantaCruz State, Bolivia; (3) Parana State in Brazil; (4) Campeche, Mexico;(5) Chiapas, Mexico; and (6) Michoacan, Mexico. A comparison of all modeloutputs across all six regions shows that each model produced quitedifferent deforestation baseline. In general, the simplest FAC model,applied at the national administrative-unit scale, projected the highestamount of forest loss (four out of six) and the LUCS model the leastamount of loss (four out of five). Based on simulations of GEOMOD, wefound that readily observable physical and biological factors as well asdistance to areas of past disturbance were each about twice as importantas either sociological/demographic or economic

    10. Computational Science and Engineering

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Computational Science and Engineering NETL's Computational Science and Engineering competency consists of conducting applied scientific research and developing physics-based simulation models, methods, and tools to support the development and deployment of novel process and equipment designs. Research includes advanced computations to generate information beyond the reach of experiments alone by integrating experimental and computational sciences across different length and time scales. Specific

    11. India's baseline plan for nuclear energy self-sufficiency.

      SciTech Connect (OSTI)

      Bucher, R .G.; Nuclear Engineering Division

      2009-01-01

      India's nuclear energy strategy has traditionally strived for energy self-sufficiency, driven largely by necessity following trade restrictions imposed by the Nuclear Suppliers Group (NSG) following India's 'peaceful nuclear explosion' of 1974. On September 6, 2008, the NSG agreed to create an exception opening nuclear trade with India, which may create opportunities for India to modify its baseline strategy. The purpose of this document is to describe India's 'baseline plan,' which was developed under constrained trade conditions, as a basis for understanding changes in India's path as a result of the opening of nuclear commerce. Note that this treatise is based upon publicly available information. No attempt is made to judge whether India can meet specified goals either in scope or schedule. In fact, the reader is warned a priori that India's delivery of stated goals has often fallen short or taken a significantly longer period to accomplish. It has been evident since the early days of nuclear power that India's natural resources would determine the direction of its civil nuclear power program. It's modest uranium but vast thorium reserves dictated that the country's primary objective would be thorium utilization. Estimates of India's natural deposits vary appreciably, but its uranium reserves are known to be extremely limited, totaling approximately 80,000 tons, on the order of 1% of the world's deposits; and nominally one-third of this ore is of very low uranium concentration. However, India's roughly 300,000 tons of thorium reserves account for approximately 30% of the world's total. Confronted with this reality, the future of India's nuclear power industry is strongly dependent on the development of a thorium-based nuclear fuel cycle as the only way to insure a stable, sustainable, and autonomous program. The path to India's nuclear energy self-sufficiency was first outlined in a seminal paper by Drs. H. J. Bhabha and N. B. Prasad presented at the Second

    12. Long-Term Stewardship Baseline Report and Transition Guidance

      SciTech Connect (OSTI)

      Kristofferson, Keith

      2001-11-01

      Long-term stewardship consists of those actions necessary to maintain and demonstrate continued protection of human health and the environment after facility cleanup is complete. As the Department of Energys (DOE) lead laboratory for environmental management programs, the Idaho National Engineering and Environmental Laboratory (INEEL) administers DOEs long-term stewardship science and technology efforts. The INEEL provides DOE with technical, and scientific expertise needed to oversee its long-term environmental management obligations complexwide. Long-term stewardship is administered and overseen by the Environmental Management Office of Science and Technology. The INEEL Long-Term Stewardship Program is currently developing the management structures and plans to complete INEEL-specific, long-term stewardship obligations. This guidance document (1) assists in ensuring that the program leads transition planning for the INEEL with respect to facility and site areas and (2) describes the classes and types of criteria and data required to initiate transition for areas and sites where the facility mission has ended and cleanup is complete. Additionally, this document summarizes current information on INEEL facilities, structures, and release sites likely to enter long-term stewardship at the completion of DOEs cleanup mission. This document is not intended to function as a discrete checklist or local procedure to determine readiness to transition. It is an overarching document meant as guidance in implementing specific transition procedures. Several documents formed the foundation upon which this guidance was developed. Principal among these documents was the Long-Term Stewardship Draft Technical Baseline; A Report to Congress on Long-Term Stewardship, Volumes I and II; Infrastructure Long-Range Plan; Comprehensive Facility Land Use Plan; INEEL End-State Plan; and INEEL Institutional Plan.

    13. Vehicle Technologies Office Merit Review 2015: Integrated Computational Materials Engineering Approach to Development of Lightweight 3GAHSS Vehicle Assembly

      Broader source: Energy.gov [DOE]

      Presentation given by USAMP at 2015 DOE Hydrogen and Fuel Cells Program and Vehicle Technologies Office Annual Merit Review and Peer Evaluation Meeting about integrated computational materials...

    14. Vehicle Technologies Office Merit Review 2014: Integrated Computational Materials Engineering Approach to Development of Lightweight 3GAHSS Vehicle Assembly

      Broader source: Energy.gov [DOE]

      Presentation given by USAMP at 2014 DOE Hydrogen and Fuel Cells Program and Vehicle Technologies Office Annual Merit Review and Peer Evaluation Meeting about integrated computational materials...

    15. Multiproject baselines for evaluation of electric power projects

      SciTech Connect (OSTI)

      Sathaye, Jayant; Murtishaw, Scott; Price, Lynn; Lefranc, Maurice; Roy, Joyashree; Winkler, Harald; Spalding-Fecher, Randall

      2003-03-12

      Calculating greenhouse gas emissions reductions from climate change mitigation projects requires construction of a baseline that sets emissions levels that would have occurred without the project. This paper describes a standardized multiproject methodology for setting baselines, represented by the emissions rate (kg C/kWh), for electric power projects. A standardized methodology would reduce the transaction costs of projects. The most challenging aspect of setting multiproject emissions rates is determining the vintage and types of plants to include in the baseline and the stringency of the emissions rates to be considered, in order to balance the desire to encourage no- or low-carbon projects while maintaining environmental integrity. The criteria for selecting power plants to include in the baseline depend on characteristics of both the project and the electricity grid it serves. Two case studies illustrate the application of these concepts to the electric power grids in eastern India and South Africa. We use hypothetical, but realistic, climate change projects in each country to illustrate the use of the multiproject methodology, and note the further research required to fully understand the implications of the various choices in constructing and using these baselines.

    16. Functional Testing Protocols for Commercial Building Efficiency Baseline Modeling Software

      SciTech Connect (OSTI)

      Jump, David; Price, Phillip N.; Granderson, Jessica; Sohn, Michael

      2013-09-06

      This document describes procedures for testing and validating proprietary baseline energy modeling software accuracy in predicting energy use over the period of interest, such as a month or a year. The procedures are designed according to the methodology used for public domain baselining software in another LBNL report that was (like the present report) prepared for Pacific Gas and Electric Company: ?Commercial Building Energy Baseline Modeling Software: Performance Metrics and Method Testing with Open Source Models and Implications for Proprietary Software Testing Protocols? (referred to here as the ?Model Analysis Report?). The test procedure focuses on the quality of the software?s predictions rather than on the specific algorithms used to predict energy use. In this way the software vendor is not required to divulge or share proprietary information about how their software works, while enabling stakeholders to assess its performance.

    17. DEVELOPMENT OF A COMPUTATIONAL MULTIPHASE FLOW MODEL FOR FISCHER TROPSCH SYNTHESIS IN A SLURRY BUBBLE COLUMN REACTOR

      SciTech Connect (OSTI)

      Donna Post Guillen; Tami Grimmett; Anastasia M. Gribik; Steven P. Antal

      2010-09-01

      The Hybrid Energy Systems Testing (HYTEST) Laboratory is being established at the Idaho National Laboratory to develop and test hybrid energy systems with the principal objective to safeguard U.S. Energy Security by reducing dependence on foreign petroleum. A central component of the HYTEST is the slurry bubble column reactor (SBCR) in which the gas-to-liquid reactions will be performed to synthesize transportation fuels using the Fischer Tropsch (FT) process. SBCRs are cylindrical vessels in which gaseous reactants (for example, synthesis gas or syngas) is sparged into a slurry of liquid reaction products and finely dispersed catalyst particles. The catalyst particles are suspended in the slurry by the rising gas bubbles and serve to promote the chemical reaction that converts syngas to a spectrum of longer chain hydrocarbon products, which can be upgraded to gasoline, diesel or jet fuel. These SBCRs operate in the churn-turbulent flow regime which is characterized by complex hydrodynamics, coupled with reacting flow chemistry and heat transfer, that effect reactor performance. The purpose of this work is to develop a computational multiphase fluid dynamic (CMFD) model to aid in understanding the physico-chemical processes occurring in the SBCR. Our team is developing a robust methodology to couple reaction kinetics and mass transfer into a four-field model (consisting of the bulk liquid, small bubbles, large bubbles and solid catalyst particles) that includes twelve species: (1) CO reactant, (2) H2 reactant, (3) hydrocarbon product, and (4) H2O product in small bubbles, large bubbles, and the bulk fluid. Properties of the hydrocarbon product were specified by vapor liquid equilibrium calculations. The absorption and kinetic models, specifically changes in species concentrations, have been incorporated into the mass continuity equation. The reaction rate is determined based on the macrokinetic model for a cobalt catalyst developed by Yates and Satterfield [1]. The

    18. Fort Drum integrated resource assessment. Volume 2, Baseline detail

      SciTech Connect (OSTI)

      Dixon, D.R.; Armstrong, P.R.; Brodrick, J.R.; Daellenbach, K.K.; Di Massa, F.V.; Keller, J.M.; Richman, E.E.; Sullivan, G.P.; Wahlstrom, R.R.

      1992-12-01

      The US Army Forces Command (FORSCOM) has tasked the Pacific Northwest Laboratory (PNL) as the lead laboratory supporting the US Department of Energy (DOE) Federal Energy Management Program`s mission to identify, evaluate, and assist in acquiring all cost-effective energy projects at Fort Drum. This is a model program PNL is designing for federal customers served by the Niagara Mohawk Power Company. It will identify and evaluate all electric and fossil fuel cost-effective energy projects; develop a schedule at each installation for project acquisition considering project type, size, timing, and capital requirements, as well as energy and dollar savings; and secure 100% of the financing required to implement electric energy efficiency projects from Niagara Mohawk and have Niagara Mohawk procure the necessary contractors to perform detailed audits and install the technologies. This report documents the assessment of baseline energy use at one of Niagara Mohawk`s primary federal facilities, the FORSCOM Fort Drum facility located near Watertown, New York. It is a companion report to Volume 1, the Executive Summary, and Volume 3, the Resource Assessment. This analysis examines the characteristics of electric, gas, oil, propane, coal, and purchased thermal capacity use for fiscal year (FY) 1990. It records energy-use intensities for the facilities at Fort Drum by building type and energy end use. It also breaks down building energy consumption by fuel type, energy end use, and building type. A complete energy consumption reconciliation is presented that includes the accounting of all energy use among buildings, utilities, central systems, and applicable losses.

    19. Griffiss AFB integrated resource assessment. Volume 2, Electric baseline detail

      SciTech Connect (OSTI)

      Dixon, D.R.; Armstrong, P.R.; Keller, J.M.

      1993-02-01

      The US Air Force Air Combat Command has tasked the Pacific Northwest Laboratory (PNL) as the lead laboratory supporting the US Department of Energy (DOE) Federal Energy Management Program`s (FEMP) mission to identify, evaluate, and assist in acquiring all cost-effective energy projects at Griffiss Air Force Base (AFB). This is a model program PNL is designing for federal customers served by the Niagara Mohawk Power Company (Niagara Mohawk). It will (1) identify and evaluate all electric cost-effective energy projects; (2) develop a schedule at each installation for project acquisition considering project type, size, timing, and capital requirements, as well as energy and dollar savings; and (3) secure 100% of the financing required to implement electric energy efficiency projects from Niagara Mohawk and have Niagara Mohawk procure the necessary contractors to perform detailed audits and install the technologies. This report documents the assessment of baseline energy use at one of Niagara Mohawk`s primary federal facilities, Griffiss AFB, an Air Combat Command facility located near Rome, New York. It is a companion report to Volume 1, the Executive Summary, and Volume 3, the Electric Resource Assessment. The analysis examines the characteristics of electric, gas, oil, propane, coal, and purchased thermal capacity use for fiscal year (FY) 1990. The results include energy-use intensities for the facilities at Griffiss AFB by building type and electric energy end use. A complete electric energy consumption reconciliation is presented that accounts for the distribution of all major electric energy uses and losses among buildings, utilities, and central systems.

    20. Precision Measurements of Long-Baseline Neutrino Oscillation at LBNF

      SciTech Connect (OSTI)

      Worcester, Elizabeth

      2015-08-06

      In a long-baseline neutrino oscillation experiment, the primary physics objectives are to determine the neutrino mass hierarchy, to determine the octant of the neutrino mixing angle θ23, to search for CP violation in neutrino oscillation, and to precisely measure the size of any CP-violating effect that is discovered. This presentation provides a brief introduction to these measurements and reports on efforts to optimize the design of a long-baseline neutrino oscillation experiment, the status of LBNE, and the transition to an international collaboration at LBNF.

    1. Precision Measurements of Long-Baseline Neutrino Oscillation at LBNF

      DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

      Worcester, Elizabeth

      2015-08-06

      In a long-baseline neutrino oscillation experiment, the primary physics objectives are to determine the neutrino mass hierarchy, to determine the octant of the neutrino mixing angle θ23, to search for CP violation in neutrino oscillation, and to precisely measure the size of any CP-violating effect that is discovered. This presentation provides a brief introduction to these measurements and reports on efforts to optimize the design of a long-baseline neutrino oscillation experiment, the status of LBNE, and the transition to an international collaboration at LBNF.

    2. Computational mechanics

      SciTech Connect (OSTI)

      Raboin, P J

      1998-01-01

      The Computational Mechanics thrust area is a vital and growing facet of the Mechanical Engineering Department at Lawrence Livermore National Laboratory (LLNL). This work supports the development of computational analysis tools in the areas of structural mechanics and heat transfer. Over 75 analysts depend on thrust area-supported software running on a variety of computing platforms to meet the demands of LLNL programs. Interactions with the Department of Defense (DOD) High Performance Computing and Modernization Program and the Defense Special Weapons Agency are of special importance as they support our ParaDyn project in its development of new parallel capabilities for DYNA3D. Working with DOD customers has been invaluable to driving this technology in directions mutually beneficial to the Department of Energy. Other projects associated with the Computational Mechanics thrust area include work with the Partnership for a New Generation Vehicle (PNGV) for ''Springback Predictability'' and with the Federal Aviation Administration (FAA) for the ''Development of Methodologies for Evaluating Containment and Mitigation of Uncontained Engine Debris.'' In this report for FY-97, there are five articles detailing three code development activities and two projects that synthesized new code capabilities with new analytic research in damage/failure and biomechanics. The article this year are: (1) Energy- and Momentum-Conserving Rigid-Body Contact for NIKE3D and DYNA3D; (2) Computational Modeling of Prosthetics: A New Approach to Implant Design; (3) Characterization of Laser-Induced Mechanical Failure Damage of Optical Components; (4) Parallel Algorithm Research for Solid Mechanics Applications Using Finite Element Analysis; and (5) An Accurate One-Step Elasto-Plasticity Algorithm for Shell Elements in DYNA3D.

    3. Computer, Computational, and Statistical Sciences

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      CCS Computer, Computational, and Statistical Sciences Computational physics, computer science, applied mathematics, statistics and the integration of large data streams are central ...

    4. Computer-Based Procedures for Field Workers in Nuclear Power Plants: Development of a Model of Procedure Usage and Identification of Requirements

      SciTech Connect (OSTI)

      Katya Le Blanc; Johanna Oxstrand

      2012-04-01

      The nuclear industry is constantly trying to find ways to decrease the human error rate, especially the human errors associated with procedure use. As a step toward the goal of improving procedure use performance, researchers, together with the nuclear industry, have been looking at replacing the current paper-based procedures with computer-based procedure systems. The concept of computer-based procedures is not new by any means; however most research has focused on procedures used in the main control room. Procedures reviewed in these efforts are mainly emergency operating procedures and normal operating procedures. Based on lessons learned for these previous efforts we are now exploring a more unknown application for computer based procedures - field procedures, i.e. procedures used by nuclear equipment operators and maintenance technicians. The Idaho National Laboratory and participants from the U.S. commercial nuclear industry are collaborating in an applied research effort with the objective of developing requirements and specifications for a computer-based procedure system to be used by field workers. The goal is to identify the types of human errors that can be mitigated by using computer-based procedures and how to best design the computer-based procedures to do so. This paper describes the development of a Model of Procedure Use and the qualitative study on which the model is based. The study was conducted in collaboration with four nuclear utilities and five research institutes. During the qualitative study and the model development requirements and for computer-based procedures were identified.

    5. Computational mechanics

      SciTech Connect (OSTI)

      Goudreau, G.L.

      1993-03-01

      The Computational Mechanics thrust area sponsors research into the underlying solid, structural and fluid mechanics and heat transfer necessary for the development of state-of-the-art general purpose computational software. The scale of computational capability spans office workstations, departmental computer servers, and Cray-class supercomputers. The DYNA, NIKE, and TOPAZ codes have achieved world fame through our broad collaborators program, in addition to their strong support of on-going Lawrence Livermore National Laboratory (LLNL) programs. Several technology transfer initiatives have been based on these established codes, teaming LLNL analysts and researchers with counterparts in industry, extending code capability to specific industrial interests of casting, metalforming, and automobile crash dynamics. The next-generation solid/structural mechanics code, ParaDyn, is targeted toward massively parallel computers, which will extend performance from gigaflop to teraflop power. Our work for FY-92 is described in the following eight articles: (1) Solution Strategies: New Approaches for Strongly Nonlinear Quasistatic Problems Using DYNA3D; (2) Enhanced Enforcement of Mechanical Contact: The Method of Augmented Lagrangians; (3) ParaDyn: New Generation Solid/Structural Mechanics Codes for Massively Parallel Processors; (4) Composite Damage Modeling; (5) HYDRA: A Parallel/Vector Flow Solver for Three-Dimensional, Transient, Incompressible Viscous How; (6) Development and Testing of the TRIM3D Radiation Heat Transfer Code; (7) A Methodology for Calculating the Seismic Response of Critical Structures; and (8) Reinforced Concrete Damage Modeling.

    6. Technical Baseline Summary Description for the Tank Farm Contractor

      SciTech Connect (OSTI)

      TEDESCHI, A.R.

      2000-04-21

      This document is a revision of the document titled above, summarizing the technical baseline of the Tank Farm Contractor. It is one of several documents prepared by CH2M HILL Hanford Group, Inc. to support the U.S. Department of Energy Office of River Protection Tank Waste Retrieval and Disposal Mission at Hanford.

    7. 241-AZ Farm Annulus Extent of Condition Baseline Inspection

      SciTech Connect (OSTI)

      Engeman, Jason K.; Girardot, Crystal L.; Vazquez, Brandon J.

      2013-05-15

      This report provides the results of the comprehensive annulus visual inspection for tanks 241- AZ-101 and 241-AZ-102 performed in fiscal year 2013. The inspection established a baseline covering about 95 percent of the annulus floor for comparison with future inspections. Any changes in the condition are also included in this document.

    8. Revised SRC-I project baseline. Volume 2

      SciTech Connect (OSTI)

      Not Available

      1984-01-01

      The SRC Process Area Design Baseline consists of six volumes. The first four were submitted to DOE on 9 September 1981. The fifth volume, summarizing the Category A Engineering Change Proposals (ECPs), was not submitted. The sixth volume, containing proprietary information on Kerr-McGee's Critical Solvent Deashing System, was forwarded to BRHG Synthetic Fuels, Inc. for custody, according to past instructions from DOE, and is available for perusal by authorized DOE representatives. DOE formally accepted the Design Baseline under ICRC Release ECP 4-1001, at the Project Configuration Control Board meeting in Oak Ridge, Tennessee on 5 November 1981. The documentation was then revised by Catalytic, Inc. to incorporate the Category B and C and Post-Baseline Engineering Change Proposals. Volumes I through V of the Revised Design Baseline, dated 22 October 1982, are nonproprietary and they were issued to the DOE via Engineering Change Notice (ECN) 4-1 on 23 February 1983. Volume VI again contains proprieary information on Kerr-McGee Critical Solvent Deashing System; it was issued to Burns and Roe Synthetic Fuels, Inc. Subsequently, updated process descriptions, utility summaries, and errata sheets were issued to the DOE and Burns and Roe Synthetic Fuels, Inc. on nonproprietary Engineering Change Notices 4-2 and 4-3 on 24 May 1983.

    9. THE FIRST VERY LONG BASELINE INTERFEROMETRIC SETI EXPERIMENT

      SciTech Connect (OSTI)

      Rampadarath, H.; Morgan, J. S.; Tingay, S. J.; Trott, C. M.

      2012-08-15

      The first Search for Extra-Terrestrial Intelligence (SETI) conducted with very long baseline interferometry (VLBI) is presented. By consideration of the basic principles of interferometry, we show that VLBI is efficient at discriminating between SETI signals and human generated radio frequency interference (RFI). The target for this study was the star Gliese 581, thought to have two planets within its habitable zone. On 2007 June 19, Gliese 581 was observed for 8 hr at 1230-1544 MHz with the Australian Long Baseline Array. The data set was searched for signals appearing on all interferometer baselines above five times the noise limit. A total of 222 potential SETI signals were detected and by using automated data analysis techniques were ruled out as originating from the Gliese 581 system. From our results we place an upper limit of 7 MW Hz{sup -1} on the power output of any isotropic emitter located in the Gliese 581 system within this frequency range. This study shows that VLBI is ideal for targeted SETI including follow-up observations. The techniques presented are equally applicable to next-generation interferometers, such as the long baselines of the Square Kilometre Array.

    10. Parallel computing works

      SciTech Connect (OSTI)

      Not Available

      1991-10-23

      An account of the Caltech Concurrent Computation Program (C{sup 3}P), a five year project that focused on answering the question: Can parallel computers be used to do large-scale scientific computations '' As the title indicates, the question is answered in the affirmative, by implementing numerous scientific applications on real parallel computers and doing computations that produced new scientific results. In the process of doing so, C{sup 3}P helped design and build several new computers, designed and implemented basic system software, developed algorithms for frequently used mathematical computations on massively parallel machines, devised performance models and measured the performance of many computers, and created a high performance computing facility based exclusively on parallel computers. While the initial focus of C{sup 3}P was the hypercube architecture developed by C. Seitz, many of the methods developed and lessons learned have been applied successfully on other massively parallel architectures.

    11. EVMS Training Snippet: 4.2 Integrated Baseline Review (IBR) Process...

      Energy Savers [EERE]

      2 Integrated Baseline Review (IBR) Process EVMS Training Snippet: 4.2 Integrated Baseline Review (IBR) Process This EVMS Training Snippet sponsored by the Office of Project ...

    12. Microsoft PowerPoint - Snippet 4.6 Baseline Control Methods 20140723...

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      to an approved performance baseline, including impacts on the project scope, schedule, design, methods, and cost baselines. The BCP represents a change to one or more of the...

    13. Development of a Computer-based Benchmarking and Analytical Tool. Benchmarking and Energy & Water Savings Tool in Dairy Plants (BEST-Dairy)

      SciTech Connect (OSTI)

      Xu, Tengfang; Flapper, Joris; Ke, Jing; Kramer, Klaas; Sathaye, Jayant

      2012-02-01

      The overall goal of the project is to develop a computer-based benchmarking and energy and water savings tool (BEST-Dairy) for use in the California dairy industry – including four dairy processes – cheese, fluid milk, butter, and milk powder.

    14. Advanced Artificial Science. The development of an artificial science and engineering research infrastructure to facilitate innovative computational modeling, analysis, and application to interdisciplinary areas of scientific investigation.

      SciTech Connect (OSTI)

      Saffer, Shelley I.

      2014-12-01

      This is a final report of the DOE award DE-SC0001132, Advanced Artificial Science. The development of an artificial science and engineering research infrastructure to facilitate innovative computational modeling, analysis, and application to interdisciplinary areas of scientific investigation. This document describes the achievements of the goals, and resulting research made possible by this award.

    15. Recent developments in large-scale finite-element Lagrangian hydrocode technology. [Dyna 20/dyna 30 computer code

      SciTech Connect (OSTI)

      Goudreau, G.L.; Hallquist, J.O.

      1981-10-01

      The state of Lagrangian hydrocodes for computing the large deformation dynamic response of inelastic continuua is reviewed in the context of engineering computation at the Lawrence Livermore National Laboratory, USA, and the DYNA2D/DYNA3D finite elements codes. The emphasis is on efficiency and computational cost. The simplest elements with explicit time integration. The two-dimensional four node quadrilateral and the three-dimensional hexahedron with one point quadrature are advocated as superior to other more expensive choices. Important auxiliary capabilities are a cheap but effective hourglass control, slidelines/planes with void opening/closure, and rezoning. Both strain measures and material formulation are seen as a homogeneous stress point problem and a flexible material subroutine interface admits both incremental and total strain formulation, dependent on internal energy or an arbitrary set of other internal variables. Vectorization on Class VI computers such as the CRAY-1 is a simple exercise for optimally organized primitive element formulations. Some examples of large scale computation are illustrated, including continuous tone graphic representation.

    16. Lawrence Livermore National Laboratories Perspective on Code Development and High Performance Computing Resources in Support of the National HED/ICF Effort

      SciTech Connect (OSTI)

      Clouse, C. J.; Edwards, M. J.; McCoy, M. G.; Marinak, M. M.; Verdon, C. P.

      2015-07-07

      Through its Advanced Scientific Computing (ASC) and Inertial Confinement Fusion (ICF) code development efforts, Lawrence Livermore National Laboratory (LLNL) provides a world leading numerical simulation capability for the National HED/ICF program in support of the Stockpile Stewardship Program (SSP). In addition the ASC effort provides high performance computing platform capabilities upon which these codes are run. LLNL remains committed to, and will work with, the national HED/ICF program community to help insure numerical simulation needs are met and to make those capabilities available, consistent with programmatic priorities and available resources.

    17. Level 3 Baseline Risk Assessment for Building 3515 at Oak Ridge National Lab., Oak Ridge, TN

      SciTech Connect (OSTI)

      Wollert, D.A.; Cretella, F.M.; Golden, K.M.

      1995-08-01

      The baseline risk assessment for the Fission Product Pilot Plant (Building 3515) at the Oak Ridge National laboratory (ORNL) provides the Decontamination and Decommissioning (D&D) Program at ORNL and Building 3515 project managers with information concerning the results of the Level 3 baseline risk assessment performed for this building. The document was prepared under Work Breakdown Structure 1.4.12.6.2.01 (Activity Data Sheet 3701, Facilities D&D) and includes information on the potential long-term impacts to human health and the environment if no action is taken to remediate Building 3515. Information provided in this document forms the basis for the development of remedial alternatives and the no-action risk portion of the Engineering Evaluation/Cost Analysis report.

    18. Special Issue On Estimation Of Baselines And Leakage In CarbonMitigation Forestry Projects

      SciTech Connect (OSTI)

      Sathaye, Jayant A.; Andrasko, Kenneth

      2006-06-01

      There is a growing acceptance that the environmentalbenefits of forests extend beyond traditional ecological benefits andinclude the mitigation of climate change. Interest in forestry mitigationactivities has led to the inclusion of forestry practices at the projectlevel in international agreements. Climate change activities place newdemands on participating institutions to set baselines, establishadditionality, determine leakage, ensure permanence, and monitor andverify a project's greenhouse gas benefits. These issues are common toboth forestry and other types of mitigation projects. They demandempirical evidence to establish conditions under which such projects canprovide sustained long term global benefits. This Special Issue reportson papers that experiment with a range of approaches based on empiricalevidence for the setting of baselines and estimation of leakage inprojects in developing Asia and Latin America.

    19. High-Level software requirements specification for the TWRS controlled baseline database system

      SciTech Connect (OSTI)

      Spencer, S.G.

      1998-09-23

      This Software Requirements Specification (SRS) is an as-built document that presents the Tank Waste Remediation System (TWRS) Controlled Baseline Database (TCBD) in its current state. It was originally known as the Performance Measurement Control System (PMCS). Conversion to the new system name has not occurred within the current production system. Therefore, for simplicity, all references to TCBD are equivalent to PMCS references. This SRS will reference the PMCS designator from this point forward to capture the as-built SRS. This SRS is written at a high-level and is intended to provide the design basis for the PMCS. The PMCS was first released as the electronic data repository for cost, schedule, and technical administrative baseline information for the TAAS Program. During its initial development, the PMCS was accepted by the customer, TARS Business Management, with no formal documentation to capture the initial requirements.

    20. Baseline ecological risk assessment Salmon Site, Lamar County, Mississippi

      SciTech Connect (OSTI)

      1995-04-01

      The Salmon Site (SS), formerly the Tatum Dome Test Site, located in Mississippi was the site of two nuclear and two gas explosion tests conducted between 1964 and 1970. A consequence of these testing activities is that radionuclides were released into the salt dome, where they are presently contained. During reentry drilling and other site activities, incidental liquid and solid wastes that contained radioactivity were generated, resulting in some soil, ground water and equipment contamination. As part of the remedial investigation effort, a Baseline Ecological Risk Assessment was conducted at the SS. The purpose is to gauge ecological and other environmental impacts attributable to past activities at the former test facility. The results of this facility-specific baseline risk assessment are presented in this document.

    1. Baseline measurements of terrestrial gamma radioactivity at the CEBAF site

      SciTech Connect (OSTI)

      Wollenberg, H.A.; Smith, A.R.

      1991-10-01

      A survey of the gamma radiation background from terrestrial sources was conducted at the CEBAF site, Newport News, Virginia, on November 12--16, 1990, to provide a gamma radiation baseline for the site prior to the startup of the accelerator. The concentrations and distributions of the natural radioelements in exposed soil were measured, and the results of the measurements were converted into gamma-ray exposure rates. Concurrently, samples were collected for laboratory gamma spectral analyses.

    2. Hybrid Electric Vehicle Fleet and Baseline Performance Testing

      SciTech Connect (OSTI)

      J. Francfort; D. Karner

      2006-04-01

      The U.S. Department of Energy’s Advanced Vehicle Testing Activity (AVTA) conducts baseline performance and fleet testing of hybrid electric vehicles (HEV). To date, the AVTA has completed baseline performance testing on seven HEV models and accumulated 1.4 million fleet testing miles on 26 HEVs. The HEV models tested or in testing include: Toyota Gen I and Gen II Prius, and Highlander; Honda Insight, Civic and Accord; Chevrolet Silverado; Ford Escape; and Lexus RX 400h. The baseline performance testing includes dynamometer and closed track testing to document the HEV’s fuel economy (SAE J1634) and performance in a controlled environment. During fleet testing, two of each HEV model are driven to 160,000 miles per vehicle within 36 months, during which maintenance and repair events, and fuel use is recorded and used to compile life-cycle costs. At the conclusion of the 160,000 miles of fleet testing, the SAE J1634 tests are rerun and each HEV battery pack is tested. These AVTA testing activities are conducted by the Idaho National Laboratory, Electric Transportation Applications, and Exponent Failure Analysis Associates. This paper discusses the testing methods and results.

    3. Cognitive Computing for Security.

      SciTech Connect (OSTI)

      Debenedictis, Erik; Rothganger, Fredrick; Aimone, James Bradley; Marinella, Matthew; Evans, Brian Robert; Warrender, Christina E.; Mickel, Patrick

      2015-12-01

      Final report for Cognitive Computing for Security LDRD 165613. It reports on the development of hybrid of general purpose/ne uromorphic computer architecture, with an emphasis on potential implementation with memristors.

    4. Final Report. Baseline LAW Glass Formulation Testing, VSL-03R3460-1, Rev. 0

      SciTech Connect (OSTI)

      Muller, Isabelle S.; Pegg, Ian L.; Gan, Hao; Buechele, Andrew; Rielley, Elizabeth; Bazemore, Gina; Cecil, Richard; Hight, Kenneth; Mooers, Cavin; Lai, Shan-Tao T.; Kruger, Albert A.

      2015-06-18

      The major objective of the baseline glass formulation work was to develop and select glass formulations that are compliant with contractual and processing requirements for each of the LAW waste streams. Other objectives of the work included preparation and characterization of glasses with respect to the properties of interest, optimization of sulfate loading in the glasses, evaluation of ability to achieve waste loading limits, testing to demonstrate compatibility of glass melts with melter materials of construction, development of glass formulations to support ILAW qualification activities, and identification of glass formulation issues with respect to contract specifications and processing requirements.

    5. Opportunities for Russian Nuclear Weapons Institute developing computer-aided design programs for pharmaceutical drug discovery. Final report

      SciTech Connect (OSTI)

      1996-09-23

      The goal of this study is to determine whether physicists at the Russian Nuclear Weapons Institute can profitably service the need for computer aided drug design (CADD) programs. The Russian physicists` primary competitive advantage is their ability to write particularly efficient code able to work with limited computing power; a history of working with very large, complex modeling systems; an extensive knowledge of physics and mathematics, and price competitiveness. Their primary competitive disadvantage is their lack of biology, and cultural and geographic issues. The first phase of the study focused on defining the competitive landscape, primarily through interviews with and literature searches on the key providers of CADD software. The second phase focused on users of CADD technology to determine deficiencies in the current product offerings, to understand what product they most desired, and to define the potential demand for such a product.

    6. Tank waste remediation system retrieval and disposal mission initial updated baseline summary

      SciTech Connect (OSTI)

      Swita, W.R.

      1998-01-09

      This document provides a summary of the Tank Waste Remediation System (TWRS) Retrieval and Disposal Mission Initial Updated Baseline (scope, schedule, and cost), developed to demonstrate Readiness-to-Proceed (RTP) in support of the TWRS Phase 1B mission. This Updated Baseline is the proposed TWRS plan to execute and measure the mission work scope. This document and other supporting data demonstrate that the TWRS Project Hanford Management Contract (PHMC) team is prepared to fully support Phase 1B by executing the following scope, schedule, and cost baseline activities: Deliver the specified initial low-activity waste (LAW) and high-level waste (HLW) feed batches in a consistent, safe, and reliable manner to support private contractors` operations starting in June 2002; Deliver specified subsequent LAW and HLW feed batches during Phase 1B in a consistent, safe, and reliable manner; Provide for the interim storage of immobilized HLW (IHLW) products and the disposal of immobilized LAW (ILAW) products generated by the private contractors; Provide for disposal of byproduct wastes generated by the private contractors; and Provide the infrastructure to support construction and operations of the private contractors` facilities.

    7. NREL: MIDC/SRRL Baseline Measurement System (39.74 N, 105.18 W, 1829 m,

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      GMT-7) Solar Radiation Research Laboratory Baseline Measurement System

    8. CLAMR (Compute Language Adaptive Mesh Refinement)

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      CLAMR (Compute Language Adaptive Mesh Refinement) CLAMR (Compute Language Adaptive Mesh Refinement) CLAMR (Compute Language Adaptive Mesh Refinement) is being developed as a DOE...

    9. Baseline review of the U.S. LHC Accelerator project

      SciTech Connect (OSTI)

      1998-02-01

      The Department of Energy (DOE) Review of the U.S. Large Hadron Collider (LHC) Accelerator project was conducted February 23--26, 1998, at the request of Dr. John R. O`Fallon, Director, Division of High Energy Physics, Office of Energy Research, U.S. DOE. This is the first review of the U.S. LHC Accelerator project. Overall, the Committee found that the U.S. LHC Accelerator project effort is off to a good start and that the proposed scope is very conservative for the funding available. The Committee recommends that the project be initially baselined at a total cost of $110 million, with a scheduled completion data of 2005. The U.S. LHC Accelerator project will supply high technology superconducting magnets for the interaction regions (IRs) and the radio frequency (rf) straight section of the LHC intersecting storage rings. In addition, the project provides the cryogenic support interface boxes to service the magnets and radiation absorbers to protect the IR dipoles and the inner triplet quadrupoles. US scientists will provide support in analyzing some of the detailed aspects of accelerator physics in the two rings. The three laboratories participating in this project are Brookhaven National Laboratory, Fermi National Accelerator Laboratory (Fermilab), and Lawrence Berkeley National Laboratory. The Committee was very impressed by the technical capabilities of the US LHC Accelerator project team. Cost estimates for each subsystem of the US LHC Accelerator project were presented to the Review Committee, with a total cost including contingency of $110 million (then year dollars). The cost estimates were deemed to be conservative. A re-examination of the funding profile, costs, and schedules on a centralized project basis should lead to an increased list of deliverables. The Committee concluded that the proposed scope of US deliverables to CERN can be readily accomplished with the $110 million total cost baseline for the project. The current deliverables should serve as

    10. Emergency Response Capability Baseline Needs Assessment Compliance Assessment

      SciTech Connect (OSTI)

      Sharry, John A.

      2013-09-16

      This document is the second of a two-part analysis of Emergency Response Capabilities of Lawrence Livermore National Laboratory. The first part, 2013 Baseline Needs Assessment Requirements Document established the minimum performance criteria necessary to meet mandatory requirements. This second part analyses the performance of Lawrence Livermore Laboratory Emergency Management Department to the contents of the Requirements Document. The document was prepared based on an extensive review of information contained in the 2009 BNA, the 2012 BNA document, a review of Emergency Planning Hazards Assessments, a review of building construction, occupancy, fire protection features, dispatch records, LLNL alarm system records, fire department training records, and fire department policies and procedures.

    11. Verification of warhead dismantelment and the importance of baseline validation

      SciTech Connect (OSTI)

      Buonpane, L.M.; Strait, R.S. )

      1991-01-01

      This paper presents an approach for evaluating verification regimes for nuclear warhead dismantlement. The approach is an adaptation of the traditional nuclear materials management model. As such the approach integrates the difficulties of verifying both stockpile estimates and numbers of warheads dismantled. Both random uncertainties and systematic uncertainties are considered in this approach. By making some basic assumptions about the relative uncertainties surrounding the stockpile estimates and the numbers of warheads dismantled, the authors illustrate their relative impacts on overall verification ability. The results highlight the need for increased attention on the problem of validating baseline declarations of stockpile size.

    12. Baseline Assessment of TREAT for Modeling and Analysis Needs

      SciTech Connect (OSTI)

      Bess, John Darrell; DeHart, Mark David

      2015-10-01

      TREAT is an air-cooled, graphite moderated, thermal, heterogeneous test facility designed to evaluate reactor fuels and structural materials under conditions simulating various types of nuclear excursions and transient undercooling situations that could occur in a nuclear reactor. After 21 years in a standby mode, TREAT is being re-activated to revive transient testing capabilities. Given the time elapsed and the concurrent loss of operating experience, current generation and advanced computational methods are being applied to begin TREAT modeling and simulation prior to renewed at-power operations. Such methods have limited value in predicting the behavior of TREAT without proper validation. Hence, the U.S. DOE has developed a number of programs to support development of benchmarks for both critical and transient operations. Extensive effort has been expended at INL to collect detailed descriptions, drawings and specifications for all aspects of TREAT, and to resolve conflicting data found through this process. This report provides a collection of these data, with updated figures that are significantly more readable than historic drawings and illustrations, compositions, and dimensions based on the best available sources. This document is not nor should it be considered to be a benchmark report. Rather, it is intended to provide one-stop shopping, to the extent possible, for other work that seeks to prepare detailed, accurate models of the core and its components. Given the nature of the variety of historic documents available and the loss of institutional memory, the only completely accurate database of TREAT data is TREAT itself. Unfortunately, disassembly of TREAT for inspection, assay, and measurement is highly unlikely. Hence the data provided herein is intended serve as a best-estimate substitute.

    13. Computational Combustion

      SciTech Connect (OSTI)

      Westbrook, C K; Mizobuchi, Y; Poinsot, T J; Smith, P J; Warnatz, J

      2004-08-26

      Progress in the field of computational combustion over the past 50 years is reviewed. Particular attention is given to those classes of models that are common to most system modeling efforts, including fluid dynamics, chemical kinetics, liquid sprays, and turbulent flame models. The developments in combustion modeling are placed into the time-dependent context of the accompanying exponential growth in computer capabilities and Moore's Law. Superimposed on this steady growth, the occasional sudden advances in modeling capabilities are identified and their impacts are discussed. Integration of submodels into system models for spark ignition, diesel and homogeneous charge, compression ignition engines, surface and catalytic combustion, pulse combustion, and detonations are described. Finally, the current state of combustion modeling is illustrated by descriptions of a very large jet lifted 3D turbulent hydrogen flame with direct numerical simulation and 3D large eddy simulations of practical gas burner combustion devices.

    14. Energy baseline and energy efficiency resource opportunities for the Forest Products Laboratory, Madison, Wisconsin

      SciTech Connect (OSTI)

      Mazzucchi, R.P.; Richman, E.E.; Parker, G.B.

      1993-08-01

      This report provides recommendations to improve the energy use efficiency at the Forest Products Laboratory in Madison, Wisconsin. The assessment focuses upon the four largest buildings and central heating plant at the facility comprising a total of approximately 287,000 square feet. The analysis is comprehensive in nature, intended primarily to determine what if any energy efficiency improvements are warranted based upon the potential for cost-effective energy savings. Because of this breadth, not all opportunities are developed in detail; however, baseline energy consumption data and energy savings concepts are described to provide a foundation for detailed investigation and project design where warranted.

    15. Data Management Guide: Integrated Baseline System (IBS). Version 2.1

      SciTech Connect (OSTI)

      Bower, J.C. [Bower Software Services, Kennewick, Washington (United States)] Bower Software Services, Kennewick, Washington (United States); Burford, M.J.; Downing, T.R.; Moise, M.C.; Williams, J.R. [Pacific Northwest Lab., Richland, WA (United States)] Pacific Northwest Lab., Richland, WA (United States)

      1995-01-01

      The Integrated Baseline System (IBS) is an emergency management planning and analysis tool that is being developed under the direction of the US Army Nuclear and Chemical Agency (USANCA). The IBS Data Management Guide provides the background, as well as the operations and procedures needed to generate and maintain a site-specific map database. Data and system managers use this guide to manage the data files and database that support the administrative, user-environment, database management, and operational capabilities of the IBS. This document provides a description of the data files and structures necessary for running the IBS software and using the site map database.

    16. Estimating baseline risks from biouptake and food ingestion at a contaminated site

      SciTech Connect (OSTI)

      MacDonell, M.; Woytowich, K.; Blunt, D.; Picel, M.

      1993-11-01

      Biouptake of contaminants and subsequent human exposure via food ingestion represents a public concern at many contaminated sites. Site-specific measurements from plant and animal studies are usually quite limited, so this exposure pathway is often modeled to assess the potential for adverse health effects. A modeling tool was applied to evaluate baseline risks at a contaminated site in Missouri, and the results were used to confirm that ingestion of fish and game animals from the site area do not pose a human health threat. Results were also used to support the development of cleanup criteria for site soil.

    17. Computed tomography and optical remote sensing: Development for the study of indoor air pollutant transport and dispersion

      SciTech Connect (OSTI)

      Drescher, A.C.

      1995-06-01

      This thesis investigates the mixing and dispersion of indoor air pollutants under a variety of conditions using standard experimental methods. It also extensively tests and improves a novel technique for measuring contaminant concentrations that has the potential for more rapid, non-intrusive measurements with higher spatial resolution than previously possible. Experiments conducted in a sealed room support the hypothesis that the mixing time of an instantaneously released tracer gas is inversely proportional to the cube root of the mechanical power transferred to the room air. One table-top and several room-scale experiments are performed to test the concept of employing optical remote sensing (ORS) and computed tomography (CT) to measure steady-state gas concentrations in a horizontal plane. Various remote sensing instruments, scanning geometries and reconstruction algorithms are employed. Reconstructed concentration distributions based on existing iterative CT techniques contain a high degree of unrealistic spatial variability and do not agree well with simultaneously gathered point-sample data.

    18. Advanced Scientific Computing Research

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Advanced Scientific Computing Research Advanced Scientific Computing Research Discovering, developing, and deploying computational and networking capabilities to analyze, model, simulate, and predict complex phenomena important to the Department of Energy. Get Expertise Pieter Swart (505) 665 9437 Email Pat McCormick (505) 665-0201 Email Dave Higdon (505) 667-2091 Email Fulfilling the potential of emerging computing systems and architectures beyond today's tools and techniques to deliver

    19. Compute nodes

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Compute nodes Compute nodes Click here to see more detailed hierachical map of the topology of a compute node. Last edited: 2015-03-30 20:55:24...

    20. Waste Assessment Baseline for the IPOC Second Floor, West Wing

      SciTech Connect (OSTI)

      McCord, Samuel A

      2015-04-01

      Following a building-wide waste assessment in September, 2014, and subsequent presentation to Sandia leadership regarding the goal of Zero Waste by 2025, the occupants of the IPOC Second Floor, West Wing contacted the Materials Sustainability and Pollution Prevention (MSP2) team to guide them to Zero Waste in advance of the rest of the site. The occupants are from Center 3600, Public Relations and Communications , and Center 800, Independent Audit, Ethics and Business Conduct . To accomplish this, MSP2 conducted a new limited waste assessment from March 2-6, 2015 to compare the second floor, west wing to the building as a whole. The assessment also serves as a baseline with which to mark improvements in diversion in approximately 6 months.

    1. Computer System,

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      undergraduate summer institute http:isti.lanl.gov (Educational Prog) 2016 Computer System, Cluster, and Networking Summer Institute Purpose The Computer System,...

    2. EVMS Training Snippet: 4.1 The Over Target Baseline (OTB) and...

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      EVMS Training Snippet: 3.1A Integrated Master Schedule (IMS) Initial Baseline Review EVMS Training Snippet: 4.6 Baseline Control Methods EVMS Training Snippet: 4.9 High-level EVM...

    3. System maintenance verification and validation plan for the TWRS controlled baseline database system

      SciTech Connect (OSTI)

      Spencer, S.G.

      1998-09-23

      TWRS Controlled Baseline Database, formally known as the Performance Measurement Control System, is used to track and monitor TWRS project management baseline information. This document contains the verification and validation approach for system documentation changes within the database system.

    4. TRIDAC host computer functional specification

      SciTech Connect (OSTI)

      Hilbert, S.M.; Hunter, S.L.

      1983-08-23

      The purpose of this document is to outline the baseline functional requirements for the Triton Data Acquisition and Control (TRIDAC) Host Computer Subsystem. The requirements presented in this document are based upon systems that currently support both the SIS and the Uranium Separator Technology Groups in the AVLIS Program at the Lawrence Livermore National Laboratory and upon the specific demands associated with the extended safe operation of the SIS Triton Facility.

    5. Comparison between the Strength Levels of Baseline Nuclear-Grade Graphite and Graphite Irradiated in AGC-2

      SciTech Connect (OSTI)

      Carroll, Mark Christopher

      2015-07-01

      This report details the initial comparison of mechanical strength properties between the cylindrical nuclear-grade graphite specimens irradiated in the second Advanced Graphite Creep (AGC-2) experiment with the established baseline, or unirradiated, mechanical properties compiled in the Baseline Graphite Characterization program. The overall comparative analysis will describe the development of an appropriate test protocol for irradiated specimens, the execution of the mechanical tests on the AGC-2 sample population, and will further discuss the data in terms of developing an accurate irradiated property distribution in the limited amount of irradiated data by leveraging the considerably larger property datasets being captured in the Baseline Graphite Characterization program. Integrating information on the inherent variability in nuclear-grade graphite with more complete datasets is one of the goals of the VHTR Graphite Materials program. Between “sister” specimens, or specimens with the same geometry machined from the same sub-block of graphite from which the irradiated AGC specimens were extracted, and the Baseline datasets, a comprehensive body of data will exist that can provide both a direct and indirect indication of the full irradiated property distributions that can be expected of irradiated nuclear-grade graphite while in service in a VHTR system. While the most critical data will remain the actual irradiated property measurements, expansion of this data into accurate distributions based on the inherent variability in graphite properties will be a crucial step in qualifying graphite for nuclear use as a structural material in a VHTR environment.

    6. computers | National Nuclear Security Administration

      National Nuclear Security Administration (NNSA)

      computers NNSA Announces Procurement of Penguin Computing Clusters to Support Stockpile Stewardship at National Labs The National Nuclear Security Administration's (NNSA's) Lawrence Livermore National Laboratory today announced the awarding of a subcontract to Penguin Computing - a leading developer of high-performance Linux cluster computing systems based in Silicon Valley - to bolster computing for stockpile... Sandia donates 242 computers to northern California schools Sandia National

    7. Idaho National Laboratory’s Greenhouse Gas FY08 Baseline

      SciTech Connect (OSTI)

      Jennifer D. Morton

      2010-09-01

      A greenhouse gas (GHG) inventory is a systematic attempt to account for the production and release of certain gasses generated by an institution from various emission sources. The gasses of interest are those which have become identified by climate science as related to anthropogenic global climate change. This document presents an inventory of GHGs generated during fiscal year (FY) 2008 by Idaho National Laboratory (INL), a Department of Energy (DOE)-sponsored entity, located in southeastern Idaho. Concern about the environmental impact of GHGs has grown in recent years. This, together with a desire to decrease harmful environmental impacts, would be enough to encourage the calculation of a baseline estimate of total GHGs generated at the INL. Additionally, the INL has a desire to see how its emissions compare with similar institutions, including other DOE-sponsored national laboratories. Executive Order 13514 requires that federally-sponsored agencies and institutions document reductions in GHG emissions in the future, and such documentation will require knowledge of a baseline against which reductions can be measured. INL’s FY08 GHG inventory was calculated according to methodologies identified in Federal recommendations and an as-yet-unpublished Technical and Support Document (TSD) using operational control boundary. It measures emissions generated in three Scopes: (1) INL emissions produced directly by stationary or mobile combustion and by fugitive emissions, (2) the share of emissions generated by entities from which INL purchased electrical power, and (3) indirect or shared emissions generated by outsourced activities that benefit INL (occur outside INL’s organizational boundaries but are a consequence of INL’s activities). This inventory found that INL generated a total of 114,256 MT of CO2-equivalent emissions during fiscal year 2008 (FY08). The following conclusions were made from looking at the results of the individual contributors to INL

    8. Idaho National Laboratory’s Greenhouse Gas FY08 Baseline

      SciTech Connect (OSTI)

      Jennifer D. Morton

      2011-06-01

      A greenhouse gas (GHG) inventory is a systematic attempt to account for the production and release of certain gasses generated by an institution from various emission sources. The gasses of interest are those which have become identified by climate science as related to anthropogenic global climate change. This document presents an inventory of GHGs generated during fiscal year (FY) 2008 by Idaho National Laboratory (INL), a Department of Energy (DOE)-sponsored entity, located in southeastern Idaho. Concern about the environmental impact of GHGs has grown in recent years. This, together with a desire to decrease harmful environmental impacts, would be enough to encourage the calculation of a baseline estimate of total GHGs generated at INL. Additionally, INL has a desire to see how its emissions compare with similar institutions, including other DOE national laboratories. Executive Order 13514 requires that federal agencies and institutions document reductions in GHG emissions in the future, and such documentation will require knowledge of a baseline against which reductions can be measured. INL's FY08 GHG inventory was calculated according to methodologies identified in federal GHG guidance documents using operational control boundaries. It measures emissions generated in three Scopes: (1) INL emissions produced directly by stationary or mobile combustion and by fugitive emissions, (2) the share of emissions generated by entities from which INL purchased electrical power, and (3) indirect or shared emissions generated by outsourced activities that benefit INL (occur outside INL's organizational boundaries but are a consequence of INL's activities). This inventory found that INL generated a total of 113,049 MT of CO2-equivalent emissions during FY08. The following conclusions were made from looking at the results of the individual contributors to INL's baseline GHG inventory: (1) Electricity (including the associated transmission and distribution losses) is the

    9. Computing Information

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Information From here you can find information relating to: Obtaining the right computer accounts. Using NIC terminals. Using BooNE's Computing Resources, including: Choosing your desktop. Kerberos. AFS. Printing. Recommended applications for various common tasks. Running CPU- or IO-intensive programs (batch jobs) Commonly encountered problems Computing support within BooNE Bringing a computer to FNAL, or purchasing a new one. Laptops. The Computer Security Program Plan for MiniBooNE The

    10. Baseline Risk Assessment Supporting Closure at Waste Management Area C at the Hanford Site Washington

      SciTech Connect (OSTI)

      Singleton, Kristin M.

      2015-01-07

      The Office of River Protection under the U.S. Department of Energy is pursuing closure of the Single-Shell Tank (SST) Waste Management Area (WMA) C under the requirements of the Hanford Federal Facility Agreement and Consent Order (HFFACO). A baseline risk assessment (BRA) of current conditions is based on available characterization data and information collected at WMA C. The baseline risk assessment is being developed as a part of a Resource Conservation and Recovery Act (RCRA) Facility Investigation (RFI)/Corrective Measures Study (CMS) at WMA C that is mandatory under Comprehensive Environmental Response, Compensation, and Liability Act and RCRA corrective action. The RFI/CMS is needed to identify and evaluate the hazardous chemical and radiological contamination in the vadose zone from past releases of waste from WMA C. WMA C will be under Federal ownership and control for the foreseeable future, and managed as an industrial area with restricted access and various institutional controls. The exposure scenarios evaluated under these conditions include Model Toxics Control Act (MTCA) Method C, industrial worker, maintenance and surveillance worker, construction worker, and trespasser scenarios. The BRA evaluates several unrestricted land use scenarios (residential all-pathway, MTCA Method B, and Tribal) to provide additional information for risk management. Analytical results from 13 shallow zone (0 to 15 ft. below ground surface) sampling locations were collected to evaluate human health impacts at WMA C. In addition, soil analytical data were screened against background concentrations and ecological soil screening levels to determine if soil concentrations have the potential to adversely affect ecological receptors. Analytical data from 12 groundwater monitoring wells were evaluated between 2004 and 2013. A screening of groundwater monitoring data against background concentrations and Federal maximum concentration levels was used to determine vadose zone

    11. Pentek metal coating removal system: Baseline report; Greenbook (chapter)

      SciTech Connect (OSTI)

      1997-07-31

      The Pentek coating removal technology was tested and is being evaluated at Florida International University (FIU) as a baseline technology. In conjunction with FIU`s evaluation of efficiency and cost, this report covers evaluation conducted for safety and health issues. It is a commercially available technology and has been used for various projects at locations throughout the country. The Pentek coating removal system consisted of the ROTO-PEEN Scaler, CORNER-CUTTER{reg_sign}, and VAC-PAC{reg_sign}. They are designed to remove coatings from steel, concrete, brick, and wood. The Scaler uses 3M Roto Peen tungsten carbide cutters while the CORNER-CUTTER{reg_sign} uses solid needles for descaling activities. These hand tools are used with the VAC-PAC{reg_sign} vacuum system to capture dust and debris as removal of the coating takes place. The safety and health evaluation during the testing demonstration focused on two main areas of exposure: dust and noise. Dust exposure minimal, but noise exposure was significant. Further testing for each exposure is recommended because of the environment where the testing demonstration took place. It is feasible that the dust and noise levels will be higher in an enclosed operating environment of different construction. In addition, other areas of concern found were arm-hand vibration, whole-body, ergonomics, heat stress, tripping hazards, electrical hazards, machine guarding, and lockout/tagout.

    12. Ultra-high pressure water jet: Baseline report

      SciTech Connect (OSTI)

      1997-07-31

      The ultra-high pressure waterjet technology was being evaluated at Florida International University (FIU) as a baseline technology. In conjunction with FIU`s evaluation of efficiency and cost, this report covers the evaluation conducted for safety and health issues. It is a commercially available technology and has been used for various projects at locations throughout the country. The ultra-high pressure waterjet technology acts as a cutting tool for the removal of surface substrates. The Husky{trademark} pump feeds water to a lance that directs the high pressure water at the surface to be removed. The safety and health evaluation during the testing demonstration focused on two main areas of exposure. These were dust and noise. The dust exposure was found to be minimal, which would be expected due to the wet environment inherent in the technology, but noise exposure was at a significant level. Further testing for noise is recommended because of the outdoor environment where the testing demonstration took place. In addition, other areas of concern found were arm-hand vibration, ergonomics, heat stress, tripping hazards, electrical hazards, lockout/tagout, fall hazards, slipping hazards, hazards associated with the high pressure water, and hazards associated with air pressure systems.

    13. Pentek concrete scabbling system: Baseline report; Greenbook (chapter)

      SciTech Connect (OSTI)

      1997-07-31

      The Pentek scabbling technology was tested at Florida International University (FIU) and is being evaluated as a baseline technology. This report evaluates it for safety and health issues. It is a commercially available technology and has been used for various projects at locations throughout the country. The Pentek concrete scabbling system consisted of the MOOSE, SQUIRREL-I, and SQUIRREL-III scabblers. The scabblers are designed to scarify concrete floors and slabs using cross-section, tungsten carbide tipped bits. The bits are designed to remove concrete in 318 inch increments. The bits are either 9-tooth or demolition type. The scabblers are used with a vacuum system designed to collect and filter the concrete dust and contamination that is removed from the surface. The safety and health evaluation conducted during the testing demonstration focused on two main areas of exposure: dust and noise. Dust exposure was minimal, but noise exposure was significant. Further testing for each of these exposures is recommended. Because of the outdoor environment where the testing demonstration took place, results may be inaccurate. It is feasible that the dust and noise levels will be higher in an enclosed operating environment. Other areas of concern were arm-hand vibration, whole-body vibration, ergonomics, heat stress, tripping hazards, electrical hazards, machine guarding, and lockout/tagout.

    14. LTC vacuum blasting machine (metal) baseline report: Greenbook (chapter)

      SciTech Connect (OSTI)

      1997-07-31

      The LTC coating removal technology was tested and is being evaluated at Florida International University (FIU) as a baseline technology. In conjunction with FIU`s evaluation of efficiency and cost, this report covers the evaluation conducted for safety and health issues. It is a commercially available technology and has been used for various projects at locations throughout the country. The LTC coating removal system consisted of several hand tools, a Roto Peen scaler, and a needlegun. They are designed to remove coatings from steel, concrete, brick, and wood. These hand tools are used with the LTC PTC-6 vacuum system to capture dust and debris as removal of the coating takes place. The safety and health evaluation during the testing demonstration focused on two main areas of exposure: dust and noise. The dust exposure was minimal but noise exposure was significant. Further testing for each exposure is recommended because of the environment where the testing demonstration took place. It is feasible that the dust and noise levels will be higher in an enclosed operating environment of different construction. In addition, other areas of concern found were arm-hand vibration, whole-body vibration, ergonomics, heat stress, tripping hazards, electrical hazards, machine guarding, and lockout/tagout.

    15. LTC vacuum blasting maching (concrete): Baseline report: Greenbook (Chapter)

      SciTech Connect (OSTI)

      1997-07-31

      The LTC shot blast technology was tested and is being evaluated at Florida International University (FIU) as a baseline technology. In conjuction with FIU`s evaluation of efficiency and cost, this report covers the evaluation conducted for safety and health issues. It is a commercially available technology and has been used for various projects at locations throughout the country. The LTC 1073 Vacuum Blasting Machine uses a high-capacity, direct-pressure blasting system which incorporates a continuous feed for the blast media. The blast media cleans the surface within the contained brush area of the blast. It incorporates a vacuum system which removes dust and debris from the surface as it is blasted. The safety and health evaluation during the testing demonstration focused on two main areas of exposure: dust and noise. Dust exposure during maintenance activities was minimal, but due to mechanical difficulties dust monitoring could not be conducted during operation. Noise exposure was significant. Further testing for each of these exposures is recommended because of the outdoor environment where the testing demonstration took place. This may cause the results to be inaccurate. It is feasible that the dust and noise levels will be higher in an enclosed environment. In addition, other safety and health issues found were ergonomics, heat stress, tripping hazards, electrical hazards, lockout/tagout, and arm-hand vibration.

    16. ATIC as a testbed for the ACCESS baseline calorimeter

      SciTech Connect (OSTI)

      Isbert, J.; Authement, J.; Coleman, J.; Guzik, T. G.; Granger, D.; Lockwood, R.; McMorris, A.; Mock, L.; Oubre, C.; Panasyuk, M.; Peck, J.; Wefel, J. P.; Adams, J. H. Jr.; Boberg, P. R.; Dion-Schwarz, C.; Kroeger, R.; Bashindzhagyan, G. B.; Khein, L.; Samsonov, G. A.; Zatsepin, V. I.

      1999-01-22

      The Advanced Thin Ionization Calorimeter (ATIC) balloon experiment is designed to measure the spectrum of individual elements from H through Fe up to a total energy >10{sup 14} eV. To accomplish this goal, ATIC incorporates a Silicon matrix detector composed of more than 4,000 pixels to measure the incident particle charge in the presence of backscatter background, three plastic scintillator hodoscopes to provide an event trigger as well as a backup measurement of the particle charge and trajectory, a 3/4 interaction length carbon target and a fully active ionization calorimeter composed of 22 radiation lengths of Bismuth Germanate (BGO) crystals. This detector complement is very similar to the baseline calorimeter for the Advanced Cosmic Ray Composition Experiment for the Space Station, ACCESS. The ATIC flights can be used to evaluate such a calorimeter in the cosmic ray 'beam.' ATIC integration is currently underway with a first flight expected during 1999. This talk will discuss ATIC as it applies to ACCESS.

    17. Neutrino Oscillation Parameter Sensitivity in Future Long-Baseline Experiments

      SciTech Connect (OSTI)

      Bass, Matthew

      2014-01-01

      The study of neutrino interactions and propagation has produced evidence for physics beyond the standard model and promises to continue to shed light on rare phenomena. Since the discovery of neutrino oscillations in the late 1990s there have been rapid advances in establishing the three flavor paradigm of neutrino oscillations. The 2012 discovery of a large value for the last unmeasured missing angle has opened the way for future experiments to search for charge-parity symmetry violation in the lepton sector. This thesis presents an analysis of the future sensitivity to neutrino oscillations in the three flavor paradigm for the T2K, NO A, LBNE, and T2HK experiments. The theory of the three flavor paradigm is explained and the methods to use these theoretical predictions to design long baseline neutrino experiments are described. The sensitivity to the oscillation parameters for each experiment is presented with a particular focus on the search for CP violation and the measurement of the neutrino mass hierarchy. The variations of these sensitivities with statistical considerations and experimental design optimizations taken into account are explored. The effects of systematic uncertainties in the neutrino flux, interaction, and detection predictions are also considered by incorporating more advanced simulations inputs from the LBNE experiment.

    18. U.S. Department of Energy Performance Baseline Guide

      Broader source: Directives, Delegations, and Requirements [Office of Management (MA)]

      2011-09-23

      This guide identifies key PB elements, development processes, and practices; describes the context in which DOE PB development occurs; and suggests ways of addressing the critical elements in PB development. Supersedes DOE G 413.3-5.

    19. computation | National Nuclear Security Administration

      National Nuclear Security Administration (NNSA)

      computation Groundbreaking Leader of Computation at LLNL Retires Dona Crawford, Associate Director for Computation at NNSA's Lawrence Livermore National Laboratory (LLNL), announced her retirement last week after 15 years of leading Livermore's Computation Directorate. "Dona has successfully led a multidisciplinary 1000-person team that develops and deploys

    20. Dixie Valley Engineered Geothermal System Exploration Methodology Project, Baseline Conceptual Model Report

      DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

      Iovenitti, Joe

      2014-01-02

      The Engineered Geothermal System (EGS) Exploration Methodology Project is developing an exploration approach for EGS through the integration of geoscientific data. The Project chose the Dixie Valley Geothermal System in Nevada as a field laboratory site for methodology calibration purposes because, in the public domain, it is a highly characterized geothermal system in the Basin and Range with a considerable amount of geoscience and most importantly, well data. The overall project area is 2500km2 with the Calibration Area (Dixie Valley Geothermal Wellfield) being about 170km2. The project was subdivided into five tasks (1) collect and assess the existing public domain geoscience data; (2) design and populate a GIS database; (3) develop a baseline (existing data) geothermal conceptual model, evaluate geostatistical relationships, and generate baseline, coupled EGS favorability/trust maps from +1km above sea level (asl) to -4km asl for the Calibration Area at 0.5km intervals to identify EGS drilling targets at a scale of 5km x 5km; (4) collect new geophysical and geochemical data, and (5) repeat Task 3 for the enhanced (baseline + new ) data. Favorability maps were based on the integrated assessment of the three critical EGS exploration parameters of interest: rock type, temperature and stress. A complimentary trust map was generated to compliment the favorability maps to graphically illustrate the cumulative confidence in the data used in the favorability mapping. The Final Scientific Report (FSR) is submitted in two parts with Part I describing the results of project Tasks 1 through 3 and Part II covering the results of project Tasks 4 through 5 plus answering nine questions posed in the proposal for the overall project. FSR Part I presents (1) an assessment of the readily available public domain data and some proprietary data provided by Terra-Gen Power, LLC, (2) a re-interpretation of these data as required, (3) an exploratory geostatistical data analysis, (4

    1. Computing Resources

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Cluster-Image TRACC RESEARCH Computational Fluid Dynamics Computational Structural Mechanics Transportation Systems Modeling Computing Resources The TRACC Computational Clusters With the addition of a new cluster called Zephyr that was made operational in September of this year (2012), TRACC now offers two clusters to choose from: Zephyr and our original cluster that has now been named Phoenix. Zephyr was acquired from Atipa technologies, and it is a 92-node system with each node having two AMD

    2. Sampling designs for geochemical baseline studies in the Colorado oil shale region: a manual for practical application

      SciTech Connect (OSTI)

      Klusman, R. W.; Ringrose, C. D.; Candito, R. J.; Zuccaro, B.; Rutherford, D. W.; Dean, W. E.

      1980-06-01

      This manual presents a rationale for sampling designs, and results of geochemical baseline studies in the Colorado portion of the oil-shale region. The program consists of a systematic trace element study of soils, stream sediments, and plants carried out in a way to be conservative of human and financial resources and yield maximum information. Extension of this approach to other parameters, other locations, and to environmental baseline studies in general is a primary objective. A baseline for any geochemical parameter can be defined as the concentration of that parameter in a given medium such as soil, the range of its concentration, and the geographic scale of variability. In air quality studies, and to a lesser extent for plants, the temporal scale of variability must also be considered. In studies of soil, the temporal variablility does not become a factor until such time that a study is deemed necessary to evaluate whether or not there have been changes in baseline levels as a result of development. The manual is divided into five major parts. The first is a suggested sampling protocol which is presented in an outline form for guiding baseline studies in this area. The second section is background information on the physical features of the area of study, trace elements of significance occurring in oil shale, and the sample media used in these studies. The third section is concerned primarily with sampling design and its application to the geochemical studies of the oil shale region. The last sections, in the form of appendices, provide actual data and illustrate in a systematic manner, the calculations performed to obtain the various summary data. The last segment of the appendices is a more academic discussion of the geochemistry of trace elements and the parameters of importance influencing their behavior in natural systems.

    3. Computer Architecture Lab

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      FastForward CAL Partnerships Shifter: User Defined Images Archive APEX Home » R & D » Exascale Computing » CAL Computer Architecture Lab The goal of the Computer Architecture Laboratory (CAL) is engage in research and development into energy efficient and effective processor and memory architectures for DOE's Exascale program. CAL coordinates hardware architecture R&D activities across the DOE. CAL is a joint NNSA/SC activity involving Sandia National Laboratories (CAL-Sandia) and

    4. Economic Model For a Return on Investment Analysis of United States Government High Performance Computing (HPC) Research and Development (R & D) Investment

      SciTech Connect (OSTI)

      Joseph, Earl C.; Conway, Steve; Dekate, Chirag

      2013-09-30

      This study investigated how high-performance computing (HPC) investments can improve economic success and increase scientific innovation. This research focused on the common good and provided uses for DOE, other government agencies, industry, and academia. The study created two unique economic models and an innovation index: 1 A macroeconomic model that depicts the way HPC investments result in economic advancements in the form of ROI in revenue (GDP), profits (and cost savings), and jobs. 2 A macroeconomic model that depicts the way HPC investments result in basic and applied innovations, looking at variations by sector, industry, country, and organization size.  A new innovation index that provides a means of measuring and comparing innovation levels. Key findings of the pilot study include: IDC collected the required data across a broad set of organizations, with enough detail to create these models and the innovation index. The research also developed an expansive list of HPC success stories.

    5. Computing and Computational Sciences Directorate - Information Technology

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Computational Sciences and Engineering The Computational Sciences and Engineering Division (CSED) is ORNL's premier source of basic and applied research in the field of data sciences and knowledge discovery. CSED's science agenda is focused on research and development related to knowledge discovery enabled by the explosive growth in the availability, size, and variability of dynamic and disparate data sources. This science agenda encompasses data sciences as well as advanced modeling and

    6. Computing and Computational Sciences Directorate - Information Technology

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Information Technology Information Technology (IT) at ORNL serves a diverse community of stakeholders and interests. From everyday operations like email and telecommunications to institutional cluster computing and high bandwidth networking, IT at ORNL is responsible for planning and executing a coordinated strategy that ensures cost-effective, state-of-the-art computing capabilities for research and development. ORNL IT delivers leading-edge products to users in a risk-managed portfolio of

    7. Computer Security

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      computer security Computer Security All JLF participants must fully comply with all LLNL computer security regulations and procedures. A laptop entering or leaving B-174 for the sole use by a US citizen and so configured, and requiring no IP address, need not be registered for use in the JLF. By September 2009, it is expected that computers for use by Foreign National Investigators will have no special provisions. Notify maricle1@llnl.gov of all other computers entering, leaving, or being moved

    8. Compute Nodes

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Compute Nodes Compute Nodes Quad CoreAMDOpteronprocessor Compute Node Configuration 9,572 nodes 1 quad-core AMD 'Budapest' 2.3 GHz processor per node 4 cores per node (38,288 total cores) 8 GB DDR3 800 MHz memory per node Peak Gflop rate 9.2 Gflops/core 36.8 Gflops/node 352 Tflops for the entire machine Each core has their own L1 and L2 caches, with 64 KB and 512KB respectively 2 MB L3 cache shared among the 4 cores Compute Node Software By default the compute nodes run a restricted low-overhead

    9. COMPARISON OF THREE METHODS TO PROJECT FUTURE BASELINE CARBON EMISSIONS IN TEMPERATE RAINFOREST, CURINANCO, CHILE

      SciTech Connect (OSTI)

      Patrick Gonzalez; Antonio Lara; Jorge Gayoso; Eduardo Neira; Patricio Romero; Leonardo Sotomayor

      2005-07-14

      Deforestation of temperate rainforests in Chile has decreased the provision of ecosystem services, including watershed protection, biodiversity conservation, and carbon sequestration. Forest conservation can restore those ecosystem services. Greenhouse gas policies that offer financing for the carbon emissions avoided by preventing deforestation require a projection of future baseline carbon emissions for an area if no forest conservation occurs. For a proposed 570 km{sup 2} conservation area in temperate rainforest around the rural community of Curinanco, Chile, we compared three methods to project future baseline carbon emissions: extrapolation from Landsat observations, Geomod, and Forest Restoration Carbon Analysis (FRCA). Analyses of forest inventory and Landsat remote sensing data show 1986-1999 net deforestation of 1900 ha in the analysis area, proceeding at a rate of 0.0003 y{sup -1}. The gross rate of loss of closed natural forest was 0.042 y{sup -1}. In the period 1986-1999, closed natural forest decreased from 20,000 ha to 11,000 ha, with timber companies clearing natural forest to establish plantations of non-native species. Analyses of previous field measurements of species-specific forest biomass, tree allometry, and the carbon content of vegetation show that the dominant native forest type, broadleaf evergreen (bosque siempreverde), contains 370 {+-} 170 t ha{sup -1} carbon, compared to the carbon density of non-native Pinus radiata plantations of 240 {+-} 60 t ha{sup -1}. The 1986-1999 conversion of closed broadleaf evergreen forest to open broadleaf evergreen forest, Pinus radiata plantations, shrublands, grasslands, urban areas, and bare ground decreased the carbon density from 370 {+-} 170 t ha{sup -1} carbon to an average of 100 t ha{sup -1} (maximum 160 t ha{sup -1}, minimum 50 t ha{sup -1}). Consequently, the conversion released 1.1 million t carbon. These analyses of forest inventory and Landsat remote sensing data provided the data to

    10. Baseline scheme for polarization preservation and control in the MEIC ion complex

      SciTech Connect (OSTI)

      Derbenev, Yaroslav S.; Lin, Fanglei; Morozov, Vasiliy; Zhang, Yuhong; Kondratenko, Anatoliy; Kondratenko, M A; Filatov, Yury

      2015-09-01

      The scheme for preservation and control of the ion polarization in the Medium-energy Electron-Ion Collider (MEIC) has been under active development in recent years. The figure-8 configuration of the ion rings provides a unique capability to control the polarization of any ion species including deuterons by means of "weak" solenoids rotating the particle spins by small angles. Insertion of "weak" solenoids into the magnetic lattices of the booster and collider rings solves the problem of polarization preservation during acceleration of the ion beam. Universal 3D spin rotators designed on the basis of "weak" solenoids allow one to obtain any polarization orientation at an interaction point of MEIC. This paper presents the baseline scheme for polarization preservation and control in the MEIC ion complex.

    11. Quality Assurance Baseline Assessment Report to Los Alamos National Laboratory Analytical Chemistry Operations

      SciTech Connect (OSTI)

      Jordan, R. A.

      1998-09-01

      This report summarizes observations that were made during a Quality Assurance (QA) Baseline Assessment of the Nuclear Materials Technology Analytical Chemistry Group (NMT-1). The Quality and Planning personnel, for NMT-1, are spending a significant amount of time transitioning out of their roles of environmental oversight into production oversight. A team from the Idaho National Engineering and Environmental Laboratory Defense Program Environmental Surety Program performed an assessment of the current status of the QA Program. Several Los Alamos National Laboratory Analytical Chemistry procedures were reviewed, as well as Transuranic Waste Characterization Program (TWCP) QA documents. Checklists were developed and the assessment was performed according to an Implementation Work Plan, INEEL/EXT-98-00740.

    12. NREL Releases Updated Baseline of Cost and Performance Data for Electricity

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Generation Technologies - News Releases | NREL Releases Updated Baseline of Cost and Performance Data for Electricity Generation Technologies Webinar to be held on September 13 September 1, 2016 Graph-showing-NREL-2016-Annual-Technology-Baseline From NREL's 2016 Annual Technology Baseline, the projected Capital Expenditure (CAPEX) for electricity generating technologies in 2030. The Energy Department's National Renewable Energy Laboratory (NREL) has released the 2016 Annual Technology

    13. 2008 CHP Baseline Assessment and Action Plan for the California Market |

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      Department of Energy California Market 2008 CHP Baseline Assessment and Action Plan for the California Market This 2008 report provides an updated baseline assessment and action plan for combined heat and power (CHP) in California and identifies hurdles that prevent the expanded use of CHP systems. This report was prepared by the Pacific Region CHP Application Center (RAC). chp_california_2008.pdf (1.41 MB) More Documents & Publications 2008 CHP Baseline Assessment and Action Plan for

    14. EVMS Training Snippet: 4.2 Integrated Baseline Review (IBR) Process

      Broader source: Energy.gov [DOE]

      This EVMS Training Snippet sponsored by the Office of Project Management (PM) covers the Integrated Baseline Review (IBR) process. 

    15. EVMS Training Snippet: 4.1 The Over Target Baseline (OTB) and...

      Office of Environmental Management (EM)

      EVMS Training Snippet: 4.1 The Over Target Baseline (OTB) and The Over Target Schedule (OTS) Implementations This EVMS Training Snippet, sponsored by the Office of Project ...

    16. U.S. Department of Energy Performance Baseline Guide - DOE Directives...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      5A, U.S. Department of Energy Performance Baseline Guide by Brian Kong Functional areas: Program Management, Project Management, Work Processes This guide identifies key PB...

    17. EVMS Training Snippet: 4.6 Baseline Control Methods | Department of Energy

      Office of Environmental Management (EM)

      6 Baseline Control Methods EVMS Training Snippet: 4.6 Baseline Control Methods This EVMS Training Snippet, sponsored by the Office of Project Management (PM) discusses baseline revisions and the different baseline control vehicles used in DOE. Link to Video Presentation (15:55) | Prior Snippet (4.5) | Next Snippet (4.7) | Return to Index Slides Only (312.05 KB) Slides with Notes (1.27 MB) Key Resources PMCDP EVMS PARS IIe FPD Resource Center PM Newsletter Forms and Templates More Documents &

    18. Microsoft PowerPoint - Snippet 3.1A IMS Initial Baseline Review...

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      and risk. This snippet is recommended whenever a schedule baseline is created or revised. 1 The Contract is the prevailing document regarding what Earned Value Management ...

    19. Software and High Performance Computing

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Software and High Performance Computing Software and High Performance Computing Providing world-class high performance computing capability that enables unsurpassed solutions to complex problems of strategic national interest Contact thumbnail of Kathleen McDonald Head of Intellectual Property, Business Development Executive Kathleen McDonald Richard P. Feynman Center for Innovation (505) 667-5844 Email Software Computational physics, computer science, applied mathematics, statistics and the

    20. Kenya-Danish Government Baseline Workstream | Open Energy Information

      Open Energy Info (EERE)

      Government Partner Danish Ministry for Climate, Energy, and Building; The Danish Energy Agency Sector Energy Topics Implementation, Low emission development planning Program...

    1. Vietnam-Danish Government Baseline Workstream | Open Energy Informatio...

      Open Energy Info (EERE)

      Government Partner Danish Ministry for Climate, Energy, and Building; The Danish Energy Agency Sector Energy Topics Implementation, Low emission development planning Program...

    2. Thailand-Danish Government Baseline Workstream | Open Energy...

      Open Energy Info (EERE)

      Government Partner Danish Ministry for Climate, Energy, and Building; The Danish Energy Agency Sector Energy Topics Implementation, Low emission development planning Program...

    3. Cost and Performance Comparison Baseline for Fossil Energy Plants...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      energy security. A broad portfolio of technologies is being developed within the Clean Coal Program to accomplish this objective. Ever increasing technological enhancements...

    4. Development of an Extensible Computational Framework for Centralized Storage and Distributed Curation and Analysis of Genomic Data Genome-scale Metabolic Models

      SciTech Connect (OSTI)

      Stevens, Rick

      2010-08-01

      The DOE funded KBase project of the Stevens group at the University of Chicago was focused on four high-level goals: (i) improve extensibility, accessibility, and scalability of the SEED framework for genome annotation, curation, and analysis; (ii) extend the SEED infrastructure to support transcription regulatory network reconstructions (2.1), metabolic model reconstruction and analysis (2.2), assertions linked to data (2.3), eukaryotic annotation (2.4), and growth phenotype prediction (2.5); (iii) develop a web-API for programmatic remote access to SEED data and services; and (iv) application of all tools to bioenergy-related genomes and organisms. In response to these goals, we enhanced and improved the ModelSEED resource within the SEED to enable new modeling analyses, including improved model reconstruction and phenotype simulation. We also constructed a new website and web-API for the ModelSEED. Further, we constructed a comprehensive web-API for the SEED as a whole. We also made significant strides in building infrastructure in the SEED to support the reconstruction of transcriptional regulatory networks by developing a pipeline to identify sets of consistently expressed genes based on gene expression data. We applied this pipeline to 29 organisms, computing regulons which were subsequently stored in the SEED database and made available on the SEED website (http://pubseed.theseed.org). We developed a new pipeline and database for the use of kmers, or short 8-residue oligomer sequences, to annotate genomes at high speed. Finally, we developed the PlantSEED, or a new pipeline for annotating primary metabolism in plant genomes. All of the work performed within this project formed the early building blocks for the current DOE Knowledgebase system, and the kmer annotation pipeline, plant annotation pipeline, and modeling tools are all still in use in KBase today.

    5. Exascale Computing

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      DesignForward FastForward CAL Partnerships Shifter: User Defined Images Archive APEX Home » R & D » Exascale Computing Exascale Computing Moving forward into the exascale era, NERSC users place will place increased demands on NERSC computational facilities. Users will be facing increased complexity in the memory subsystem and node architecture. System designs and programming models will have to evolve to face these new challenges. NERSC staff are active in current initiatives addressing

    6. Computer Science

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Cite Seer Department of Energy provided open access science research citations in chemistry, physics, materials, engineering, and computer science IEEE Xplore Full text...

    7. Compute Nodes

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      low-overhead operating system optimized for high performance computing called "Cray Linux Environment" (CLE). This OS supports only a limited number of system calls and UNIX...

    8. Computational Science

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      ... Advanced Materials Laboratory Center for Integrated Nanotechnologies Combustion Research Facility Computational Science Research Institute Joint BioEnergy Institute About EC News ...

    9. COMPUTATIONAL SCIENCE CENTER

      SciTech Connect (OSTI)

      DAVENPORT, J.

      2006-11-01

      Computational Science is an integral component of Brookhaven's multi science mission, and is a reflection of the increased role of computation across all of science. Brookhaven currently has major efforts in data storage and analysis for the Relativistic Heavy Ion Collider (RHIC) and the ATLAS detector at CERN, and in quantum chromodynamics. The Laboratory is host for the QCDOC machines (quantum chromodynamics on a chip), 10 teraflop/s computers which boast 12,288 processors each. There are two here, one for the Riken/BNL Research Center and the other supported by DOE for the US Lattice Gauge Community and other scientific users. A 100 teraflop/s supercomputer will be installed at Brookhaven in the coming year, managed jointly by Brookhaven and Stony Brook, and funded by a grant from New York State. This machine will be used for computational science across Brookhaven's entire research program, and also by researchers at Stony Brook and across New York State. With Stony Brook, Brookhaven has formed the New York Center for Computational Science (NYCCS) as a focal point for interdisciplinary computational science, which is closely linked to Brookhaven's Computational Science Center (CSC). The CSC has established a strong program in computational science, with an emphasis on nanoscale electronic structure and molecular dynamics, accelerator design, computational fluid dynamics, medical imaging, parallel computing and numerical algorithms. We have been an active participant in DOES SciDAC program (Scientific Discovery through Advanced Computing). We are also planning a major expansion in computational biology in keeping with Laboratory initiatives. Additional laboratory initiatives with a dependence on a high level of computation include the development of hydrodynamics models for the interpretation of RHIC data, computational models for the atmospheric transport of aerosols, and models for combustion and for energy utilization. The CSC was formed to bring together

    10. An introduction to computer viruses

      SciTech Connect (OSTI)

      Brown, D.R.

      1992-03-01

      This report on computer viruses is based upon a thesis written for the Master of Science degree in Computer Science from the University of Tennessee in December 1989 by David R. Brown. This thesis is entitled An Analysis of Computer Virus Construction, Proliferation, and Control and is available through the University of Tennessee Library. This paper contains an overview of the computer virus arena that can help the reader to evaluate the threat that computer viruses pose. The extent of this threat can only be determined by evaluating many different factors. These factors include the relative ease with which a computer virus can be written, the motivation involved in writing a computer virus, the damage and overhead incurred by infected systems, and the legal implications of computer viruses, among others. Based upon the research, the development of a computer virus seems to require more persistence than technical expertise. This is a frightening proclamation to the computing community. The education of computer professionals to the dangers that viruses pose to the welfare of the computing industry as a whole is stressed as a means of inhibiting the current proliferation of computer virus programs. Recommendations are made to assist computer users in preventing infection by computer viruses. These recommendations support solid general computer security practices as a means of combating computer viruses.

    11. Computing and Computational Sciences Directorate - Contacts

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Home About Us Contacts Jeff Nichols Associate Laboratory Director Computing and Computational Sciences Becky Verastegui Directorate Operations Manager Computing and...

    12. Computational procedures for determining parameters in Ramberg...

      Office of Scientific and Technical Information (OSTI)

      A computer code, RAMBO, is developed for obtaining the values of parameters in the ... DAMPING; HYSTERESIS; SHEAR; STRAINS; COMPUTER CODES; MECHANICAL PROPERTIES; TENSILE ...

    13. Computing and Computational Sciences Directorate - Divisions

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      CCSD Divisions Computational Sciences and Engineering Computer Sciences and Mathematics Information Technolgoy Services Joint Institute for Computational Sciences National Center for Computational Sciences

    14. System maintenance test plan for the TWRS controlled baseline database system

      SciTech Connect (OSTI)

      Spencer, S.G.

      1998-09-23

      TWRS [Tank Waste Remediation System] Controlled Baseline Database, formally known as the Performance Measurement Control System, is used to track and monitor TWRS project management baseline information. This document contains the maintenance testing approach for software testing of the TCBD system once SCR/PRs are implemented.

    15. Vehicle Technologies Office Merit Review 2015: Computational Design and Development of a New, Lightweight Cast Alloy for Advanced Cylinder Heads in High-Efficiency, Light-Duty Engines

      Broader source: Energy.gov [DOE]

      Presentation given by General Motors at 2015 DOE Hydrogen and Fuel Cells Program and Vehicle Technologies Office Annual Merit Review and Peer Evaluation Meeting about computational design and...

    16. Lung Texture in Serial Thoracic Computed Tomography Scans: Correlation of Radiomics-based Features With Radiation Therapy Dose and Radiation Pneumonitis Development

      SciTech Connect (OSTI)

      Cunliffe, Alexandra; Armato, Samuel G.; Castillo, Richard; Pham, Ngoc; Guerrero, Thomas; Al-Hallaq, Hania A.

      2015-04-01

      Purpose: To assess the relationship between radiation dose and change in a set of mathematical intensity- and texture-based features and to determine the ability of texture analysis to identify patients who develop radiation pneumonitis (RP). Methods and Materials: A total of 106 patients who received radiation therapy (RT) for esophageal cancer were retrospectively identified under institutional review board approval. For each patient, diagnostic computed tomography (CT) scans were acquired before (0-168 days) and after (5-120 days) RT, and a treatment planning CT scan with an associated dose map was obtained. 32- × 32-pixel regions of interest (ROIs) were randomly identified in the lungs of each pre-RT scan. ROIs were subsequently mapped to the post-RT scan and the planning scan dose map by using deformable image registration. The changes in 20 feature values (ΔFV) between pre- and post-RT scan ROIs were calculated. Regression modeling and analysis of variance were used to test the relationships between ΔFV, mean ROI dose, and development of grade ≥2 RP. Area under the receiver operating characteristic curve (AUC) was calculated to determine each feature's ability to distinguish between patients with and those without RP. A classifier was constructed to determine whether 2- or 3-feature combinations could improve RP distinction. Results: For all 20 features, a significant ΔFV was observed with increasing radiation dose. Twelve features changed significantly for patients with RP. Individual texture features could discriminate between patients with and those without RP with moderate performance (AUCs from 0.49 to 0.78). Using multiple features in a classifier, AUC increased significantly (0.59-0.84). Conclusions: A relationship between dose and change in a set of image-based features was observed. For 12 features, ΔFV was significantly related to RP development. This study demonstrated the ability of radiomics to provide a quantitative, individualized

    17. Baseline blood levels of manganese, lead, cadmium, copper, and zinc in residents of Beijing suburb

      SciTech Connect (OSTI)

      Zhang, Long-Lian; Lu, Ling; Pan, Ya-Juan; Ding, Chun-Guang; Xu, Da-Yong; Huang, Chuan-Feng; Pan, Xing-Fu; Zheng, Wei

      2015-07-15

      Baseline blood concentrations of metals are important references for monitoring metal exposure in environmental and occupational settings. The purpose of this study was to determine the blood levels of manganese (Mn), copper (Cu), zinc (Zn), lead (Pb), and cadmium (Cd) among the residents (aged 12–60 years old) living in the suburb southwest of Beijing in China and to compare the outcomes with reported values in various developed countries. Blood samples were collected from 648 subjects from March 2009 to February 2010. Metal concentrations in the whole blood were determined by ICP-MS. The geometric means of blood levels of Mn, Cu, Zn, Pb and Cd were 11.4, 802.4, 4665, 42.6, and 0.68 µg/L, respectively. Male subjects had higher blood Pb than the females, while the females had higher blood Mn and Cu than the males. There was no gender difference for blood Cd and Zn. Smokers had higher blood Cu, Zn, and Cd than nonsmokers. There were significant age-related differences in blood levels of all metals studied; subjects in the 17–30 age group had higher blood levels of Mn, Pb, Cu, and Zn, while those in the 46–60 age group had higher Cd than the other age groups. A remarkably lower blood level of Cu and Zn in this population as compared with residents of other developed countries was noticed. Based on the current study, the normal reference ranges for the blood Mn were estimated to be 5.80–25.2 μg/L; for blood Cu, 541–1475 μg/L; for blood Zn, 2349–9492 μg/L; for blood Pb, <100 μg/L; and for blood Cd, <5.30 μg/L in the general population living in Beijing suburbs. - Highlights: • Baseline blood levels of metals in residents of Beijing suburb are investigated. • BMn and BPb in this cohort are higher than those in other developed countries. • Remarkably lower blood levels of Cu and Zn in this Chinese cohort are noticed. • The reference values for blood levels of Mn, Cu, Zn, Pb, and Cd are established.

    18. Hanford Site baseline risk assessment methodology. Revision 2

      SciTech Connect (OSTI)

      Not Available

      1993-03-01

      This methodology has been developed to prepare human health and environmental evaluations of risk as part of the Comprehensive Environmental Response, Compensation, and Liability Act remedial investigations (RIs) and the Resource Conservation and Recovery Act facility investigations (FIs) performed at the Hanford Site pursuant to the Hanford Federal Facility Agreement and Consent Order referred to as the Tri-Party Agreement. Development of the methodology has been undertaken so that Hanford Site risk assessments are consistent with current regulations and guidance, while providing direction on flexible, ambiguous, or undefined aspects of the guidance. The methodology identifies Site-specific risk assessment considerations and integrates them with approaches for evaluating human and environmental risk that can be factored into the risk assessment program supporting the Hanford Site cleanup mission. Consequently, the methodology will enhance the preparation and review of individual risk assessments at the Hanford Site.

    19. Mixed waste focus area technical baseline report. Volume 2

      SciTech Connect (OSTI)

      1997-04-01

      As part of its overall program, the MWFA uses a national mixed waste data set to develop approaches for treating mixed waste that cannot be treated using existing capabilities at DOE or commercial facilities. The current data set was originally compiled under the auspices of the 1995 Mixed Waste Inventory Report. The data set has been updated over the past two years based on Site Treatment Plan revisions and clarifications provided by individual sites. The current data set is maintained by the MWFA staff and is known as MWFA97. In 1996, the MWFA developed waste groupings, process flow diagrams, and treatment train diagrams to systematically model the treatment of all mixed waste in the DOE complex. The purpose of the modeling process was to identify treatment gaps and corresponding technology development needs for the DOE complex. Each diagram provides the general steps needed to treat a specific type of waste. The NWFA categorized each MWFA97 waste stream by waste group, treatment train, and process flow. Appendices B through F provide the complete listing of waste streams by waste group, treatment train, and process flow. The MWFA97 waste strewn information provided in the appendices is defined in Table A-1.

    20. Compute Nodes

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Nodes Quad CoreAMDOpteronprocessor Compute Node Configuration 9,572 nodes 1 quad-core AMD 'Budapest' 2.3 GHz processor per node 4 cores per node (38,288 total cores) 8 GB...

    1. Comprehensive Baseline Environmental Audit of the Inhalation Toxicology Research Institute, Albuquerque, New Mexico

      SciTech Connect (OSTI)

      Not Available

      1993-06-01

      This report documents the results of the Comprehensive Baseline Environmental Audit conducted at the Inhalation Toxicology Research Institute (ITRI) in Albuquerque, New Mexico. The scope of the audit at the ITRI was comprehensive, addressing environmental activities in the technical areas of air; soils, sediments, and biota; surface water/drinking water; groundwater; waste management; toxic and chemical materials; quality assurance; radiation; inactive waste sites; environmental management; and environmental monitoring programs. Specifically assessed was the compliance of ITRI operations and activities with Federal, state, and local regulations; DOE Orders; internal operating standards; and best management practices. Onsite activities included inspection of ITRI facilities and operations; review of site documents; interviews with DOE and contractor personnel, as well as representatives from state regulatory agencies; and reviews of previous appraisals. Using these sources of information, the environmental audit team developed findings, which fell into two general categories: compliance findings and best management practice findings. Each finding also identifies apparent causal factor(s) that contributed to the finding and will assist line management in developing ``root causes`` for implementing corrective actions.

    2. Exascale Computing

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Computing Exascale Computing CoDEx Project: A Hardware/Software Codesign Environment for the Exascale Era The next decade will see a rapid evolution of HPC node architectures as power and cooling constraints are limiting increases in microprocessor clock speeds and constraining data movement. Applications and algorithms will need to change and adapt as node architectures evolve. A key element of the strategy as we move forward is the co-design of applications, architectures and programming

    3. LHC Computing

      SciTech Connect (OSTI)

      Lincoln, Don

      2015-07-28

      The LHC is the world’s highest energy particle accelerator and scientists use it to record an unprecedented amount of data. This data is recorded in electronic format and it requires an enormous computational infrastructure to convert the raw data into conclusions about the fundamental rules that govern matter. In this video, Fermilab’s Dr. Don Lincoln gives us a sense of just how much data is involved and the incredible computer resources that makes it all possible.

    4. GTA (ground test accelerator) Phase 1: Baseline design report

      SciTech Connect (OSTI)

      Not Available

      1986-08-01

      The national Neutral Particle Beam (NPB) program has two objectives: to provide the necessary basis for a discriminator/weapon decision by 1992, and to develop the technology in stages that lead ultimately to a neutral particle beam weapon. The ground test accelerator (GTA) is the test bed that permits the advancement of the state-of-the-art under experimental conditions in an integrated automated system mode. An intermediate goal of the GTA program is to support the Integrated Space Experiments, while the ultimate goal is to support the 1992 decision. The GTA system and each of its major subsystems are described, and project schedules and resource requirements are provided. (LEW)

    5. Bioinformatics Computing Consultant Position Available

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      You can read more about the positions and apply at jobs.lbl.gov: Bioinformatics High Performance Computing Consultant (job number: 73194) and Software Developer for High...

    6. Parallel Computing Summer Research Internship

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Recommended Reading & Resources Parallel Computing Summer Research Internship Creates next-generation leaders in HPC research and applications development Contacts Program Co-Lead ...

    7. Insertion Devices for NSLS-II Baseline and Future

      SciTech Connect (OSTI)

      Tanabe,T.

      2008-06-23

      NSLS-II is going to employ Damping Wigglers (DWs) not only for emittance reduction but also as broad band hard X-ray source. In-Vacuum Undulators (IVUs) with the minimum RMS phase error (< 2 degree) and possible cryo-capability are planned for X-ray planar device. Elliptically Polarized Undulators (EPUs) are envisioned for polarization controls. Due to the lack of hard X-ray flux from weak dipole magnet field (0.4 Tesla), three pole wigglers (3PWs) of the peak field over 1 Tesla will be mainly used by NSLS bending magnet beam line users. Magnetic designs and kick maps for dynamic aperture surveys were created using the latest version of Radia [1] for Mathematica 6 which we supported the development. There are other devices planned for the later stage of the project, such as quasi-periodic EPU, superconducting wiggler/undulator, and Cryo-Permanent Magnet Undulator (CPMU) with Praseodymium Iron Boron (PrFeB) magnets and textured Dysprosium poles. For R&D, Hybrid PrFeB arrays were planned to be assembled and field-measured at room temperature, liquid nitrogen and liquid helium temperature using our vertical test facility. We have also developed a specialized power supply for pulsed wire measurement.

    8. Baseline System Costs for 50.0 MW Enhanced Geothermal System--A Function of: Working Fluid, Technology, and Location, Location, Location

      Broader source: Energy.gov [DOE]

      Project objectives: Develop a baseline cost model of a 50.0 MW Enhanced Geothermal System, including all aspects of the project, from finding the resource through to operation, for a particularly challenging scenario: the deep, radioactively decaying granitic rock of the Pioneer Valley in Western Massachusetts.

    9. Convergence: Computing and communications

      SciTech Connect (OSTI)

      Catlett, C.

      1996-12-31

      This paper highlights the operations of the National Center for Supercomputing Applications (NCSA). NCSA is developing and implementing a national strategy to create, use, and transfer advanced computing and communication tools and information technologies for science, engineering, education, and business. The primary focus of the presentation is historical and expected growth in the computing capacity, personal computer performance, and Internet and WorldWide Web sites. Data are presented to show changes over the past 10 to 20 years in these areas. 5 figs., 4 tabs.

    10. Computational Earth Science

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      6 Computational Earth Science We develop and apply a range of high-performance computational methods and software tools to Earth science projects in support of environmental health, cleaner energy, and national security. Contact Us Group Leader Carl Gable Deputy Group Leader Gilles Bussod Email Profile pages header Search our Profile pages Hari Viswanathan inspects a microfluidic cell used to study the extraction of hydrocarbon fuels from a complex fracture network. EES-16's Subsurface Flow

    11. High Hydrogen Concentrations Detected In The Underground Vaults For RH-TRU Waste At INEEL Compared With Calculated Values Using The INEEL-Developed Computer Code

      SciTech Connect (OSTI)

      Rajiv Bhatt; Soli Khericha

      2005-02-01

      About 700 remote-handled transuranic (RH-TRU) waste drums are stored in about 144 underground vaults at the Intermediate-Level Transuranic Storage Facility at the Idaho National Environmental and Engineering Laboratorys (INEELs) Radioactive Waste Management Complex (RWMC). These drums were shipped to the INEEL from 1976 through 1996. During recent monitoring, concentrations of hydrogen were found to be in excess of lower explosive limits. The hydrogen concentration in one vault was detected to be as high as 18% (by volume). This condition required evaluation of the safety basis for the facility. The INEEL has developed a computer program to estimate the hydrogen gas generation as a function of time and diffusion through a series of layers (volumes), with a maximum five layers plus a sink/environment. The program solves the first-order diffusion equations as a function of time. The current version of the code is more flexible in terms of user input. The program allows the user to estimate hydrogen concentrations in the different layers of a configuration and then change the configuration after a given time; e.g.; installation of a filter on an unvented drum or placed in a vault or in a shipping cask. The code has been used to predict vault concentrations and to identify potential problems during retrieval and aboveground storage. The code has generally predicted higher hydrogen concentrations than the measured values, particularly for the drums older than 20 year, which could be due to uncertainty and conservative assumptions in drum age, heat generation rate, hydrogen generation rate, Geff, and diffusion rates through the layers.

    12. Baseline Concept Description of a Small Modular High Temperature Reactor

      SciTech Connect (OSTI)

      Hans Gougar

      2014-05-01

      The objective of this report is to provide a description of generic small modular high temperature reactors (herein denoted as an smHTR), summarize their distinguishing attributes, and lay out the research and development (R&D) required for commercialization. The generic concepts rely heavily on the modular high temperature gas-cooled reactor designs developed in the 1980s which were never built but for which pre-licensing or certification activities were conducted. The concept matured more recently under the Next Generation Nuclear Plant (NGNP) project, specifically in the areas of fuel and material qualification, methods development, and licensing. As all vendor-specific designs proposed under NGNP were all both ‘small’ or medium-sized and ‘modular’ by International Atomic Energy Agency (IAEA) and Department of Energy (DOE) standards, the technical attributes, challenges, and R&D needs identified, addressed, and documented under NGNP are valid and appropriate in the context of Small Modular Reactor (SMR) applications. Although the term High Temperature Reactor (HTR) is commonly used to denote graphite-moderated, thermal spectrum reactors with coolant temperatures in excess of 650oC at the core outlet, in this report the historical term High Temperature Gas-Cooled Reactor (HTGR) will be used to distinguish the gas-cooled technology described herein from its liquid salt-cooled cousin. Moreover, in this report it is to be understood that the outlet temperature of the helium in an HTGR has an upper limit of 950 degrees C which corresponds to the temperature to which certain alloys are currently being qualified under DOE’s ARC program. Although similar to the HTGR in just about every respect, the Very High Temperature Reactor (VHTR) may have an outlet temperature in excess of 950 degrees C and is therefore farther from commercialization because of the challenges posed to materials exposed to these temperatures. The VHTR is the focus of R&D under the

    13. Baseline Concept Description of a Small Modular High Temperature Reactor

      SciTech Connect (OSTI)

      Gougar, Hans D.

      2014-10-01

      The objective of this report is to provide a description of generic small modular high temperature reactors (herein denoted as an smHTR), summarize their distinguishing attributes, and lay out the research and development (R&D) required for commercialization. The generic concepts rely heavily on the modular high temperature gas-cooled reactor designs developed in the 1980s which were never built but for which pre-licensing or certification activities were conducted. The concept matured more recently under the Next Generation Nuclear Plant (NGNP) project, specifically in the areas of fuel and material qualification, methods development, and licensing. As all vendor-specific designs proposed under NGNP were all both ‘small’ or medium-sized and ‘modular’ by International Atomic Energy Agency (IAEA) and Department of Energy (DOE) standards, the technical attributes, challenges, and R&D needs identified, addressed, and documented under NGNP are valid and appropriate in the context of Small Modular Reactor (SMR) applications. Although the term High Temperature Reactor (HTR) is commonly used to denote graphite-moderated, thermal spectrum reactors with coolant temperatures in excess of 650oC at the core outlet, in this report the historical term High Temperature Gas-Cooled Reactor (HTGR) will be used to distinguish the gas-cooled technology described herein from its liquid salt-cooled cousin. Moreover, in this report it is to be understood that the outlet temperature of the helium in an HTGR has an upper limit of 950 degrees C which corresponds to the temperature to which certain alloys are currently being qualified under DOE’s ARC program. Although similar to the HTGR in just about every respect, the Very High Temperature Reactor (VHTR) may have an outlet temperature in excess of 950 degrees C and is therefore farther from commercialization because of the challenges posed to materials exposed to these temperatures. The VHTR is the focus of R&D under the

    14. Baseline Concept Description of a Small Modular High Temperature Reactor

      SciTech Connect (OSTI)

      Hans Gougar

      2014-05-01

      The objective of this report is to provide a description of generic small modular high temperature reactors (herein denoted as an smHTR), summarize their distinguishing attributes, and lay out the research and development (R&D) required for commercialization. The generic concepts rely heavily on the modular high temperature gas-cooled reactor designs developed in the 1980s which were never built but for which pre-licensing or certification activities were conducted. The concept matured more recently under the Next Generation Nuclear Plant (NGNP) project, specifically in the areas of fuel and material qualification, methods development, and licensing. As all vendor-specific designs proposed under NGNP were all both small or medium-sized and modular by International Atomic Energy Agency (IAEA) and Department of Energy (DOE) standards, the technical attributes, challenges, and R&D needs identified, addressed, and documented under NGNP are valid and appropriate in the context of Small Modular Reactor (SMR) applications. Although the term High Temperature Reactor (HTR) is commonly used to denote graphite-moderated, thermal spectrum reactors with coolant temperatures in excess of 650oC at the core outlet, in this report the historical term High Temperature Gas-Cooled Reactor (HTGR) will be used to distinguish the gas-cooled technology described herein from its liquid salt-cooled cousin. Moreover, in this report it is to be understood that the outlet temperature of the helium in an HTGR has an upper limit of 950 degrees C which corresponds to the temperature to which certain alloys are currently being qualified under DOEs ARC program. Although similar to the HTGR in just about every respect, the Very High Temperature Reactor (VHTR) may have an outlet temperature in excess of 950 degrees C and is therefore farther from commercialization because of the challenges posed to materials exposed to these temperatures. The VHTR is the focus of R&D under the Generation IV

    15. BASELINE MEMBRANE SELECTION AND CHARACTERIZATION FOR AN SDE

      SciTech Connect (OSTI)

      Colon-Mercado, H; David Hobbs, D

      2007-04-03

      Thermochemical processes are being developed to provide global-scale quantities of hydrogen. A variant on sulfur-based thermochemical cycles is the Hybrid Sulfur (HyS) Process which uses a sulfur dioxide depolarized electrolyzer (SDE) to produce the hydrogen. In FY05 and FY06, testing at the Savannah River National Laboratory (SRNL) explored a low temperature fuel cell design concept for the SDE. The advantages of this design concept include high electrochemical efficiency and small footprint that are crucial for successful implementation on a commercial scale. A key component of the SDE is the ion conductive membrane through which protons produced at anode migrate to the cathode and react to produce hydrogen. An ideal membrane for the SDE should have both low ionic resistivity and low sulfur dioxide transport. These features allow the electrolyzer to perform at high currents with low potentials, along with preventing contamination of both the hydrogen output and poisoning of the catalysts involved. Another key component is the electrocatalyst material used for the anode and cathode. Good electrocatalysts should be chemically stable and have a low overpotential for the desired electrochemical reactions. This report summarizes results from activities to evaluate commercial and experimental membranes for the SDE. Several different types of commercially-available membranes were analyzed for sulfur dioxide transport as a function of acid strength including perfluorinated sulfonic acid (PFSA), sulfonated poly-etherketone-ketone, and poly-benzimidazole (PBI) membranes. Experimental membranes from the sulfonated diels-alder polyphenylenes (SDAPP) and modified Nafion{reg_sign} 117 were evaluated for SO{sub 2} transport as well. These membranes exhibited reduced transport coefficient for SO{sub 2} transport without the loss in ionic conductivity. The use of Nafion{reg_sign} with EW 1100 is recommended for the present SDE testing due to the limited data regarding chemical

    16. An evaluation of baseline conditions at lease tract C-a, Rio Blanco County, Colorado

      SciTech Connect (OSTI)

      Barteaux, W.L.; Biezugbe, G.

      1987-09-01

      An analysis was made of baseline groundwater quality data from oil shale lease tract C-a, managed by Rio Blanco Oil Shale Company. The data are limited in several respects. All conclusions drawn from the data must be qualified with these limitations. Baseline conditions were determined by analyzing data from wells in the upper bedrock and lower bedrock aquifers and from the alluvial wells. Baseline data were considered all data collected before mining operations began. The water quality was then evaluated using the 1987 Colorado State Basic Standards for Ground Water as a basis. The maximum baseline values for several parameters in each aquifer exceed the standard values. The quality of the upper lower bedrock aquifers varies from region to region within the site. Data on the lower bedrock aquifer are insufficient for speculation on the cause of the variations. Variations in the upper bedrock aquifer are possibly caused by leakage of the lower bedrock aquifer. 16 refs., 9 figs., 9 tabs.

    17. Public Interest Energy Research (PIER) Program Development of a Computer-based Benchmarking and Analytical Tool. Benchmarking and Energy & Water Savings Tool in Dairy Plants (BEST-Dairy)

      SciTech Connect (OSTI)

      Xu, Tengfang; Flapper, Joris; Ke, Jing; Kramer, Klaas; Sathaye, Jayant

      2012-02-01

      The overall goal of the project is to develop a computer-based benchmarking and energy and water savings tool (BEST-Dairy) for use in the California dairy industry - including four dairy processes - cheese, fluid milk, butter, and milk powder. BEST-Dairy tool developed in this project provides three options for the user to benchmark each of the dairy product included in the tool, with each option differentiated based on specific detail level of process or plant, i.e., 1) plant level; 2) process-group level, and 3) process-step level. For each detail level, the tool accounts for differences in production and other variables affecting energy use in dairy processes. The dairy products include cheese, fluid milk, butter, milk powder, etc. The BEST-Dairy tool can be applied to a wide range of dairy facilities to provide energy and water savings estimates, which are based upon the comparisons with the best available reference cases that were established through reviewing information from international and national samples. We have performed and completed alpha- and beta-testing (field testing) of the BEST-Dairy tool, through which feedback from voluntary users in the U.S. dairy industry was gathered to validate and improve the tool's functionality. BEST-Dairy v1.2 was formally published in May 2011, and has been made available for free downloads from the internet (i.e., http://best-dairy.lbl.gov). A user's manual has been developed and published as the companion documentation for use with the BEST-Dairy tool. In addition, we also carried out technology transfer activities by engaging the dairy industry in the process of tool development and testing, including field testing, technical presentations, and technical assistance throughout the project. To date, users from more than ten countries in addition to those in the U.S. have downloaded the BEST-Dairy from the LBNL website. It is expected that the use of BEST-Dairy tool will advance understanding of energy and water

    18. The Science and Strategy for Phasing of the Long-Baseline Neutrino Experiment

      SciTech Connect (OSTI)

      Diwan, Milind V.

      2012-05-22

      This note is about the principles behind a phased plan for realizing a Long-Baseline Neutrino Experiment(LBNE) in the U.S.. The most important issue that must be resolved is the direction of the first phase of the experiment. Based on both scientific and programmatic considerations, the U.S. should pursue the best option for accelerator neutrino physics, which is the longer baseline towards Homestake with an optimizedbroadband intense beam.

    19. Computing Resources

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Resources This page is the repository for sundry items of information relevant to general computing on BooNE. If you have a question or problem that isn't answered here, or a suggestion for improving this page or the information on it, please mail boone-computing@fnal.gov and we'll do our best to address any issues. Note about this page Some links on this page point to www.everything2.com, and are meant to give an idea about a concept or thing without necessarily wading through a whole website

    20. Annual Technology Baseline (Including Supporting Data); NREL (National Renewable Energy Laboratory)

      SciTech Connect (OSTI)

      Blair, Nate; Cory, Karlynn; Hand, Maureen; Parkhill, Linda; Speer, Bethany; Stehly, Tyler; Feldman, David; Lantz, Eric; Augusting, Chad; Turchi, Craig; O'Connor, Patrick

      2015-07-08

      Consistent cost and performance data for various electricity generation technologies can be difficult to find and may change frequently for certain technologies. With the Annual Technology Baseline (ATB), National Renewable Energy Laboratory provides an organized and centralized dataset that was reviewed by internal and external experts. It uses the best information from the Department of Energy laboratory's renewable energy analysts and Energy Information Administration information for conventional technologies. The ATB will be updated annually in order to provide an up-to-date repository of current and future cost and performance data. Going forward, we plan to revise and refine the values using best available information. The ATB includes both a presentation with notes (PDF) and an associated Excel Workbook. The ATB includes the following electricity generation technologies: land-based wind; offshore wind; utility-scale solar PV; concentrating solar power; geothermal power; hydropower plants (upgrades to existing facilities, powering non-powered dams, and new stream-reach development); conventional coal; coal with carbon capture and sequestration; integrated gasification combined cycle coal; natural gas combustion turbines; natural gas combined cycle; conventional biopower. Nuclear laboratory's renewable energy analysts and Energy Information Administration information for conventional technologies. The ATB will be updated annually in order to provide an up-to-date repository of current and future cost and performance data. Going forward, we plan to revise and refine the values using best available information.

    1. Evaluation of final waste forms and recommendations for baseline alternatives to group and glass

      SciTech Connect (OSTI)

      Bleier, A.

      1997-09-01

      An assessment of final waste forms was made as part of the Federal Facilities Compliance Agreement/Development, Demonstration, Testing, and Evaluation (FFCA/DDT&E) Program because supplemental waste-form technologies are needed for the hazardous, radioactive, and mixed wastes of concern to the Department of Energy and the problematic wastes on the Oak Ridge Reservation. The principal objective was to identify a primary waste-form candidate as an alternative to grout (cement) and glass. The effort principally comprised a literature search, the goal of which was to establish a knowledge base regarding four areas: (1) the waste-form technologies based on grout and glass, (2) candidate alternatives, (3) the wastes that need to be immobilized, and (4) the technical and regulatory constraints on the waste-from technologies. This report serves, in part, to meet this goal. Six families of materials emerged as relevant; inorganic, organic, vitrified, devitrified, ceramic, and metallic matrices. Multiple members of each family were assessed, emphasizing the materials-oriented factors and accounting for the fact that the two most prevalent types of wastes for the FFCA/DDT&E Program are aqueous liquids and inorganic sludges and solids. Presently, no individual matrix is sufficiently developed to permit its immediate implementation as a baseline alternative. Three thermoplastic materials, sulfur-polymer cement (inorganic), bitumen (organic), and polyethylene (organic), are the most technologically developed candidates. Each warrants further study, emphasizing the engineering and economic factors, but each also has limitations that regulate it to a status of short-term alternative. The crystallinity and flexible processing of sulfur provide sulfur-polymer cement with the highest potential for short-term success via encapsulation. Long-term immobilization demands chemical stabilization, which the thermoplastic matrices do not offer. Among the properties of the remaining

    2. Computational trigonometry

      SciTech Connect (OSTI)

      Gustafson, K.

      1994-12-31

      By means of the author`s earlier theory of antieigenvalues and antieigenvectors, a new computational approach to iterative methods is presented. This enables an explicit trigonometric understanding of iterative convergence and provides new insights into the sharpness of error bounds. Direct applications to Gradient descent, Conjugate gradient, GCR(k), Orthomin, CGN, GMRES, CGS, and other matrix iterative schemes will be given.

    3. Quantum steady computation

      SciTech Connect (OSTI)

      Castagnoli, G. )

      1991-08-10

      This paper reports that current conceptions of quantum mechanical computers inherit from conventional digital machines two apparently interacting features, machine imperfection and temporal development of the computational process. On account of machine imperfection, the process would become ideally reversible only in the limiting case of zero speed. Therefore the process is irreversible in practice and cannot be considered to be a fundamental quantum one. By giving up classical features and using a linear, reversible and non-sequential representation of the computational process - not realizable in classical machines - the process can be identified with the mathematical form of a quantum steady state. This form of steady quantum computation would seem to have an important bearing on the notion of cognition.

    4. Evaluation of computer-aided system design tools for SDI (Strategic Defense Initiative) Battle Management/C3 (command, control and communications) architecture development. Final report

      SciTech Connect (OSTI)

      Fife, D.W.; Campbell, K.; Chludzinski, J.; Corcoran, N.; Gonzalez, C.

      1987-10-01

      This IDA paper was prepared at the request of the Strategic Defense Initiative Organization. The paper documents findings of an evaluation on the capabilities of certain computer software/computer-aided software engineering (CASE) tools to provide computer-aided graphic design of Battle Management/C3 for the SDIO. Each tool (of the five selected on the basis of the best available at this time), was installed at IDA. After training by vendor tool staff, an IDA team, using a hands-on design experience determined the merits of the tools for SDI application. A comparative summary of the tools is given relative to envisaged SDI requirements and an extensive questionnaire is answered for each.

    5. Advanced Scientific Computing Research

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Advanced Scientific Computing Research Advanced Scientific Computing Research Discovering, ... The DOE Office of Science's Advanced Scientific Computing Research (ASCR) program ...

    6. Theory, Simulation, and Computation

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Computer, Computational, and Statistical Sciences (CCS) Division is an international ... and statistics The deployment and integration of computational technology, ...

    7. Computing and Computational Sciences Directorate - Information Technology

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Oak Ridge Climate Change Science Institute Jim Hack Oak Ridge National Laboratory (ORNL) has formed the Oak Ridge Climate Change Science Institute (ORCCSI) that will develop and execute programs for the multi-agency, multi-disciplinary climate change research partnerships at ORNL. Led by Director Jim Hack and Deputy Director Dave Bader, the Institute will integrate scientific projects in modeling, observations, and experimentation with ORNL's powerful computational and informatics capabilities

    8. Computing at JLab

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      JLab --- Accelerator Controls CAD CDEV CODA Computer Center High Performance Computing Scientific Computing JLab Computer Silo maintained by webmaster@jlab.org...

    9. Assessing health impacts in complex eco-epidemiological settings in the humid tropics: Modular baseline health surveys

      SciTech Connect (OSTI)

      Winkler, Mirko S.; Divall, Mark J.; Krieger, Gary R.; Schmidlin, Sandro; Magassouba, Mohamed L.; Knoblauch, Astrid M.; Singer, Burton H.; Utzinger, Juerg

      2012-02-15

      The quantitative assessment of health impacts has been identified as a crucial feature for realising the full potential of health impact assessment (HIA). In settings where demographic and health data are notoriously scarce, but there is a broad range of ascertainable ecological, environmental, epidemiological and socioeconomic information, a diverse toolkit of data collection strategies becomes relevant for the mainly small-area impacts of interest. We present a modular, cross-sectional baseline health survey study design, which has been developed for HIA of industrial development projects in the humid tropics. The modular nature of our toolkit allows our methodology to be readily adapted to the prevailing eco-epidemiological characteristics of a given project setting. Central to our design is a broad set of key performance indicators, covering a multiplicity of health outcomes and determinants at different levels and scales. We present experience and key findings from our modular baseline health survey methodology employed in 14 selected sentinel sites within an iron ore mining project in the Republic of Guinea. We argue that our methodology is a generic example of rapid evidence assembly in difficult-to-reach localities, where improvement of the predictive validity of the assessment and establishment of a benchmark for longitudinal monitoring of project impacts and mitigation efforts is needed.

    10. Baseline Library

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Skip navigation links Marketing Resources Reports, Publications, and Research Agricultural Commercial Consumer Products Industrial Institutional Multi-Sector Residential...

    11. Commercial Building Energy Baseline Modeling Software: Performance Metrics and Method Testing with Open Source Models and Implications for Proprietary Software Testing

      SciTech Connect (OSTI)

      Price, Phillip N.; Granderson, Jessica; Sohn, Michael; Addy, Nathan; Jump, David

      2013-09-01

      whose savings can be calculated with least error? 4. What is the state of public domain models, that is, how well do they perform, and what are the associated implications for whole-building measurement and verification (M&V)? Additional project objectives that were addressed as part of this study include: (1) clarification of the use cases and conditions for baseline modeling performance metrics, benchmarks and evaluation criteria, (2) providing guidance for determining customer suitability for baseline modeling, (3) describing the portfolio level effects of baseline model estimation errors, (4) informing PG&E’s development of EMIS technology product specifications, and (5) providing the analytical foundation for future studies about baseline modeling and saving effects of EMIS technologies. A final objective of this project was to demonstrate the application of the methodology, performance metrics, and test protocols with participating EMIS product vendors.

    12. RATIO COMPUTER

      DOE Patents [OSTI]

      Post, R.F.

      1958-11-11

      An electronic computer circuit is described for producing an output voltage proportional to the product or quotient of tbe voltages of a pair of input signals. ln essence, the disclosed invention provides a computer having two channels adapted to receive separate input signals and each having amplifiers with like fixed amplification factors and like negatlve feedback amplifiers. One of the channels receives a constant signal for comparison purposes, whereby a difference signal is produced to control the amplification factors of the variable feedback amplifiers. The output of the other channel is thereby proportional to the product or quotient of input signals depending upon the relation of input to fixed signals in the first mentioned channel.

    13. NERSC seeks Computational Systems Group Lead

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      seeks Computational Systems Group Lead NERSC seeks Computational Systems Group Lead January 6, 2011 by Katie Antypas Note: This position is now closed. The Computational Systems Group provides production support and advanced development for the supercomputer systems at NERSC. Manage the Computational Systems Group (CSG) which provides production support and advanced development for the supercomputer systems at NERSC (National Energy Research Scientific Computing Center). These systems, which

    14. Computer System,

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      System, Cluster, and Networking Summer Institute New Mexico Consortium and Los Alamos National Laboratory HOW TO APPLY Applications will be accepted JANUARY 5 - FEBRUARY 13, 2016 Computing and Information Technology undegraduate students are encouraged to apply. Must be a U.S. citizen. * Submit a current resume; * Offcial University Transcript (with spring courses posted and/or a copy of spring 2016 schedule) 3.0 GPA minimum; * One Letter of Recommendation from a Faculty Member; and * Letter of

    15. Computing Events

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Events Computing Events Spotlighting the most advanced scientific and technical applications in the world! Featuring exhibits of the latest and greatest technologies from industry, academia and government research organizations; many of these technologies will be seen for the first time in Denver. Supercomputing Conference 13 Denver, Colorado November 17-22, 2013 Spotlighting the most advanced scientific and technical applications in the world, SC13 will bring together the international

    16. Baseline Evaluations to Support Control Room Modernization at Nuclear Power Plants

      SciTech Connect (OSTI)

      Boring, Ronald L.; Joe, Jeffrey C.

      2015-02-01

      For any major control room modernization activity at a commercial nuclear power plant (NPP) in the U.S., a utility should carefully follow the four phases prescribed by the U.S. Nuclear Regulatory Commission in NUREG-0711, Human Factors Engineering Program Review Model. These four phases include Planning and Analysis, Design, Verification and Validation, and Implementation and Operation. While NUREG-0711 is a useful guideline, it is written primarily from the perspective of regulatory review, and it therefore does not provide a nuanced account of many of the steps the utility might undertake as part of control room modernization. The guideline is largely summative—intended to catalog final products—rather than formative—intended to guide the overall modernization process. In this paper, we highlight two crucial formative sub-elements of the Planning and Analysis phase specific to control room modernization that are not covered in NUREG-0711. These two sub-elements are the usability and ergonomics baseline evaluations. A baseline evaluation entails evaluating the system as-built and currently in use. The usability baseline evaluation provides key insights into operator performance using the control system currently in place. The ergonomics baseline evaluation identifies possible deficiencies in the physical configuration of the control system. Both baseline evaluations feed into the design of the replacement system and subsequent summative benchmarking activities that help ensure that control room modernization represents a successful evolution of the control system.

    17. Report of the DOE Review Committee on the baseline validation of the Superconducting Super Collider

      SciTech Connect (OSTI)

      Not Available

      1993-08-01

      The Secretary of Energy directed that an independent review of the current cost and schedule baseline for the SSC be conducted. The purpose of this independent review was to validate the current cost and schedule baseline and to ensure that the project status is accurate as currently reported. Through May 1993, approximately $1.5 billion of the baseline cost of $8.249 billion had been expended, with project completion forecasted on the baseline schedule as of September 1999. This report documents the findings of the SSC Baseline Validation Review Committee (the Committee). The report is organized into five parts. The first section is the Executive Summary. This introduction is followed by a discussion of the project progress/status as determined by the Committee. The next section describes the Committee`s estimate of the cost at completion for the SSC project, followed by an assessment of the adequacy of the business management systems currently being used to manage the project. The final section presents the Committee`s conclusions and recommendations. The main body of the report is followed by the subcommittee reports and appendices.

    18. National Nuclear Security Administration Service Center Environmental Programs Long-Term Environmental Stewardship Baseline Handbook

      SciTech Connect (OSTI)

      Griswold, D. D.; Rohde, K.

      2003-02-26

      As environmental restoration (ER) projects move toward completion, the planning, integration, and documentation of long-term environmental stewardship (LTES) activities is increasingly important for ensuring smooth transition to LTES. The Long-Term Environmental Stewardship Baseline Handbook (Handbook) prepared by the National Nuclear Security Administration (NNSA) Service Center Environmental Programs Department (EPD) outlines approaches for integrating site-specific LTES planning and implementation into site ER baseline documentation. Since LTES will vary greatly from site to site, the Handbook also provides for flexibility in addressing LTES in ER Project life-cycle baselines, while clearly identifying Environmental Management (EM) requirements. It provides suggestions for enacting LTES principles and objectives through operational activities described in site-specific LTES plans and life cycle ER Project baseline scope, cost, and schedule documentation and tools for more thorough planning, better quantification, broader understanding of risk and risk management factors, and more comprehensive documentation. LTES planning applied to baselines in a phased approach will facilitate seamlessly integrating LTES into site operational activities, thereby minimizing the use of resources.

    19. MHD computations for stellarators

      SciTech Connect (OSTI)

      Johnson, J.L.

      1985-12-01

      Considerable progress has been made in the development of computational techniques for studying the magnetohydrodynamic equilibrium and stability properties of three-dimensional configurations. Several different approaches have evolved to the point where comparison of results determined with different techniques shows good agreement. 55 refs., 7 figs.

    20. Understanding the Effect of Baseline Modeling Implementation Choices on Analysis of Demand Response Performance

      SciTech Connect (OSTI)

      University of California, Berkeley; Addy, Nathan; Kiliccote, Sila; Mathieu, Johanna; Callaway, Duncan S.

      2012-06-13

      Accurate evaluation of the performance of buildings participating in Demand Response (DR) programs is critical to the adoption and improvement of these programs. Typically, we calculate load sheds during DR events by comparing observed electric demand against counterfactual predictions made using statistical baseline models. Many baseline models exist and these models can produce different shed calculations. Moreover, modelers implementing the same baseline model can make different modeling implementation choices, which may affect shed estimates. In this work, using real data, we analyze the effect of different modeling implementation choices on shed predictions. We focused on five issues: weather data source, resolution of data, methods for determining when buildings are occupied, methods for aligning building data with temperature data, and methods for power outage filtering. Results indicate sensitivity to the weather data source and data filtration methods as well as an immediate potential for automation of methods to choose building occupied modes.

    1. Borehole temperatures and a baseline for 20th-century global warming estimates

      SciTech Connect (OSTI)

      Harris, R.N.; Chapman, D.S.

      1997-03-14

      Lack of a 19th-century baseline temperature against which 20th-century warming can be referenced constitutes a deficiency in understanding recent climate change. Combination of borehole temperature profiles, which contain a memory of surface temperature changes in previous centuries, with the meteorologicl archive of surface air temperatures can provide a 19th-century baseline temperature tied to the current observational record. A test case in Utah, where boreholes are interspersed with meteorological stations belonging to the Historical Climatological network, Yields a noise reduction in estimates of 20th-century warming and a baseline temperature that is 0.6{degrees} {+-} 0.1{degrees}C below the 1951 to 1970 mean temperature for the region. 22 refs., 3 figs., 1 tab.

    2. Relative astrometry of compact flaring structures in Sgr A* with polarimetric very long baseline interferometry

      SciTech Connect (OSTI)

      Johnson, Michael D.; Doeleman, Sheperd S.; Fish, Vincent L.; Broderick, Avery E.; Wardle, John F. C.; Marrone, Daniel P.

      2014-10-20

      We demonstrate that polarimetric interferometry can be used to extract precise spatial information about compact polarized flares of Sgr A*. We show that, for a faint dynamical component, a single interferometric baseline suffices to determine both its polarization and projected displacement from the quiescent intensity centroid. A second baseline enables two-dimensional reconstruction of the displacement, and additional baselines can self-calibrate using the flare, enhancing synthesis imaging of the quiescent emission. We apply this technique to simulated 1.3 mm wavelength observations of a 'hot spot' embedded in a radiatively inefficient accretion disk around Sgr A*. Our results indicate that, even with current sensitivities, polarimetric interferometry with the Event Horizon Telescope can achieve ∼5 μas relative astrometry of compact flaring structures near Sgr A* on timescales of minutes.

    3. Development and testing of FIDELE: a computer code for finite-difference solution to harmonic magnetic-dipole excitation of an azimuthally symmetric horizontally and radially layered earth

      SciTech Connect (OSTI)

      Vittitoe, C.N.

      1981-04-01

      The FORTRAN IV computer code FIDELE simulates the high-frequency electrical logging of a well in which induction and receiving coils are mounted in an instrument sonde immersed in a drilling fluid. The fluid invades layers of surrounding rock in an azimuthally symmetric pattern, superimposing radial layering upon the horizonally layered earth. Maxwell's equations are reduced to a second-order elliptic differential equation for the azimuthal electric-field intensity. The equation is solved at each spatial position where the complex dielectric constant, magnetic permeability, and electrical conductivity have been assigned. Receiver response is given as the complex open-circuit voltage on receiver coils. The logging operation is simulated by a succession of such solutions as the sonde traverses the borehole. Test problems verify consistency with available results for simple geometries. The code's main advantage is its treatment of a two-dimensional earth; its chief disadvantage is the large computer time required for typical problems. Possible code improvements are noted. Use of the computer code is outlined, and tests of most code features are presented.

    4. Computing and Computational Sciences Directorate - Computer Science and

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Mathematics Division Computer Science and Mathematics Division The Computer Science and Mathematics Division (CSMD) is ORNL's premier source of basic and applied research in high-performance computing, applied mathematics, and intelligent systems. Our mission includes basic research in computational sciences and application of advanced computing systems, computational, mathematical and analysis techniques to the solution of scientific problems of national importance. We seek to work

    5. 2008 CHP Baseline Assessment and Action Plan for the Hawaii Market |

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      Department of Energy Hawaii Market 2008 CHP Baseline Assessment and Action Plan for the Hawaii Market The purpose of this 2008 report is to provide an updated baseline assessment and action plan for combined heat and power (CHP) in Hawaii and to identify the hurdles that prevent the expanded use of CHP systems. This report was prepared by the Pacific Region CHP Application Center (RAC). chp_hawaii_2008.pdf (563.87 KB) More Documents & Publications Renewable Power Options for Electricity

    6. 2010-06 "Budget Priorities for FY'12 and Baseline Change Proposal with

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      Future Budgets at LANL" | Department of Energy 6 "Budget Priorities for FY'12 and Baseline Change Proposal with Future Budgets at LANL" 2010-06 "Budget Priorities for FY'12 and Baseline Change Proposal with Future Budgets at LANL" The intent of this recommendation is to provide LASO with the priorities, which the NNMCAB believes are important to the citizens of Northern New Mexico in the large program to clean up the legacy waste at LANL. Rec 2010-06 - March 31, 2010

    7. Parallel Computing Summer Research Internship

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Students Parallel Computing Summer Research Internship Creates next-generation leaders in HPC research and applications development Contacts Program Co-Lead Robert (Bob) Robey Email Program Co-Lead Gabriel Rockefeller Email Program Co-Lead Hai Ah Nam Email Professional Staff Assistant Nickole Aguilar Garcia (505) 665-3048 Email 2016: Students Peter Ahrens Peter Ahrens Electrical Engineering & Computer Science BS UC Berkeley Jenniffer Estrada Jenniffer Estrada Computer Science MS Youngstown

    8. Computational Cosmology | Argonne National Laboratory

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Computational Cosmology Computational Cosmology To study the mysterious dark sector we use powerful simulations on state-of-the-art high-performance computers, run at Argonne and elsewhere. By developing specialized codes for these supercomputers, and by comparing their results to observations from deep surveys of the sky, we aim to answer some of the most fundamental questions in physics: What is dark matter made of? What is the nature of dark energy? Does general relativity need to be

    9. Sandia National Laboratories/New Mexico Environmental Baseline update--Revision 1.0

      SciTech Connect (OSTI)

      1996-07-01

      This report provides a baseline update to provide the background information necessary for personnel to prepare clear and consise NEPA documentation. The environment of the Sandia National Laboratories is described in this document, including the ecology, meteorology, climatology, seismology, emissions, cultural resources and land use, visual resources, noise pollution, transportation, and socioeconomics.

    10. Baseline Risk Assessment for the F-Area Burning/Rubble Pits and Rubble Pit

      SciTech Connect (OSTI)

      Palmer, E.

      1996-03-01

      This document provides an overview of the Savannah River Site (SRS) and a description of the F-Area Burning/Rubble Pits (BRPs) and Rubble Pit (RP) unit. It also describes the objectives and scope of the baseline risk assessment (BRA).

    11. Waste Isolation Pilot Plant Transuranic Waste Baseline inventory report. Volume 2. Revision 1

      SciTech Connect (OSTI)

      1995-02-01

      This document is the Baseline Inventory Report for the transuranic (alpha-bearing) wastes stored at the Waste Isolation Pilot Plant (WIPP) in New Mexico. Waste stream profiles including origin, applicable EPA codes, typical isotopic composition, typical waste densities, and typical rates of waste generation for each facility are presented for wastes stored at the WIPP.

    12. Computing Resources | Argonne Leadership Computing Facility

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Computing Resources Mira Cetus and Vesta Visualization Cluster Data and Networking Software JLSE Computing Resources Theory and Computing Sciences Building Argonne's Theory and Computing Sciences (TCS) building houses a wide variety of computing systems including some of the most powerful supercomputers in the world. The facility has 25,000 square feet of raised computer floor space and a pair of redundant 20 megavolt amperes electrical feeds from a 90 megawatt substation. The building also

    13. Baseline geochemistry of soil and bedrock Tshirege Member of the Bandelier Tuff at MDA-P

      SciTech Connect (OSTI)

      Warren, R.G.; McDonald, E.V.; Ryti, R.T.

      1997-08-01

      This report provides baseline geochemistry for soils (including fill), and for bedrock within three specific areas that are planned for use in the remediation of Material Disposal Area P (MDA-P) at Technical Area 16 (TA-16). The baseline chemistry includes leachable element concentrations for both soils and bedrock and total element concentrations for all soil samples and for two selected bedrock samples. MDA-P operated from the early 1950s to 1984 as a landfill for rubble and debris generated by the burning of high explosives (HE) at the TA-16 Burning Ground, HE-contaminated equipment and material, barium nitrate sand, building materials, and trash. The aim of this report is to establish causes for recognizable chemical differences between the background and baseline data sets. In many cases, the authors conclude that recognizable differences represent natural enrichments. In other cases, differences are best attributed to analytical problems. But most importantly, the comparison of background and baseline geochemistry demonstrates significant contamination for several elements not only at the two remedial sites near the TA-16 Burning Ground, but also within the entire region of the background study. This contamination is highly localized very near to the surface in soil and fill, and probably also in bedrock; consequently, upper tolerance limits (UTLs) calculated as upper 95% confidence limits of the 95th percentile are of little value and thus are not provided. This report instead provides basic statistical summaries and graphical comparisons for background and baseline samples to guide strategies for remediation of the three sites to be used in the restoration of MDA-P.

    14. Extreme Scale Computing, Co-design

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Information Science, Computing, Applied Math » Extreme Scale Computing, Co-design Extreme Scale Computing, Co-design Computational co-design may facilitate revolutionary designs in the next generation of supercomputers. Get Expertise Tim Germann Physics and Chemistry of Materials Email Allen McPherson Energy and Infrastructure Analysis Email Turab Lookman Physics and Condensed Matter and Complex Systems Email Computational co-design involves developing the interacting components of a

    15. High Performance Computing

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      HPC INL Logo Home High-Performance Computing INL's high-performance computing center provides general use scientific computing capabilities to support the lab's efforts in advanced...

    16. Magnetic resonance imaging and computational fluid dynamics (CFD) simulations of rabbit nasal airflows for the development of hybrid CFD/PBPK models

      SciTech Connect (OSTI)

      Corley, Richard A.; Minard, Kevin R.; Kabilan, Senthil; Einstein, Daniel R.; Kuprat, Andrew P.; harkema, J. R.; Kimbell, Julia; Gargas, M. L.; Kinzell, John H.

      2009-06-01

      The percentages of total air?ows over the nasal respiratory and olfactory epithelium of female rabbits were cal-culated from computational ?uid dynamics (CFD) simulations of steady-state inhalation. These air?ow calcula-tions, along with nasal airway geometry determinations, are critical parameters for hybrid CFD/physiologically based pharmacokinetic models that describe the nasal dosimetry of water-soluble or reactive gases and vapors in rabbits. CFD simulations were based upon three-dimensional computational meshes derived from magnetic resonance images of three adult female New Zealand White (NZW) rabbits. In the anterior portion of the nose, the maxillary turbinates of rabbits are considerably more complex than comparable regions in rats, mice, mon-keys, or humans. This leads to a greater surface area to volume ratio in this region and thus the potential for increased extraction of water soluble or reactive gases and vapors in the anterior portion of the nose compared to many other species. Although there was considerable interanimal variability in the ?ne structures of the nasal turbinates and air?ows in the anterior portions of the nose, there was remarkable consistency between rabbits in the percentage of total inspired air?ows that reached the ethmoid turbinate region (~50%) that is presumably lined with olfactory epithelium. These latter results (air?ows reaching the ethmoid turbinate region) were higher than previous published estimates for the male F344 rat (19%) and human (7%). These di?erences in regional air?ows can have signi?cant implications in interspecies extrapolations of nasal dosimetry.

    17. EVMS Training Snippet: 4.1 The Over Target Baseline (OTB) and The Over Target Schedule (OTS) Implementations

      Broader source: Energy.gov [DOE]

      This EVMS Training Snippet, sponsored by the Office of Project Management (PM) covers Over Target Baseline and Over Target Schedule implementations.

    18. RCRA Facility Investigation/Remedial Investigation Report with the Baseline Risk Assessment for the 716-A Motor Shops Seepage Basin

      SciTech Connect (OSTI)

      Palmer, E.

      1997-08-25

      This document describes the RCRA Facility Investigation/Remedial Investigation/Baseline Risk Assessment of the 716-A Motor Shops Seepage Basin.

    19. Baseline point source load inventory, 1985. 1991 reevaluation report No. 2

      SciTech Connect (OSTI)

      Not Available

      1993-02-04

      The report finalizes and documents the Chesapeake Bay Agreement states' 1985 point source nutrient load estimates initially presented in the Baywide Nutrient Reduction Strategy (BNRS). The Bay Agreement states include Maryland, Virginia, Pennsylvania, and the District of Columbia. Each of the states final, annual, discharged, 1985 point source total phosphorus and total nitrogen nutrient load estimates are presented. These estimates are to serve as the point source baseline for the year 2000 40% nutrient reduction goal. Facility by facility flows, nutrient concentrations and nutrient loads for 1985 from above the fall line (AFL) and from below the fall line (BFL) are presented. The report presents the percent change in the 1985 baseline loads for each of the Bay agreement states relative to 1991. Estimates of 1991 nutrient loads are not available for non-agreement states at this time.

    20. Free-piston Stirling engine experimental program: Part 1. Baseline test summary

      SciTech Connect (OSTI)

      Berggren, R.; Moynihan, T.

      1983-06-01

      Free-Piston Stirling Engine experimental data are presented from a series of tests that establish the operating characteristics of the engine and determine performance repeatability. The operating envelope of the engine was to determine maximum parameter range and repeatability. Tests were then carried out in which individual operating parameters were varied while others were maintained constant. These data establish the baseline operation of the engine as a preliminary to a series of tests in which several suspected sources of energy loss are investigated by changing the engine geometry to isolate and magnify each suspected loss mechanism. Performance with the geometry change is compared against baseline operation to quantify the magnitude of the loss mechanism under investigation. The results of the loss mechanism investigation are presented in Part 2 of this report.

    1. Announcement of Computer Software

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      F 241.4 (10-01) (Replaces ESTSC F1 and ESTSC F2) All Other Editions Are Obsolete UNITED STATES DEPARTMENT OF ENERGY ANNOUNCEMENT OF COMPUTER SOFTWARE OMB Control Number 1910-1400 (OMB Burden Disclosure Statement is on last page of Instructions) Record Status (Select One): New Package Software Revision H. Description/Abstract PART I: STI SOFTWARE DESCRIPTION A. Software Title SHORT NAME OR ACRONYM KEYWORDS IN CONTEXT (KWIC) TITLE B. Developer(s) E-MAIL ADDRESS(ES) C. Site Product Number 1. DOE

    2. Results from baseline tests of the SPRE I and comparison with code model predictions

      SciTech Connect (OSTI)

      Cairelli, J.E.; Geng, S.M.; Skupinski, R.C.

      1994-09-01

      The Space Power Research Engine (SPRE), a free-piston Stirling engine with linear alternator, is being tested at the NASA Lewis Research Center as part of the Civil Space Technology Initiative (CSTI) as a candidate for high capacity space power. This paper presents results of base-line engine tests at design and off-design operating conditions. The test results are compared with code model predictions.

    3. OSTIblog Articles in the Long Baseline Neutrino Experiment Topic | OSTI, US

      Office of Scientific and Technical Information (OSTI)

      Dept of Energy Office of Scientific and Technical Information Long Baseline Neutrino Experiment Topic Mining for Gold, Neutrinos and the Neutrinoless Double Beta Decay by Kathy Chambers 23 Sep, 2014 in Deep within the caverns of Lead, South Dakota is one of the nation's preeminent underground laboratories. The site of the former Homestake Mine was once one of the largest and deepest gold mines in North America. This famous mine was discovered during the 1876 Black Hills gold rush and

    4. Mixed waste focus area integrated technical baseline report. Phase I, Volume 2: Revision 0

      SciTech Connect (OSTI)

      1996-01-16

      This document (Volume 2) contains the Appendices A through J for the Mixed Waste Focus Area Integrated Technical Baseline Report Phase I for the Idaho National Engineering Laboratory. Included are: Waste Type Managers` Resumes, detailed information on wastewater, combustible organics, debris, unique waste, and inorganic homogeneous solids and soils, and waste data information. A detailed list of technology deficiencies and site needs identification is also provided.

    5. BASELINE RISK ASSESSMENT OF GROUND WATER CONTAMINATION AT THE URAN~UM MILL TAILINGS

      Office of Legacy Management (LM)

      I~:-:ii*.i: i,<;.;.-;_r- --:-:ir-- - . . - -. . - . . - , -, . , , , - - - - . BASELINE RISK ASSESSMENT OF GROUND WATER CONTAMINATION AT THE URAN~UM MILL TAILINGS SITE NEAR RIVERTON, WYOMING I i I I I Prepared by the U.S. Department of Energy Albuquerque, New Mexico September 1995 INTENDED FOR PUBLIC RELEASE This report has been reproduced from the best available copy. Avai and microfiche Number of pages in this report: 166 DOE and DOE contractors can obtain copies of this report from: Office

    6. Some Beam Dynamics and Related Studies of Possible Changes to the ILC Baseline Design

      SciTech Connect (OSTI)

      Paterson, Ewan; /SLAC

      2012-04-03

      Since the completion of the ILC Reference Design Report (RDR) in 2007, global R and D has continued on all ILC systems in a coordinated program titled Technical Design Phase 1. This program, which is planned and coordinated by the Program Managers and the Technical Area Group Leaders, will transition to a Phase 2 in 2010 which has the goal of producing a more complete Technical Design Report in 2012. In this transition there will be a re-baseline process which will update and or modify the RDR baseline design taking into account progress with systems design and progress with various technologies coming from the continuing R and D programs. The RDR design was considered by some to be a conservative one and many of the topics being studied for inclusion in a new baseline are directed towards more optimum cost versus risk designs. Some of these are engineering systems design modifications, both technical and civil, while others are accelerator parameters, technical system designs and beam dynamics optimizations. A few of the latter are described here.

    7. Idaho National Engineering Laboratory (INEL) Environmental Restoration Program (ERP), Baseline Safety Analysis File (BSAF). Revision 1

      SciTech Connect (OSTI)

      Not Available

      1994-06-20

      This document was prepared to take the place of a Safety Evaluation Report since the Baseline Safety Analysis File (BSAF)and associated Baseline Technical Safety Requirements (TSR) File do not meet the requirements of a complete safety analysis documentation. Its purpose is to present in summary form the background of how the BSAF and Baseline TSR originated and a description of the process by which it was produced and approved for use in the Environmental Restoration Program.The BSAF is a facility safety reference document for INEL environmental restoration activities including environmental remediation of inactive waste sites and decontamination and decommissioning (D&D) of surplus facilities. The BSAF contains safety bases common to environmental restoration activities and guidelines for performing and documenting safety analysis. The common safety bases can be incorporated by reference into the safety analysis documentation prepared for individual environmental restoration activities with justification and any necessary revisions. The safety analysis guidelines in BSAF provide an accepted method for hazard analysis; analysis of normal, abnormal, and accident conditions; human factors analysis; and derivation of TSRS. The BSAF safety bases and guidelines are graded for environmental restoration activities.

    8. The impact of sterile neutrinos on CP measurements at long baselines

      DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

      Gandhi, Raj; Kayser, Boris; Masud, Mehedi; Prakash, Suprabh

      2015-09-01

      With the Deep Underground Neutrino Experiment (DUNE) as an example, we show that the presence of even one sterile neutrino of mass ~1 eV can significantly impact the measurements of CP violation in long baseline experiments. Using a probability level analysis and neutrino-antineutrino asymmetry calculations, we discuss the large magnitude of these effects, and show how they translate into significant event rate deviations at DUNE. These results demonstrate that measurements which, when interpreted in the context of the standard three family paradigm, indicate CP conservation at long baselines, may, in fact hide large CP violation if there is a sterilemore » state. Similarly, any data indicating the violation of CP cannot be properly interpreted within the standard paradigm unless the presence of sterile states of mass O(1 eV) can be conclusively ruled out. Our work underscores the need for a parallel and linked short baseline oscillation program and a highly capable near detector for DUNE, but in order that its highly anticipated results on CP violation in the lepton sector may be correctly interpreted.« less

    9. Development of a lab-scale, high-resolution, tube-generated X-ray computed-tomography system for three-dimensional (3D) materials characterization

      SciTech Connect (OSTI)

      Mertens, J.C.E. Williams, J.J. Chawla, Nikhilesh

      2014-06-01

      The design and construction of a modular high resolution X-ray computed tomography (XCT) system is highlighted in this paper. The design approach is detailed for meeting a specified set of instrument performance goals tailored towards experimental versatility and high resolution imaging. The XCT tool is unique in the detector and X-ray source design configuration, enabling control in the balance between detection efficiency and spatial resolution. The system package is also unique: The sample manipulation approach implemented enables a wide gamut of in situ experimentation to analyze structure evolution under applied stimulus, by optimizing scan conditions through a high degree of controllability. The component selection and design process is detailed: Incorporated components are specified, custom designs are shared, and the approach for their integration into a fully functional XCT scanner is provided. Custom designs discussed include the dual-target X-ray source cradle which maintains position and trajectory of the beam between the two X-ray target configurations with respect to a scintillator mounting and positioning assembly and the imaging sensor, as well as a novel large-format X-ray detector with enhanced adaptability. The instrument is discussed from an operational point of view, including the details of data acquisition and processing implemented for 3D imaging via micro-CT. The performance of the instrument is demonstrated on a silica-glass particle/hydroxyl-terminated-polybutadiene (HTPB) matrix binder PBX simulant. Post-scan data processing, specifically segmentation of the sample's relevant microstructure from the 3D reconstruction, is provided to demonstrate the utility of the instrument. - Highlights: Custom built X-ray tomography system for microstructural characterization Detector design for maximizing polychromatic X-ray detection efficiency X-ray design offered for maximizing X-ray flux with respect to imaging resolution Novel lab

    10. Exploratory Experimentation and Computation

      SciTech Connect (OSTI)

      Bailey, David H.; Borwein, Jonathan M.

      2010-02-25

      We believe the mathematical research community is facing a great challenge to re-evaluate the role of proof in light of recent developments. On one hand, the growing power of current computer systems, of modern mathematical computing packages, and of the growing capacity to data-mine on the Internet, has provided marvelous resources to the research mathematician. On the other hand, the enormous complexity of many modern capstone results such as the Poincare conjecture, Fermat's last theorem, and the classification of finite simple groups has raised questions as to how we can better ensure the integrity of modern mathematics. Yet as the need and prospects for inductive mathematics blossom, the requirement to ensure the role of proof is properly founded remains undiminished.

    11. FY12 Quarter 3 Computing Utilization Report LANL

      SciTech Connect (OSTI)

      Wampler, Cheryl L. [Los Alamos National Laboratory; McClellan, Laura Ann [Los Alamos National Laboratory

      2012-07-25

      DSW continues to dominate the capacity workload, with a focus in Q3 on common model baselining runs in preparation for the Annual Assessment Review (AAR) of the weapon systems. There remains unmet demand for higher fidelity simulations, and for increased throughput of simulations. Common model baselining activities would benefit from doubling the resolution of the models and running twice as many simulations. Capacity systems were also utilized during the quarter to prepare for upcoming Level 2 milestones. Other notable DSW activities include validation of new physics models and safety studies. The safety team used the capacity resources extensively for projects involving 3D computer simulations for the Furrow series of experiments at DARHT (a Level 2 milestone), fragment impact, surety theme, PANTEX assessments, and the 120-day study. With the more than tripling of classified capacity computing resources with the addition of the Luna system and the safety team's imminent access to the Cielo system, demand has been met for current needs. The safety team has performed successful scaling studies on Luna up to 16K PE size-jobs with linear scaling, running the large 3D simulations required for the analysis of Furrow. They will be investigating scaling studies on the Cielo system with the Lustre file system in Q4. Overall average capacity utilization was impacted by negative effects of the LANL Voluntary Separation Program (VSP) at the beginning of Q3, in which programmatic staffing was reduced by 6%, with further losses due to management backfills and attrition, resulting in about 10% fewer users. All classified systems were impacted in April by a planned 2 day red network outage. ASC capacity workload continues to focus on code development, regression testing, and verification and validation (V&V) studies. Significant capacity cycles were used in preparation for a JOWOG in May and several upcoming L2 milestones due in Q4. A network transition has been underway on the

    12. About the Advanced Computing Tech Team | Department of Energy

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      About the Advanced Computing Tech Team About the Advanced Computing Tech Team The Advanced Computing Tech Team is made up of representatives from DOE and its national laboratories who are involved with developing and using advanced computing tools. The following is a list of some of those programs and what how they are currently using advanced computing in pursuit of their respective missions. Advanced Science Computing Research (ASCR) The mission of the Advanced Scientific Computing Research

    13. Computational analysis of storage synthesis in developing Brassica napus L. (oilseed rape) embryos: Flux variability analysis in relation to 13C-metabolic flux analysis

      SciTech Connect (OSTI)

      Hay, J.; Schwender, J.

      2011-08-01

      Plant oils are an important renewable resource, and seed oil content is a key agronomical trait that is in part controlled by the metabolic processes within developing seeds. A large-scale model of cellular metabolism in developing embryos of Brassica napus (bna572) was used to predict biomass formation and to analyze metabolic steady states by flux variability analysis under different physiological conditions. Predicted flux patterns are highly correlated with results from prior 13C metabolic flux analysis of B. napus developing embryos. Minor differences from the experimental results arose because bna572 always selected only one sugar and one nitrogen source from the available alternatives, and failed to predict the use of the oxidative pentose phosphate pathway. Flux variability, indicative of alternative optimal solutions, revealed alternative pathways that can provide pyruvate and NADPH to plastidic fatty acid synthesis. The nutritional values of different medium substrates were compared based on the overall carbon conversion efficiency (CCE) for the biosynthesis of biomass. Although bna572 has a functional nitrogen assimilation pathway via glutamate synthase, the simulations predict an unexpected role of glycine decarboxylase operating in the direction of NH4+ assimilation. Analysis of the light-dependent improvement of carbon economy predicted two metabolic phases. At very low light levels small reductions in CO2 efflux can be attributed to enzymes of the tricarboxylic acid cycle (oxoglutarate dehydrogenase, isocitrate dehydrogenase) and glycine decarboxylase. At higher light levels relevant to the 13C flux studies, ribulose-1,5-bisphosphate carboxylase activity is predicted to account fully for the light-dependent changes in carbon balance.

    14. Community Greening: How to Develop a Strategic Plan | Open Energy...

      Open Energy Info (EERE)

      Focus Area People and Policy Phase Bring the Right People Together, Create a Vision, Determine Baseline, Evaluate Options, Develop Goals, Prepare a Plan, Get Feedback,...

    15. Computational Electronics and Electromagnetics

      SciTech Connect (OSTI)

      DeFord, J.F.

      1993-03-01

      The Computational Electronics and Electromagnetics thrust area is a focal point for computer modeling activities in electronics and electromagnetics in the Electronics Engineering Department of Lawrence Livermore National Laboratory (LLNL). Traditionally, they have focused their efforts in technical areas of importance to existing and developing LLNL programs, and this continues to form the basis for much of their research. A relatively new and increasingly important emphasis for the thrust area is the formation of partnerships with industry and the application of their simulation technology and expertise to the solution of problems faced by industry. The activities of the thrust area fall into three broad categories: (1) the development of theoretical and computational models of electronic and electromagnetic phenomena, (2) the development of useful and robust software tools based on these models, and (3) the application of these tools to programmatic and industrial problems. In FY-92, they worked on projects in all of the areas outlined above. The object of their work on numerical electromagnetic algorithms continues to be the improvement of time-domain algorithms for electromagnetic simulation on unstructured conforming grids. The thrust area is also investigating various technologies for conforming-grid mesh generation to simplify the application of their advanced field solvers to design problems involving complicated geometries. They are developing a major code suite based on the three-dimensional (3-D), conforming-grid, time-domain code DSI3D. They continue to maintain and distribute the 3-D, finite-difference time-domain (FDTD) code TSAR, which is installed at several dozen university, government, and industry sites.

    16. Evaluation of metrics and baselines for tracking greenhouse gas emissions trends: Recommendations for the California climate action registry

      SciTech Connect (OSTI)

      Price, Lynn; Murtishaw, Scott; Worrell, Ernst

      2003-06-01

      Energy Commission (Energy Commission) related to the Registry in three areas: (1) assessing the availability and usefulness of industry-specific metrics, (2) evaluating various methods for establishing baselines for calculating GHG emissions reductions related to specific actions taken by Registry participants, and (3) establishing methods for calculating electricity CO2 emission factors. The third area of research was completed in 2002 and is documented in Estimating Carbon Dioxide Emissions Factors for the California Electric Power Sector (Marnay et al., 2002). This report documents our findings related to the first areas of research. For the first area of research, the overall objective was to evaluate the metrics, such as emissions per economic unit or emissions per unit of production that can be used to report GHG emissions trends for potential Registry participants. This research began with an effort to identify methodologies, benchmarking programs, inventories, protocols, and registries that u se industry-specific metrics to track trends in energy use or GHG emissions in order to determine what types of metrics have already been developed. The next step in developing industry-specific metrics was to assess the availability of data needed to determine metric development priorities. Berkeley Lab also determined the relative importance of different potential Registry participant categories in order to asses s the availability of sectoral or industry-specific metrics and then identified industry-specific metrics in use around the world. While a plethora of metrics was identified, no one metric that adequately tracks trends in GHG emissions while maintaining confidentiality of data was identified. As a result of this review, Berkeley Lab recommends the development of a GHG intensity index as a new metric for reporting and tracking GHG emissions trends.Such an index could provide an industry-specific metric for reporting and tracking GHG emissions trends to accurately

    17. Reference Model Development

      SciTech Connect (OSTI)

      Jepsen, Richard

      2011-11-02

      Presentation from the 2011 Water Peer Review in which principal investigator discusses project progress to develop a representative set of Reference Models (RM) for the MHK industry to develop baseline cost of energy (COE) and evaluate key cost component/system reduction pathways.

    18. NERSC seeks Computational Systems Group Lead

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      and advanced development for the supercomputer systems at NERSC (National Energy Research Scientific Computing ... workload demands within hiring and budget constraints. ...

    19. Inexpensive computer data-acquisition system

      SciTech Connect (OSTI)

      Galvin, J.E.; Brown, I.G.

      1985-10-01

      A system based on an Apple II+ personal computer is used for on-line monitoring of ion-beam characteristics in accelerator ion source development.

    20. Computer System, Cluster, and Networking Summer Institute Program Description

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      System, Cluster, and Networking Summer Institute Program Description The Computer System, Cluster, and Networking Summer Institute (CSCNSI) is a focused technical enrichment program targeting third-year college undergraduate students currently engaged in a computer science, computer engineering, or similar major. The program emphasizes practical skill development in setting up, configuring, administering, testing, monitoring, and scheduling computer systems, supercomputer clusters, and computer

    1. Virtual Design Studio (VDS) - Development of an Integrated Computer Simulation Environment for Performance Based Design of Very-Low Energy and High IEQ Buildings

      SciTech Connect (OSTI)

      Chen, Yixing; Zhang, Jianshun; Pelken, Michael; Gu, Lixing; Rice, Danial; Meng, Zhaozhou; Semahegn, Shewangizaw; Feng, Wei; Ling, Francesca; Shi, Jun; Henderson, Hugh

      2013-09-01

      Executive Summary The objective of this study was to develop a “Virtual Design Studio (VDS)”: a software platform for integrated, coordinated and optimized design of green building systems with low energy consumption, high indoor environmental quality (IEQ), and high level of sustainability. This VDS is intended to assist collaborating architects, engineers and project management team members throughout from the early phases to the detailed building design stages. It can be used to plan design tasks and workflow, and evaluate the potential impacts of various green building strategies on the building performance by using the state of the art simulation tools as well as industrial/professional standards and guidelines for green building system design. Engaged in the development of VDS was a multi-disciplinary research team that included architects, engineers, and software developers. Based on the review and analysis of how existing professional practices in building systems design operate, particularly those used in the U.S., Germany and UK, a generic process for performance-based building design, construction and operation was proposed. It distinguishes the whole process into five distinct stages: Assess, Define, Design, Apply, and Monitoring (ADDAM). The current VDS is focused on the first three stages. The VDS considers building design as a multi-dimensional process, involving multiple design teams, design factors, and design stages. The intersection among these three dimensions defines a specific design task in terms of “who”, “what” and “when”. It also considers building design as a multi-objective process that aims to enhance the five aspects of performance for green building systems: site sustainability, materials and resource efficiency, water utilization efficiency, energy efficiency and impacts to the atmospheric environment, and IEQ. The current VDS development has been limited to energy efficiency and IEQ performance, with particular focus

    2. The Computational Physics Program of the national MFE Computer Center

      SciTech Connect (OSTI)

      Mirin, A.A.

      1989-01-01

      Since June 1974, the MFE Computer Center has been engaged in a significant computational physics effort. The principal objective of the Computational Physics Group is to develop advanced numerical models for the investigation of plasma phenomena and the simulation of present and future magnetic confinement devices. Another major objective of the group is to develop efficient algorithms and programming techniques for current and future generations of supercomputers. The Computational Physics Group has been involved in several areas of fusion research. One main area is the application of Fokker-Planck/quasilinear codes to tokamaks. Another major area is the investigation of resistive magnetohydrodynamics in three dimensions, with applications to tokamaks and compact toroids. A third area is the investigation of kinetic instabilities using a 3-D particle code; this work is often coupled with the task of numerically generating equilibria which model experimental devices. Ways to apply statistical closure approximations to study tokamak-edge plasma turbulence have been under examination, with the hope of being able to explain anomalous transport. Also, we are collaborating in an international effort to evaluate fully three-dimensional linear stability of toroidal devices. In addition to these computational physics studies, the group has developed a number of linear systems solvers for general classes of physics problems and has been making a major effort at ascertaining how to efficiently utilize multiprocessor computers. A summary of these programs are included in this paper. 6 tabs.

    3. Application of Robust Design and Advanced Computer Aided Engineering Technologies: Cooperative Research and Development Final Report, CRADA Number CRD-04-143

      SciTech Connect (OSTI)

      Thornton, M.

      2013-06-01

      Oshkosh Corporation (OSK) is taking an aggressive approach to implementing advanced technologies, including hybrid electric vehicle (HEV) technology, throughout their commercial and military product lines. These technologies have important implications for OSK's commercial and military customers, including fleet fuel efficiency, quiet operational modes, additional on-board electric capabilities, and lower thermal signature operation. However, technical challenges exist with selecting the optimal HEV components and design to work within the performance and packaging constraints of specific vehicle applications. SK desires to use unique expertise developed at the Department of Energy?s (DOE) National Renewable Energy Laboratory (NREL), including HEV modeling and simulation. These tools will be used to overcome technical hurdles to implementing advanced heavy vehicle technology that meet performance requirements while improving fuel efficiency.

    4. Parallel Computing Summer Research Internship

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Recommended Reading & Resources Parallel Computing Summer Research Internship Creates next-generation leaders in HPC research and applications development Contacts Program Co-Lead Robert (Bob) Robey Email Program Co-Lead Gabriel Rockefeller Email Program Co-Lead Hai Ah Nam Email Professional Staff Assistant Nickole Aguilar Garcia (505) 665-3048 Email Recommended Reading & References The Parallel Computing Summer Research Internship covers a broad range of topics that you may not have

    5. Parallel Computing Summer Research Internship

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      LaboratoryNational Security Education Center Menu About Seminar Series Summer Schools Workshops Viz Collab IS&T Projects NSEC » Information Science and Technology Institute (ISTI) » Summer School Programs » Parallel Computing Parallel Computing Summer Research Internship Creates next-generation leaders in HPC research and applications development Contacts Program Co-Lead Robert (Bob) Robey Email Program Co-Lead Gabriel Rockefeller Email Program Co-Lead Hai Ah Nam Email Professional Staff

    6. Vesta | Argonne Leadership Computing Facility

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Policies Documentation Feedback Please provide feedback to help guide us as we continue to build documentation for our new computing resource. [Feedback Form] Vesta Vesta is the ALCF's test and development platform, serving as a launching pad for researchers planning to use Mira. Vesta has the same architecture as Mira, but on a much smaller scale (two computer racks compared to Mira's 48 racks). This system enables researchers to debug and scale up codes for the Blue Gene/Q architecture in

    7. Secure computing for the 'Everyman'

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Secure computing for the 'Everyman' Secure computing for the 'Everyman' If implemented on a wide scale, quantum key distribution technology could ensure truly secure commerce, banking, communications and data transfer. September 2, 2014 This small device developed at Los Alamos National Laboratory uses the truly random spin of light particles as defined by laws of quantum mechanics to generate a random number for use in a cryptographic key that can be used to securely transmit information

    8. Computational Sciences and Engineering Division

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      If you have questions or comments regarding any of our research and development activities, how to work with ORNL and the Computational Sciences and Engineering (CSE) Division, or the content of this website please contact one of the following people: If you have questions regarding CSE technologies and capabilities, job opportunities, working with ORNL and the CSE Division, intellectual property, etc., contact, Shaun S. Gleason, Ph.D. Division Director, Computational Sciences and Engineering

    9. The mixed waste management facility. Project baseline revision 1.2

      SciTech Connect (OSTI)

      Streit, R.D.; Throop, A.L.

      1995-04-01

      Revision 1.2 to the Project Baseline (PB) for the Mixed Waste Management Facility (MWMF) is in response to DOE directives and verbal guidance to (1) Collocate the Decontamination and Waste Treatment Facility (DWTF) and MWMF into a single complex, integrate certain and overlapping functions as a cost-saving measure; (2) Meet certain fiscal year (FY) new-BA funding objectives ($15.3M in FY95) with lower and roughly balanced funding for out years; (3) Reduce Total Project Cost (TPC) for the MWMF Project; (4) Include costs for all appropriate permitting activities in the project TPC. This baseline revision also incorporates revisions in the technical baseline design for Molten Salt Oxidation (MSO) and Mediated Electrochemical Oxidation (MEO). Changes in the WBS dictionary that are necessary as a result of this rebaseline, as well as minor title changes, at WBS Level 3 or above (DOE control level) are approved as a separate document. For completeness, the WBS dictionary that reflects these changes is contained in Appendix B. The PB, with revisions as described in this document, were also the basis for the FY97 Validation Process, presented to DOE and their reviewers on March 21-22, 1995. Appendix C lists information related to prior revisions to the PB. Several key changes relate to the integration of functions and sharing of facilities between the portion of the DWTF that will house the MWMF and those portions that are used by the Hazardous Waste Management (HWM) Division at LLNL. This collocation has been directed by DOE as a cost-saving measure and has been implemented in a manner that maintains separate operational elements from a safety and permitting viewpoint. Appendix D provides background information on the decision and implications of collocating the two facilities.

    10. Formation and Sustainment of ITPs in ITER with the Baseline Heating Mix

      SciTech Connect (OSTI)

      Francesca M. Poli and Charles Kessel

      2012-12-03

      Plasmas with internal transport barriers (ITBs) are a potential and attractive route to steady-state operation in ITER. These plasmas exhibit radially localized regions of improved con nement with steep pressure gradients in the plasma core, which drive large bootstrap current and generate hollow current pro les and negative shear. This work examines the formation and sustainment of ITBs in ITER with electron cyclotron heating and current drive. It is shown that, with a trade-o of the power delivered to the equatorial and to the upper launcher, the sustainment of steady-state ITBs can be demonstrated in ITER with the baseline heating con guration.

    11. A comparison of baseline aerodynamic performance of optimally-twisted versus non-twisted HAWT blades

      SciTech Connect (OSTI)

      Simms, D.A.; Robinson, M.C.; Hand, M.M.; Fingersh, L.J.

      1995-01-01

      NREL has completed the initial twisted blade field tests of the ``Unsteady Aerodynamics Experiment.`` This test series continues systematic measurements of unsteady aerodynamic phenomena prevalent in stall-controlled horizontal axis wind turbines (HAWTs). The blade twist distribution optimizes power production at a single angle of attack along the span. Abrupt transitions into and out of stall are created due to rapid changes in inflow. Data from earlier experiments have been analyzed extensively to characterize the steady and unsteady response of untwisted blades. In this report, a characterization and comparison of the baseline aerodynamic performance of the twisted versus non-twisted blade sets will be presented for steady flow conditions.

    12. DOE-EM-STD-5502-94; DOE Limited Standard Hazard Baseline Documentation

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      NOT MEASUREMENT SENSITIVE DOE-EM-STD-5502-94 August 1994 DOE LIMITED STANDARD HAZARD BASELINE DOCUMENTATION U.S. Department of Energy AREA SAFT Washington, D.C. 20585 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. This document has been reproduced directly from the best available copy. Available to DOE and DOE contractors from the Office of Scientific and Technical Information, P.O. Box 62, Oak Ridge, TN 37831; (615) 576-8401. Available to the public from the

    13. A Study to Develop an Industrial-Scale, Computer-Controlled High Magnetic Field Processing (HMFP) System to Assist in Commercializing the Novel, Enabling HMFP Manufacturing Technology

      SciTech Connect (OSTI)

      Ludtka, Gail Mackiewicz-; Chourey, Aashish

      2010-08-01

      As the original magnet designer and manufacturer of ORNL s 9T, 5-inch ID bore magnet, American Magnetics Inc. (AMI) has collaborated with ORNL s Materials Processing Group s and this partnership has been instrumental in the development of our unique thermo-magnetic facilities and expertise. Consequently, AMI and ORNL have realized that the commercial implementation of the High Magnetic Field Processing (HMFP) technology will require the evolution of robust, automated superconducting (SC) magnet systems that will be cost-effective and easy to operate in an industrial environment. The goal of this project and CRADA is to significantly expedite the timeline for implementing this revolutionary and pervasive cross-cutting technology for future US produced industrial components. The successful completion of this project is anticipated to significantly assist in the timely commercialization and licensing of our HMFP intellectual property for a broad spectrum of industries; and to open up a new market for AMI. One notable outcome of this project is that the ThermoMagnetic Processing Technology WON a prestigious 2009 R&D 100 Awards. This award acknowledges and recognizes our TMP Technology as one of the top 100 innovative US technologies in 2009. By successfully establishing the design requirements for a commercial scale magnetic processing system, this project effort has accomplished a key first step in facilitating the building and demonstration of a superconducting magnetic processing coil, enabling the transition of the High Magnetic Field Processing Technology beyond a laboratory novelty into a commercially viable and industrially scalable Manufacturing Technology.

    14. Applications of Parallel Computers

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Computers Applications of Parallel Computers UCB CS267 Spring 2015 Tuesday & Thursday, 9:30-11:00 Pacific Time Applications of Parallel Computers, CS267, is a graduate-level course...

    15. Multiprocessor computing for images

      SciTech Connect (OSTI)

      Cantoni, V. ); Levialdi, S. )

      1988-08-01

      A review of image processing systems developed until now is given, highlighting the weak points of such systems and the trends that have dictated their evolution through the years producing different generations of machines. Each generation may be characterized by the hardware architecture, the programmability features and the relative application areas. The need for multiprocessing hierarchical systems is discussed focusing on pyramidal architectures. Their computational paradigms, their virtual and physical implementation, their programming and software requirements, and capabilities by means of suitable languages, are discussed.

    16. A Study to Develop an Industrial-Scale, Computer-Controlled High Magnetic Field Processing (HMFP) System to Assist in Commercializing the Novel, Enabling HMFP Manufacturing Technology

      SciTech Connect (OSTI)

      Lutdka, G. M.; Chourey, A.

      2010-05-12

      As the original magnet designer and manufacturer of ORNLs 9T, 5-inch ID bore magnet, American Magnetics Inc. (AMI) has collaborated with ORNLs Materials Processing Groups and this partnership has been instrumental in the development of our unique thermo-magnetic facilities and expertise. Consequently, AMI and ORNL have realized that the commercial implementation of the High Magnetic Field Processing (HMFP) technology will require the evolution of robust, automated superconducting (SC) magnet systems that will be cost-effective and easy to operate in an industrial environment. The goal of this project and CRADA is to significantly expedite the timeline for implementing this revolutionary and pervasive cross-cutting technology for future US produced industrial components. The successful completion of this project is anticipated to significantly assist in the timely commercialization and licensing of our HMFP intellectual property for a broad spectrum of industries; and to open up a new market for AMI. One notable outcome of this project is that the ThermoMagnetic Processing Technology WON a prestigious 2009 R&D 100 Awards. This award acknowledges and recognizes our TMP Technology as one of the top 100 innovative US technologies in 2009. By successfully establishing the design requirements for a commercial scale magnetic processing system, this project effort has accomplished a key first step in facilitating the building and demonstration of a superconducting magnetic processing coil, enabling the transition of the High Magnetic Field Processing Technology beyond a laboratory novelty into a commercially viable and industrially scalable Manufacturing Technology.

    17. Theory, Modeling and Computation

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Theory, Modeling and Computation Theory, Modeling and Computation The sophistication of modeling and simulation will be enhanced not only by the wealth of data available from MaRIE but by the increased computational capacity made possible by the advent of extreme computing. CONTACT Jack Shlachter (505) 665-1888 Email Extreme Computing to Power Accurate Atomistic Simulations Advances in high-performance computing and theory allow longer and larger atomistic simulations than currently possible.

    18. Computing for Finance

      ScienceCinema (OSTI)

      None

      2011-10-06

      The finance sector is one of the driving forces for the use of distributed or Grid computing for business purposes. The speakers will review the state-of-the-art of high performance computing in the financial sector, and provide insight into how different types of Grid computing ? from local clusters to global networks - are being applied to financial applications. They will also describe the use of software and techniques from physics, such as Monte Carlo simulations, in the financial world. There will be four talks of 20min each. The talk abstracts and speaker bios are listed below. This will be followed by a Q&A; panel session with the speakers. From 19:00 onwards there will be a networking cocktail for audience and speakers. This is an EGEE / CERN openlab event organized in collaboration with the regional business network rezonance.ch. A webcast of the event will be made available for subsequent viewing, along with powerpoint material presented by the speakers. Attendance is free and open to all. Registration is mandatory via www.rezonance.ch, including for CERN staff. 1. Overview of High Performance Computing in the Financial Industry Michael Yoo, Managing Director, Head of the Technical Council, UBS Presentation will describe the key business challenges driving the need for HPC solutions, describe the means in which those challenges are being addressed within UBS (such as GRID) as well as the limitations of some of these solutions, and assess some of the newer HPC technologies which may also play a role in the Financial Industry in the future. Speaker Bio: Michael originally joined the former Swiss Bank Corporation in 1994 in New York as a developer on a large data warehouse project. In 1996 he left SBC and took a role with Fidelity Investments in Boston. Unable to stay away for long, he returned to SBC in 1997 while working for Perot Systems in Singapore. Finally, in 1998 he formally returned to UBS in Stamford following the merger with SBC and has remained

    19. Computing for Finance

      ScienceCinema (OSTI)

      None

      2011-10-06

      The finance sector is one of the driving forces for the use of distributed or Grid computing for business purposes. The speakers will review the state-of-the-art of high performance computing in the financial sector, and provide insight into how different types of Grid computing ? from local clusters to global networks - are being applied to financial applications. They will also describe the use of software and techniques from physics, such as Monte Carlo simulations, in the financial world. There will be four talks of 20min each. The talk abstracts and speaker bios are listed below. This will be followed by a Q&A; panel session with the speakers. From 19:00 onwards there will be a networking cocktail for audience and speakers. This is an EGEE / CERN openlab event organized in collaboration with the regional business network rezonance.ch. A webcast of the event will be made available for subsequent viewing, along with powerpoint material presented by the speakers. Attendance is free and open to all. Registration is mandatory via www.rezonance.ch, including for CERN staff. 1. Overview of High Performance Computing in the Financial Industry Michael Yoo, Managing Director, Head of the Technical Council, UBS Presentation will describe the key business challenges driving the need for HPC solutions, describe the means in which those challenges are being addressed within UBS (such as GRID) as well as the limitations of some of these solutions, and assess some of the newer HPC technologies which may also play a role in the Financial Industry in the future. Speaker Bio: Michael originally joined the former Swiss Bank Corporation in 1994 in New York as a developer on a large data warehouse project. In 1996 he left SBC and took a role with Fidelity Investments in Boston. Unable to stay away for long, he returned to SBC in 1997 while working for Perot Systems in Singapore. Finally, in 1998 he formally returned to UBS in Stamford following the merger with SBC and has remained

    20. Computing for Finance

      SciTech Connect (OSTI)

      2010-03-24

      The finance sector is one of the driving forces for the use of distributed or Grid computing for business purposes. The speakers will review the state-of-the-art of high performance computing in the financial sector, and provide insight into how different types of Grid computing – from local clusters to global networks - are being applied to financial applications. They will also describe the use of software and techniques from physics, such as Monte Carlo simulations, in the financial world. There will be four talks of 20min each. The talk abstracts and speaker bios are listed below. This will be followed by a Q&A; panel session with the speakers. From 19:00 onwards there will be a networking cocktail for audience and speakers. This is an EGEE / CERN openlab event organized in collaboration with the regional business network rezonance.ch. A webcast of the event will be made available for subsequent viewing, along with powerpoint material presented by the speakers. Attendance is free and open to all. Registration is mandatory via www.rezonance.ch, including for CERN staff. 1. Overview of High Performance Computing in the Financial Industry Michael Yoo, Managing Director, Head of the Technical Council, UBS Presentation will describe the key business challenges driving the need for HPC solutions, describe the means in which those challenges are being addressed within UBS (such as GRID) as well as the limitations of some of these solutions, and assess some of the newer HPC technologies which may also play a role in the Financial Industry in the future. Speaker Bio: Michael originally joined the former Swiss Bank Corporation in 1994 in New York as a developer on a large data warehouse project. In 1996 he left SBC and took a role with Fidelity Investments in Boston. Unable to stay away for long, he returned to SBC in 1997 while working for Perot Systems in Singapore. Finally, in 1998 he formally returned to UBS in Stamford following the merger with SBC and has

    1. Model of Procedure Usage – Results from a Qualitative Study to Inform Design of Computer-Based Procedures

      SciTech Connect (OSTI)

      Johanna H Oxstrand; Katya L Le Blanc

      2012-07-01

      The nuclear industry is constantly trying to find ways to decrease the human error rate, especially the human errors associated with procedure use. As a step toward the goal of improving procedure use performance, researchers, together with the nuclear industry, have been looking at replacing the current paper-based procedures with computer-based procedure systems. The concept of computer-based procedures is not new by any means; however most research has focused on procedures used in the main control room. Procedures reviewed in these efforts are mainly emergency operating procedures and normal operating procedures. Based on lessons learned for these previous efforts we are now exploring a more unknown application for computer based procedures - field procedures, i.e. procedures used by nuclear equipment operators and maintenance technicians. The Idaho National Laboratory, the Institute for Energy Technology, and participants from the U.S. commercial nuclear industry are collaborating in an applied research effort with the objective of developing requirements and specifications for a computer-based procedure system to be used by field operators. The goal is to identify the types of human errors that can be mitigated by using computer-based procedures and how to best design the computer-based procedures to do this. The underlying philosophy in the research effort is “Stop – Start – Continue”, i.e. what features from the use of paper-based procedures should we not incorporate (Stop), what should we keep (Continue), and what new features or work processes should be added (Start). One step in identifying the Stop – Start – Continue was to conduct a baseline study where affordances related to the current usage of paper-based procedures were identified. The purpose of the study was to develop a model of paper based procedure use which will help to identify desirable features for computer based procedure prototypes. Affordances such as note taking, markups

    2. advanced simulation and computing

      National Nuclear Security Administration (NNSA)

      Each successive generation of computing system has provided greater computing power and energy efficiency.

      CTS-1 clusters will support NNSA's Life Extension Program and...

    3. Computational Physics and Methods

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      2 Computational Physics and Methods Performing innovative simulations of physics phenomena on tomorrow's scientific computing platforms Growth and emissivity of young galaxy ...

    4. Computer hardware fault administration

      DOE Patents [OSTI]

      Archer, Charles J.; Megerian, Mark G.; Ratterman, Joseph D.; Smith, Brian E.

      2010-09-14

      Computer hardware fault administration carried out in a parallel computer, where the parallel computer includes a plurality of compute nodes. The compute nodes are coupled for data communications by at least two independent data communications networks, where each data communications network includes data communications links connected to the compute nodes. Typical embodiments carry out hardware fault administration by identifying a location of a defective link in the first data communications network of the parallel computer and routing communications data around the defective link through the second data communications network of the parallel computer.

    5. Applied & Computational Math

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      & Computational Math - Sandia Energy Energy Search Icon Sandia Home Locations Contact Us ... Twitter Google + Vimeo GovDelivery SlideShare Applied & Computational Math HomeEnergy ...

    6. Molecular Science Computing | EMSL

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      computational and state-of-the-art experimental tools, providing a cross-disciplinary environment to further research. Additional Information Computing user policies Partners...

    7. Dynamic gating window for compensation of baseline shift in respiratory-gated radiation therapy

      SciTech Connect (OSTI)

      Pepin, Eric W.; Wu Huanmei; Shirato, Hiroki

      2011-04-15

      Purpose: To analyze and evaluate the necessity and use of dynamic gating techniques for compensation of baseline shift during respiratory-gated radiation therapy of lung tumors. Methods: Motion tracking data from 30 lung tumors over 592 treatment fractions were analyzed for baseline shift. The finite state model (FSM) was used to identify the end-of-exhale (EOE) breathing phase throughout each treatment fraction. Using duty cycle as an evaluation metric, several methods of end-of-exhale dynamic gating were compared: An a posteriori ideal gating window, a predictive trend-line-based gating window, and a predictive weighted point-based gating window. These methods were evaluated for each of several gating window types: Superior/inferior (SI) gating, anterior/posterior beam, lateral beam, and 3D gating. Results: In the absence of dynamic gating techniques, SI gating gave a 39.6% duty cycle. The ideal SI gating window yielded a 41.5% duty cycle. The weight-based method of dynamic SI gating yielded a duty cycle of 36.2%. The trend-line-based method yielded a duty cycle of 34.0%. Conclusions: Dynamic gating was not broadly beneficial due to a breakdown of the FSM's ability to identify the EOE phase. When the EOE phase was well defined, dynamic gating showed an improvement over static-window gating.

    8. Baseline Design Compliance Matrix for the Rotary Mode Core Sampling System

      SciTech Connect (OSTI)

      LECHELT, J.A.

      2000-10-17

      The purpose of the design compliance matrix (DCM) is to provide a single-source document of all design requirements associated with the fifteen subsystems that make up the rotary mode core sampling (RMCS) system. It is intended to be the baseline requirement document for the RMCS system and to be used in governing all future design and design verification activities associated with it. This document is the DCM for the RMCS system used on Hanford single-shell radioactive waste storage tanks. This includes the Exhauster System, Rotary Mode Core Sample Trucks, Universal Sampling System, Diesel Generator System, Distribution Trailer, X-Ray Cart System, Breathing Air Compressor, Nitrogen Supply Trailer, Casks and Cask Truck, Service Trailer, Core Sampling Riser Equipment, Core Sampling Support Trucks, Foot Clamp, Ramps and Platforms and Purged Camera System. Excluded items are tools such as light plants and light stands. Other items such as the breather inlet filter are covered by a different design baseline. In this case, the inlet breather filter is covered by the Tank Farms Design Compliance Matrix.

    9. Baseline Fracture Toughness and CGR testing of alloys X-750 and XM-19 (EPRI Phase I)

      SciTech Connect (OSTI)

      J. H. Jackson; S. P. Teysseyre

      2012-02-01

      The Advanced Test Reactor National Scientific User Facility (ATR NSUF) and Electric Power Research Institute (EPRI) formed an agreement to test representative alloys used as reactor structural materials as a pilot program toward establishing guidelines for future ATR NSUF research programs. This report contains results from the portion of this program established as Phase I (of three phases) that entails baseline fracture toughness, stress corrosion cracking (SCC), and tensile testing of selected materials for comparison to similar tests conducted at GE Global Research. The intent of this Phase I research program is to determine baseline properties for the materials of interest prior to irradiation, and to ensure comparability between laboratories using similar testing techniques, prior to applying these techniques to the same materials after having been irradiated at the Advanced Test Reactor (ATR). The materials chosen for this research are the nickel based super alloy X-750, and nitrogen strengthened austenitic stainless steel XM-19. A spare core shroud upper support bracket of alloy X-750 was purchased by EPRI from Southern Co. and a section of XM-19 plate was purchased by EPRI from GE-Hitachi. These materials were sectioned at GE Global Research and provided to INL.

    10. Baseline Fracture Toughness and CGR testing of alloys X-750 and XM-19 (EPRI Phase I)

      SciTech Connect (OSTI)

      J. H. Jackson; S. P. Teysseyre

      2012-10-01

      The Advanced Test Reactor National Scientific User Facility (ATR NSUF) and Electric Power Research Institute (EPRI) formed an agreement to test representative alloys used as reactor structural materials as a pilot program toward establishing guidelines for future ATR NSUF research programs. This report contains results from the portion of this program established as Phase I (of three phases) that entails baseline fracture toughness, stress corrosion cracking (SCC), and tensile testing of selected materials for comparison to similar tests conducted at GE Global Research. The intent of this Phase I research program is to determine baseline properties for the materials of interest prior to irradiation, and to ensure comparability between laboratories using similar testing techniques, prior to applying these techniques to the same materials after having been irradiated at the Advanced Test Reactor (ATR). The materials chosen for this research are the nickel based super alloy X-750, and nitrogen strengthened austenitic stainless steel XM-19. A spare core shroud upper support bracket of alloy X-750 was purchased by EPRI from Southern Co. and a section of XM-19 plate was purchased by EPRI from GE-Hitachi. These materials were sectioned at GE Global Research and provided to INL.

    11. Climate Modeling using High-Performance Computing

      SciTech Connect (OSTI)

      Mirin, A A

      2007-02-05

      The Center for Applied Scientific Computing (CASC) and the LLNL Climate and Carbon Science Group of Energy and Environment (E and E) are working together to improve predictions of future climate by applying the best available computational methods and computer resources to this problem. Over the last decade, researchers at the Lawrence Livermore National Laboratory (LLNL) have developed a number of climate models that provide state-of-the-art simulations on a wide variety of massively parallel computers. We are now developing and applying a second generation of high-performance climate models. Through the addition of relevant physical processes, we are developing an earth systems modeling capability as well.

    12. Lawrence Livermore National Laboratory Emergency Response Capability Baseline Needs Assessment Requirement Document

      SciTech Connect (OSTI)

      Sharry, J A

      2009-12-30

      This revision of the LLNL Fire Protection Baseline Needs Assessment (BNA) was prepared by John A. Sharry, LLNL Fire Marshal and LLNL Division Leader for Fire Protection and reviewed by Martin Gresho, Sandia/CA Fire Marshal. The document follows and expands upon the format and contents of the DOE Model Fire Protection Baseline Capabilities Assessment document contained on the DOE Fire Protection Web Site, but only address emergency response. The original LLNL BNA was created on April 23, 1997 as a means of collecting all requirements concerning emergency response capabilities at LLNL (including response to emergencies at Sandia/CA) into one BNA document. The original BNA documented the basis for emergency response, emergency personnel staffing, and emergency response equipment over the years. The BNA has been updated and reissued five times since in 1998, 1999, 2000, 2002, and 2004. A significant format change was performed in the 2004 update of the BNA in that it was 'zero based.' Starting with the requirement documents, the 2004 BNA evaluated the requirements, and determined minimum needs without regard to previous evaluations. This 2010 update maintains the same basic format and requirements as the 2004 BNA. In this 2010 BNA, as in the previous BNA, the document has been intentionally divided into two separate documents - the needs assessment (1) and the compliance assessment (2). The needs assessment will be referred to as the BNA and the compliance assessment will be referred to as the BNA Compliance Assessment. The primary driver for separation is that the needs assessment identifies the detailed applicable regulations (primarily NFPA Standards) for emergency response capabilities based on the hazards present at LLNL and Sandia/CA and the geographical location of the facilities. The needs assessment also identifies areas where the modification of the requirements in the applicable NFPA standards is appropriate, due to the improved fire protection provided, the

    13. Estimating Demand Response Load Impacts: Evaluation of BaselineLoad Models for Non-Residential Buildings in California

      SciTech Connect (OSTI)

      Coughlin, Katie; Piette, Mary Ann; Goldman, Charles; Kiliccote,Sila

      2008-01-01

      Both Federal and California state policymakers areincreasingly interested in developing more standardized and consistentapproaches to estimate and verify the load impacts of demand responseprograms and dynamic pricing tariffs. This study describes a statisticalanalysis of the performance of different models used to calculate thebaseline electric load for commercial buildings participating in ademand-response (DR) program, with emphasis onthe importance of weathereffects. During a DR event, a variety of adjustments may be made tobuilding operation, with the goal of reducing the building peak electricload. In order to determine the actual peak load reduction, an estimateof what the load would have been on the day of the event without any DRactions is needed. This baseline load profile (BLP) is key to accuratelyassessing the load impacts from event-based DR programs and may alsoimpact payment settlements for certain types of DR programs. We testedseven baseline models on a sample of 33 buildings located in California.These models can be loosely categorized into two groups: (1) averagingmethods, which use some linear combination of hourly load values fromprevious days to predict the load on the event, and (2) explicit weathermodels, which use a formula based on local hourly temperature to predictthe load. The models were tested both with and without morningadjustments, which use data from the day of the event to adjust theestimated BLP up or down.Key findings from this study are: - The accuracyof the BLP model currently used by California utilities to estimate loadreductions in several DR programs (i.e., hourly usage in highest 3 out of10 previous days) could be improved substantially if a morning adjustmentfactor were applied for weather-sensitive commercial and institutionalbuildings. - Applying a morning adjustment factor significantly reducesthe bias and improves the accuracy of all BLP models examined in oursample of buildings. - For buildings with low load

    14. Cosmic Reionization On Computers | Argonne Leadership Computing...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      its Cosmic Reionization On Computers (CROC) project, using the Adaptive Refinement Tree (ART) code as its main simulation tool. An important objective of this research is to make...

    15. ORISE: Web Development

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Web Development As computer-based applications become increasingly popular for the delivery of health care training and information, the need for Web development in support of ...

    16. Advanced Materials Development through Computational Design

      Broader source: Energy.gov [DOE]

      Presentation given at the 2007 Diesel Engine-Efficiency & Emissions Research Conference (DEER 2007). 13-16 August, 2007, Detroit, Michigan. Sponsored by the U.S. Department of Energy's (DOE) Office of FreedomCAR and Vehicle Technologies (OFCVT).

    17. Advanced Materials Development through Computational Design ...

      Broader source: Energy.gov (indexed) [DOE]

      Presentation given at the 2007 Diesel Engine-Efficiency & Emissions Research Conference (DEER ... Office Merit Review 2015: High Temperature Materials for High Efficiency Engines ...

    18. High energy neutron Computed Tomography developed

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      objects. May 9, 2014 Neutron tomography horizontal "slice" of a tungsten and polyethylene test object containing tungsten carbide BBs. Neutron tomography horizontal "slice"...

    19. Accelerating Technology Development through Integrated Computation...

      Office of Scientific and Technical Information (OSTI)

      for COsub 2, Hsub 2, and Osub 2 production), (2) COsub 2 utilization (enhanced oil recovery, chemical production, mineralization, etc.), (3) carbon sequestration (flow in ...

    20. Computational Fluid Dynamics

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      scour-tracc-cfd TRACC RESEARCH Computational Fluid Dynamics Computational Structural Mechanics Transportation Systems Modeling Computational Fluid Dynamics Overview of CFD: Video Clip with Audio Computational fluid dynamics (CFD) research uses mathematical and computational models of flowing fluids to describe and predict fluid response in problems of interest, such as the flow of air around a moving vehicle or the flow of water and sediment in a river. Coupled with appropriate and prototypical

    1. Dual baseline search for muon antineutrino disappearance at 0.1 eV²

      DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

      Cheng, G.; Huelsnitz, W.; Aguilar-Arevalo, A. A.; Alcaraz-Aunion, J. L.; Brice, S. J.; Brown, B. C.; Bugel, L.; Catala-Perez, J.; Church, E. D.; Conrad, J. M.; et al

      2012-09-25

      The MiniBooNE and SciBooNE collaborations report the results of a joint search for short baseline disappearance of ν¯μ at Fermilab’s Booster Neutrino Beamline. The MiniBooNE Cherenkov detector and the SciBooNE tracking detector observe antineutrinos from the same beam, therefore the combined analysis of their data sets serves to partially constrain some of the flux and cross section uncertainties. Uncertainties in the νμ background were constrained by neutrino flux and cross section measurements performed in both detectors. A likelihood ratio method was used to set a 90% confidence level upper limit on ν¯μ disappearance that dramatically improves upon prior limits inmore »the Δm²=0.1–100 eV² region.« less

    2. Etalon-induced baseline drift and correction in atom flux sensors based on atomic absorption spectroscopy

      SciTech Connect (OSTI)

      Du, Yingge; Chambers, Scott A.

      2014-10-20

      Atom flux sensors based on atomic absorption (AA) spectroscopy are of significant interest in thin film growth as they can provide unobtrusive, element specific real-time flux sensing and control. The ultimate sensitivity and performance of these sensors are strongly affected by baseline drift. Here we demonstrate that an etalon effect resulting from temperature changes in optical viewport housings is a major source of signal instability, which has not been previously considered, and cannot be corrected using existing methods. We show that small temperature variations in the fused silica viewports can introduce intensity modulations of up to 1.5% which in turn significantly deteriorate AA sensor performance. This undesirable effect can be at least partially eliminated by reducing the size of the beam and tilting the incident light beam off the viewport normal.

    3. Etalon-induced Baseline Drift And Correction In Atom Flux Sensors Based On Atomic Absorption Spectroscopy

      SciTech Connect (OSTI)

      Du, Yingge; Chambers, Scott A.

      2014-10-20

      Atom flux sensors based on atomic absorption (AA) spectroscopy are of significant interest in thin film growth as they can provide unobtrusive, element specific, real-time flux sensing and control. The ultimate sensitivity and performance of the sensors are strongly affected by the long-term and short term baseline drift. Here we demonstrate that an etalon effect resulting from temperature changes in optical viewport housings is a major source of signal instability which has not been previously considered or corrected by existing methods. We show that small temperature variations in the fused silica viewports can introduce intensity modulations of up to 1.5%, which in turn significantly deteriorate AA sensor performance. This undesirable effect can be at least partially eliminated by reducing the size of the beam and tilting the incident light beam off the viewport normal.

    4. Baseline for Climate Change: Modeling Watershed Aquatic Biodiversity Relative to Environmental and Anthropogenic Factors

      SciTech Connect (OSTI)

      Maurakis, Eugene G

      2010-10-01

      Objectives of the two-year study were to (1) establish baselines for fish and macroinvertebrate community structures in two mid-Atlantic lower Piedmont watersheds (Quantico Creek, a pristine forest watershed; and Cameron Run, an urban watershed, Virginia) that can be used to monitor changes relative to the impacts related to climate change in the future; (2) create mathematical expressions to model fish species richness and diversity, and macroinvertebrate taxa and macroinvertebrate functional feeding group taxa richness and diversity that can serve as a baseline for future comparisons in these and other watersheds in the mid-Atlantic region; and (3) heighten peoples awareness, knowledge and understanding of climate change and impacts on watersheds in a laboratory experience and interactive exhibits, through internship opportunities for undergraduate and graduate students, a week-long teacher workshop, and a website about climate change and watersheds. Mathematical expressions modeled fish and macroinvertebrate richness and diversity accurately well during most of the six thermal seasons where sample sizes were robust. Additionally, hydrologic models provide the basis for estimating flows under varying meteorological conditions and landscape changes. Continuations of long-term studies are requisite for accurately teasing local human influences (e.g. urbanization and watershed alteration) from global anthropogenic impacts (e.g. climate change) on watersheds. Effective and skillful translations (e.g. annual potential exposure of 750,000 people to our inquiry-based laboratory activities and interactive exhibits in Virginia) of results of scientific investigations are valuable ways of communicating information to the general public to enhance their understanding of climate change and its effects in watersheds.

    5. Dixie Valley Engineered Geothermal System Exploration Methodology Project, Baseline Conceptual Model Report

      DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

      Iovenitti, Joe

      FSR Part I presents (1) an assessment of the readily available public domain data and some proprietary data provided by Terra-Gen Power, LLC, (2) a re-interpretation of these data as required, (3) an exploratory geostatistical data analysis, (4) the baseline geothermal conceptual model, and (5) the EGS favorability/trust mapping. The conceptual model presented applies to both the hydrothermal system and EGS in the Dixie Valley region. FSR Part II presents (1) 278 new gravity stations; (2) enhanced gravity-magnetic modeling; (3) 42 new ambient seismic noise survey stations; (4) an integration of the new seismic noise data with a regional seismic network; (5) a new methodology and approach to interpret this data; (5) a novel method to predict rock type and temperature based on the newly interpreted data; (6) 70 new magnetotelluric (MT) stations; (7) an integrated interpretation of the enhanced MT data set; (8) the results of a 308 station soil CO2 gas survey; (9) new conductive thermal modeling in the project area; (10) new convective modeling in the Calibration Area; (11) pseudo-convective modeling in the Calibration Area; (12) enhanced data implications and qualitative geoscience correlations at three scales (a) Regional, (b) Project, and (c) Calibration Area; (13) quantitative geostatistical exploratory data analysis; and (14) responses to nine questions posed in the proposal for this investigation. Enhanced favorability/trust maps were not generated because there was not a sufficient amount of new, fully-vetted (see below) rock type, temperature, and stress data. The enhanced seismic data did generate a new method to infer rock type and temperature. However, in the opinion of the Principal Investigator for this project, this new methodology needs to be tested and evaluated at other sites in the Basin and Range before it is used to generate the referenced maps. As in the baseline conceptual model, the enhanced findings can be applied to both the hydrothermal

    6. Reference manual for toxicity and exposure assessment and risk characterization. CERCLA Baseline Risk Assessment

      SciTech Connect (OSTI)

      1995-03-01

      The Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA, 1980) (CERCLA or Superfund) was enacted to provide a program for identifying and responding to releases of hazardous substances into the environment. The Superfund Amendments and Reauthorization Act (SARA, 1986) was enacted to strengthen CERCLA by requiring that site clean-ups be permanent, and that they use treatments that significantly reduce the volume, toxicity, or mobility of hazardous pollutants. The National Oil and Hazardous Substances Pollution Contingency Plan (NCP) (USEPA, 1985; USEPA, 1990) implements the CERCLA statute, presenting a process for (1) identifying and prioritizing sites requiring remediation and (2) assessing the extent of remedial action required at each site. The process includes performing two studies: a Remedial Investigation (RI) to evaluate the nature, extent, and expected consequences of site contamination, and a Feasibility Study (FS) to select an appropriate remedial alternative adequate to reduce such risks to acceptable levels. An integral part of the RI is the evaluation of human health risks posed by hazardous substance releases. This risk evaluation serves a number of purposes within the overall context of the RI/FS process, the most essential of which is to provide an understanding of ``baseline`` risks posed by a given site. Baseline risks are those risks that would exist if no remediation or institutional controls are applied at a site. This document was written to (1) guide risk assessors through the process of interpreting EPA BRA policy and (2) help risk assessors to discuss EPA policy with regulators, decision makers, and stakeholders as it relates to conditions at a particular DOE site.

    7. Idaho National Engineering Laboratory (INEL) Environmental Restoration (ER) Program Baseline Safety Analysis File (BSAF)

      SciTech Connect (OSTI)

      1995-09-01

      The Baseline Safety Analysis File (BSAF) is a facility safety reference document for the Idaho National Engineering Laboratory (INEL) environmental restoration activities. The BSAF contains information and guidance for safety analysis documentation required by the U.S. Department of Energy (DOE) for environmental restoration (ER) activities, including: Characterization of potentially contaminated sites. Remedial investigations to identify and remedial actions to clean up existing and potential releases from inactive waste sites Decontamination and dismantlement of surplus facilities. The information is INEL-specific and is in the format required by DOE-EM-STD-3009-94, Preparation Guide for U.S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports. An author of safety analysis documentation need only write information concerning that activity and refer to BSAF for further information or copy applicable chapters and sections. The information and guidance provided are suitable for: {sm_bullet} Nuclear facilities (DOE Order 5480-23, Nuclear Safety Analysis Reports) with hazards that meet the Category 3 threshold (DOE-STD-1027-92, Hazard Categorization and Accident Analysis Techniques for Compliance with DOE Order 5480.23, Nuclear Safety Analysis Reports) {sm_bullet} Radiological facilities (DOE-EM-STD-5502-94, Hazard Baseline Documentation) Nonnuclear facilities (DOE-EM-STD-5502-94) that are classified as {open_quotes}low{close_quotes} hazard facilities (DOE Order 5481.1B, Safety Analysis and Review System). Additionally, the BSAF could be used as an information source for Health and Safety Plans and for Safety Analysis Reports (SARs) for nuclear facilities with hazards equal to or greater than the Category 2 thresholds, or for nonnuclear facilities with {open_quotes}moderate{close_quotes} or {open_quotes}high{close_quotes} hazard classifications.

    8. Dixie Valley Engineered Geothermal System Exploration Methodology Project, Baseline Conceptual Model Report

      DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

      Iovenitti, Joe

      2014-01-02

      FSR Part I presents (1) an assessment of the readily available public domain data and some proprietary data provided by Terra-Gen Power, LLC, (2) a re-interpretation of these data as required, (3) an exploratory geostatistical data analysis, (4) the baseline geothermal conceptual model, and (5) the EGS favorability/trust mapping. The conceptual model presented applies to both the hydrothermal system and EGS in the Dixie Valley region. FSR Part II presents (1) 278 new gravity stations; (2) enhanced gravity-magnetic modeling; (3) 42 new ambient seismic noise survey stations; (4) an integration of the new seismic noise data with a regional seismic network; (5) a new methodology and approach to interpret this data; (5) a novel method to predict rock type and temperature based on the newly interpreted data; (6) 70 new magnetotelluric (MT) stations; (7) an integrated interpretation of the enhanced MT data set; (8) the results of a 308 station soil CO2 gas survey; (9) new conductive thermal modeling in the project area; (10) new convective modeling in the Calibration Area; (11) pseudo-convective modeling in the Calibration Area; (12) enhanced data implications and qualitative geoscience correlations at three scales (a) Regional, (b) Project, and (c) Calibration Area; (13) quantitative geostatistical exploratory data analysis; and (14) responses to nine questions posed in the proposal for this investigation. Enhanced favorability/trust maps were not generated because there was not a sufficient amount of new, fully-vetted (see below) rock type, temperature, and stress data. The enhanced seismic data did generate a new method to infer rock type and temperature. However, in the opinion of the Principal Investigator for this project, this new methodology needs to be tested and evaluated at other sites in the Basin and Range before it is used to generate the referenced maps. As in the baseline conceptual model, the enhanced findings can be applied to both the hydrothermal

    9. Argonne's Laboratory computing center - 2007 annual report.

      SciTech Connect (OSTI)

      Bair, R.; Pieper, G. W.

      2008-05-28

      Argonne National Laboratory founded the Laboratory Computing Resource Center (LCRC) in the spring of 2002 to help meet pressing program needs for computational modeling, simulation, and analysis. The guiding mission is to provide critical computing resources that accelerate the development of high-performance computing expertise, applications, and computations to meet the Laboratory's challenging science and engineering missions. In September 2002 the LCRC deployed a 350-node computing cluster from Linux NetworX to address Laboratory needs for mid-range supercomputing. This cluster, named 'Jazz', achieved over a teraflop of computing power (1012 floating-point calculations per second) on standard tests, making it the Laboratory's first terascale computing system and one of the 50 fastest computers in the world at the time. Jazz was made available to early users in November 2002 while the system was undergoing development and configuration. In April 2003, Jazz was officially made available for production operation. Since then, the Jazz user community has grown steadily. By the end of fiscal year 2007, there were over 60 active projects representing a wide cross-section of Laboratory expertise, including work in biosciences, chemistry, climate, computer science, engineering applications, environmental science, geoscience, information science, materials science, mathematics, nanoscience, nuclear engineering, and physics. Most important, many projects have achieved results that would have been unobtainable without such a computing resource. The LCRC continues to foster growth in the computational science and engineering capability and quality at the Laboratory. Specific goals include expansion of the use of Jazz to new disciplines and Laboratory initiatives, teaming with Laboratory infrastructure providers to offer more scientific data management capabilities, expanding Argonne staff use of national computing facilities, and improving the scientific reach and

    10. MELCOR computer code manuals

      SciTech Connect (OSTI)

      Summers, R.M.; Cole, R.K. Jr.; Smith, R.C.; Stuart, D.S.; Thompson, S.L.; Hodge, S.A.; Hyman, C.R.; Sanders, R.L.

      1995-03-01

      MELCOR is a fully integrated, engineering-level computer code that models the progression of severe accidents in light water reactor nuclear power plants. MELCOR is being developed at Sandia National Laboratories for the U.S. Nuclear Regulatory Commission as a second-generation plant risk assessment tool and the successor to the Source Term Code Package. A broad spectrum of severe accident phenomena in both boiling and pressurized water reactors is treated in MELCOR in a unified framework. These include: thermal-hydraulic response in the reactor coolant system, reactor cavity, containment, and confinement buildings; core heatup, degradation, and relocation; core-concrete attack; hydrogen production, transport, and combustion; fission product release and transport; and the impact of engineered safety features on thermal-hydraulic and radionuclide behavior. Current uses of MELCOR include estimation of severe accident source terms and their sensitivities and uncertainties in a variety of applications. This publication of the MELCOR computer code manuals corresponds to MELCOR 1.8.3, released to users in August, 1994. Volume 1 contains a primer that describes MELCOR`s phenomenological scope, organization (by package), and documentation. The remainder of Volume 1 contains the MELCOR Users Guides, which provide the input instructions and guidelines for each package. Volume 2 contains the MELCOR Reference Manuals, which describe the phenomenological models that have been implemented in each package.

    11. Fracture Analysis of Vessels. Oak Ridge FAVOR, v06.1, Computer Code: Theory and Implementation of Algorithms, Methods, and Correlations

      SciTech Connect (OSTI)

      Williams, P. T.; Dickson, T. L.; Yin, S.

      2007-12-01

      The current regulations to insure that nuclear reactor pressure vessels (RPVs) maintain their structural integrity when subjected to transients such as pressurized thermal shock (PTS) events were derived from computational models developed in the early-to-mid 1980s. Since that time, advancements and refinements in relevant technologies that impact RPV integrity assessment have led to an effort by the NRC to re-evaluate its PTS regulations. Updated computational methodologies have been developed through interactions between experts in the relevant disciplines of thermal hydraulics, probabilistic risk assessment, materials embrittlement, fracture mechanics, and inspection (flaw characterization). Contributors to the development of these methodologies include the NRC staff, their contractors, and representatives from the nuclear industry. These updated methodologies have been integrated into the Fracture Analysis of Vessels -- Oak Ridge (FAVOR, v06.1) computer code developed for the NRC by the Heavy Section Steel Technology (HSST) program at Oak Ridge National Laboratory (ORNL). The FAVOR, v04.1, code represents the baseline NRC-selected applications tool for re-assessing the current PTS regulations. This report is intended to document the technical bases for the assumptions, algorithms, methods, and correlations employed in the development of the FAVOR, v06.1, code.

    12. Radiological Worker Computer Based Training

      Energy Science and Technology Software Center (OSTI)

      2003-02-06

      Argonne National Laboratory has developed an interactive computer based training (CBT) version of the standardized DOE Radiological Worker training program. This CD-ROM based program utilizes graphics, animation, photographs, sound and video to train users in ten topical areas: radiological fundamentals, biological effects, dose limits, ALARA, personnel monitoring, controls and postings, emergency response, contamination controls, high radiation areas, and lessons learned.

    13. Polymorphous computing fabric

      DOE Patents [OSTI]

      Wolinski, Christophe Czeslaw; Gokhale, Maya B.; McCabe, Kevin Peter

      2011-01-18

      Fabric-based computing systems and methods are disclosed. A fabric-based computing system can include a polymorphous computing fabric that can be customized on a per application basis and a host processor in communication with said polymorphous computing fabric. The polymorphous computing fabric includes a cellular architecture that can be highly parameterized to enable a customized synthesis of fabric instances for a variety of enhanced application performances thereof. A global memory concept can also be included that provides the host processor random access to all variables and instructions associated with the polymorphous computing fabric.

    14. Computing for Finance

      ScienceCinema (OSTI)

      None

      2011-10-06

      The finance sector is one of the driving forces for the use of distributed or Grid computing for business purposes. The speakers will review the state-of-the-art of high performance computing in the financial sector, and provide insight into how different types of Grid computing ? from local clusters to global networks - are being applied to financial applications. They will also describe the use of software and techniques from physics, such as Monte Carlo simulations, in the financial world. There will be four talks of 20min each. The talk abstracts and speaker bios are listed below. This will be followed by a Q&A; panel session with the speakers. From 19:00 onwards there will be a networking cocktail for audience and speakers. This is an EGEE / CERN openlab event organized in collaboration with the regional business network rezonance.ch. A webcast of the event will be made available for subsequent viewing, along with powerpoint material presented by the speakers. Attendance is free and open to all. Registration is mandatory via www.rezonance.ch, including for CERN staff. 1. Overview of High Performance Computing in the Financial Industry Michael Yoo, Managing Director, Head of the Technical Council, UBS Presentation will describe the key business challenges driving the need for HPC solutions, describe the means in which those challenges are being addressed within UBS (such as GRID) as well as the limitations of some of these solutions, and assess some of the newer HPC technologies which may also play a role in the Financial Industry in the future. Speaker Bio: Michael originally joined the former Swiss Bank Corporation in 1994 in New York as a developer on a large data warehouse project. In 1996 he left SBC and took a role with Fidelity Investments in Boston. Unable to stay away for long, he returned to SBC in 1997 while working for Perot Systems in Singapore. Finally, in 1998 he formally returned to UBS in Stamford following the merger with SBC and has remained

    15. The computational physics program of the National MFE Computer Center

      SciTech Connect (OSTI)

      Mirin, A.A.

      1988-01-01

      The principal objective of the Computational Physics Group is to develop advanced numerical models for the investigation of plasma phenomena and the simulation of present and future magnetic confinement devices. Another major objective of the group is to develop efficient algorithms and programming techniques for current and future generation of supercomputers. The computational physics group is involved in several areas of fusion research. One main area is the application of Fokker-Planck/quasilinear codes to tokamaks. Another major area is the investigation of resistive magnetohydrodynamics in three dimensions, with applications to compact toroids. Another major area is the investigation of kinetic instabilities using a 3-D particle code. This work is often coupled with the task of numerically generating equilibria which model experimental devices. Ways to apply statistical closure approximations to study tokamak-edge plasma turbulence are being examined. In addition to these computational physics studies, the group has developed a number of linear systems solvers for general classes of physics problems and has been making a major effort at ascertaining how to efficiently utilize multiprocessor computers.

    16. Computers in Commercial Buildings

      U.S. Energy Information Administration (EIA) Indexed Site

      Government-owned buildings of all types, had, on average, more than one computer per person (1,104 computers per thousand employees). They also had a fairly high ratio of...

    17. Computers for Learning

      Broader source: Energy.gov [DOE]

      Through Executive Order 12999, the Computers for Learning Program was established to provide Federal agencies a quick and easy system for donating excess and surplus computer equipment to schools...

    18. Vehicle Technologies Office Merit Review 2014: Computational design and development of a new, lightweight cast alloy for advanced cylinder heads in high-efficiency, light-duty engines FOA 648-3a

      Broader source: Energy.gov [DOE]

      Presentation given by General Motors at 2014 DOE Hydrogen and Fuel Cells Program and Vehicle Technologies Office Annual Merit Review and Peer Evaluation Meeting about computational design and...

    19. Supporting collaborative computing and interaction

      SciTech Connect (OSTI)

      Agarwal, Deborah; McParland, Charles; Perry, Marcia

      2002-05-22

      To enable collaboration on the daily tasks involved in scientific research, collaborative frameworks should provide lightweight and ubiquitous components that support a wide variety of interaction modes. We envision a collaborative environment as one that provides a persistent space within which participants can locate each other, exchange synchronous and asynchronous messages, share documents and applications, share workflow, and hold videoconferences. We are developing the Pervasive Collaborative Computing Environment (PCCE) as such an environment. The PCCE will provide integrated tools to support shared computing and task control and monitoring. This paper describes the PCCE and the rationale for its design.

    20. Fermilab | Science at Fermilab | Computing | Grid Computing

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      which would collect more data than any computing center in existence could process. ... consortium grid called Open Science Grid, so they initiated a project known as FermiGrid. ...

    1. Computational Structural Mechanics

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      load-2 TRACC RESEARCH Computational Fluid Dynamics Computational Structural Mechanics Transportation Systems Modeling Computational Structural Mechanics Overview of CSM Computational structural mechanics is a well-established methodology for the design and analysis of many components and structures found in the transportation field. Modern finite-element models (FEMs) play a major role in these evaluations, and sophisticated software, such as the commercially available LS-DYNA® code, is

    2. Computers-BSA.ppt

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Energy Computers, Electronics and Electrical Equipment (2010 MECS) Computers, Electronics and Electrical Equipment (2010 MECS) Manufacturing Energy and Carbon Footprint for Computers, Electronics and Electrical Equipment Sector (NAICS 334, 335) Energy use data source: 2010 EIA MECS (with adjustments) Footprint Last Revised: February 2014 View footprints for other sectors here. Manufacturing Energy and Carbon Footprint Computers, Electronics and Electrical Equipment (123.71 KB) More Documents

    3. Multicore: Fallout from a Computing Evolution

      ScienceCinema (OSTI)

      Yelick, Kathy [Director, NERSC

      2009-09-01

      July 22, 2008 Berkeley Lab lecture: Parallel computing used to be reserved for big science and engineering projects, but in two years that's all changed. Even laptops and hand-helds use parallel processors. Unfortunately, the software hasn't kept pace. Kathy Yelick, Director of the National Energy Research Scientific Computing Center at Berkeley Lab, describes the resulting chaos and the computing community's efforts to develop exciting applications that take advantage of tens or hundreds of processors on a single chip.

    4. Mathematical and Computational Epidemiology

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Mathematical and Computational Epidemiology Search Site submit Contacts | Sponsors Mathematical and Computational Epidemiology Los Alamos National Laboratory change this image and alt text Menu About Contact Sponsors Research Agent-based Modeling Mixing Patterns, Social Networks Mathematical Epidemiology Social Internet Research Uncertainty Quantification Publications People Mathematical and Computational Epidemiology (MCEpi) Quantifying model uncertainty in agent-based simulations for

    5. BNL ATLAS Grid Computing

      ScienceCinema (OSTI)

      Michael Ernst

      2010-01-08

      As the sole Tier-1 computing facility for ATLAS in the United States and the largest ATLAS computing center worldwide Brookhaven provides a large portion of the overall computing resources for U.S. collaborators and serves as the central hub for storing,

    6. Computing environment logbook

      DOE Patents [OSTI]

      Osbourn, Gordon C; Bouchard, Ann M

      2012-09-18

      A computing environment logbook logs events occurring within a computing environment. The events are displayed as a history of past events within the logbook of the computing environment. The logbook provides search functionality to search through the history of past events to find one or more selected past events, and further, enables an undo of the one or more selected past events.

    7. The 1993 baseline biological studies and proposed monitoring plan for the Device Assembly Facility at the Nevada Test Site

      SciTech Connect (OSTI)

      Woodward, B.D.; Hunter, R.B.; Greger, P.D.; Saethre, M.B.

      1995-02-01

      This report contains baseline data and recommendations for future monitoring of plants and animals near the new Device Assembly Facility (DAF) on the Nevada Test Site (NTS). The facility is a large structure designed for safely assembling nuclear weapons. Baseline data was collected in 1993, prior to the scheduled beginning of DAF operations in early 1995. Studies were not performed prior to construction and part of the task of monitoring operational effects will be to distinguish those effects from the extensive disturbance effects resulting from construction. Baseline information on species abundances and distributions was collected on ephemeral and perennial plants, mammals, reptiles, and birds in the desert ecosystems within three kilometers (km) of the DAF. Particular attention was paid to effects of selected disturbances, such as the paved road, sewage pond, and the flood-control dike, associated with the facility. Radiological monitoring of areas surrounding the DAF is not included in this report.

    8. LTC America`s, Inc. PTC-6 vacuum system (metal): Baseline report

      SciTech Connect (OSTI)

      1997-07-31

      The LTC coating removal technology was tested and is being evaluated at Florida International University (FIU) as a baseline technology. In conjunction with FIU`s evaluation of efficiency and cost, this report covers the evaluation conducted for safety and health issues. It is a commercially available technology and has been used for various projects at locations throughout the country. The LTC coating removal system consisted of several hand tools, a Roto Peen scaler, and a needlegun. They are designed to remove coatings from steel, concrete, brick, and wood. These hand tools are used with the LTC PTC-6 vacuum system to capture dust and debris as removal of the coating takes place. The safety and health evaluation during the testing demonstration focused on two main areas of exposure: dust and noise. The dust exposure was minimal but noise exposure was significant. Further testing for each exposure is recommended because of the environment where the testing demonstration took place. It is feasible that the dust and noise levels will be higher in an enclosed operating environment of different construction. In addition, other areas of concern found were arm-hand vibration, whole body vibration, ergonomics, heat stress, tripping hazards, electrical hazards, machine guarding, and lockout/tagout.

    9. Lawrence Livermore National Laboratory Emergency Response Capability 2009 Baseline Needs Assessment Performance Assessment

      SciTech Connect (OSTI)

      Sharry, J A

      2009-12-30

      This document was prepared by John A. Sharry, LLNL Fire Marshal and Division Leader for Fire Protection and was reviewed by Sandia/CA Fire Marshal, Martin Gresho. This document is the second of a two-part analysis of Emergency Response Capabilities of Lawrence Livermore National Laboratory. The first part, 2009 Baseline Needs Assessment Requirements Document established the minimum performance criteria necessary to meet mandatory requirements. This second part analyses the performance of Lawrence Livermore Laboratory Emergency Management Department to the contents of the Requirements Document. The document was prepared based on an extensive review of information contained in the 2004 BNA, a review of Emergency Planning Hazards Assessments, a review of building construction, occupancy, fire protection features, dispatch records, LLNL alarm system records, fire department training records, and fire department policies and procedures. On October 1, 2007, LLNL contracted with the Alameda County Fire Department to provide emergency response services. The level of service called for in that contract is the same level of service as was provided by the LLNL Fire Department prior to that date. This Compliance Assessment will evaluate fire department services beginning October 1, 2008 as provided by the Alameda County Fire Department.

    10. NREL Solar Radiation Research Laboratory (SRRL): Baseline Measurement System (BMS); Golden, Colorado (Data)

      DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

      Stoffel, T.; Andreas, A.

      The SRRL was established at the Solar Energy Research Institute (now NREL) in 1981 to provide continuous measurements of the solar resources, outdoor calibrations of pyranometers and pyrheliometers, and to characterize commercially available instrumentation. The SRRL is an outdoor laboratory located on South Table Mountain, a mesa providing excellent solar access throughout the year, overlooking Denver. Beginning with the basic measurements of global horizontal irradiance, direct normal irradiance and diffuse horizontal irradiance at 5-minute intervals, the SRRL Baseline Measurement System now produces more than 130 data elements at 1-min intervals that are available from the Measurement & Instrumentation Data Center Web site. Data sources include global horizontal, direct normal, diffuse horizontal (from shadowband and tracking disk), global on tilted surfaces, reflected solar irradiance, ultraviolet, infrared (upwelling and downwelling), photometric and spectral radiometers, sky imagery, and surface meteorological conditions (temperature, relative humidity, barometric pressure, precipitation, snow cover, wind speed and direction at multiple levels). Data quality control and assessment include daily instrument maintenance (M-F) with automated data quality control based on real-time examinations of redundant instrumentation and internal consistency checks using NREL's SERI-QC methodology. Operators are notified of equipment problems by automatic e-mail messages generated by the data acquisition and processing system. Radiometers are recalibrated at least annually with reference instruments traceable to the World Radiometric Reference (WRR).

    11. Tank Waste Remediation System retrieval and disposal mission technical baseline summary description

      SciTech Connect (OSTI)

      McLaughlin, T.J.

      1998-01-06

      This document is prepared in order to support the US Department of Energy`s evaluation of readiness-to-proceed for the Waste Retrieval and Disposal Mission at the Hanford Site. The Waste Retrieval and Disposal Mission is one of three primary missions under the Tank Waste Remediation System (TWRS) Project. The other two include programs to characterize tank waste and to provide for safe storage of the waste while it awaits treatment and disposal. The Waste Retrieval and Disposal Mission includes the programs necessary to support tank waste retrieval, wastefeed, delivery, storage and disposal of immobilized waste, and closure of tank farms. This mission will enable the tank farms to be closed and turned over for final remediation. The Technical Baseline is defined as the set of science and engineering, equipment, facilities, materials, qualified staff, and enabling documentation needed to start up and complete the mission objectives. The primary purposes of this document are (1) to identify the important technical information and factors that should be used by contributors to the mission and (2) to serve as a basis for configuration management of the technical information and factors.

    12. NREL Solar Radiation Research Laboratory (SRRL): Baseline Measurement System (BMS); Golden, Colorado (Data)

      DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

      Stoffel, T.; Andreas, A.

      1981-07-15

      The SRRL was established at the Solar Energy Research Institute (now NREL) in 1981 to provide continuous measurements of the solar resources, outdoor calibrations of pyranometers and pyrheliometers, and to characterize commercially available instrumentation. The SRRL is an outdoor laboratory located on South Table Mountain, a mesa providing excellent solar access throughout the year, overlooking Denver. Beginning with the basic measurements of global horizontal irradiance, direct normal irradiance and diffuse horizontal irradiance at 5-minute intervals, the SRRL Baseline Measurement System now produces more than 130 data elements at 1-min intervals that are available from the Measurement & Instrumentation Data Center Web site. Data sources include global horizontal, direct normal, diffuse horizontal (from shadowband and tracking disk), global on tilted surfaces, reflected solar irradiance, ultraviolet, infrared (upwelling and downwelling), photometric and spectral radiometers, sky imagery, and surface meteorological conditions (temperature, relative humidity, barometric pressure, precipitation, snow cover, wind speed and direction at multiple levels). Data quality control and assessment include daily instrument maintenance (M-F) with automated data quality control based on real-time examinations of redundant instrumentation and internal consistency checks using NREL's SERI-QC methodology. Operators are notified of equipment problems by automatic e-mail messages generated by the data acquisition and processing system. Radiometers are recalibrated at least annually with reference instruments traceable to the World Radiometric Reference (WRR).

    13. Baseline tests for arc melter vitrification of INEL buried wastes. Volume 1: Facility description and summary data report

      SciTech Connect (OSTI)

      Oden, L.L.; O`Connor, W.K.; Turner, P.C.; Soelberg, N.R.; Anderson, G.L.

      1993-11-19

      This report presents field results and raw data from the Buried Waste Integrated Demonstration (BWID) Arc Melter Vitrification Project Phase 1 baseline test series conducted by the Idaho National Engineering Laboratory (INEL) in cooperation with the U.S. Bureau of Mines (USBM). The baseline test series was conducted using the electric arc melter facility at the USBM Albany Research Center in Albany, Oregon. Five different surrogate waste feed mixtures were tested that simulated thermally-oxidized, buried, TRU-contaminated, mixed wastes and soils present at the INEL. The USBM Arc Furnace Integrated Waste Processing Test Facility includes a continuous feed system, the arc melting furnace, an offgas control system, and utilities. The melter is a sealed, 3-phase alternating current (ac) furnace approximately 2 m high and 1.3 m wide. The furnace has a capacity of 1 metric ton of steel and can process as much as 1,500 lb/h of soil-type waste materials. The surrogate feed materials included five mixtures designed to simulate incinerated TRU-contaminated buried waste materials mixed with INEL soil. Process samples, melter system operations data and offgas composition data were obtained during the baseline tests to evaluate the melter performance and meet test objectives. Samples and data gathered during this program included (a) automatically and manually logged melter systems operations data, (b) process samples of slag, metal and fume solids, and (c) offgas composition, temperature, velocity, flowrate, moisture content, particulate loading and metals content. This report consists of 2 volumes: Volume I summarizes the baseline test operations. It includes an executive summary, system and facility description, review of the surrogate waste mixtures, and a description of the baseline test activities, measurements, and sample collection. Volume II contains the raw test data and sample analyses from samples collected during the baseline tests.

    14. COMPUTATIONAL SCIENCE CENTER

      SciTech Connect (OSTI)

      DAVENPORT, J.

      2005-11-01

      The Brookhaven Computational Science Center brings together researchers in biology, chemistry, physics, and medicine with applied mathematicians and computer scientists to exploit the remarkable opportunities for scientific discovery which have been enabled by modern computers. These opportunities are especially great in computational biology and nanoscience, but extend throughout science and technology and include, for example, nuclear and high energy physics, astrophysics, materials and chemical science, sustainable energy, environment, and homeland security. To achieve our goals we have established a close alliance with applied mathematicians and computer scientists at Stony Brook and Columbia Universities.

    15. Scalable optical quantum computer

      SciTech Connect (OSTI)

      Manykin, E A; Mel'nichenko, E V [Institute for Superconductivity and Solid-State Physics, Russian Research Centre 'Kurchatov Institute', Moscow (Russian Federation)

      2014-12-31

      A way of designing a scalable optical quantum computer based on the photon echo effect is proposed. Individual rare earth ions Pr{sup 3+}, regularly located in the lattice of the orthosilicate (Y{sub 2}SiO{sub 5}) crystal, are suggested to be used as optical qubits. Operations with qubits are performed using coherent and incoherent laser pulses. The operation protocol includes both the method of measurement-based quantum computations and the technique of optical computations. Modern hybrid photon echo protocols, which provide a sufficient quantum efficiency when reading recorded states, are considered as most promising for quantum computations and communications. (quantum computer)

    16. Computers for artificial intelligence a technology assessment and forecast

      SciTech Connect (OSTI)

      Miller, R.K.

      1986-01-01

      This study reviews the development and current state-of-the-art in computers for artificial intelligence, including LISP machines, AI workstations, professional and engineering workstations, minicomputers, mainframes, and supercomputers. Major computer systems for AI applications are reviewed. The use of personal computers for expert system development is discussed, and AI software for the IBM PC, Texas Instrument Professional Computer, and Apple MacIntosh is presented. Current research aimed at developing a new computer for artificial intelligence is described, and future technological developments are discussed.

    17. Statistical Comparison of the Baseline Mechanical Properties of NBG-18 and PCEA Graphite

      SciTech Connect (OSTI)

      Mark C. Carroll; David T. Rohrbaugh

      2013-08-01

      High-purity graphite is the core structural material of choice in the Very High Temperature Reactor (VHTR), a graphite-moderated, helium-cooled design that is capable of producing process heat for power generation and for industrial process that require temperatures higher than the outlet temperatures of present nuclear reactors. The Baseline Graphite Characterization Program is endeavoring to minimize the conservative estimates of as-manufactured mechanical and physical properties by providing comprehensive data that captures the level of variation in measured values. In addition to providing a comprehensive comparison between these values in different nuclear grades, the program is also carefully tracking individual specimen source, position, and orientation information in order to provide comparisons and variations between different lots, different billets, and different positions from within a single billet. This report is a preliminary comparison between the two grades of graphite that were initially favored in the two main VHTR designs. NBG-18, a medium-grain pitch coke graphite from SGL formed via vibration molding, was the favored structural material in the pebble-bed configuration, while PCEA, a smaller grain, petroleum coke, extruded graphite from GrafTech was favored for the prismatic configuration. An analysis of the comparison between these two grades will include not only the differences in fundamental and statistically-significant individual strength levels, but also the differences in variability in properties within each of the grades that will ultimately provide the basis for the prediction of in-service performance. The comparative performance of the different types of nuclear grade graphites will continue to evolve as thousands more specimens are fully characterized from the numerous grades of graphite being evaluated.

    18. Vandenberg Air Force Base integrated resource assessment. Volume 2, Baseline detail

      SciTech Connect (OSTI)

      Halverson, M.A.; Richman, E.E.; Dagle, J.E.; Hickman, B.J.; Daellenbach, K.K.; Sullivan, G.P.

      1993-06-01

      The US Air Force Space Command has tasked the Pacific Northwest Laboratory, as the lead laboratory supporting the US Department of Energy Federal Energy Management Program, to identify, evaluate, and assist in acquiring all cost-effective energy projects at Vandenberg Air Force Base (VAFB). This is a model program PNL is designing for federal customers served by the Pacific Gas and Electric Company (PG and E). The primary goal of the VAFB project is to identify all electric energy efficiency opportunities, and to negotiate with PG and E to acquire those resources through a customized demand-side management program for its federal clients. That customized program should have three major characteristics: (1) 100% up-front financing; (2) substantial utility cost-sharing; and (3) utility implementation through energy service companies under contract to the utility. A similar arrangement will be pursued with Southern California Gas for non-electric resource opportunities if that is deemed desirable by the site and if the gas utility seems open to such an approach. This report documents the assessment of baseline energy use at VAFB located near Lompoc, California. It is a companion report to Volume 1, Executive Summary, and Volume 3, Resource Assessment. This analysis examines the characteristics of electric, natural gas, fuel oil, and propane use for fiscal year 1991. It records energy-use intensities for the facilities at VAFB by building type and energy end use. It also breaks down building energy consumption by fuel type, energy end use, and building type. A more complete energy consumption reconciliation is presented that includes the accounting of all energy use among buildings, utilities, and applicable losses.

    19. Baseline Utilization of Breast Radiotherapy Before Institution of the Medicare Practice Quality Reporting Initiative

      SciTech Connect (OSTI)

      Smith, Benjamin D. Smith, Grace L.; Roberts, Kenneth B.; Buchholz, Thomas A.

      2009-08-01

      Purpose: In 2007, Medicare implemented the Physician Quality Reporting Initiative (PQRI), which provides financial incentives to physicians who report their performance on certain quality measures. PQRI measure no. 74 recommends radiotherapy for patients treated with conservative surgery (CS) for invasive breast cancer. As a first step in evaluating the potential impact of this measure, we assessed baseline use of radiotherapy among women diagnosed with invasive breast cancer before implementation of PQRI. Methods and Materials: Using the SEER-Medicare data set, we identified women aged 66-70 diagnosed with invasive breast cancer and treated with CS between 2000 and 2002. Treatment with radiotherapy was determined using SEER and claims data. Multivariate logistic regression tested whether receipt of radiotherapy varied significantly across clinical, pathologic, and treatment covariates. Results: Of 3,674 patients, 94% (3,445) received radiotherapy. In adjusted analysis, the presence of comorbid illness (odds ratio [OR] 1.69; 95% confidence interval [CI], 1.19-2.42) and unmarried marital status were associated with omission of radiotherapy (OR 1.65; 95% CI, 1.22-2.20). In contrast, receipt of chemotherapy was protective against omission of radiotherapy (OR 0.25; 95% CI, 0.16-0.38). Race and geographic region did not correlate with radiotherapy utilization. Conclusions: Utilization of radiotherapy following CS was high for patients treated before institution of PQRI, suggesting that at most 6% of patients could benefit from measure no. 74. Further research is needed to determine whether institution of PQRI will affect radiotherapy utilization.

    20. Baseline biological risk assessment for aquatic populations occurring near Eielson Air Force Base, Alaska

      SciTech Connect (OSTI)

      Dauble, D.; Brandt, C.; Lewis, R.; Smith, R.

      1995-12-31

      Eielson Air Force Base (AFB), Alaska was listed as a Superfund site in November 1989 with 64 potential source areas of contamination. As part of a sitewide remedial investigation, baseline risk assessments were conducted in 1993 and 1994 to evaluate hazards posed to biological receptors and to human health. Fish tissue, aquatic invertebrates, aquatic vegetation, sediment, and surface water data were collected from several on-site and off-site surface water bodies. An initial screening risk assessment indicated that several surface water sites along two major tributary creeks flowing through the base had unacceptable risks to both aquatic receptors and to human health because of DDTs. Other contaminants of concern (i.e., PCBs and PAHs) were below screening risk levels for aquatic organisms, but contributed to an unacceptable risk to human health. Additional samples was taken in 1994 to characterize the site-wide distribution of PAHs, DDTs, and PCBs in aquatic biota and sediments. Concentrations of PAHs were invertebrates > aquatic vegetation > fish, but concentrations were sufficiently low that they posed no significant risk to biological receptors. Pesticides were detected in all fish tissue samples. Polychlorinated biphenyls (PCBs) were also detected in most fish from Garrison Slough. The pattern of PCB concentrations in Arctic grayling (Thymallus arcticus) was related to their proximity to a sediment source in lower Garrison Slough. Ingestion of PCB-contaminated fish is the primary human-health risk driver for surface water bodies on Eielson AFB, resulting in carcinogenic risks > 1 {times} 10{sup {minus}4} for future recreational land-use at some sites. Principal considerations affecting uncertainty in the risk assessment process included spatial and temporal variability in media contaminant concentrations and inconsistencies between modelled and measured body burdens.

    1. Breckinridge Project, initial effort. Report VII, Volume II. Environmental baseline report

      SciTech Connect (OSTI)

      1982-01-01

      Ashland Synthetic Fuels, Inc. (ASFI) and Airco Energy Company, Inc. (AECI) have recently formed the Breckinridge Project and are currently conducting a process and economic feasibility study of a commercial scale facility to produce synthetic liquid fuels from coal. The coal conversion process to be used is the H-COAL process, which is in the pilot plant testing stage under the auspices of the US Department of Energy at the H-COAL Pilot Plant Project near Catlettsburg, Kentucky. The preliminary plans for the commercial plant are for a 18,140 metric ton/day (24,000 ton/day) nominal coal assumption capacity utilizing the abundant high sulfur Western Kentucky coals. The Western Kentucky area offers a source of the coal along with adequate water, power, labor, transportation and other factors critical to the successful siting of a plant. Various studies by federal and state governments, as well as private industry, have reached similar conclusions regarding the suitability of such plant sites in western Kentucky. Of the many individual sites evaluated, a site in Breckinridge County, Kentucky, approximately 4 kilometers (2.5 miles) west of the town of Stephensport, has been identified as the plant location. Actions have been taken to obtain options to insure that this site will be available when needed. This report contains an overview of the regional setting and results of the baseline environmental studies. These studies include collection of data on ambient air and water quality, sound, aquatic and terrestrial biology and geology. This report contains the following chapters; introduction, review of significant findings, ambient air quality monitoring, sound, aquatic ecology, vegetation, wildlife, geology, soils, surface water, and ground water.

    2. Baseline risk assessment for exposure to contaminants at the St. Louis Site, St. Louis, Missouri

      SciTech Connect (OSTI)

      Not Available

      1993-11-01

      The St. Louis Site comprises three noncontiguous areas in and near St. Louis, Missouri: the St. Louis Downtown Site (SLDS), the St. Louis Airport Storage Site (SLAPS), and the Latty Avenue Properties. The main site of the Latty Avenue Properties includes the Hazelwood Interim Storage Site (HISS) and the Futura Coatings property, which are located at 9200 Latty Avenue. Contamination at the St. Louis Site is the result of uranium processing and disposal activities that took place from the 1940s through the 1970s. Uranium processing took place at the SLDS from 1942 through 1957. From the 1940s through the 1960s, SLAPS was used as a storage area for residues from the manufacturing operations at SLDS. The materials stored at SLAPS were bought by Continental Mining and Milling Company of Chicago, Illinois, in 1966, and moved to the HISS/Futura Coatings property at 9200 Latty Avenue. Vicinity properties became contaminated as a result of transport and movement of the contaminated material among SLDS, SLAPS, and the 9200 Latty Avenue property. This contamination led to the SLAPS, HISS, and Futura Coatings properties being placed on the National Priorities List (NPL) of the US Environmental Protection Agency (EPA). The US Department of Energy (DOE) is responsible for cleanup activities at the St. Louis Site under its Formerly Utilized Sites Remedial Action Program (FUSRAP). The primary goal of FUSRAP is the elimination of potential hazards to human health and the environment at former Manhattan Engineer District/Atomic Energy Commission (MED/AEC) sites so that, to the extent possible, these properties can be released for use without restrictions. To determine and establish cleanup goals for the St. Louis Site, DOE is currently preparing a remedial investigation/feasibility study-environmental impact statement (RI/FS-EIS). This baseline risk assessment (BRA) is a component of the process; it addresses potential risk to human health and the environment associated wi

    3. Level III baseline risk evaluation for Building 3505 at Oak Ridge National Laboratory, Oak Ridge, Tennessee

      SciTech Connect (OSTI)

      Mostella, W.B. Jr.

      1994-12-01

      The Level III Baseline Risk Evaluation (BRE) for Building 3505, the ORNL Metal Recovery Facility, provides an analysis of the potential for adverse health effects, current or future, associated with the presence of hazardous substances in the building. The Metal Recovery Facility was used from 1952 through 1960 to process large quantities of radioactive material using the PUREX process for the recovery of uranium-238, plutonium-239, neptunium-237, and americium-241. The facility consists of seven process cells (A through G), a canal, a dissolver room, a dissolver pit, an office, locker room, storage area, control room, electrical gallery, shop, and makeup area. The cells were used to house the nuclear fuel reprocessing equipment, and the canal was constructed to be used as a water-shielded transfer canal. Currently, there are no known releases of radioactive contaminants from Building 3505. To perform the BRE, historical radiological survey data were used to estimate the concentration of alpha- and beta/gamma emitting radionuclides in the various cells, rooms, and other areas in Building 3505. Data from smear surveys were used to estimate the amount of transferable contamination (to which receptors can be exposed via inhalation and ingestion), and data from probe surveys were used to estimate the amount of both fixed and transferable contamination (from which receptors can receive external exposure). Two land use scenarios, current and future, and their subsequent exposure scenarios were explored in the BRE. Under the current land use scenario, two exposure scenarios were evaluated. The first was a worst-case industrial exposure scenario in which the receptor is a maintenance worker who works 8 hours/day, 350 days/year in the building for 25 years. In the second, more realistic exposure scenario, the receptor is a surveillance and maintenance (S&M) worker who spends two 8-hour days/year in the building for 25 years.

    4. M & V Shootout: Setting the Stage For Testing the Performance of New Energy Baseline

      SciTech Connect (OSTI)

      Touzani, Samir; Custodio, Claudine; Sohn, Michael; Fernandes, Samuel; Granderson, Jessica; Jump, David; Taylor, Cody

      2015-07-01

      Trustworthy savings calculations are critical to convincing investors in energy efficiency projects of the benefit and cost-effectiveness of such investments and their ability to replace or defer supply-side capital investments. However, today’s methods for measurement and verification (M&V) of energy savings constitute a significant portion of the total costs of efficiency projects. They also require time-consuming data acquisition and often do not deliver results until years after the program period has ended. A spectrum of savings calculation approaches are used, with some relying more heavily on measured data and others relying more heavily on estimated or modeled data, or stipulated information. The rising availability of “smart” meters, combined with new analytical approaches to quantifying savings, has opened the door to conducting M&V more quickly and at lower cost, with comparable or improved accuracy. Energy management and information systems (EMIS) technologies, not only enable significant site energy savings, but are also beginning to offer M&V capabilities. This paper expands recent analyses of public-domain, whole-building M&V methods, focusing on more novel baseline modeling approaches that leverage interval meter data. We detail a testing procedure and metrics to assess the performance of these new approaches using a large test dataset. We also provide conclusions regarding the accuracy, cost, and time trade-offs between more traditional M&V and these emerging streamlined methods. Finally, we discuss the potential evolution of M&V to better support the energy efficiency industry through low-cost approaches, and the long-term agenda for validation of building energy analytics.

    5. Sandia Energy - High Performance Computing

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      High Performance Computing Home Energy Research Advanced Scientific Computing Research (ASCR) High Performance Computing High Performance Computingcwdd2015-03-18T21:41:24+00:00...

    6. Computing and Computational Sciences Directorate - Computer Science and

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Mathematics Division - Meetings and Workshops Awards Awards Night 2012 R&D LEADERSHIP, DIRECTOR LEVEL Winner: Brian Worley Organization: Computational Sciences & Engineering Division Citation: For exemplary program leadership of a successful and growing collaboration with the Department of Defense and for successfully initiating and providing oversight of a new data program with the Centers for Medicare and Medicaid Services. TECHNICAL SUPPORT Winner: Michael Matheson Organization:

    7. EA-2020: Energy Efficiency Standards for New Federal Low-Rise Residential Buildings’ Baseline Standards Update (RIN 1904-AD56)

      Broader source: Energy.gov [DOE]

      This EA will evaluate the potential environmental impacts of implementing the provisions in the Energy Conservation and Production Act (ECPA) that require DOE to update the baseline Federal energy efficiency performance standards for the construction of new Federal buildings, including low-rise residential buildings.

    8. Session on computation in biological pathways

      SciTech Connect (OSTI)

      Karp, P.D.; Riley, M.

      1996-12-31

      The papers in this session focus on the development of pathway databases and computational tools for pathway analysis. The discussion involves existing databases of sequenced genomes, as well as techniques for studying regulatory pathways.

    9. Scientific Cloud Computing Misconceptions

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Scientific Cloud Computing Misconceptions Scientific Cloud Computing Misconceptions July 1, 2011 Part of the Magellan project was to understand both the possibilities and the limitations of cloud computing in the pursuit of science. At a recent conference, Magellan investigator Shane Canon outlined some persistent misconceptions about doing science in the cloud - and what Magellan has taught us about them. » Read the ISGTW story. » Download the slides (PDF, 4.1MB

    10. Edison Electrifies Scientific Computing

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Edison Electrifies Scientific Computing Edison Electrifies Scientific Computing NERSC Flips Switch on New Flagship Supercomputer January 31, 2014 Contact: Margie Wylie, mwylie@lbl.gov, +1 510 486 7421 The National Energy Research Scientific Computing (NERSC) Center recently accepted "Edison," a new flagship supercomputer designed for scientific productivity. Named in honor of American inventor Thomas Alva Edison, the Cray XC30 will be dedicated in a ceremony held at the Department of

    11. Energy Aware Computing

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Partnerships Shifter: User Defined Images Archive APEX Home » R & D » Energy Aware Computing Energy Aware Computing Dynamic Frequency Scaling One means to lower the energy required to compute is to reduce the power usage on a node. One way to accomplish this is by lowering the frequency at which the CPU operates. However, reducing the clock speed increases the time to solution, creating a potential tradeoff. NERSC continues to examine how such methods impact its operations and its

    12. NERSC Computer Security

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Security NERSC Computer Security NERSC computer security efforts are aimed at protecting NERSC systems and its users' intellectual property from unauthorized access or modification. Among NERSC's security goal are: 1. To protect NERSC systems from unauthorized access. 2. To prevent the interruption of services to its users. 3. To prevent misuse or abuse of NERSC resources. Security Incidents If you think there has been a computer security incident you should contact NERSC Security as soon as

    13. Personal Computer Inventory System

      Energy Science and Technology Software Center (OSTI)

      1993-10-04

      PCIS is a database software system that is used to maintain a personal computer hardware and software inventory, track transfers of hardware and software, and provide reports.

    14. Applied Computer Science

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      7 Applied Computer Science Innovative co-design of applications, algorithms, and architectures in order to enable scientific simulations at extreme scale Leadership Group Leader ...

    15. Parallel Computing Summer Research Internship

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Mentors Parallel Computing Summer Research Internship Creates next-generation leaders in HPC research and applications development Contacts Program Co-Lead Robert (Bob) Robey Email Program Co-Lead Gabriel Rockefeller Email Program Co-Lead Hai Ah Nam Email Professional Staff Assistant Nickole Aguilar Garcia (505) 665-3048 Email 2016: Mentors Bob Robey Bob Robey XCP-2: EULERIAN CODES Bob Robey is a Research Scientist in the Eulerian Applications group at Los Alamos National Laboratory. He is the

    16. Parallel Computing Summer Research Internship

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Guide to Los Alamos Parallel Computing Summer Research Internship Creates next-generation leaders in HPC research and applications development Contacts Program Co-Lead Robert (Bob) Robey Email Program Co-Lead Gabriel Rockefeller Email Program Co-Lead Hai Ah Nam Email Professional Staff Assistant Nickole Aguilar Garcia (505) 665-3048 Email Guide to Los Alamos During your 10-week internship, we hope you have the opportunity to explore and enjoy Los Alamos and the surrounding area. Here are some

    17. Molecular Science Computing: 2010 Greenbook

      SciTech Connect (OSTI)

      De Jong, Wibe A.; Cowley, David E.; Dunning, Thom H.; Vorpagel, Erich R.

      2010-04-02

      This 2010 Greenbook outlines the science drivers for performing integrated computational environmental molecular research at EMSL and defines the next-generation HPC capabilities that must be developed at the MSC to address this critical research. The EMSL MSC Science Panel used EMSL’s vision and science focus and white papers from current and potential future EMSL scientific user communities to define the scientific direction and resulting HPC resource requirements presented in this 2010 Greenbook.

    18. Michael Levitt and Computational Biology

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Michael Levitt and Computational Biology Resources with Additional Information * Publications Michael Levitt Courtesy of Linda A. Cicero / Stanford News Service Michael Levitt, PhD, professor of structural biology at the Stanford University School of Medicine, has won the 2013 Nobel Prize in Chemistry. ... Levitt ... shares the ... prize with Martin Karplus ... and Arieh Warshel ... "for the development of multiscale models for complex chemical systems." Levitt's work focuses on

    19. 60 Years of Computing | Department of Energy

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      60 Years of Computing 60 Years of Computing

    20. INITIAL COMPARISON OF BASELINE PHYSICAL AND MECHANICAL PROPERTIES FOR THE VHTR CANDIDATE GRAPHITE GRADES

      SciTech Connect (OSTI)

      Carroll, Mark C

      2014-09-01

      High-purity graphite is the core structural material of choice in the Very High Temperature Reactor (VHTR) design, a graphite-moderated, helium-cooled configuration that is capable of producing thermal energy for power generation as well as process heat for industrial applications that require temperatures higher than the outlet temperatures of present nuclear reactors. The Baseline Graphite Characterization Program is endeavoring to minimize the conservative estimates of as-manufactured mechanical and physical properties in nuclear-grade graphites by providing comprehensive data that captures the level of variation in measured values. In addition to providing a thorough comparison between these values in different graphite grades, the program is also carefully tracking individual specimen source, position, and orientation information in order to provide comparisons both in specific properties and in the associated variability between different lots, different billets, and different positions from within a single billet. This report is a preliminary comparison between each of the grades of graphite that are considered candidate grades from four major international graphite producers. These particular grades (NBG-18, NBG-17, PCEA, IG-110, and 2114) are the major focus of the evaluations presently underway on irradiated graphite properties through the series of Advanced Graphite Creep (AGC) experiments. NBG-18, a medium-grain pitch coke graphite from SGL from which billets are formed via vibration molding, was the favored structural material in the pebble-bed configuration. NBG-17 graphite from SGL is essentially NBG-18 with the grain size reduced by a factor of two. PCEA, petroleum coke graphite from GrafTech with a similar grain size to NBG-17, is formed via an extrusion process and was initially considered the favored grade for the prismatic layout. IG-110 and 2114, from Toyo Tanso and Mersen (formerly Carbone Lorraine), respectively, are fine-grain grades