National Library of Energy BETA

Sample records for baseline computational model

  1. Proposed Methodology for LEED Baseline Refrigeration Modeling (Presentation)

    SciTech Connect (OSTI)

    Deru, M.

    2011-02-01

    This PowerPoint presentation summarizes a proposed methodology for LEED baseline refrigeration modeling. The presentation discusses why refrigeration modeling is important, the inputs of energy models, resources, reference building model cases, baseline model highlights, example savings calculations and results.

  2. Integrated Baseline Bystem (IBS) Version 1.03: Models guide

    SciTech Connect (OSTI)

    Not Available

    1993-01-01

    The Integrated Baseline System)(IBS), operated by the Federal Emergency Management Agency (FEMA), is a system of computerized tools for emergency planning and analysis. This document is the models guide for the IBS and explains how to use the emergency related computer models. This document provides information for the experienced system user, and is the primary reference for the computer modeling software supplied with the system. It is designed for emergency managers and planners, and others familiar with the concepts of computer modeling. Although the IBS manual set covers basic and advanced operations, it is not a complete reference document set. Emergency situation modeling software in the IBS is supported by additional technical documents. Some of the other IBS software is commercial software for which more complete documentation is available. The IBS manuals reference such documentation where necessary.

  3. Integrated Baseline System (IBS) Version 2.0: Models guide

    SciTech Connect (OSTI)

    Not Available

    1994-03-01

    The Integrated Baseline System (IBS) is an emergency management planning and analysis tool being developed under the direction of the US Army Nuclear and Chemical Agency. This Models Guide summarizes the IBS use of several computer models for predicting the results of emergency situations. These include models for predicting dispersion/doses of airborne contaminants, traffic evacuation, explosion effects, heat radiation from a fire, and siren sound transmission. The guide references additional technical documentation on the models when such documentation is available from other sources. The audience for this manual is chiefly emergency management planners and analysts, but also data managers and system managers.

  4. Functional Testing Protocols for Commercial Building Efficiency Baseline Modeling Software

    SciTech Connect (OSTI)

    Jump, David; Price, Phillip N.; Granderson, Jessica; Sohn, Michael

    2013-09-06

    This document describes procedures for testing and validating proprietary baseline energy modeling software accuracy in predicting energy use over the period of interest, such as a month or a year. The procedures are designed according to the methodology used for public domain baselining software in another LBNL report that was (like the present report) prepared for Pacific Gas and Electric Company: ?Commercial Building Energy Baseline Modeling Software: Performance Metrics and Method Testing with Open Source Models and Implications for Proprietary Software Testing Protocols? (referred to here as the ?Model Analysis Report?). The test procedure focuses on the quality of the software?s predictions rather than on the specific algorithms used to predict energy use. In this way the software vendor is not required to divulge or share proprietary information about how their software works, while enabling stakeholders to assess its performance.

  5. Theory, Modeling and Computation

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    modeling and simulation will be enhanced not only by the wealth of data available from MaRIE but by the increased computational capacity made possible by the advent of extreme...

  6. Understanding the Effect of Baseline Modeling Implementation Choices on Analysis of Demand Response Performance

    SciTech Connect (OSTI)

    University of California, Berkeley; Addy, Nathan; Kiliccote, Sila; Mathieu, Johanna; Callaway, Duncan S.

    2012-06-13

    Accurate evaluation of the performance of buildings participating in Demand Response (DR) programs is critical to the adoption and improvement of these programs. Typically, we calculate load sheds during DR events by comparing observed electric demand against counterfactual predictions made using statistical baseline models. Many baseline models exist and these models can produce different shed calculations. Moreover, modelers implementing the same baseline model can make different modeling implementation choices, which may affect shed estimates. In this work, using real data, we analyze the effect of different modeling implementation choices on shed predictions. We focused on five issues: weather data source, resolution of data, methods for determining when buildings are occupied, methods for aligning building data with temperature data, and methods for power outage filtering. Results indicate sensitivity to the weather data source and data filtration methods as well as an immediate potential for automation of methods to choose building occupied modes.

  7. Performance Modeling for 3D Visualization in a Heterogeneous Computing

    Office of Scientific and Technical Information (OSTI)

    Environment (Technical Report) | SciTech Connect Performance Modeling for 3D Visualization in a Heterogeneous Computing Environment Citation Details In-Document Search Title: Performance Modeling for 3D Visualization in a Heterogeneous Computing Environment The visualization of large, remotely located data sets necessitates the development of a distributed computing pipeline in order to reduce the data, in stages, to a manageable size. The required baseline infrastructure for launching such

  8. Dixie Valley Engineered Geothermal System Exploration Methodology Project, Baseline Conceptual Model Report

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Iovenitti, Joe

    2013-05-15

    The Engineered Geothermal System (EGS) Exploration Methodology Project is developing an exploration approach for EGS through the integration of geoscientific data. The Project chose the Dixie Valley Geothermal System in Nevada as a field laboratory site for methodlogy calibration purposes because, in the public domain, it is a highly characterized geothermal systems in the Basin and Range with a considerable amount of geoscience and most importantly, well data. This Baseline Conceptual Model report summarizes the results of the first three project tasks (1) collect and assess the existing public domain geoscience data, (2) design and populate a GIS database, and (3) develop a baseline (existing data) geothermal conceptual model, evaluate geostatistical relationships, and generate baseline, coupled EGS favorability/trust maps from +1km above sea level (asl) to -4km asl for the Calibration Area (Dixie Valley Geothermal Wellfield) to identify EGS drilling targets at a scale of 5km x 5km. It presents (1) an assessment of the readily available public domain data and some proprietary data provided by Terra-Gen Power, LLC, (2) a re-interpretation of these data as required, (3) an exploratory geostatistical data analysis, (4) the baseline geothermal conceptual model, and (5) the EGS favorability/trust mapping. The conceptual model presented applies to both the hydrothermal system and EGS in the Dixie Valley region.

  9. Dixie Valley Engineered Geothermal System Exploration Methodology Project, Baseline Conceptual Model Report

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Iovenitti, Joe

    The Engineered Geothermal System (EGS) Exploration Methodology Project is developing an exploration approach for EGS through the integration of geoscientific data. The Project chose the Dixie Valley Geothermal System in Nevada as a field laboratory site for methodlogy calibration purposes because, in the public domain, it is a highly characterized geothermal systems in the Basin and Range with a considerable amount of geoscience and most importantly, well data. This Baseline Conceptual Model report summarizes the results of the first three project tasks (1) collect and assess the existing public domain geoscience data, (2) design and populate a GIS database, and (3) develop a baseline (existing data) geothermal conceptual model, evaluate geostatistical relationships, and generate baseline, coupled EGS favorability/trust maps from +1km above sea level (asl) to -4km asl for the Calibration Area (Dixie Valley Geothermal Wellfield) to identify EGS drilling targets at a scale of 5km x 5km. It presents (1) an assessment of the readily available public domain data and some proprietary data provided by Terra-Gen Power, LLC, (2) a re-interpretation of these data as required, (3) an exploratory geostatistical data analysis, (4) the baseline geothermal conceptual model, and (5) the EGS favorability/trust mapping. The conceptual model presented applies to both the hydrothermal system and EGS in the Dixie Valley region.

  10. Results from baseline tests of the SPRE I and comparison with code model predictions

    SciTech Connect (OSTI)

    Cairelli, J.E.; Geng, S.M.; Skupinski, R.C.

    1994-09-01

    The Space Power Research Engine (SPRE), a free-piston Stirling engine with linear alternator, is being tested at the NASA Lewis Research Center as part of the Civil Space Technology Initiative (CSTI) as a candidate for high capacity space power. This paper presents results of base-line engine tests at design and off-design operating conditions. The test results are compared with code model predictions.

  11. Dixie Valley Engineered Geothermal System Exploration Methodology Project, Baseline Conceptual Model Report

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Iovenitti, Joe

    2014-01-02

    FSR Part I presents (1) an assessment of the readily available public domain data and some proprietary data provided by Terra-Gen Power, LLC, (2) a re-interpretation of these data as required, (3) an exploratory geostatistical data analysis, (4) the baseline geothermal conceptual model, and (5) the EGS favorability/trust mapping. The conceptual model presented applies to both the hydrothermal system and EGS in the Dixie Valley region. FSR Part II presents (1) 278 new gravity stations; (2) enhanced gravity-magnetic modeling; (3) 42 new ambient seismic noise survey stations; (4) an integration of the new seismic noise data with a regional seismic network; (5) a new methodology and approach to interpret this data; (5) a novel method to predict rock type and temperature based on the newly interpreted data; (6) 70 new magnetotelluric (MT) stations; (7) an integrated interpretation of the enhanced MT data set; (8) the results of a 308 station soil CO2 gas survey; (9) new conductive thermal modeling in the project area; (10) new convective modeling in the Calibration Area; (11) pseudo-convective modeling in the Calibration Area; (12) enhanced data implications and qualitative geoscience correlations at three scales (a) Regional, (b) Project, and (c) Calibration Area; (13) quantitative geostatistical exploratory data analysis; and (14) responses to nine questions posed in the proposal for this investigation. Enhanced favorability/trust maps were not generated because there was not a sufficient amount of new, fully-vetted (see below) rock type, temperature, and stress data. The enhanced seismic data did generate a new method to infer rock type and temperature. However, in the opinion of the Principal Investigator for this project, this new methodology needs to be tested and evaluated at other sites in the Basin and Range before it is used to generate the referenced maps. As in the baseline conceptual model, the enhanced findings can be applied to both the hydrothermal system and EGS in the Dixie Valley region.

  12. Dixie Valley Engineered Geothermal System Exploration Methodology Project, Baseline Conceptual Model Report

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Iovenitti, Joe

    FSR Part I presents (1) an assessment of the readily available public domain data and some proprietary data provided by Terra-Gen Power, LLC, (2) a re-interpretation of these data as required, (3) an exploratory geostatistical data analysis, (4) the baseline geothermal conceptual model, and (5) the EGS favorability/trust mapping. The conceptual model presented applies to both the hydrothermal system and EGS in the Dixie Valley region. FSR Part II presents (1) 278 new gravity stations; (2) enhanced gravity-magnetic modeling; (3) 42 new ambient seismic noise survey stations; (4) an integration of the new seismic noise data with a regional seismic network; (5) a new methodology and approach to interpret this data; (5) a novel method to predict rock type and temperature based on the newly interpreted data; (6) 70 new magnetotelluric (MT) stations; (7) an integrated interpretation of the enhanced MT data set; (8) the results of a 308 station soil CO2 gas survey; (9) new conductive thermal modeling in the project area; (10) new convective modeling in the Calibration Area; (11) pseudo-convective modeling in the Calibration Area; (12) enhanced data implications and qualitative geoscience correlations at three scales (a) Regional, (b) Project, and (c) Calibration Area; (13) quantitative geostatistical exploratory data analysis; and (14) responses to nine questions posed in the proposal for this investigation. Enhanced favorability/trust maps were not generated because there was not a sufficient amount of new, fully-vetted (see below) rock type, temperature, and stress data. The enhanced seismic data did generate a new method to infer rock type and temperature. However, in the opinion of the Principal Investigator for this project, this new methodology needs to be tested and evaluated at other sites in the Basin and Range before it is used to generate the referenced maps. As in the baseline conceptual model, the enhanced findings can be applied to both the hydrothermal system and EGS in the Dixie Valley region.

  13. LANL computer model boosts engine efficiency

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    LANL computer model boosts engine efficiency LANL computer model boosts engine efficiency The KIVA model has been instrumental in helping researchers and manufacturers understand...

  14. Baseline for Climate Change: Modeling Watershed Aquatic Biodiversity Relative to Environmental and Anthropogenic Factors

    SciTech Connect (OSTI)

    Maurakis, Eugene G

    2010-10-01

    Objectives of the two-year study were to (1) establish baselines for fish and macroinvertebrate community structures in two mid-Atlantic lower Piedmont watersheds (Quantico Creek, a pristine forest watershed; and Cameron Run, an urban watershed, Virginia) that can be used to monitor changes relative to the impacts related to climate change in the future; (2) create mathematical expressions to model fish species richness and diversity, and macroinvertebrate taxa and macroinvertebrate functional feeding group taxa richness and diversity that can serve as a baseline for future comparisons in these and other watersheds in the mid-Atlantic region; and (3) heighten people’s awareness, knowledge and understanding of climate change and impacts on watersheds in a laboratory experience and interactive exhibits, through internship opportunities for undergraduate and graduate students, a week-long teacher workshop, and a website about climate change and watersheds. Mathematical expressions modeled fish and macroinvertebrate richness and diversity accurately well during most of the six thermal seasons where sample sizes were robust. Additionally, hydrologic models provide the basis for estimating flows under varying meteorological conditions and landscape changes. Continuations of long-term studies are requisite for accurately teasing local human influences (e.g. urbanization and watershed alteration) from global anthropogenic impacts (e.g. climate change) on watersheds. Effective and skillful translations (e.g. annual potential exposure of 750,000 people to our inquiry-based laboratory activities and interactive exhibits in Virginia) of results of scientific investigations are valuable ways of communicating information to the general public to enhance their understanding of climate change and its effects in watersheds.

  15. Improved computer models support genetics research

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    February Simple computer models unravel genetic stress reactions in cells Simple computer models unravel genetic stress reactions in cells Integrated biological and...

  16. Modeling of Electric Water Heaters for Demand Response: A Baseline PDE Model

    SciTech Connect (OSTI)

    Xu, Zhijie; Diao, Ruisheng; Lu, Shuai; Lian, Jianming; Zhang, Yu

    2014-09-05

    Demand response (DR)control can effectively relieve balancing and frequency regulation burdens on conventional generators, facilitate integrating more renewable energy, and reduce generation and transmission investments needed to meet peak demands. Electric water heaters (EWHs) have a great potential in implementing DR control strategies because: (a) the EWH power consumption has a high correlation with daily load patterns; (b) they constitute a significant percentage of domestic electrical load; (c) the heating element is a resistor, without reactive power consumption; and (d) they can be used as energy storage devices when needed. Accurately modeling the dynamic behavior of EWHs is essential for designing DR controls. Various water heater models, simplified to different extents, were published in the literature; however, few of them were validated against field measurements, which may result in inaccuracy when implementing DR controls. In this paper, a partial differential equation physics-based model, developed to capture detailed temperature profiles at different tank locations, is validated against field test data for more than 10 days. The developed model shows very good performance in capturing water thermal dynamics for benchmark testing purposes

  17. Renewable Diesel from Algal Lipids: An Integrated Baseline for Cost, Emissions, and Resource Potential from a Harmonized Model

    SciTech Connect (OSTI)

    Davis, R.; Fishman, D.; Frank, E. D.; Wigmosta, M. S.; Aden, A.; Coleman, A. M.; Pienkos, P. T.; Skaggs, R. J.; Venteris, E. R.; Wang, M. Q.

    2012-06-01

    The U.S. Department of Energy's Biomass Program has begun an initiative to obtain consistent quantitative metrics for algal biofuel production to establish an 'integrated baseline' by harmonizing and combining the Program's national resource assessment (RA), techno-economic analysis (TEA), and life-cycle analysis (LCA) models. The baseline attempts to represent a plausible near-term production scenario with freshwater microalgae growth, extraction of lipids, and conversion via hydroprocessing to produce a renewable diesel (RD) blendstock. Differences in the prior TEA and LCA models were reconciled (harmonized) and the RA model was used to prioritize and select the most favorable consortium of sites that supports production of 5 billion gallons per year of RD. Aligning the TEA and LCA models produced slightly higher costs and emissions compared to the pre-harmonized results. However, after then applying the productivities predicted by the RA model (13 g/m2/d on annual average vs. 25 g/m2/d in the original models), the integrated baseline resulted in markedly higher costs and emissions. The relationship between performance (cost and emissions) and either productivity or lipid fraction was found to be non-linear, and important implications on the TEA and LCA results were observed after introducing seasonal variability from the RA model. Increasing productivity and lipid fraction alone was insufficient to achieve cost and emission targets; however, combined with lower energy, less expensive alternative technology scenarios, emissions and costs were substantially reduced.

  18. Cupola Furnace Computer Process Model

    SciTech Connect (OSTI)

    Seymour Katz

    2004-12-31

    The cupola furnace generates more than 50% of the liquid iron used to produce the 9+ million tons of castings annually. The cupola converts iron and steel into cast iron. The main advantages of the cupola furnace are lower energy costs than those of competing furnaces (electric) and the ability to melt less expensive metallic scrap than the competing furnaces. However the chemical and physical processes that take place in the cupola furnace are highly complex making it difficult to operate the furnace in optimal fashion. The results are low energy efficiency and poor recovery of important and expensive alloy elements due to oxidation. Between 1990 and 2004 under the auspices of the Department of Energy, the American Foundry Society and General Motors Corp. a computer simulation of the cupola furnace was developed that accurately describes the complex behavior of the furnace. When provided with the furnace input conditions the model provides accurate values of the output conditions in a matter of seconds. It also provides key diagnostics. Using clues from the diagnostics a trained specialist can infer changes in the operation that will move the system toward higher efficiency. Repeating the process in an iterative fashion leads to near optimum operating conditions with just a few iterations. More advanced uses of the program have been examined. The program is currently being combined with an ''Expert System'' to permit optimization in real time. The program has been combined with ''neural network'' programs to affect very easy scanning of a wide range of furnace operation. Rudimentary efforts were successfully made to operate the furnace using a computer. References to these more advanced systems will be found in the ''Cupola Handbook''. Chapter 27, American Foundry Society, Des Plaines, IL (1999).

  19. Description of Model Data for SNL100-00: The Sandia 100-meter All-glass Baseline Wind Turbine

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    00-00: The Sandia 100-meter All-glass Baseline Wind Turbine Blade D. Todd Griffith, Brian R. Resor Sandia National Laboratories Wind and Water Power Technologies Department Introduction This document provides a brief description of model files that are available for the SNL100-00 blade [1]. For each file, codes used to create/read the model files are detailed (e.g. code version and date, description, etc). A summary of the blade model data is also provided from the design report [1]. A Design

  20. Commercial Building Energy Baseline Modeling Software: Performance Metrics and Method Testing with Open Source Models and Implications for Proprietary Software Testing

    SciTech Connect (OSTI)

    Price, Phillip N.; Granderson, Jessica; Sohn, Michael; Addy, Nathan; Jump, David

    2013-09-01

    The overarching goal of this work is to advance the capabilities of technology evaluators in evaluating the building-level baseline modeling capabilities of Energy Management and Information System (EMIS) software. Through their customer engagement platforms and products, EMIS software products have the potential to produce whole-building energy savings through multiple strategies: building system operation improvements, equipment efficiency upgrades and replacements, and inducement of behavioral change among the occupants and operations personnel. Some offerings may also automate the quantification of whole-building energy savings, relative to a baseline period, using empirical models that relate energy consumption to key influencing parameters, such as ambient weather conditions and building operation schedule. These automated baseline models can be used to streamline the whole-building measurement and verification (M&V) process, and therefore are of critical importance in the context of multi-measure whole-building focused utility efficiency programs. This report documents the findings of a study that was conducted to begin answering critical questions regarding quantification of savings at the whole-building level, and the use of automated and commercial software tools. To evaluate the modeling capabilities of EMIS software particular to the use case of whole-building savings estimation, four research questions were addressed: 1. What is a general methodology that can be used to evaluate baseline model performance, both in terms of a) overall robustness, and b) relative to other models? 2. How can that general methodology be applied to evaluate proprietary models that are embedded in commercial EMIS tools? How might one handle practical issues associated with data security, intellectual property, appropriate testing ‘blinds’, and large data sets? 3. How can buildings be pre-screened to identify those that are the most model-predictable, and therefore those whose savings can be calculated with least error? 4. What is the state of public domain models, that is, how well do they perform, and what are the associated implications for whole-building measurement and verification (M&V)? Additional project objectives that were addressed as part of this study include: (1) clarification of the use cases and conditions for baseline modeling performance metrics, benchmarks and evaluation criteria, (2) providing guidance for determining customer suitability for baseline modeling, (3) describing the portfolio level effects of baseline model estimation errors, (4) informing PG&E’s development of EMIS technology product specifications, and (5) providing the analytical foundation for future studies about baseline modeling and saving effects of EMIS technologies. A final objective of this project was to demonstrate the application of the methodology, performance metrics, and test protocols with participating EMIS product vendors.

  1. Bayesian approaches for combining computational model output...

    Office of Scientific and Technical Information (OSTI)

    for combining computational model output and physical observations Authors: Higdon, David M 1 ; Lawrence, Earl 1 ; Heitmann, Katrin 2 ; Habib, Salman 2 + Show Author...

  2. Computable General Equilibrium Models for Sustainability Impact...

    Open Energy Info (EERE)

    Publications, Softwaremodeling tools User Interface: Other Website: iatools.jrc.ec.europa.eudocsecolecon2006.pdf Computable General Equilibrium Models for Sustainability...

  3. Section 23: Models and Computer Codes

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Application-2014 for the Waste Isolation Pilot Plant Models and Computer Codes (40 CFR 194.23) United States Department of Energy Waste Isolation Pilot Plant Carlsbad Field...

  4. Climate Modeling using High-Performance Computing

    SciTech Connect (OSTI)

    Mirin, A A

    2007-02-05

    The Center for Applied Scientific Computing (CASC) and the LLNL Climate and Carbon Science Group of Energy and Environment (E and E) are working together to improve predictions of future climate by applying the best available computational methods and computer resources to this problem. Over the last decade, researchers at the Lawrence Livermore National Laboratory (LLNL) have developed a number of climate models that provide state-of-the-art simulations on a wide variety of massively parallel computers. We are now developing and applying a second generation of high-performance climate models. Through the addition of relevant physical processes, we are developing an earth systems modeling capability as well.

  5. Improved computer models support genetics research

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    February » Simple computer models unravel genetic stress reactions in cells Simple computer models unravel genetic stress reactions in cells Integrated biological and computational methods provide insight into why genes are activated. February 8, 2013 When complete, these barriers will be a portion of the NMSSUP upgrade. This molecular structure depicts a yeast transfer ribonucleic acid (tRNA), which carries a single amino acid to the ribosome during protein construction. A combined

  6. Improved computer models support genetics research

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Simple computer models unravel genetic stress reactions in cells Simple computer models unravel genetic stress reactions in cells Integrated biological and computational methods provide insight into why genes are activated. February 8, 2013 When complete, these barriers will be a portion of the NMSSUP upgrade. This molecular structure depicts a yeast transfer ribonucleic acid (tRNA), which carries a single amino acid to the ribosome during protein construction. A combined experimental and

  7. Low Mach Number Models in Computational Astrophysics

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Ann Almgren Low Mach Number Models in Computational Astrophysics February 4, 2014 Ann Almgren. Berkeley Lab Downloads Almgren-nug2014.pdf | Adobe Acrobat PDF file Low Mach Number Models in Computational Astrophysics - Ann Almgren, Berkeley Lab Last edited: 2016-02-01 08:06:52

  8. Appendix A - GPRA06 benefits estimates: MARKAL and NEMS model baseline cases

    SciTech Connect (OSTI)

    None, None

    2009-01-18

    NEMS is an integrated energy model of the U.S. energy system developed by the Energy Information Administration (EIA) for forecasting and policy analysis purposes.

  9. LANL computer model boosts engine efficiency

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    LANL computer model boosts engine efficiency LANL computer model boosts engine efficiency The KIVA model has been instrumental in helping researchers and manufacturers understand combustion processes, accelerate engine development and improve engine design and efficiency. September 25, 2012 KIVA simulation of an experimental engine with DOHC quasi-symmetric pent-roof combustion chamber and 4 valves. KIVA simulation of an experimental engine with DOHC quasi-symmetric pent-roof combustion chamber

  10. CUPOLA FURNACE COMPUTER PROCESS MODEL

    Office of Scientific and Technical Information (OSTI)

    ... p 809 (1995) 25. Clark D., Moore K., Stanek V., Katz S.: Neural network ... E. D., Clark D. E., Moore K. L.: AFS cupola model verification - initial investigations. ...

  11. Computer Model Buildings Contaminated with Radioactive Material

    Energy Science and Technology Software Center (OSTI)

    1998-05-19

    The RESRAD-BUILD computer code is a pathway analysis model designed to evaluate the potential radiological dose incurred by an individual who works or lives in a building contaminated with radioactive material.

  12. Significant Enhancement of Computational Efficiency in Nonlinear Multiscale Battery Model for Computer Aided Engineering

    SciTech Connect (OSTI)

    Smith, Kandler; Graf, Peter; Jun, Myungsoo; Yang, Chuanbo; Li, Genong; Li, Shaoping; Hochman, Amit; Tselepidakis, Dimitrios

    2015-06-09

    This presentation provides an update on improvements in computational efficiency in a nonlinear multiscale battery model for computer aided engineering.

  13. Low Mach Number Models in Computational Astrophysics

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    In memoriam: Michael Welcome 1957 - 2014 RIP Almgren CCSE Low Mach Number Models in Computational Astrophysics Ann Almgren Center for Computational Sciences and Engineering Lawrence Berkeley National Laboratory NUG 2014: NERSC@40 February 4, 2014 Collaborators: John Bell, Chris Malone, Andy Nonaka, Stan Woosley, Michael Zingale Almgren CCSE Introduction We often associate astrophysics with explosive phenomena: novae supernovae gamma-ray bursts X-ray bursts Type Ia Supernovae Largest

  14. Hydropower Baseline Cost Modeling

    SciTech Connect (OSTI)

    O'Connor, Patrick W.; Zhang, Qin Fen; DeNeale, Scott T.; Chalise, Dol Raj; Centurion, Emma E.

    2015-01-01

    Recent resource assessments conducted by the United States Department of Energy have identified significant opportunities for expanding hydropower generation through the addition of power to non-powered dams and on undeveloped stream-reaches. Additional interest exists in the powering of existing water resource infrastructure such as conduits and canals, upgrading and expanding existing hydropower facilities, and the construction new pumped storage hydropower. Understanding the potential future role of these hydropower resources in the nation’s energy system requires an assessment of the environmental and techno-economic issues associated with expanding hydropower generation. To facilitate these assessments, this report seeks to fill the current gaps in publically available hydropower cost-estimating tools that can support the national-scale evaluation of hydropower resources.

  15. MaRIE theory, modeling and computation roadmap executive summary...

    Office of Scientific and Technical Information (OSTI)

    Conference: MaRIE theory, modeling and computation roadmap executive summary Citation Details In-Document Search Title: MaRIE theory, modeling and computation roadmap executive ...

  16. Computational Fluid Dynamics Modeling of Diesel Engine Combustion...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Computational Fluid Dynamics Modeling of Diesel Engine Combustion and Emissions Computational Fluid Dynamics Modeling of Diesel Engine Combustion and Emissions 2005 Diesel Engine...

  17. Computational flow modeling of a simplified integrated tractor...

    Office of Scientific and Technical Information (OSTI)

    Computational flow modeling of a simplified integrated tractor-trailer geometry. Citation Details In-Document Search Title: Computational flow modeling of a simplified integrated...

  18. Computer modeling of the global warming effect

    SciTech Connect (OSTI)

    Washington, W.M.

    1993-12-31

    The state of knowledge of global warming will be presented and two aspects examined: observational evidence and a review of the state of computer modeling of climate change due to anthropogenic increases in greenhouse gases. Observational evidence, indeed, shows global warming, but it is difficult to prove that the changes are unequivocally due to the greenhouse-gas effect. Although observational measurements of global warming are subject to ``correction,`` researchers are showing consistent patterns in their interpretation of the data. Since the 1960s, climate scientists have been making their computer models of the climate system more realistic. Models started as atmospheric models and, through the addition of oceans, surface hydrology, and sea-ice components, they then became climate-system models. Because of computer limitations and the limited understanding of the degree of interaction of the various components, present models require substantial simplification. Nevertheless, in their present state of development climate models can reproduce most of the observed large-scale features of the real system, such as wind, temperature, precipitation, ocean current, and sea-ice distribution. The use of supercomputers to advance the spatial resolution and realism of earth-system models will also be discussed.

  19. MHK Reference Model: Relevance to Computer Simulation

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Diana Bull Sandia National Laboratories July 9 th , 2012 SAND Number: 2012-5508P MHK Reference Model: Relevance to Computer Simulation Reference Model Partners Oregon State University /NNMREC University of Washington St. Anthony Falls Laboratory-UMinn Florida Atlantic University / SNMREC Cardinal Engineering WEC Design Operational Waves Profile Design of WEC--Performance Structural Design of WEC PTO Design Survival Waves Structural Design of WEC--Survivability Brake Design Anchor and Mooring

  20. Significant Enhancement of Computational Efficiency in Nonlinear Multiscale Battery Model for Computer Aided Engineering (Presentation)

    SciTech Connect (OSTI)

    Kim, G.; Pesaran, A.; Smith, K.; Graf, P.; Jun, M.; Yang, C.; Li, G.; Li, S.; Hochman, A.; Tselepidakis, D.; White, J.

    2014-06-01

    This presentation discusses the significant enhancement of computational efficiency in nonlinear multiscale battery model for computer aided engineering in current research at NREL.

  1. Wild Fire Computer Model Helps Firefighters

    ScienceCinema (OSTI)

    Canfield, Jesse

    2014-06-02

    A high-tech computer model called HIGRAD/FIRETEC, the cornerstone of a collaborative effort between U.S. Forest Service Rocky Mountain Research Station and Los Alamos National Laboratory, provides insights that are essential for front-line fire fighters. The science team is looking into levels of bark beetle-induced conditions that lead to drastic changes in fire behavior and how variable or erratic the behavior is likely to be.

  2. COMPUTATIONAL MODELING OF CIRCULATING FLUIDIZED BED REACTORS

    SciTech Connect (OSTI)

    Ibrahim, Essam A

    2013-01-09

    Details of numerical simulations of two-phase gas-solid turbulent flow in the riser section of Circulating Fluidized Bed Reactor (CFBR) using Computational Fluid Dynamics (CFD) technique are reported. Two CFBR riser configurations are considered and modeled. Each of these two riser models consist of inlet, exit, connecting elbows and a main pipe. Both riser configurations are cylindrical and have the same diameter but differ in their inlet lengths and main pipe height to enable investigation of riser geometrical scaling effects. In addition, two types of solid particles are exploited in the solid phase of the two-phase gas-solid riser flow simulations to study the influence of solid loading ratio on flow patterns. The gaseous phase in the two-phase flow is represented by standard atmospheric air. The CFD-based FLUENT software is employed to obtain steady state and transient solutions for flow modulations in the riser. The physical dimensions, types and numbers of computation meshes, and solution methodology utilized in the present work are stated. Flow parameters, such as static and dynamic pressure, species velocity, and volume fractions are monitored and analyzed. The differences in the computational results between the two models, under steady and transient conditions, are compared, contrasted, and discussed.

  3. Review of the synergies between computational modeling and experimenta...

    Office of Scientific and Technical Information (OSTI)

    Accepted Manuscript: Review of the synergies between computational modeling and ... November 16, 2016 Prev Next Title: Review of the synergies between computational ...

  4. BASELINE DESIGN/ECONOMICS FOR ADVANCED FISCHER-TROPSCH TECHNOLOGY

    SciTech Connect (OSTI)

    1998-04-01

    Bechtel, along with Amoco as the main subcontractor, developed a Baseline design, two alternative designs, and computer process simulation models for indirect coal liquefaction based on advanced Fischer-Tropsch (F-T) technology for the U. S. Department of Energy's (DOE's) Federal Energy Technology Center (FETC).

  5. Modeling-Computer Simulations At Northern Basin & Range Region...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Northern Basin & Range Region (Pritchett, 2004) Exploration Activity...

  6. Modeling-Computer Simulations At Central Nevada Seismic Zone...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Central Nevada Seismic Zone Region (Pritchett, 2004) Exploration...

  7. Modeling-Computer Simulations At Geysers Area (Goff & Decker...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Geysers Area (Goff & Decker, 1983) Exploration Activity Details...

  8. Modeling-Computer Simulations At Dixie Valley Geothermal Area...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Dixie Valley Geothermal Area (Wisian & Blackwell, 2004) Exploration...

  9. Modeling-Computer Simulations At Raft River Geothermal Area ...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Raft River Geothermal Area (1980) Exploration Activity Details...

  10. Modeling-Computer Simulations (Lewicki & Oldenburg, 2004) | Open...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations (Lewicki & Oldenburg, 2004) Exploration Activity Details Location...

  11. Modeling-Computer Simulations At Desert Peak Area (Wisian & Blackwell...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Desert Peak Area (Wisian & Blackwell, 2004) Exploration Activity...

  12. Modeling-Computer Simulations (Combs, Et Al., 1999) | Open Energy...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations (Combs, Et Al., 1999) Exploration Activity Details Location Unspecified...

  13. Modeling-Computer Simulations At Yellowstone Region (Laney, 2005...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Yellowstone Region (Laney, 2005) Exploration Activity Details Location...

  14. Modeling-Computer Simulations At Raft River Geothermal Area ...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Raft River Geothermal Area (1979) Exploration Activity Details...

  15. Modeling-Computer Simulations At Raft River Geothermal Area ...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Raft River Geothermal Area (1977) Exploration Activity Details...

  16. Modeling-Computer Simulations (Ozkocak, 1985) | Open Energy Informatio...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations (Ozkocak, 1985) Exploration Activity Details Location Unspecified...

  17. Modeling-Computer Simulations At White Mountains Area (Goff ...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At White Mountains Area (Goff & Decker, 1983) Exploration Activity...

  18. Modeling-Computer Simulations At Stillwater Area (Wisian & Blackwell...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Stillwater Area (Wisian & Blackwell, 2004) Exploration Activity...

  19. Modeling-Computer Simulations At Valles Caldera - Redondo Geothermal...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Valles Caldera - Redondo Geothermal Area (Wilt & Haar, 1986)...

  20. Modeling-Computer Simulations At Dixie Valley Geothermal Area...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Dixie Valley Geothermal Area (Kennedy & Soest, 2006) Exploration...

  1. Modeling-Computer Simulations (Ranalli & Rybach, 2005) | Open...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations (Ranalli & Rybach, 2005) Exploration Activity Details Location...

  2. Modeling-Computer Simulations At Raft River Geothermal Area ...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Raft River Geothermal Area (1983) Exploration Activity Details...

  3. Preliminary Phase Field Computational Model Development

    SciTech Connect (OSTI)

    Li, Yulan; Hu, Shenyang Y.; Xu, Ke; Suter, Jonathan D.; McCloy, John S.; Johnson, Bradley R.; Ramuhalli, Pradeep

    2014-12-15

    This interim report presents progress towards the development of meso-scale models of magnetic behavior that incorporate microstructural information. Modeling magnetic signatures in irradiated materials with complex microstructures (such as structural steels) is a significant challenge. The complexity is addressed incrementally, using the monocrystalline Fe (i.e., ferrite) film as model systems to develop and validate initial models, followed by polycrystalline Fe films, and by more complicated and representative alloys. In addition, the modeling incrementally addresses inclusion of other major phases (e.g., martensite, austenite), minor magnetic phases (e.g., carbides, FeCr precipitates), and minor nonmagnetic phases (e.g., Cu precipitates, voids). The focus of the magnetic modeling is on phase-field models. The models are based on the numerical solution to the Landau-Lifshitz-Gilbert equation. From the computational standpoint, phase-field modeling allows the simulation of large enough systems that relevant defect structures and their effects on functional properties like magnetism can be simulated. To date, two phase-field models have been generated in support of this work. First, a bulk iron model with periodic boundary conditions was generated as a proof-of-concept to investigate major loop effects of single versus polycrystalline bulk iron and effects of single non-magnetic defects. More recently, to support the experimental program herein using iron thin films, a new model was generated that uses finite boundary conditions representing surfaces and edges. This model has provided key insights into the domain structures observed in magnetic force microscopy (MFM) measurements. Simulation results for single crystal thin-film iron indicate the feasibility of the model for determining magnetic domain wall thickness and mobility in an externally applied field. Because the phase-field model dimensions are limited relative to the size of most specimens used in experiments, special experimental methods were devised to create similar boundary conditions in the iron films. Preliminary MFM studies conducted on single and polycrystalline iron films with small sub-areas created with focused ion beam have correlated quite well qualitatively with phase-field simulations. However, phase-field model dimensions are still small relative to experiments thus far. We are in the process of increasing the size of the models and decreasing specimen size so both have identical dimensions. Ongoing research is focused on validation of the phase-field model. Validation is being accomplished through comparison with experimentally obtained MFM images (in progress), and planned measurements of major hysteresis loops and first order reversal curves. Extrapolation of simulation sizes to represent a more stochastic bulk-like system will require sampling of various simulations (i.e., with single non-magnetic defect, single magnetic defect, single grain boundary, single dislocation, etc.) with distributions of input parameters. These outputs can then be compared to laboratory magnetic measurements and ultimately to simulate magnetic Barkhausen noise signals.

  4. Hazard Baseline Documentation

    Broader source: Directives, Delegations, and Requirements [Office of Management (MA)]

    1995-12-04

    This standard establishes uniform Office of Environmental Management (EM) guidance on hazard baseline documents that identify and control radiological and non-radiological hazards for all EM facilities.

  5. Caterpillar and Cummins Gain Edge Through Argonnne's Rare Computer Modeling

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    and Analysis Resources | Argonne National Laboratory Caterpillar and Cummins Gain Edge Through Argonnne's Rare Computer Modeling and Analysis Resources A private industry success story. PDF icon cat_cummins_computing_success_story_dec_

  6. Grocery 2009 TSD Miami Baseline | Open Energy Information

    Open Energy Info (EERE)

    Jump to: navigation, search Model Name Grocery 2009 TSD Miami Baseline Building Type Food Sales Model Type Baseline Model Target Type ASHRAE 90.1 2004 Model Year 2009 IDF file...

  7. Grocery 2009 TSD Chicago Baseline | Open Energy Information

    Open Energy Info (EERE)

    Jump to: navigation, search Model Name Grocery 2009 TSD Chicago Baseline Building Type Food Sales Model Type Baseline Model Target Type ASHRAE 90.1 2004 Model Year 2009 IDF file...

  8. MaRIE theory, modeling and computation roadmap executive summary

    Office of Scientific and Technical Information (OSTI)

    (Conference) | SciTech Connect Conference: MaRIE theory, modeling and computation roadmap executive summary Citation Details In-Document Search Title: MaRIE theory, modeling and computation roadmap executive summary The confluence of MaRIE (Matter-Radiation Interactions in Extreme) and extreme (exascale) computing timelines offers a unique opportunity in co-designing the elements of materials discovery, with theory and high performance computing, itself co-designed by constrained

  9. Predictive Capability Maturity Model for computational modeling and simulation.

    SciTech Connect (OSTI)

    Oberkampf, William Louis; Trucano, Timothy Guy; Pilch, Martin M.

    2007-10-01

    The Predictive Capability Maturity Model (PCMM) is a new model that can be used to assess the level of maturity of computational modeling and simulation (M&S) efforts. The development of the model is based on both the authors experience and their analysis of similar investigations in the past. The perspective taken in this report is one of judging the usefulness of a predictive capability that relies on the numerical solution to partial differential equations to better inform and improve decision making. The review of past investigations, such as the Software Engineering Institute's Capability Maturity Model Integration and the National Aeronautics and Space Administration and Department of Defense Technology Readiness Levels, indicates that a more restricted, more interpretable method is needed to assess the maturity of an M&S effort. The PCMM addresses six contributing elements to M&S: (1) representation and geometric fidelity, (2) physics and material model fidelity, (3) code verification, (4) solution verification, (5) model validation, and (6) uncertainty quantification and sensitivity analysis. For each of these elements, attributes are identified that characterize four increasing levels of maturity. Importantly, the PCMM is a structured method for assessing the maturity of an M&S effort that is directed toward an engineering application of interest. The PCMM does not assess whether the M&S effort, the accuracy of the predictions, or the performance of the engineering system satisfies or does not satisfy specified application requirements.

  10. Review of computational thermal-hydraulic modeling

    SciTech Connect (OSTI)

    Keefer, R.H.; Keeton, L.W.

    1995-12-31

    Corrosion of heat transfer tubing in nuclear steam generators has been a persistent problem in the power generation industry, assuming many different forms over the years depending on chemistry and operating conditions. Whatever the corrosion mechanism, a fundamental understanding of the process is essential to establish effective management strategies. To gain this fundamental understanding requires an integrated investigative approach that merges technology from many diverse scientific disciplines. An important aspect of an integrated approach is characterization of the corrosive environment at high temperature. This begins with a thorough understanding of local thermal-hydraulic conditions, since they affect deposit formation, chemical concentration, and ultimately corrosion. Computational Fluid Dynamics (CFD) can and should play an important role in characterizing the thermal-hydraulic environment and in predicting the consequences of that environment,. The evolution of CFD technology now allows accurate calculation of steam generator thermal-hydraulic conditions and the resulting sludge deposit profiles. Similar calculations are also possible for model boilers, so that tests can be designed to be prototypic of the heat exchanger environment they are supposed to simulate. This paper illustrates the utility of CFD technology by way of examples in each of these two areas. This technology can be further extended to produce more detailed local calculations of the chemical environment in support plate crevices, beneath thick deposits on tubes, and deep in tubesheet sludge piles. Knowledge of this local chemical environment will provide the foundation for development of mechanistic corrosion models, which can be used to optimize inspection and cleaning schedules and focus the search for a viable fix.

  11. Modeling of Geothermal Reservoirs: Fundamental Processes, Computer...

    Open Energy Info (EERE)

    of Geothermal Reservoirs: Fundamental Processes, Computer Simulation and Field Applications Jump to: navigation, search OpenEI Reference LibraryAdd to library Journal Article:...

  12. Modeling-Computer Simulations At Fish Lake Valley Area (Deymonaz...

    Open Energy Info (EERE)

    Fish Lake Valley Area (Deymonaz, Et Al., 2008) Jump to: navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Fish Lake Valley...

  13. Martin Karplus and Computer Modeling for Chemical Systems

    Office of Scientific and Technical Information (OSTI)

    Information Additional information about Martin Karplus, computer modeling, and chemical systems is available in electronic documents and on the Web. Documents: Comparison of 3D...

  14. Modeling-Computer Simulations At Nevada Test And Training Range...

    Open Energy Info (EERE)

    Nevada Test And Training Range Area (Sabin, Et Al., 2004) Jump to: navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Nevada...

  15. New partnership uses advanced computer science modeling to address...

    National Nuclear Security Administration (NNSA)

    partnership uses advanced computer science modeling to address climate change | National Nuclear Security Administration Facebook Twitter Youtube Flickr RSS People Mission Managing...

  16. Modeling-Computer Simulations At Dixie Valley Geothermal Area...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Dixie Valley Geothermal Area (Wannamaker, Et Al., 2006) Exploration...

  17. Modeling-Computer Simulations At Obsidian Cliff Area (Hulen,...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Obsidian Cliff Area (Hulen, Et Al., 2003) Exploration Activity Details...

  18. Modeling-Computer Simulations At Walker-Lane Transitional Zone...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Walker-Lane Transitional Zone Region (Laney, 2005) Exploration...

  19. Modeling-Computer Simulations At Valles Caldera - Redondo Geothermal...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Valles Caldera - Redondo Geothermal Area (Roberts, Et Al., 1995)...

  20. Modeling-Computer Simulations At Long Valley Caldera Geothermal...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Long Valley Caldera Geothermal Area (Pribnow, Et Al., 2003)...

  1. Modeling-Computer Simulations At Hawthorne Area (Lazaro, Et Al...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Hawthorne Area (Lazaro, Et Al., 2010) Exploration Activity Details...

  2. Modeling-Computer Simulations At Walker-Lane Transitional Zone...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Walker-Lane Transitional Zone Region (Pritchett, 2004) Exploration...

  3. Modeling-Computer Simulations At Fenton Hill HDR Geothermal Area...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Fenton Hill HDR Geothermal Area (Brown & DuTeaux, 1997) Exploration...

  4. Modeling-Computer Simulations At Coso Geothermal Area (1980)...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Coso Geothermal Area (1980) Exploration Activity Details Location Coso...

  5. Modeling-Computer Simulations At Long Valley Caldera Geothermal...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Long Valley Caldera Geothermal Area (Newman, Et Al., 2006) Exploration...

  6. Scientists use world's fastest computer to model materials under...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Materials under extreme conditions Scientists use world's fastest computer to model materials under extreme conditions Materials scientists are for the first time attempting to...

  7. Modeling-Computer Simulations At The Needles Area (Bell & Ramelli...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At The Needles Area (Bell & Ramelli, 2009) Exploration Activity Details...

  8. Modeling-Computer Simulations At Fenton Hill HDR Geothermal Area...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Fenton Hill HDR Geothermal Area (Goff & Decker, 1983) Exploration...

  9. Modeling-Computer Simulations At Long Valley Caldera Geothermal...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Long Valley Caldera Geothermal Area (Farrar, Et Al., 2003) Exploration...

  10. Modeling-Computer Simulations At Central Nevada Seismic Zone...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Central Nevada Seismic Zone Region (Biasi, Et Al., 2009) Exploration...

  11. Modeling-Computer Simulations At Valles Caldera - Sulphur Springs...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Valles Caldera - Sulphur Springs Geothermal Area (Roberts, Et Al.,...

  12. Modeling-Computer Simulations At Nw Basin & Range Region (Pritchett...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Nw Basin & Range Region (Pritchett, 2004) Exploration Activity Details...

  13. Modeling-Computer Simulations At Long Valley Caldera Geothermal...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Long Valley Caldera Geothermal Area (Tempel, Et Al., 2011) Exploration...

  14. Modeling-Computer Simulations At Nw Basin & Range Region (Biasi...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Nw Basin & Range Region (Biasi, Et Al., 2009) Exploration Activity...

  15. Modeling-Computer Simulations At Coso Geothermal Area (2000)...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Coso Geothermal Area (2000) Exploration Activity Details Location Coso...

  16. Modeling-Computer Simulations At Northern Basin & Range Region...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Northern Basin & Range Region (Biasi, Et Al., 2009) Exploration...

  17. Modeling-Computer Simulations At Valles Caldera - Sulphur Springs...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Valles Caldera - Sulphur Springs Geothermal Area (Wilt & Haar, 1986)...

  18. Computer Modeling of Chemical and Geochemical Processes in High...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Computer modeling of chemical and geochemical processes in high ionic strength solutions is a unique capability within Sandia's Defense Waste Managment Programs located in...

  19. Modeling-Computer Simulations At Akutan Fumaroles Area (Kolker...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Akutan Fumaroles Area (Kolker, Et Al., 2010) Exploration Activity...

  20. Modeling-Computer Simulations At Walker-Lane Transitional Zone...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Walker-Lane Transitional Zone Region (Biasi, Et Al., 2009) Exploration...

  1. Modeling-Computer Simulations At Coso Geothermal Area (1999)...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Coso Geothermal Area (1999) Exploration Activity Details Location Coso...

  2. Unsolicited Projects in 2012: Research in Computer Architecture, Modeling,

    Office of Science (SC) Website

    and Evolving MPI for Exascale | U.S. DOE Office of Science (SC) 2: Research in Computer Architecture, Modeling, and Evolving MPI for Exascale Advanced Scientific Computing Research (ASCR) ASCR Home About Research Applied Mathematics Computer Science Exascale Tools Workshop Programming Challenges Workshop Architectures I Workshop External link Architectures II Workshop External link Next Generation Networking Scientific Discovery through Advanced Computing (SciDAC) ASCR SBIR-STTR Facilities

  3. Ambient temperature modelling with soft computing techniques

    SciTech Connect (OSTI)

    Bertini, Ilaria; Ceravolo, Francesco; Citterio, Marco; Di Pietra, Biagio; Margiotta, Francesca; Pizzuti, Stefano; Puglisi, Giovanni; De Felice, Matteo

    2010-07-15

    This paper proposes a hybrid approach based on soft computing techniques in order to estimate monthly and daily ambient temperature. Indeed, we combine the back-propagation (BP) algorithm and the simple Genetic Algorithm (GA) in order to effectively train artificial neural networks (ANN) in such a way that the BP algorithm initialises a few individuals of the GA's population. Experiments concerned monthly temperature estimation of unknown places and daily temperature estimation for thermal load computation. Results have shown remarkable improvements in accuracy compared to traditional methods. (author)

  4. Develop baseline computational model for proactive welding stress management to suppress helium induced cracking during weld repair

    Broader source: Energy.gov [DOE]

    There are over 100 nuclear power plants operating in the U.S., which generate approximately 20% of the nation’s electricity. These plants range from 15 to 40 years old. Extending the service lives...

  5. Direct coal liquefaction baseline design and system analysis

    SciTech Connect (OSTI)

    Not Available

    1991-04-01

    The primary objective of the study is to develop a computer model for a base line direct coal liquefaction design based on two stage direct coupled catalytic reactors. This primary objective is to be accomplished by completing the following: a base line design based on previous DOE/PETC results from Wilsonville pilot plant and other engineering evaluations; a cost estimate and economic analysis; a computer model incorporating the above two steps over a wide range of capacities and selected process alternatives; a comprehensive training program for DOE/PETC Staff to understand and use the computer model; a thorough documentation of all underlying assumptions for baseline economics; and a user manual and training material which will facilitate updating of the model in the future.

  6. Direct coal liquefaction baseline design and system analysis

    SciTech Connect (OSTI)

    Not Available

    1991-07-01

    The primary objective of the study is to develop a computer model for a base line direct coal liquefaction design based on two stage direct coupled catalytic reactors. This primary objective is to be accomplished by completing the following: a base line design based on previous DOE/PETC results from Wilsonville pilot plant and other engineering evaluations; a cost estimate and economic analysis; a computer model incorporating the above two steps over a wide range of capacities and selected process alternatives; a comprehensive training program for DOE/PETC Staff to understand and use the computer model; a thorough documentation of all underlying assumptions for baseline economics; and a user manual and training material which will facilitate updating of the model in the future.

  7. Computational Fluid Dynamics Modeling of Diesel Engine Combustion and

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Emissions | Department of Energy Computational Fluid Dynamics Modeling of Diesel Engine Combustion and Emissions Computational Fluid Dynamics Modeling of Diesel Engine Combustion and Emissions 2005 Diesel Engine Emissions Reduction (DEER) Conference Presentations and Posters PDF icon 2005_deer_reitz.pdf More Documents & Publications Experiments and Modeling of Two-Stage Combustion in Low-Emissions Diesel Engines Comparison of Conventional Diesel and Reactivity Controlled Compression

  8. Modeling-Computer Simulations | Open Energy Information

    Open Energy Info (EERE)

    the risk of inaccurate predictions.1 Potential Pitfalls Uncertainties in initial reservoir conditions and other model inputs can cause inaccuracies in simulations, which...

  9. Scientists model brain structure to help computers recognize...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Do you see what I see? Scientists model brain structure to help computers recognize ... Introspectively, we know that the human brain solves this problem very well. We only have ...

  10. Computational Modeling for the American Chemical Society | GE...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Computational Modeling for the American Chemical Society Click to email this to a friend (Opens in new window) Share on Facebook (Opens in new window) Click to share (Opens in new...

  11. Computational model of miniature pulsating heat pipes.

    SciTech Connect (OSTI)

    Martinez, Mario J.; Givler, Richard C.

    2013-01-01

    The modeling work described herein represents Sandia National Laboratories (SNL) portion of a collaborative three-year project with Northrop Grumman Electronic Systems (NGES) and the University of Missouri to develop an advanced, thermal ground-plane (TGP), which is a device, of planar configuration, that delivers heat from a source to an ambient environment with high efficiency. Work at all three institutions was funded by DARPA/MTO; Sandia was funded under DARPA/MTO project number 015070924. This is the final report on this project for SNL. This report presents a numerical model of a pulsating heat pipe, a device employing a two phase (liquid and its vapor) working fluid confined in a closed loop channel etched/milled into a serpentine configuration in a solid metal plate. The device delivers heat from an evaporator (hot zone) to a condenser (cold zone). This new model includes key physical processes important to the operation of flat plate pulsating heat pipes (e.g. dynamic bubble nucleation, evaporation and condensation), together with conjugate heat transfer with the solid portion of the device. The model qualitatively and quantitatively predicts performance characteristics and metrics, which was demonstrated by favorable comparisons with experimental results on similar configurations. Application of the model also corroborated many previous performance observations with respect to key parameters such as heat load, fill ratio and orientation.

  12. Hierarchical calibration of computer models (Conference) | SciTech Connect

    Office of Scientific and Technical Information (OSTI)

    Hierarchical calibration of computer models Citation Details In-Document Search Title: Hierarchical calibration of computer models × You are accessing a document from the Department of Energy's (DOE) SciTech Connect. This site is a product of DOE's Office of Scientific and Technical Information (OSTI) and is provided as a public service. Visit OSTI to utilize additional information resources in energy science and technology. A paper copy of this document is also available for sale to the public

  13. Towards a Computational Model of a Methane Producing Archaeum (Journal

    Office of Scientific and Technical Information (OSTI)

    Article) | SciTech Connect SciTech Connect Search Results Journal Article: Towards a Computational Model of a Methane Producing Archaeum Citation Details In-Document Search Title: Towards a Computational Model of a Methane Producing Archaeum Authors: Peterson, Joseph R. ; Labhsetwar, Piyush Search SciTech Connect for author "Labhsetwar, Piyush" Search SciTech Connect for ORCID "0000000159333609" Search orcid.org for ORCID "0000000159333609" ; Ellermeier, Jeremy

  14. Bayesian approaches for combining computational model output and physical

    Office of Scientific and Technical Information (OSTI)

    observations (Conference) | SciTech Connect Bayesian approaches for combining computational model output and physical observations Citation Details In-Document Search Title: Bayesian approaches for combining computational model output and physical observations Authors: Higdon, David M [1] ; Lawrence, Earl [1] ; Heitmann, Katrin [2] ; Habib, Salman [2] + Show Author Affiliations Los Alamos National Laboratory ANL Publication Date: 2011-07-25 OSTI Identifier: 1084581 Report Number(s):

  15. Cielo Computational Environment Usage Model With Mappings to ACE

    Office of Scientific and Technical Information (OSTI)

    Requirements for the General Availability User Environment Capabilities Release Version 1.1 (Technical Report) | SciTech Connect Technical Report: Cielo Computational Environment Usage Model With Mappings to ACE Requirements for the General Availability User Environment Capabilities Release Version 1.1 Citation Details In-Document Search Title: Cielo Computational Environment Usage Model With Mappings to ACE Requirements for the General Availability User Environment Capabilities Release

  16. Cielo Computational Environment Usage Model With Mappings to ACE

    Office of Scientific and Technical Information (OSTI)

    Requirements for the General Availability User Environment Capabilities Release Version 1.1 (Technical Report) | SciTech Connect Technical Report: Cielo Computational Environment Usage Model With Mappings to ACE Requirements for the General Availability User Environment Capabilities Release Version 1.1 Citation Details In-Document Search Title: Cielo Computational Environment Usage Model With Mappings to ACE Requirements for the General Availability User Environment Capabilities Release

  17. HIV virus spread and evolution studied through computer modeling

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    HIV and evolution studied through computer modeling HIV virus spread and evolution studied through computer modeling This approach distinguishes between susceptible and infected individuals to capture the full infection history, including contact tracing data for infected individuals. November 19, 2013 Scanning electron micrograph of HIV-1 budding (in green) from cultured lymphocytes. The image has been colored to highlight important features. Scanning electron micrograph of HIV-1 budding (in

  18. Computer modeling reveals how surprisingly potent hepatitis C drug works

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Hepatitis C computer modeling Computer modeling reveals how surprisingly potent hepatitis C drug works A study reveals how daclatasvir targets one of its proteins and causes the fastest viral decline ever seen with anti-HCV drugs - within 12 hours of treatment. February 19, 2013 Los Alamos National Laboratory sits on top of a once-remote mesa in northern New Mexico with the Jemez mountains as a backdrop to research and innovation covering multi-disciplines from bioscience, sustainable energy

  19. Computationally Efficient Modeling of High-Efficiency Clean Combustion

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Engines | Department of Energy 2 DOE Hydrogen and Fuel Cells Program and Vehicle Technologies Program Annual Merit Review and Peer Evaluation Meeting PDF icon ace012_flowers_2012_o.pdf More Documents & Publications Computationally Efficient Modeling of High-Efficiency Clean Combustion Engines Computationally Efficient Modeling of High-Efficiency Clean Combustion Engines Simulation of High Efficiency Clean Combustion Engines and Detailed Chemical Kinetic Mechanisms Development

  20. Use Computational Model to Design and Optimize Welding Conditions to

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Suppress Helium Cracking during Welding | Department of Energy Use Computational Model to Design and Optimize Welding Conditions to Suppress Helium Cracking during Welding Use Computational Model to Design and Optimize Welding Conditions to Suppress Helium Cracking during Welding Today, welding is widely used for repair, maintenance and upgrade of nuclear reactor components. As a critical technology to extend the service life of nuclear power plants beyond 60 years, weld technology must be

  1. Review of the synergies between computational modeling and experimental

    Office of Scientific and Technical Information (OSTI)

    characterization of materials across length scales (Journal Article) | DOE PAGES Accepted Manuscript: Review of the synergies between computational modeling and experimental characterization of materials across length scales This content will become publicly available on November 16, 2016 « Prev Next » Title: Review of the synergies between computational modeling and experimental characterization of materials across length scales With the increasing interplay between experimental and

  2. Towards a Computational Model of a Methane Producing Archaeum (Journal

    Office of Scientific and Technical Information (OSTI)

    Article) | DOE PAGES Towards a Computational Model of a Methane Producing Archaeum Title: Towards a Computational Model of a Methane Producing Archaeum Authors: Peterson, Joseph R. ; Labhsetwar, Piyush Search DOE PAGES for author "Labhsetwar, Piyush" Search DOE PAGES for ORCID "0000000159333609" Search orcid.org for ORCID "0000000159333609" ; Ellermeier, Jeremy R. ; Kohler, Petra R. A. ; Jain, Ankur Search DOE PAGES for author "Jain, Ankur" Search DOE

  3. Transportation Baseline Report

    SciTech Connect (OSTI)

    Fawcett, Ricky Lee; Kramer, George Leroy Jr.

    1999-12-01

    The National Transportation Program 1999 Transportation Baseline Report presents data that form a baseline to enable analysis and planning for future Department of Energy (DOE) Environmental Management (EM) waste and materials transportation. In addition, this Report provides a summary overview of DOE’s projected quantities of waste and materials for transportation. Data presented in this report were gathered as a part of the IPABS Spring 1999 update of the EM Corporate Database and are current as of July 30, 1999. These data were input and compiled using the Analysis and Visualization System (AVS) which is used to update all stream-level components of the EM Corporate Database, as well as TSD System and programmatic risk (disposition barrier) information. Project (PBS) and site-level IPABS data are being collected through the Interim Data Management System (IDMS). The data are presented in appendices to this report.

  4. NASA technical baseline

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    technical baseline - Sandia Energy Energy Search Icon Sandia Home Locations Contact Us Employee Locator Energy & Climate Secure & Sustainable Energy Future Stationary Power Energy Conversion Efficiency Solar Energy Wind Energy Water Power Supercritical CO2 Geothermal Natural Gas Safety, Security & Resilience of the Energy Infrastructure Energy Storage Nuclear Power & Engineering Grid Modernization Battery Testing Nuclear Fuel Cycle Defense Waste Management Programs Advanced

  5. New Computer Model Pinpoints Prime Materials for Carbon Capture

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    New Computer Model Pinpoints Prime Materials for Carbon Capture New Computer Model Pinpoints Prime Materials for Carbon Capture July 17, 2012 NERSC Contact: Linda Vu, lvu@lbl.gov, +1 510 495 2402 UC Berkeley Contact: Robert Sanders, rsanders@berkeley.edu zeolite350.jpg One of the 50 best zeolite structures for capturing carbon dioxide. Zeolite is a porous solid made of silicon dioxide, or quartz. In the model, the red balls are oxygen, the tan balls are silicon. The blue-green area is where

  6. Annual Technology Baseline

    Broader source: Energy.gov [DOE]

    The National Renewable Energy Laboratory is conducting a study sponsored by the U.S. Department of Energy DOE, Office of Energy Efficiency and Renewable Energy (EERE), that aims to document and implement an annual process designed to identify a realistic and timely set of input assumptions (e.g., technology cost and performance, fuel costs), and a diverse set of potential futures (standard scenarios), initially for electric sector analysis. This primary product of the Annual Technology Baseline (ATB) project component includes detailed cost and performance data (both current and projected) for both renewable and conventional technologies. This data is presented in MS Excel.

  7. Integrated Multiscale Modeling of Molecular Computing Devices

    SciTech Connect (OSTI)

    Weinan E

    2012-03-29

    The main bottleneck in modeling transport in molecular devices is to develop the correct formulation of the problem and efficient algorithms for analyzing the electronic structure and dynamics using, for example, the time-dependent density functional theory. We have divided this task into several steps. The first step is to developing the right mathematical formulation and numerical algorithms for analyzing the electronic structure using density functional theory. The second step is to study time-dependent density functional theory, particularly the far-field boundary conditions. The third step is to study electronic transport in molecular devices. We are now at the end of the first step. Under DOE support, we have made subtantial progress in developing linear scaling and sub-linear scaling algorithms for electronic structure analysis. Although there has been a huge amount of effort in the past on developing linear scaling algorithms, most of the algorithms developed suffer from the lack of robustness and controllable accuracy. We have made the following progress: (1) We have analyzed thoroughly the localization properties of the wave-functions. We have developed a clear understanding of the physical as well as mathematical origin of the decay properties. One important conclusion is that even for metals, one can choose wavefunctions that decay faster than any algebraic power. (2) We have developed algorithms that make use of these localization properties. Our algorithms are based on non-orthogonal formulations of the density functional theory. Our key contribution is to add a localization step into the algorithm. The addition of this localization step makes the algorithm quite robust and much more accurate. Moreover, we can control the accuracy of these algorithms by changing the numerical parameters. (3) We have considerably improved the Fermi operator expansion (FOE) approach. Through pole expansion, we have developed the optimal scaling FOE algorithm.

  8. Methodology for characterizing modeling and discretization uncertainties in computational simulation

    SciTech Connect (OSTI)

    ALVIN,KENNETH F.; OBERKAMPF,WILLIAM L.; RUTHERFORD,BRIAN M.; DIEGERT,KATHLEEN V.

    2000-03-01

    This research effort focuses on methodology for quantifying the effects of model uncertainty and discretization error on computational modeling and simulation. The work is directed towards developing methodologies which treat model form assumptions within an overall framework for uncertainty quantification, for the purpose of developing estimates of total prediction uncertainty. The present effort consists of work in three areas: framework development for sources of uncertainty and error in the modeling and simulation process which impact model structure; model uncertainty assessment and propagation through Bayesian inference methods; and discretization error estimation within the context of non-deterministic analysis.

  9. Computer Modeling VRF Heat Pumps in Commercial Buildings using EnergyPlus

    SciTech Connect (OSTI)

    Raustad, Richard

    2013-06-01

    Variable Refrigerant Flow (VRF) heat pumps are increasingly used in commercial buildings in the United States. Monitored energy use of field installations have shown, in some cases, savings exceeding 30% compared to conventional heating, ventilating, and air-conditioning (HVAC) systems. A simulation study was conducted to identify the installation or operational characteristics that lead to energy savings for VRF systems. The study used the Department of Energy EnergyPlus? building simulation software and four reference building models. Computer simulations were performed in eight U.S. climate zones. The baseline reference HVAC system incorporated packaged single-zone direct-expansion cooling with gas heating (PSZ-AC) or variable-air-volume systems (VAV with reheat). An alternate baseline HVAC system using a heat pump (PSZ-HP) was included for some buildings to directly compare gas and electric heating results. These baseline systems were compared to a VRF heat pump model to identify differences in energy use. VRF systems combine multiple indoor units with one or more outdoor unit(s). These systems move refrigerant between the outdoor and indoor units which eliminates the need for duct work in most cases. Since many applications install duct work in unconditioned spaces, this leads to installation differences between VRF systems and conventional HVAC systems. To characterize installation differences, a duct heat gain model was included to identify the energy impacts of installing ducts in unconditioned spaces. The configuration of variable refrigerant flow heat pumps will ultimately eliminate or significantly reduce energy use due to duct heat transfer. Fan energy is also studied to identify savings associated with non-ducted VRF terminal units. VRF systems incorporate a variable-speed compressor which may lead to operational differences compared to single-speed compression systems. To characterize operational differences, the computer model performance curves used to simulate cooling operation are also evaluated. The information in this paper is intended to provide a relative difference in system energy use and compare various installation practices that can impact performance. Comparative results of VRF versus conventional HVAC systems include energy use differences due to duct location, differences in fan energy when ducts are eliminated, and differences associated with electric versus fossil fuel type heating systems.

  10. District-heating strategy model: computer programmer's manual

    SciTech Connect (OSTI)

    Kuzanek, J.F.

    1982-05-01

    The US Department of Housing and Urban Development (HUD) and the US Department of Energy (DOE) cosponsor a program aimed at increasing the number of district heating and cooling (DHC) systems. Such systems can reduce the amount and costs of fuels used to heat and cool buildings in a district. Twenty-eight communities have agreed to aid HUD in a national feasibility assessment of DHC systems. The HUD/DOE program entails technical assistance by Argonne National Laboratory and Oak Ridge National Laboratory. The assistance includes a computer program, called the district heating strategy model (DHSM), that performs preliminary calculations to analyze potential DHC systems. This report describes the general capabilities of the DHSM, provides historical background on its development, and explains the computer installation and operation of the model - including the data file structures and the options. Sample problems illustrate the structure of the various input data files, the interactive computer-output listings. The report is written primarily for computer programmers responsible for installing the model on their computer systems, entering data, running the model, and implementing local modifications to the code.

  11. Multi-epoch very long baseline interferometric observations of the nuclear starburst region of NGC 253: Improved modeling of the supernova and star formation rates

    SciTech Connect (OSTI)

    Rampadarath, H.; Morgan, J. S.; Tingay, S. J.; Lenc, E.

    2014-01-01

    The results of multi-epoch observations of the southern starburst galaxy, NGC 253, with the Australian Long Baseline Array at 2.3 GHz are presented. As with previous radio interferometric observations of this galaxy, no new sources were discovered. By combining the results of this survey with Very Large Array observations at higher frequencies from the literature, spectra were derived and a free-free absorption model was fitted of 20 known sources in NGC 253. The results were found to be consistent with previous studies. The supernova remnant, 5.48-43.3, was imaged with the highest sensitivity and resolution to date, revealing a two-lobed morphology. Comparisons with previous observations of similar resolution give an upper limit of 10{sup 4} km s{sup –1} for the expansion speed of this remnant. We derive a supernova rate of <0.2 yr{sup –1} for the inner 300 pc using a model that improves on previous methods by incorporating an improved radio supernova peak luminosity distribution and by making use of multi-wavelength radio data spanning 21 yr. A star formation rate of SFR(M ? 5 M {sub ?}) < 4.9 M {sub ?} yr{sup –1} was also estimated using the standard relation between supernova and star formation rates. Our improved estimates of supernova and star formation rates are consistent with studies at other wavelengths. The results of our study point to the possible existence of a small population of undetected supernova remnants, suggesting a low rate of radio supernova production in NGC 253.

  12. Hazard baseline documentation

    SciTech Connect (OSTI)

    Not Available

    1994-08-01

    This DOE limited technical standard establishes uniform Office of Environmental Management (EM) guidance on hazards baseline documents that identify and control radiological and nonradiological hazards for all EM facilities. It provides a road map to the safety and health hazard identification and control requirements contained in the Department`s orders and provides EM guidance on the applicability and integration of these requirements. This includes a definition of four classes of facilities (nuclear, non-nuclear, radiological, and other industrial); the thresholds for facility hazard classification; and applicable safety and health hazard identification, controls, and documentation. The standard applies to the classification, development, review, and approval of hazard identification and control documentation for EM facilities.

  13. Computationally Efficient Modeling of High-Efficiency Clean Combustion

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Engines | Department of Energy 0 DOE Vehicle Technologies and Hydrogen Programs Annual Merit Review and Peer Evaluation Meeting, June 7-11, 2010 -- Washington D.C. PDF icon ace012_aceves_2010_o.pdf More Documents & Publications Computationally Efficient Modeling of High-Efficiency Clean Combustion Engines

  14. LANL researchers use computer modeling to study HIV | National Nuclear

    National Nuclear Security Administration (NNSA)

    Security Administration researchers use computer modeling to study HIV | National Nuclear Security Administration Facebook Twitter Youtube Flickr RSS People Mission Managing the Stockpile Preventing Proliferation Powering the Nuclear Navy Emergency Response Recapitalizing Our Infrastructure Countering Nuclear Terrorism About Our Programs Our History Who We Are Our Leadership Our Locations Budget Our Operations Library Bios Congressional Testimony Fact Sheets Newsletters Press Releases Photo

  15. New partnership uses advanced computer science modeling to address climate

    National Nuclear Security Administration (NNSA)

    change | National Nuclear Security Administration partnership uses advanced computer science modeling to address climate change | National Nuclear Security Administration Facebook Twitter Youtube Flickr RSS People Mission Managing the Stockpile Preventing Proliferation Powering the Nuclear Navy Emergency Response Recapitalizing Our Infrastructure Countering Nuclear Terrorism About Our Programs Our History Who We Are Our Leadership Our Locations Budget Our Operations Library Bios

  16. Scientists use world's fastest computer to model materials under extreme

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    conditions Materials under extreme conditions Scientists use world's fastest computer to model materials under extreme conditions Materials scientists are for the first time attempting to create atomic-scale models that describe how voids are created, grow, and merge. October 30, 2009 Los Alamos National Laboratory sits on top of a once-remote mesa in northern New Mexico with the Jemez mountains as a backdrop to research and innovation covering multi-disciplines from bioscience, sustainable

  17. Computer Modeling of Saltstone Landfills by Intera Environmental Consultants

    SciTech Connect (OSTI)

    Albenesius, E.L.

    2001-08-09

    This report summaries the computer modeling studies and how the results of these studies were used to estimate contaminant releases to the groundwater. These modeling studies were used to improve saltstone landfill designs and are the basis for the current reference design. With the reference landfill design, EPA Drinking Water Standards can be met for all chemicals and radionuclides contained in Savannah River Plant waste salts.

  18. Computational Science Research in Support of Petascale Electromagnetic Modeling

    SciTech Connect (OSTI)

    Lee, L.-Q.; Akcelik, V; Ge, L; Chen, S; Schussman, G; Candel, A; Li, Z; Xiao, L; Kabel, A; Uplenchwar, R; Ng, C; Ko, K; /SLAC

    2008-06-20

    Computational science research components were vital parts of the SciDAC-1 accelerator project and are continuing to play a critical role in newly-funded SciDAC-2 accelerator project, the Community Petascale Project for Accelerator Science and Simulation (ComPASS). Recent advances and achievements in the area of computational science research in support of petascale electromagnetic modeling for accelerator design analysis are presented, which include shape determination of superconducting RF cavities, mesh-based multilevel preconditioner in solving highly-indefinite linear systems, moving window using h- or p- refinement for time-domain short-range wakefield calculations, and improved scalable application I/O.

  19. Wind energy conversion system analysis model (WECSAM) computer program documentation

    SciTech Connect (OSTI)

    Downey, W T; Hendrick, P L

    1982-07-01

    Described is a computer-based wind energy conversion system analysis model (WECSAM) developed to predict the technical and economic performance of wind energy conversion systems (WECS). The model is written in CDC FORTRAN V. The version described accesses a data base containing wind resource data, application loads, WECS performance characteristics, utility rates, state taxes, and state subsidies for a six state region (Minnesota, Michigan, Wisconsin, Illinois, Ohio, and Indiana). The model is designed for analysis at the county level. The computer model includes a technical performance module and an economic evaluation module. The modules can be run separately or together. The model can be run for any single user-selected county within the region or looped automatically through all counties within the region. In addition, the model has a restart capability that allows the user to modify any data-base value written to a scratch file prior to the technical or economic evaluation. Thus, any user-supplied data for WECS performance, application load, utility rates, or wind resource may be entered into the scratch file to override the default data-base value. After the model and the inputs required from the user and derived from the data base are described, the model output and the various output options that can be exercised by the user are detailed. The general operation is set forth and suggestions are made for efficient modes of operation. Sample listings of various input, output, and data-base files are appended. (LEW)

  20. A New Perspective for the Calibration of Computational Predictor Models.

    SciTech Connect (OSTI)

    Crespo, Luis Guillermo

    2014-11-01

    This paper presents a framework for calibrating computational models using data from sev- eral and possibly dissimilar validation experiments. The offset between model predictions and observations, which might be caused by measurement noise, model-form uncertainty, and numerical error, drives the process by which uncertainty in the models parameters is characterized. The resulting description of uncertainty along with the computational model constitute a predictor model. Two types of predictor models are studied: Interval Predictor Models (IPMs) and Random Predictor Models (RPMs). IPMs use sets to characterize uncer- tainty, whereas RPMs use random vectors. The propagation of a set through a model makes the response an interval valued function of the state, whereas the propagation of a random vector yields a random process. Optimization-based strategies for calculating both types of predictor models are proposed. Whereas the formulations used to calculate IPMs target solutions leading to the interval value function of minimal spread containing all observations, those for RPMs seek to maximize the models' ability to reproduce the distribution of obser- vations. Regarding RPMs, we choose a structure for the random vector (i.e., the assignment of probability to points in the parameter space) solely dependent on the prediction error. As such, the probabilistic description of uncertainty is not a subjective assignment of belief, nor is it expected to asymptotically converge to a fixed value, but instead it is a description of the model's ability to reproduce the experimental data. This framework enables evaluating the spread and distribution of the predicted response of target applications depending on the same parameters beyond the validation domain (i.e., roll-up and extrapolation).

  1. Fast, narrow-band computer model for radiation calculations

    SciTech Connect (OSTI)

    Yan, Z.; Holmstedt, G.

    1997-01-01

    A fast, narrow-band computer model, FASTNB, which predicts the radiation intensity in a general nonisothermal and nonhomogeneous combustion environment, has been developed. The spectral absorption coefficients of the combustion products, including carbon dioxide, water vapor, and soot, are calculated based on the narrow-band model. FASTNB provides an accurate calculation at reasonably high speed. Compared with Grosshandler`s narrow-band model, RADCAL, which has been verified quite extensively against experimental measurements, FASTNB is more than 20 times faster and gives almost exactly the same results.

  2. The Impact of IBM Cell Technology on the Programming Paradigm in the Context of Computer Systems for Climate and Weather Models

    SciTech Connect (OSTI)

    Zhou, Shujia; Duffy, Daniel; Clune, Thomas; Suarez, Max; Williams, Samuel; Halem, Milton

    2009-01-10

    The call for ever-increasing model resolutions and physical processes in climate and weather models demands a continual increase in computing power. The IBM Cell processor's order-of-magnitude peak performance increase over conventional processors makes it very attractive to fulfill this requirement. However, the Cell's characteristics, 256KB local memory per SPE and the new low-level communication mechanism, make it very challenging to port an application. As a trial, we selected the solar radiation component of the NASA GEOS-5 climate model, which: (1) is representative of column physics components (half the total computational time), (2) has an extremely high computational intensity: the ratio of computational load to main memory transfers, and (3) exhibits embarrassingly parallel column computations. In this paper, we converted the baseline code (single-precision Fortran) to C and ported it to an IBM BladeCenter QS20. For performance, we manually SIMDize four independent columns and include several unrolling optimizations. Our results show that when compared with the baseline implementation running on one core of Intel's Xeon Woodcrest, Dempsey, and Itanium2, the Cell is approximately 8.8x, 11.6x, and 12.8x faster, respectively. Our preliminary analysis shows that the Cell can also accelerate the dynamics component (~;;25percent total computational time). We believe these dramatic performance improvements make the Cell processor very competitive as an accelerator.

  3. The origins of computer weather prediction and climate modeling

    SciTech Connect (OSTI)

    Lynch, Peter [Meteorology and Climate Centre, School of Mathematical Sciences, University College Dublin, Belfield (Ireland)], E-mail: Peter.Lynch@ucd.ie

    2008-03-20

    Numerical simulation of an ever-increasing range of geophysical phenomena is adding enormously to our understanding of complex processes in the Earth system. The consequences for mankind of ongoing climate change will be far-reaching. Earth System Models are capable of replicating climate regimes of past millennia and are the best means we have of predicting the future of our climate. The basic ideas of numerical forecasting and climate modeling were developed about a century ago, long before the first electronic computer was constructed. There were several major practical obstacles to be overcome before numerical prediction could be put into practice. A fuller understanding of atmospheric dynamics allowed the development of simplified systems of equations; regular radiosonde observations of the free atmosphere and, later, satellite data, provided the initial conditions; stable finite difference schemes were developed; and powerful electronic computers provided a practical means of carrying out the prodigious calculations required to predict the changes in the weather. Progress in weather forecasting and in climate modeling over the past 50 years has been dramatic. In this presentation, we will trace the history of computer forecasting through the ENIAC integrations to the present day. The useful range of deterministic prediction is increasing by about one day each decade, and our understanding of climate change is growing rapidly as Earth System Models of ever-increasing sophistication are developed.

  4. ARM - Baseline Change Request Guidelines

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    DocumentsBaseline Change Request Guidelines Page Contents Introduction Submit a BCR BCR Process Flowchart Baseline Change Request Guidelines Introduction Baseline Change Requests (BCR) are used by the ARM Infrastructure as a process to provide configuration control and for formally requesting and documenting changes within the ARM Infrastructure. Configuration Control: BCRs are required for changes to instruments, data systems, data processes, datastreams, measurement methods, and facilities.

  5. Final Report: Center for Programming Models for Scalable Parallel Computing

    SciTech Connect (OSTI)

    Mellor-Crummey, John

    2011-09-13

    As part of the Center for Programming Models for Scalable Parallel Computing, Rice University collaborated with project partners in the design, development and deployment of language, compiler, and runtime support for parallel programming models to support application development for the “leadership-class” computer systems at DOE national laboratories. Work over the course of this project has focused on the design, implementation, and evaluation of a second-generation version of Coarray Fortran. Research and development efforts of the project have focused on the CAF 2.0 language, compiler, runtime system, and supporting infrastructure. This has involved working with the teams that provide infrastructure for CAF that we rely on, implementing new language and runtime features, producing an open source compiler that enabled us to evaluate our ideas, and evaluating our design and implementation through the use of benchmarks. The report details the research, development, findings, and conclusions from this work.

  6. Baseline Control Measures.pdf

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Individual Permit Baseline Control Measures at Los Alamos National Laboratory, Poster, Individual Permit for Storm Water, NPDES Permit No. NM0030759 Author(s): Veenis, Steven J....

  7. ARM - AMF2 Baseline Instruments

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    2009-2010 Shouxian, China, 2008 Black Forest, Germany, 2007 Niamey, Niger, 2006 Point Reyes, California, 2005 AMF2 Baseline Instruments Instrument Suites View the list of...

  8. Martin Karplus and Computer Modeling for Chemical Systems

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Martin Karplus and Computer Modeling for Chemical Systems Resources with Additional Information * Karplus Equation Martin Karplus ©Portrait by N. Pitt, 9/10/03 Martin Karplus, the Theodore William Richards Professor of Chemistry Emeritus at Harvard, is one of three winners of the 2013 Nobel Prize in chemistry... The 83-year-old Vienna-born theoretical chemist, who is also affiliated with the Université de Strasbourg, Strasbourg, France, is a 1951 graduate of Harvard College and earned his

  9. ONSET OF CHAOS IN A MODEL OF QUANTUM COMPUTATION G. BERMAN; ET...

    Office of Scientific and Technical Information (OSTI)

    OF CHAOS IN A MODEL OF QUANTUM COMPUTATION G. BERMAN; ET AL 71 CLASSICAL AND QUANTUM MECHANICS, GENERAL PHYSICS; 99 GENERAL AND MISCELLANEOUSMATHEMATICS, COMPUTING, AND...

  10. computers

    National Nuclear Security Administration (NNSA)

    Each successive generation of computing system has provided greater computing power and energy efficiency.

    CTS-1 clusters will support NNSA's Life Extension Program and...

  11. Computer Modeling of Violent Intent: A Content Analysis Approach

    SciTech Connect (OSTI)

    Sanfilippo, Antonio P.; Mcgrath, Liam R.; Bell, Eric B.

    2014-01-03

    We present a computational approach to modeling the intent of a communication source representing a group or an individual to engage in violent behavior. Our aim is to identify and rank aspects of radical rhetoric that are endogenously related to violent intent to predict the potential for violence as encoded in written or spoken language. We use correlations between contentious rhetoric and the propensity for violent behavior found in documents from radical terrorist and non-terrorist groups and individuals to train and evaluate models of violent intent. We then apply these models to unseen instances of linguistic behavior to detect signs of contention that have a positive correlation with violent intent factors. Of particular interest is the application of violent intent models to social media, such as Twitter, that have proved to serve as effective channels in furthering sociopolitical change.

  12. HEV America Baseline Test Sequence

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    BASELINE TEST SEQUENCE Revision 1 September 1, 2006 Prepared by Electric Transportation Applications Prepared by: _______________________________ Date: __________ Roberta Brayer Approved by: _________ _________________________________ Date: _______________ _____ Donald B. Karner ©2005 Electric Transportation Applications All Rights Reserved HEV America Baseline Test Sequence Page 1 HEV PERFORMANCE TEST PROCEDURE SEQUENCE The following test sequence shall be used for conduct of HEV America

  13. Hanford Site technical baseline database

    SciTech Connect (OSTI)

    Porter, P.E., Westinghouse Hanford

    1996-05-10

    This document includes a cassette tape that contains the Hanford specific files that make up the Hanford Site Technical Baseline Database as of May 10, 1996. The cassette tape also includes the delta files that delineate the differences between this revision and revision 3 (April 10, 1996) of the Hanford Site Technical Baseline Database.

  14. Computer-Aided Construction of Chemical Kinetic Models

    SciTech Connect (OSTI)

    Green, William H.

    2014-12-31

    The combustion chemistry of even simple fuels can be extremely complex, involving hundreds or thousands of kinetically significant species. The most reasonable way to deal with this complexity is to use a computer not only to numerically solve the kinetic model, but also to construct the kinetic model in the first place. Because these large models contain so many numerical parameters (e.g. rate coefficients, thermochemistry) one never has sufficient data to uniquely determine them all experimentally. Instead one must work in “predictive” mode, using theoretical rather than experimental values for many of the numbers in the model, and as appropriate refining the most sensitive numbers through experiments. Predictive chemical kinetics is exactly what is needed for computer-aided design of combustion systems based on proposed alternative fuels, particularly for early assessment of the value and viability of proposed new fuels before those fuels are commercially available. This project was aimed at making accurate predictive chemical kinetics practical; this is a challenging goal which requires a range of science advances. The project spanned a wide range from quantum chemical calculations on individual molecules and elementary-step reactions, through the development of improved rate/thermo calculation procedures, the creation of algorithms and software for constructing and solving kinetic simulations, the invention of methods for model-reduction while maintaining error control, and finally comparisons with experiment. Many of the parameters in the models were derived from quantum chemistry calculations, and the models were compared with experimental data measured in our lab or in collaboration with others.

  15. Baseline design/economics for advanced Fischer-Tropsch technology. Quarterly report, April--June 1992

    SciTech Connect (OSTI)

    Not Available

    1992-10-01

    Effective September 26, 1991, Bechtel, with Amoco as the main subcontractor, initiated a study to develop a computer model and baseline design for advanced Fischer-Tropsch (F-T) technology for the US Department of Energy`s Pittsburgh Energy Technology Center (PETC). The objectives of the study are to: Develop a baseline design for indirect liquefaction using advanced F-T technology; prepare the capital and operating costs for the baseline design; and develop a process flow sheet simulation (PI-S) model. The baseline design, the economic analysis, and the computer model win be the major research planning tools that PETC will use to plan, guide, and evaluate its ongoing and future research and commercialization programs relating to indirect coal liquefaction. for the manufacture of synthetic liquid fuels from coal. This report is Bechtel`s third quarterly technical progress report covering the period from March 16, 1992 through June 21, 1992. This report consists of seven sections: Section 1 - introduction; Section 2 - summary; Section 3 - carbon dioxide removal tradeoff study; Section 4 - preliminary plant designs for coal preparation; Section 5 - preliminary design for syngas production; Section 6 - Task 3 - engineering design criteria; and Section 7 - project management.

  16. 324 Building Baseline Radiological Characterization

    SciTech Connect (OSTI)

    R.J. Reeder, J.C. Cooper

    2010-06-24

    This report documents the analysis of radiological data collected as part of the characterization study performed in 1998. The study was performed to create a baseline of the radiological conditions in the 324 Building.

  17. Modeling the Fracture of Ice Sheets on Parallel Computers

    SciTech Connect (OSTI)

    Waisman, Haim; Tuminaro, Ray

    2013-10-10

    The objective of this project was to investigate the complex fracture of ice and understand its role within larger ice sheet simulations and global climate change. This objective was achieved by developing novel physics based models for ice, novel numerical tools to enable the modeling of the physics and by collaboration with the ice community experts. At the present time, ice fracture is not explicitly considered within ice sheet models due in part to large computational costs associated with the accurate modeling of this complex phenomena. However, fracture not only plays an extremely important role in regional behavior but also influences ice dynamics over much larger zones in ways that are currently not well understood. To this end, our research findings through this project offers significant advancement to the field and closes a large gap of knowledge in understanding and modeling the fracture of ice sheets in the polar regions. Thus, we believe that our objective has been achieved and our research accomplishments are significant. This is corroborated through a set of published papers, posters and presentations at technical conferences in the field. In particular significant progress has been made in the mechanics of ice, fracture of ice sheets and ice shelves in polar regions and sophisticated numerical methods that enable the solution of the physics in an efficient way.

  18. ARM - AMF1 Baseline Instruments

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    FacilitiesAMF1 Baseline Instruments AMF Information Science Architecture Baseline Instruments AMF1 AMF2 AMF3 MAOS Data Operations AMF Fact Sheet Images Contacts AMF Deployments McMurdo Station, Antarctica, 2015-2016 Pearl Harbor, Hawaii, to San Francisco, California, 2015 HyytiÀlÀ, Finland, 2014 Manacapuru, Brazil, 2014 Oliktok Point, Alaska, 2013 Los Angeles, California, to Honolulu, Hawaii, 2012 Cape Cod, Massachusetts, 2012 Gan Island, Maldives, 2011 Ganges Valley, India, 2011 Steamboat

  19. ARM - AMF3 Baseline Instruments

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    FacilitiesAMF3 Baseline Instruments AMF Information Science Architecture Baseline Instruments AMF1 AMF2 AMF3 MAOS Data Operations AMF Fact Sheet Images Contacts AMF Deployments McMurdo Station, Antarctica, 2015-2016 Pearl Harbor, Hawaii, to San Francisco, California, 2015 HyytiÀlÀ, Finland, 2014 Manacapuru, Brazil, 2014 Oliktok Point, Alaska, 2013 Los Angeles, California, to Honolulu, Hawaii, 2012 Cape Cod, Massachusetts, 2012 Gan Island, Maldives, 2011 Ganges Valley, India, 2011 Steamboat

  20. Cielo Computational Environment Usage Model With Mappings to...

    Office of Scientific and Technical Information (OSTI)

    Cielo is a massively parallel supercomputer funded by the DOENNSA Advanced Simulation and Computing (ASC) program, and operated by the Alliance for Computing at Extreme Scale ...

  1. Computational fluid dynamic modeling of fluidized-bed polymerization reactors

    SciTech Connect (OSTI)

    Rokkam, Ram

    2012-11-02

    Polyethylene is one of the most widely used plastics, and over 60 million tons are produced worldwide every year. Polyethylene is obtained by the catalytic polymerization of ethylene in gas and liquid phase reactors. The gas phase processes are more advantageous, and use fluidized-bed reactors for production of polyethylene. Since they operate so close to the melting point of the polymer, agglomeration is an operational concern in all slurry and gas polymerization processes. Electrostatics and hot spot formation are the main factors that contribute to agglomeration in gas-phase processes. Electrostatic charges in gas phase polymerization fluidized bed reactors are known to influence the bed hydrodynamics, particle elutriation, bubble size, bubble shape etc. Accumulation of electrostatic charges in the fluidized-bed can lead to operational issues. In this work a first-principles electrostatic model is developed and coupled with a multi-fluid computational fluid dynamic (CFD) model to understand the effect of electrostatics on the dynamics of a fluidized-bed. The multi-fluid CFD model for gas-particle flow is based on the kinetic theory of granular flows closures. The electrostatic model is developed based on a fixed, size-dependent charge for each type of particle (catalyst, polymer, polymer fines) phase. The combined CFD model is first verified using simple test cases, validated with experiments and applied to a pilot-scale polymerization fluidized-bed reactor. The CFD model reproduced qualitative trends in particle segregation and entrainment due to electrostatic charges observed in experiments. For the scale up of fluidized bed reactor, filtered models are developed and implemented on pilot scale reactor.

  2. Computational model for simulation small testing launcher, technical solution

    SciTech Connect (OSTI)

    Chelaru, Teodor-Viorel; Cristian, Barbu; Chelaru, Adrian

    2014-12-10

    The purpose of this paper is to present some aspects regarding the computational model and technical solutions for multistage suborbital launcher for testing (SLT) used to test spatial equipment and scientific measurements. The computational model consists in numerical simulation of SLT evolution for different start conditions. The launcher model presented will be with six degrees of freedom (6DOF) and variable mass. The results analysed will be the flight parameters and ballistic performances. The discussions area will focus around the technical possibility to realize a small multi-stage launcher, by recycling military rocket motors. From technical point of view, the paper is focused on national project 'Suborbital Launcher for Testing' (SLT), which is based on hybrid propulsion and control systems, obtained through an original design. Therefore, while classical suborbital sounding rockets are unguided and they use as propulsion solid fuel motor having an uncontrolled ballistic flight, SLT project is introducing a different approach, by proposing the creation of a guided suborbital launcher, which is basically a satellite launcher at a smaller scale, containing its main subsystems. This is why the project itself can be considered an intermediary step in the development of a wider range of launching systems based on hybrid propulsion technology, which may have a major impact in the future European launchers programs. SLT project, as it is shown in the title, has two major objectives: first, a short term objective, which consists in obtaining a suborbital launching system which will be able to go into service in a predictable period of time, and a long term objective that consists in the development and testing of some unconventional sub-systems which will be integrated later in the satellite launcher as a part of the European space program. This is why the technical content of the project must be carried out beyond the range of the existing suborbital vehicle programs towards the current technological necessities in the space field, especially the European one.

  3. computers

    National Nuclear Security Administration (NNSA)

    California.

     

    Retired computers used for cybersecurity research at Sandia National...

  4. Computer

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    I. INTRODUCTION This paper presents several computational tools required for processing images of a heavy ion beam and estimating the magnetic field within a plasma. The...

  5. Computation Modeling and Assessment of Nanocoatings for Ultra Supercritical Boilers

    SciTech Connect (OSTI)

    J. Shingledecker; D. Gandy; N. Cheruvu; R. Wei; K. Chan

    2011-06-21

    Forced outages and boiler unavailability of coal-fired fossil plants is most often caused by fire-side corrosion of boiler waterwalls and tubing. Reliable coatings are required for Ultrasupercritical (USC) application to mitigate corrosion since these boilers will operate at a much higher temperatures and pressures than in supercritical (565 C {at} 24 MPa) boilers. Computational modeling efforts have been undertaken to design and assess potential Fe-Cr-Ni-Al systems to produce stable nanocrystalline coatings that form a protective, continuous scale of either Al{sub 2}O{sub 3} or Cr{sub 2}O{sub 3}. The computational modeling results identified a new series of Fe-25Cr-40Ni with or without 10 wt.% Al nanocrystalline coatings that maintain long-term stability by forming a diffusion barrier layer at the coating/substrate interface. The computational modeling predictions of microstructure, formation of continuous Al{sub 2}O{sub 3} scale, inward Al diffusion, grain growth, and sintering behavior were validated with experimental results. Advanced coatings, such as MCrAl (where M is Fe, Ni, or Co) nanocrystalline coatings, have been processed using different magnetron sputtering deposition techniques. Several coating trials were performed and among the processing methods evaluated, the DC pulsed magnetron sputtering technique produced the best quality coating with a minimum number of shallow defects and the results of multiple deposition trials showed that the process is repeatable. scale, inward Al diffusion, grain growth, and sintering behavior were validated with experimental results. The cyclic oxidation test results revealed that the nanocrystalline coatings offer better oxidation resistance, in terms of weight loss, localized oxidation, and formation of mixed oxides in the Al{sub 2}O{sub 3} scale, than widely used MCrAlY coatings. However, the ultra-fine grain structure in these coatings, consistent with the computational model predictions, resulted in accelerated Al diffusion from the coating into the substrate. An effective diffusion barrier interlayer coating was developed to prevent inward Al diffusion. The fire-side corrosion test results showed that the nanocrystalline coatings with a minimum number of defects have a great potential in providing corrosion protection. The coating tested in the most aggressive environment showed no evidence of coating spallation and/or corrosion attack after 1050 hours exposure. In contrast, evidence of coating spallation in isolated areas and corrosion attack of the base metal in the spalled areas were observed after 500 hours. These contrasting results after 500 and 1050 hours exposure suggest that the premature coating spallation in isolated areas may be related to the variation of defects in the coating between the samples. It is suspected that the cauliflower-type defects in the coating were presumably responsible for coating spallation in isolated areas. Thus, a defect free good quality coating is the key for the long-term durability of nanocrystalline coatings in corrosive environments. Thus, additional process optimization work is required to produce defect-free coatings prior to development of a coating application method for production parts.

  6. Optimization and Performance Modeling of Stencil Computations on Modern Microprocessors

    SciTech Connect (OSTI)

    Datta, Kaushik; Kamil, Shoaib; Williams, Samuel; Oliker, Leonid; Shalf, John; Yelick, Katherine

    2007-06-01

    Stencil-based kernels constitute the core of many important scientific applications on blockstructured grids. Unfortunately, these codes achieve a low fraction of peak performance, due primarily to the disparity between processor and main memory speeds. In this paper, we explore the impact of trends in memory subsystems on a variety of stencil optimization techniques and develop performance models to analytically guide our optimizations. Our work targets cache reuse methodologies across single and multiple stencil sweeps, examining cache-aware algorithms as well as cache-oblivious techniques on the Intel Itanium2, AMD Opteron, and IBM Power5. Additionally, we consider stencil computations on the heterogeneous multicore design of the Cell processor, a machine with an explicitly managed memory hierarchy. Overall our work represents one of the most extensive analyses of stencil optimizations and performance modeling to date. Results demonstrate that recent trends in memory system organization have reduced the efficacy of traditional cache-blocking optimizations. We also show that a cache-aware implementation is significantly faster than a cache-oblivious approach, while the explicitly managed memory on Cell enables the highest overall efficiency: Cell attains 88% of algorithmic peak while the best competing cache-based processor achieves only 54% of algorithmic peak performance.

  7. Baseline Wind Energy Facility | Open Energy Information

    Open Energy Info (EERE)

    Wind Energy Facility Jump to: navigation, search Name Baseline Wind Energy Facility Facility Baseline Wind Energy Facility Sector Wind energy Facility Type Commercial Scale Wind...

  8. Computational Modeling and Assessment Of Nanocoatings for Ultra Supercritical Boilers

    SciTech Connect (OSTI)

    David W. Gandy; John P. Shingledecker

    2011-04-11

    Forced outages and boiler unavailability in conventional coal-fired fossil power plants is most often caused by fireside corrosion of boiler waterwalls. Industry-wide, the rate of wall thickness corrosion wastage of fireside waterwalls in fossil-fired boilers has been of concern for many years. It is significant that the introduction of nitrogen oxide (NOx) emission controls with staged burners systems has increased reported waterwall wastage rates to as much as 120 mils (3 mm) per year. Moreover, the reducing environment produced by the low-NOx combustion process is the primary cause of accelerated corrosion rates of waterwall tubes made of carbon and low alloy steels. Improved coatings, such as the MCrAl nanocoatings evaluated here (where M is Fe, Ni, and Co), are needed to reduce/eliminate waterwall damage in subcritical, supercritical, and ultra-supercritical (USC) boilers. The first two tasks of this six-task project-jointly sponsored by EPRI and the U.S. Department of Energy (DE-FC26-07NT43096)-have focused on computational modeling of an advanced MCrAl nanocoating system and evaluation of two nanocrystalline (iron and nickel base) coatings, which will significantly improve the corrosion and erosion performance of tubing used in USC boilers. The computational model results showed that about 40 wt.% is required in Fe based nanocrystalline coatings for long-term durability, leading to a coating composition of Fe-25Cr-40Ni-10 wt.% Al. In addition, the long term thermal exposure test results further showed accelerated inward diffusion of Al from the nanocrystalline coatings into the substrate. In order to enhance the durability of these coatings, it is necessary to develop a diffusion barrier interlayer coating such TiN and/or AlN. The third task 'Process Advanced MCrAl Nanocoating Systems' of the six-task project jointly sponsored by the Electric Power Research Institute, EPRI and the U.S. Department of Energy (DE-FC26-07NT43096)- has focused on processing of advanced nanocrystalline coating systems and development of diffusion barrier interlayer coatings. Among the diffusion interlayer coatings evaluated, the TiN interlayer coating was found to be the optimum one. This report describes the research conducted under the Task 3 workscope.

  9. Renewable Diesel from Algal Lipis: An Integrated Baseline for...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    ... Original RA modeling output: mean annual biofuel production (Lha-year) under current ... of unit farms required to meet the 5 BG biofuel production target under: (a) the baseline ...

  10. Baseline Control Measures.pdf

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Individual Permit Baseline Control Measures at Los Alamos National Laboratory, Poster, Individual Permit for Storm Water, NPDES Permit No. NM0030759 Author(s): Veenis, Steven J. Intended for: Public Purpose: This poster was prepared for the June 2013 Individual Permit for Storm Water (IP) public meeting. The purpose of the meeting was to update the public on implementation of the permit as required under Part 1.I (7) of the IP (National Pollutant Discharge Elimination System Permit No.

  11. Modeling-Computer Simulations At U.S. West Region (Sabin, Et...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At U.S. West Region (Sabin, Et Al., 2004) Exploration Activity Details...

  12. Modeling-Computer Simulations At Cove Fort Area (Toksoz, Et Al...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Cove Fort Area (Toksoz, Et Al, 2010) Exploration Activity Details...

  13. Modeling-Computer Simulations At U.S. West Region (Williams ...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At U.S. West Region (Williams & Deangelo, 2008) Exploration Activity...

  14. Computing

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Office of Advanced Scientific Computing Research in the Department of Energy Office of Science under contract number DE-AC02-05CH11231. ! Application and System Memory Use, Configuration, and Problems on Bassi Richard Gerber Lawrence Berkeley National Laboratory NERSC User Services ScicomP 13 Garching bei MĂŒnchen, Germany, July 17, 2007 ScicomP 13, July 17, 2007, Garching Overview * About Bassi * Memory on Bassi * Large Page Memory (It's Great!) * System Configuration * Large Page

  15. Computations

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Computations - Sandia Energy Energy Search Icon Sandia Home Locations Contact Us Employee Locator Energy & Climate Secure & Sustainable Energy Future Stationary Power Energy Conversion Efficiency Solar Energy Wind Energy Water Power Supercritical CO2 Geothermal Natural Gas Safety, Security & Resilience of the Energy Infrastructure Energy Storage Nuclear Power & Engineering Grid Modernization Battery Testing Nuclear Fuel Cycle Defense Waste Management Programs Advanced Nuclear

  16. Baseline LAW Glass Formulation Testing

    SciTech Connect (OSTI)

    Kruger, Albert A.; Mooers, Cavin; Bazemore, Gina; Pegg, Ian L.; Hight, Kenneth; Lai, Shan Tao; Buechele, Andrew; Rielley, Elizabeth; Gan, Hao; Muller, Isabelle S.; Cecil, Richard

    2013-06-13

    The major objective of the baseline glass formulation work was to develop and select glass formulations that are compliant with contractual and processing requirements for each of the LAW waste streams. Other objectives of the work included preparation and characterization of glasses with respect to the properties of interest, optimization of sulfate loading in the glasses, evaluation of ability to achieve waste loading limits, testing to demonstrate compatibility of glass melts with melter materials of construction, development of glass formulations to support ILAW qualification activities, and identification of glass formulation issues with respect to contract specifications and processing requirements.

  17. Compensator models for fluence field modulated computed tomography

    SciTech Connect (OSTI)

    Bartolac, Steven; Jaffray, David; Radiation Medicine Program, Princess Margaret Hospital Department of Radiation Oncology, University of Toronto, Toronto, Ontario M5G 2M9

    2013-12-15

    Purpose: Fluence field modulated computed tomography (FFMCT) presents a novel approach for acquiring CT images, whereby a patient model guides dynamically changing fluence patterns in an attempt to achieve task-based, user-prescribed, regional variations in image quality, while also controlling dose to the patient. This work aims to compare the relative effectiveness of FFMCT applied to different thoracic imaging tasks (routine diagnostic CT, lung cancer screening, and cardiac CT) when the modulator is subject to limiting constraints, such as might be present in realistic implementations.Methods: An image quality plan was defined for a simulated anthropomorphic chest slice, including regions of high and low image quality, for each of the thoracic imaging tasks. Modulated fluence patterns were generated using a simulated annealing optimization script, which attempts to achieve the image quality plan under a global dosimetric constraint. Optimization was repeated under different types of modulation constraints (e.g., fixed or gantry angle dependent patterns, continuous or comprised of discrete apertures) with the most limiting case being a fixed conventional bowtie filter. For each thoracic imaging task, an image quality map (IQM{sub sd}) representing the regionally varying standard deviation is predicted for each modulation method and compared to the prescribed image quality plan as well as against results from uniform fluence fields. Relative integral dose measures were also compared.Results: Each IQM{sub sd} resulting from FFMCT showed improved agreement with planned objectives compared to those from uniform fluence fields for all cases. Dynamically changing modulation patterns yielded better uniformity, improved image quality, and lower dose compared to fixed filter patterns with optimized tube current. For the latter fixed filter cases, the optimal choice of tube current modulation was found to depend heavily on the task. Average integral dose reduction compared to a uniform fluence field ranged from 10% using a bowtie filter to 40% or greater using an idealized modulator.Conclusions: The results support that FFMCT may achieve regionally varying image quality distributions in good agreement with user-prescribed values, while limiting dose. The imposition of constraints inhibits dose reduction capacity and agreement with image quality plans but still yields significant improvement over what is afforded by conventional dose minimization techniques. These results suggest that FFMCT can be implemented effectively even when the modulator has limited modulation capabilities.

  18. FED baseline engineering studies report

    SciTech Connect (OSTI)

    Sager, P.H.

    1983-04-01

    Studies were carried out on the FED Baseline to improve design definition, establish feasibility, and reduce cost. Emphasis was placed on cost reduction, but significant feasibility concerns existed in several areas, and better design definition was required to establish feasibility and provide a better basis for cost estimates. Design definition and feasibility studies included the development of a labyrinth shield ring concept to prevent radiation streaming between the torus spool and the TF coil cryostat. The labyrinth shield concept which was developed reduced radiation streaming sufficiently to permit contact maintenance of the inboard EF coils. Various concepts of preventing arcing between adjacent shield sectors were also explored. It was concluded that installation of copper straps with molybdenum thermal radiation shields would provide the most reliable means of preventing arcing. Other design studies included torus spool electrical/structural concepts, test module shielding, torus seismic response, poloidal conditions in the magnets, disruption characteristics, and eddy current effects. These additional studies had no significant impact on cost but did confirm the feasibility of the basic FED Baseline concept.

  19. Improvements in fast-response flood modeling: desktop parallel computing and domain tracking

    SciTech Connect (OSTI)

    Judi, David R; Mcpherson, Timothy N; Burian, Steven J

    2009-01-01

    It is becoming increasingly important to have the ability to accurately forecast flooding, as flooding accounts for the most losses due to natural disasters in the world and the United States. Flood inundation modeling has been dominated by one-dimensional approaches. These models are computationally efficient and are considered by many engineers to produce reasonably accurate water surface profiles. However, because the profiles estimated in these models must be superimposed on digital elevation data to create a two-dimensional map, the result may be sensitive to the ability of the elevation data to capture relevant features (e.g. dikes/levees, roads, walls, etc...). Moreover, one-dimensional models do not explicitly represent the complex flow processes present in floodplains and urban environments and because two-dimensional models based on the shallow water equations have significantly greater ability to determine flow velocity and direction, the National Research Council (NRC) has recommended that two-dimensional models be used over one-dimensional models for flood inundation studies. This paper has shown that two-dimensional flood modeling computational time can be greatly reduced through the use of Java multithreading on multi-core computers which effectively provides a means for parallel computing on a desktop computer. In addition, this paper has shown that when desktop parallel computing is coupled with a domain tracking algorithm, significant computation time can be eliminated when computations are completed only on inundated cells. The drastic reduction in computational time shown here enhances the ability of two-dimensional flood inundation models to be used as a near-real time flood forecasting tool, engineering, design tool, or planning tool. Perhaps even of greater significance, the reduction in computation time makes the incorporation of risk and uncertainty/ensemble forecasting more feasible for flood inundation modeling (NRC 2000; Sayers et al. 2000).

  20. Baseline Test Specimen Machining Report

    SciTech Connect (OSTI)

    mark Carroll

    2009-08-01

    The Next Generation Nuclear Plant (NGNP) Project is tasked with selecting a high temperature gas reactor technology that will be capable of generating electricity and supplying large amounts of process heat. The NGNP is presently being designed as a helium-cooled high temperature gas reactor (HTGR) with a large graphite core. The graphite baseline characterization project is conducting the research and development (R&D) activities deemed necessary to fully qualify nuclear-grade graphite for use in the NGNP reactor. Establishing nonirradiated thermomechanical and thermophysical properties by characterizing lot-to-lot and billet-to-billet variations (for probabilistic baseline data needs) through extensive data collection and statistical analysis is one of the major fundamental objectives of the project. The reactor core will be made up of stacks of graphite moderator blocks. In order to gain a more comprehensive understanding of the varying characteristics in a wide range of suitable graphites, any of which can be classified as “nuclear grade,” an experimental program has been initiated to develop an extensive database of the baseline characteristics of numerous candidate graphites. Various factors known to affect the properties of graphite will be investigated, including specimen size, spatial location within a graphite billet, specimen orientation within a billet (either parallel to [P] or transverse to [T] the long axis of the as-produced billet), and billet-to-billet variations within a lot or across different production lots. Because each data point is based on a certain position within a given billet of graphite, particular attention must be paid to the traceability of each specimen and its spatial location and orientation within each billet. The evaluation of these properties is discussed in the Graphite Technology Development Plan (Windes et. al, 2007). One of the key components in the evaluation of these graphite types will be mechanical testing on specimens drawn from carefully controlled sections of each billet. To this end, this report will discuss the machining of the first set of test specimens that will be evaluated in this program through tensile, compressive, and flexural testing. Validation that the test specimens have been produced to the tolerances required by the applicable ASTM standards, and to the quality control levels required by this program, will demonstrate the viability of sending graphite to selected suppliers that will provide valuable and certifiable data to future data sets that are integral to the NGNP program and beyond.

    1. COMPUTATIONAL FLUID DYNAMICS MODELING OF SCALED HANFORD DOUBLE SHELL TANK MIXING - CFD MODELING SENSITIVITY STUDY RESULTS

      SciTech Connect (OSTI)

      JACKSON VL

      2011-08-31

      The primary purpose of the tank mixing and sampling demonstration program is to mitigate the technical risks associated with the ability of the Hanford tank farm delivery and celtification systems to measure and deliver a uniformly mixed high-level waste (HLW) feed to the Waste Treatment and Immobilization Plant (WTP) Uniform feed to the WTP is a requirement of 24590-WTP-ICD-MG-01-019, ICD-19 - Interface Control Document for Waste Feed, although the exact definition of uniform is evolving in this context. Computational Fluid Dynamics (CFD) modeling has been used to assist in evaluating scaleup issues, study operational parameters, and predict mixing performance at full-scale.

    2. Pinellas Plant Environmental Baseline Report

      SciTech Connect (OSTI)

      Not Available

      1997-06-01

      The Pinellas Plant has been part of the Department of Energy`s (DOE) nuclear weapons complex since the plant opened in 1957. In March 1995, the DOE sold the Pinellas Plant to the Pinellas County Industry Council (PCIC). DOE has leased back a large portion of the plant site to facilitate transition to alternate use and safe shutdown. The current mission is to achieve a safe transition of the facility from defense production and prepare the site for alternative uses as a community resource for economic development. Toward that effort, the Pinellas Plant Environmental Baseline Report (EBR) discusses the current and past environmental conditions of the plant site. Information for the EBR is obtained from plant records. Historical process and chemical usage information for each area is reviewed during area characterizations.

    3. A system analysis computer model for the High Flux Isotope Reactor (HFIRSYS Version 1)

      SciTech Connect (OSTI)

      Sozer, M.C.

      1992-04-01

      A system transient analysis computer model (HFIRSYS) has been developed for analysis of small break loss of coolant accidents (LOCA) and operational transients. The computer model is based on the Advanced Continuous Simulation Language (ACSL) that produces the FORTRAN code automatically and that provides integration routines such as the Gear`s stiff algorithm as well as enabling users with numerous practical tools for generating Eigen values, and providing debug outputs and graphics capabilities, etc. The HFIRSYS computer code is structured in the form of the Modular Modeling System (MMS) code. Component modules from MMS and in-house developed modules were both used to configure HFIRSYS. A description of the High Flux Isotope Reactor, theoretical bases for the modeled components of the system, and the verification and validation efforts are reported. The computer model performs satisfactorily including cases in which effects of structural elasticity on the system pressure is significant; however, its capabilities are limited to single phase flow. Because of the modular structure, the new component models from the Modular Modeling System can easily be added to HFIRSYS for analyzing their effects on system`s behavior. The computer model is a versatile tool for studying various system transients. The intent of this report is not to be a users manual, but to provide theoretical bases and basic information about the computer model and the reactor.

    4. Final Report for Integrated Multiscale Modeling of Molecular Computing Devices

      SciTech Connect (OSTI)

      Glotzer, Sharon C.

      2013-08-28

      In collaboration with researchers at Vanderbilt University, North Carolina State University, Princeton and Oakridge National Laboratory we developed multiscale modeling and simulation methods capable of modeling the synthesis, assembly, and operation of molecular electronics devices. Our role in this project included the development of coarse-grained molecular and mesoscale models and simulation methods capable of simulating the assembly of millions of organic conducting molecules and other molecular components into nanowires, crossbars, and other organized patterns.

    5. Mathematical modeling and computer simulation of processes in energy systems

      SciTech Connect (OSTI)

      Hanjalic, K.C. )

      1990-01-01

      This book is divided into the following chapters. Modeling techniques and tools (fundamental concepts of modeling); 2. Fluid flow, heat and mass transfer, chemical reactions, and combustion; 3. Processes in energy equipment and plant components (boilers, steam and gas turbines, IC engines, heat exchangers, pumps and compressors, nuclear reactors, steam generators and separators, energy transport equipment, energy convertors, etc.); 4. New thermal energy conversion technologies (MHD, coal gasification and liquefaction fluidized-bed combustion, pulse-combustors, multistage combustion, etc.); 5. Combined cycles and plants, cogeneration; 6. Dynamics of energy systems and their components; 7. Integrated approach to energy systems modeling, and 8. Application of modeling in energy expert systems.

    6. Modeling-Computer Simulations (Walker, Et Al., 2005) | Open Energy...

      Open Energy Info (EERE)

      occurrence model for geothermal systems based on fundamental geologic data. References J. D. Walker, A. E. Sabin, J. R. Unruh, J. Combs, F. C. Monastero (2005) Development Of...

    7. Modeling-Computer Simulations At Kilauea East Rift Geothermal...

      Open Energy Info (EERE)

      importance of water convection for distributing heat in the East Rift Zone. References Albert J. Rudman, David Epp (1983) Conduction Models Of The Temperature Distribution In The...

    8. Computer-Aided Construction of Combustion Chemistry Models

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Constructing Accurate Combustion Chemistry Models: Butanols William H. Green & Michael Harper MIT Dept. of Chem. Eng. CEFRC Annual Meeting, Sept. 2010 The people who did this work:...

    9. Baseline Graphite Characterization: First Billet

      SciTech Connect (OSTI)

      Mark C. Carroll; Joe Lords; David Rohrbaugh

      2010-09-01

      The Next Generation Nuclear Plant Project Graphite Research and Development program is currently establishing the safe operating envelope of graphite core components for a very high temperature reactor design. To meet this goal, the program is generating the extensive amount of quantitative data necessary for predicting the behavior and operating performance of the available nuclear graphite grades. In order determine the in-service behavior of the graphite for the latest proposed designs, two main programs are underway. The first, the Advanced Graphite Creep (AGC) program, is a set of experiments that are designed to evaluate the irradiated properties and behavior of nuclear grade graphite over a large spectrum of temperatures, neutron fluences, and compressive loads. Despite the aggressive experimental matrix that comprises the set of AGC test runs, a limited amount of data can be generated based upon the availability of space within the Advanced Test Reactor and the geometric constraints placed on the AGC specimens that will be inserted. In order to supplement the AGC data set, the Baseline Graphite Characterization program will endeavor to provide supplemental data that will characterize the inherent property variability in nuclear-grade graphite without the testing constraints of the AGC program. This variability in properties is a natural artifact of graphite due to the geologic raw materials that are utilized in its production. This variability will be quantified not only within a single billet of as-produced graphite, but also from billets within a single lot, billets from different lots of the same grade, and across different billets of the numerous grades of nuclear graphite that are presently available. The thorough understanding of this variability will provide added detail to the irradiated property data, and provide a more thorough understanding of the behavior of graphite that will be used in reactor design and licensing. This report covers the development of the Baseline Graphite Characterization program from a testing and data collection standpoint through the completion of characterization on the first billet of nuclear-grade graphite. This data set is the starting point for all future evaluations and comparisons of material properties.

    10. Modeling and Analysis of a Lunar Space Reactor with the Computer Code

      Office of Scientific and Technical Information (OSTI)

      RELAP5-3D/ATHENA (Conference) | SciTech Connect Conference: Modeling and Analysis of a Lunar Space Reactor with the Computer Code RELAP5-3D/ATHENA Citation Details In-Document Search Title: Modeling and Analysis of a Lunar Space Reactor with the Computer Code RELAP5-3D/ATHENA The transient analysis 3-dimensional (3-D) computer code RELAP5-3D/ATHENA has been employed to model and analyze a space reactor of 180 kW(thermal), 40 kW (net, electrical) with eight Stirling engines (SEs). Each SE

    11. Computer support to run models of the atmosphere. Final report

      SciTech Connect (OSTI)

      Fung, I.

      1996-08-30

      This research is focused on a better quantification of the variations in CO{sub 2} exchanges between the atmosphere and biosphere and the factors responsible for these exchangers. The principal approach is to infer the variations in the exchanges from variations in the atmospheric CO{sub 2} distribution. The principal tool involves using a global three-dimensional tracer transport model to advect and convect CO{sub 2} in the atmosphere. The tracer model the authors used was developed at the Goddard institute for Space Studies (GISS) and is derived from the GISS atmospheric general circulation model. A special run of the GCM is made to save high-frequency winds and mixing statistics for the tracer model.

    12. Modeling-Computer Simulations (Gritto & Majer) | Open Energy...

      Open Energy Info (EERE)

      are shown in Figure 1. The parameters of the fault were modeled after Coates and Schoenberg (1995), where the orientation of the fault relative to the finite-difference grid...

    13. FINITE ELEMENT MODELS FOR COMPUTING SEISMIC INDUCED SOIL PRESSURES ON DEEPLY EMBEDDED NUCLEAR POWER PLANT STRUCTURES.

      SciTech Connect (OSTI)

      XU, J.; COSTANTINO, C.; HOFMAYER, C.

      2006-06-26

      PAPER DISCUSSES COMPUTATIONS OF SEISMIC INDUCED SOIL PRESSURES USING FINITE ELEMENT MODELS FOR DEEPLY EMBEDDED AND OR BURIED STIFF STRUCTURES SUCH AS THOSE APPEARING IN THE CONCEPTUAL DESIGNS OF STRUCTURES FOR ADVANCED REACTORS.

    14. Theoretical and computer models of detonation in solid explosives

      SciTech Connect (OSTI)

      Tarver, C.M.; Urtiew, P.A.

      1997-10-01

      Recent experimental and theoretical advances in understanding energy transfer and chemical kinetics have led to improved models of detonation waves in solid explosives. The Nonequilibrium Zeldovich - von Neumann - Doring (NEZND) model is supported by picosecond laser experiments and molecular dynamics simulations of the multiphonon up-pumping and internal vibrational energy redistribution (IVR) processes by which the unreacted explosive molecules are excited to the transition state(s) preceding reaction behind the leading shock front(s). High temperature, high density transition state theory calculates the induction times measured by laser interferometric techniques. Exothermic chain reactions form product gases in highly excited vibrational states, which have been demonstrated to rapidly equilibrate via supercollisions. Embedded gauge and Fabry-Perot techniques measure the rates of reaction product expansion as thermal and chemical equilibrium is approached. Detonation reaction zone lengths in carbon-rich condensed phase explosives depend on the relatively slow formation of solid graphite or diamond. The Ignition and Growth reactive flow model based on pressure dependent reaction rates and Jones-Wilkins-Lee (JWL) equations of state has reproduced this nanosecond time resolved experimental data and thus has yielded accurate average reaction zone descriptions in one-, two- and three- dimensional hydrodynamic code calculations. The next generation reactive flow model requires improved equations of state and temperature dependent chemical kinetics. Such a model is being developed for the ALE3D hydrodynamic code, in which heat transfer and Arrhenius kinetics are intimately linked to the hydrodynamics.

    15. Computer model for characterizing, screening, and optimizing electrolyte systems

      SciTech Connect (OSTI)

      2015-06-15

      Electrolyte systems in contemporary batteries are tasked with operating under increasing performance requirements. All battery operation is in some way tied to the electrolyte and how it interacts with various regions within the cell environment. Seeing the electrolyte plays a crucial role in battery performance and longevity, it is imperative that accurate, physics-based models be developed that will characterize key electrolyte properties while keeping pace with the increasing complexity of these liquid systems. Advanced models are needed since laboratory measurements require significant resources to carry out for even a modest experimental matrix. The Advanced Electrolyte Model (AEM) developed at the INL is a proven capability designed to explore molecular-to-macroscale level aspects of electrolyte behavior, and can be used to drastically reduce the time required to characterize and optimize electrolytes. Although it is applied most frequently to lithium-ion battery systems, it is general in its theory and can be used toward numerous other targets and intended applications. This capability is unique, powerful, relevant to present and future electrolyte development, and without peer. It redefines electrolyte modeling for highly-complex contemporary systems, wherein significant steps have been taken to capture the reality of electrolyte behavior in the electrochemical cell environment. This capability can have a very positive impact on accelerating domestic battery development to support aggressive vehicle and energy goals in the 21st century.

    16. Computational modeling of drug-resistant bacteria. Final report

      SciTech Connect (OSTI)

      MacDougall, Preston

      2015-03-12

      Initial proposal summary: The evolution of antibiotic-resistant mutants among bacteria (superbugs) is a persistent and growing threat to public health. In many ways, we are engaged in a war with these microorganisms, where the corresponding arms race involves chemical weapons and biological targets. Just as advances in microelectronics, imaging technology and feature recognition software have turned conventional munitions into smart bombs, the long-term objectives of this proposal are to develop highly effective antibiotics using next-generation biomolecular modeling capabilities in tandem with novel subatomic feature detection software. Using model compounds and targets, our design methodology will be validated with correspondingly ultra-high resolution structure-determination methods at premier DOE facilities (single-crystal X-ray diffraction at Argonne National Laboratory, and neutron diffraction at Oak Ridge National Laboratory). The objectives and accomplishments are summarized.

    17. CASTING DEFECT MODELING IN AN INTEGRATED COMPUTATIONAL MATERIALS ENGINEERING APPROACH

      SciTech Connect (OSTI)

      Sabau, Adrian S [ORNL

      2015-01-01

      To accelerate the introduction of new cast alloys, the simultaneous modeling and simulation of multiphysical phenomena needs to be considered in the design and optimization of mechanical properties of cast components. The required models related to casting defects, such as microporosity and hot tears, are reviewed. Three aluminum alloys are considered A356, 356 and 319. The data on calculated solidification shrinkage is presented and its effects on microporosity levels discussed. Examples are given for predicting microporosity defects and microstructure distribution for a plate casting. Models to predict fatigue life and yield stress are briefly highlighted here for the sake of completion and to illustrate how the length scales of the microstructure features as well as porosity defects are taken into account for modeling the mechanical properties. Thus, the data on casting defects, including microstructure features, is crucial for evaluating the final performance-related properties of the component. ACKNOWLEDGEMENTS This work was performed under a Cooperative Research and Development Agreement (CRADA) with the Nemak Inc., and Chrysler Co. for the project "High Performance Cast Aluminum Alloys for Next Generation Passenger Vehicle Engines. The author would also like to thank Amit Shyam for reviewing the paper and Andres Rodriguez of Nemak Inc. Research sponsored by the U. S. Department of Energy, Office of Energy Efficiency and Renewable Energy, Vehicle Technologies Office, as part of the Propulsion Materials Program under contract DE-AC05-00OR22725 with UT-Battelle, LLC. Part of this research was conducted through the Oak Ridge National Laboratory's High Temperature Materials Laboratory User Program, which is sponsored by the U. S. Department of Energy, Office of Energy Efficiency and Renewable Energy, Vehicle Technologies Program.

    18. Accelerated Climate Modeling for Energy | Argonne Leadership Computing

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Facility An example of a Category 5 hurricane simulated by the CESM at 13 km resolution An example of a Category 5 hurricane simulated by the CESM at 13 km resolution. Precipitable water (gray scale) shows the detailed dynamical structure in the flow. Strong precipitation is overlaid in red. High resolution is necessary to simulate reasonable numbers of tropical cyclones including Category 4 and 5 storms. Alan Scott and Mark Taylor, Sandia National Laboratories Accelerated Climate Modeling

    19. Computational models for the berry phase in semiconductor quantum dots

      SciTech Connect (OSTI)

      Prabhakar, S. Melnik, R. V. N.; Sebetci, A.

      2014-10-06

      By developing a new model and its finite element implementation, we analyze the Berry phase low-dimensional semiconductor nanostructures, focusing on quantum dots (QDs). In particular, we solve the Schrödinger equation and investigate the evolution of the spin dynamics during the adiabatic transport of the QDs in the 2D plane along circular trajectory. Based on this study, we reveal that the Berry phase is highly sensitive to the Rashba and Dresselhaus spin-orbit lengths.

    20. Computer modeling of a CFB (circulating fluidized bed) gasifier

      SciTech Connect (OSTI)

      Gidaspow, D.; Ding, J.

      1990-06-01

      The overall objective of this investigation is to develop experimentally verified models for circulating fluidized bed (CFB) combustors. This report presents an extension of our cold flow modeling of a CFB given in our first quarterly report of this project and published in Numerical Methods for Multiphase Flows'' edited by I. Celik, D. Hughes, C. T. Crowe and D. Lankford, FED-Vol.91, American Society of Mechanical Engineering, pp47--56 (1990). The title of the paper is Multiphase Navier-Stokes Equation Solver'' by D. Gidaspow, J. Ding and U.K. Jayaswal. To the two dimensional code described in the above paper we added the energy equations and the conservation of species equations to describe a synthesis gas from char producer. Under the simulation conditions the injected oxygen reacted near the inlet. The solid-gas mixing was sufficiently rapid that no undesirable hot spots were produced. This simulation illustrates the code's capability to model CFB reactors. 15 refs., 20 figs.

    1. Computer model for characterizing, screening, and optimizing electrolyte systems

      Energy Science and Technology Software Center (OSTI)

      2015-06-15

      Electrolyte systems in contemporary batteries are tasked with operating under increasing performance requirements. All battery operation is in some way tied to the electrolyte and how it interacts with various regions within the cell environment. Seeing the electrolyte plays a crucial role in battery performance and longevity, it is imperative that accurate, physics-based models be developed that will characterize key electrolyte properties while keeping pace with the increasing complexity of these liquid systems. Advanced modelsmore » are needed since laboratory measurements require significant resources to carry out for even a modest experimental matrix. The Advanced Electrolyte Model (AEM) developed at the INL is a proven capability designed to explore molecular-to-macroscale level aspects of electrolyte behavior, and can be used to drastically reduce the time required to characterize and optimize electrolytes. Although it is applied most frequently to lithium-ion battery systems, it is general in its theory and can be used toward numerous other targets and intended applications. This capability is unique, powerful, relevant to present and future electrolyte development, and without peer. It redefines electrolyte modeling for highly-complex contemporary systems, wherein significant steps have been taken to capture the reality of electrolyte behavior in the electrochemical cell environment. This capability can have a very positive impact on accelerating domestic battery development to support aggressive vehicle and energy goals in the 21st century.« less

    2. Baselines for Greenhouse Gas Reductions: Problems, Precedents...

      Open Energy Info (EERE)

      Baseline projection, GHG inventory, Pathways analysis Resource Type: Publications, Lessons learnedbest practices Website: www.p2pays.orgref2221739.pdf References:...

    3. Tank waste remediation systems technical baseline database

      SciTech Connect (OSTI)

      Porter, P.E.

      1996-10-16

      This document includes a cassette tape that contains Hanford generated data for the Tank Waste Remediation Systems Technical Baseline Database as of October 09, 1996.

    4. Computational Human Performance Modeling For Alarm System Design

      SciTech Connect (OSTI)

      Jacques Hugo

      2012-07-01

      The introduction of new technologies like adaptive automation systems and advanced alarms processing and presentation techniques in nuclear power plants is already having an impact on the safety and effectiveness of plant operations and also the role of the control room operator. This impact is expected to escalate dramatically as more and more nuclear power utilities embark on upgrade projects in order to extend the lifetime of their plants. One of the most visible impacts in control rooms will be the need to replace aging alarm systems. Because most of these alarm systems use obsolete technologies, the methods, techniques and tools that were used to design the previous generation of alarm system designs are no longer effective and need to be updated. The same applies to the need to analyze and redefine operators’ alarm handling tasks. In the past, methods for analyzing human tasks and workload have relied on crude, paper-based methods that often lacked traceability. New approaches are needed to allow analysts to model and represent the new concepts of alarm operation and human-system interaction. State-of-the-art task simulation tools are now available that offer a cost-effective and efficient method for examining the effect of operator performance in different conditions and operational scenarios. A discrete event simulation system was used by human factors researchers at the Idaho National Laboratory to develop a generic alarm handling model to examine the effect of operator performance with simulated modern alarm system. It allowed analysts to evaluate alarm generation patterns as well as critical task times and human workload predicted by the system.

    5. Systems, methods and computer-readable media for modeling cell performance fade of rechargeable electrochemical devices

      DOE Patents [OSTI]

      Gering, Kevin L

      2013-08-27

      A system includes an electrochemical cell, monitoring hardware, and a computing system. The monitoring hardware periodically samples performance characteristics of the electrochemical cell. The computing system determines cell information from the performance characteristics of the electrochemical cell. The computing system also develops a mechanistic level model of the electrochemical cell to determine performance fade characteristics of the electrochemical cell and analyzing the mechanistic level model to estimate performance fade characteristics over aging of a similar electrochemical cell. The mechanistic level model uses first constant-current pulses applied to the electrochemical cell at a first aging period and at three or more current values bracketing a first exchange current density. The mechanistic level model also is based on second constant-current pulses applied to the electrochemical cell at a second aging period and at three or more current values bracketing the second exchange current density.

    6. R&D for computational cognitive and social models : foundations for model evaluation through verification and validation (final LDRD report).

      SciTech Connect (OSTI)

      Slepoy, Alexander; Mitchell, Scott A.; Backus, George A.; McNamara, Laura A.; Trucano, Timothy Guy

      2008-09-01

      Sandia National Laboratories is investing in projects that aim to develop computational modeling and simulation applications that explore human cognitive and social phenomena. While some of these modeling and simulation projects are explicitly research oriented, others are intended to support or provide insight for people involved in high consequence decision-making. This raises the issue of how to evaluate computational modeling and simulation applications in both research and applied settings where human behavior is the focus of the model: when is a simulation 'good enough' for the goals its designers want to achieve? In this report, we discuss two years' worth of review and assessment of the ASC program's approach to computational model verification and validation, uncertainty quantification, and decision making. We present a framework that extends the principles of the ASC approach into the area of computational social and cognitive modeling and simulation. In doing so, we argue that the potential for evaluation is a function of how the modeling and simulation software will be used in a particular setting. In making this argument, we move from strict, engineering and physics oriented approaches to V&V to a broader project of model evaluation, which asserts that the systematic, rigorous, and transparent accumulation of evidence about a model's performance under conditions of uncertainty is a reasonable and necessary goal for model evaluation, regardless of discipline. How to achieve the accumulation of evidence in areas outside physics and engineering is a significant research challenge, but one that requires addressing as modeling and simulation tools move out of research laboratories and into the hands of decision makers. This report provides an assessment of our thinking on ASC Verification and Validation, and argues for further extending V&V research in the physical and engineering sciences toward a broader program of model evaluation in situations of high consequence decision-making.

    7. A Variable Refrigerant Flow Heat Pump Computer Model in EnergyPlus

      SciTech Connect (OSTI)

      Raustad, Richard A.

      2013-01-01

      This paper provides an overview of the variable refrigerant flow heat pump computer model included with the Department of Energy's EnergyPlusTM whole-building energy simulation software. The mathematical model for a variable refrigerant flow heat pump operating in cooling or heating mode, and a detailed model for the variable refrigerant flow direct-expansion (DX) cooling coil are described in detail.

    8. Advanced Scientific Computing Research

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Advanced Scientific Computing Research Advanced Scientific Computing Research Discovering, developing, and deploying computational and networking capabilities to analyze, model,...

    9. Verification of a VRF Heat Pump Computer Model in EnergyPlus

      SciTech Connect (OSTI)

      Nigusse, Bereket; Raustad, Richard

      2013-06-01

      This paper provides verification results of the EnergyPlus variable refrigerant flow (VRF) heat pump computer model using manufacturer's performance data. The paper provides an overview of the VRF model, presents the verification methodology, and discusses the results. The verification provides quantitative comparison of full and part-load performance to manufacturer's data in cooling-only and heating-only modes of operation. The VRF heat pump computer model uses dual range bi-quadratic performance curves to represent capacity and Energy Input Ratio (EIR) as a function of indoor and outdoor air temperatures, and dual range quadratic performance curves as a function of part-load-ratio for modeling part-load performance. These performance curves are generated directly from manufacturer's published performance data. The verification compared the simulation output directly to manufacturer's performance data, and found that the dual range equation fit VRF heat pump computer model predicts the manufacturer's performance data very well over a wide range of indoor and outdoor temperatures and part-load conditions. The predicted capacity and electric power deviations are comparbale to equation-fit HVAC computer models commonly used for packaged and split unitary HVAC equipment.

    10. A Hybrid MPI/OpenMP Approach for Parallel Groundwater Model Calibration on Multicore Computers

      SciTech Connect (OSTI)

      Tang, Guoping; D'Azevedo, Ed F; Zhang, Fan; Parker, Jack C.; Watson, David B; Jardine, Philip M

      2010-01-01

      Groundwater model calibration is becoming increasingly computationally time intensive. We describe a hybrid MPI/OpenMP approach to exploit two levels of parallelism in software and hardware to reduce calibration time on multicore computers with minimal parallelization effort. At first, HydroGeoChem 5.0 (HGC5) is parallelized using OpenMP for a uranium transport model with over a hundred species involving nearly a hundred reactions, and a field scale coupled flow and transport model. In the first application, a single parallelizable loop is identified to consume over 97% of the total computational time. With a few lines of OpenMP compiler directives inserted into the code, the computational time reduces about ten times on a compute node with 16 cores. The performance is further improved by selectively parallelizing a few more loops. For the field scale application, parallelizable loops in 15 of the 174 subroutines in HGC5 are identified to take more than 99% of the execution time. By adding the preconditioned conjugate gradient solver and BICGSTAB, and using a coloring scheme to separate the elements, nodes, and boundary sides, the subroutines for finite element assembly, soil property update, and boundary condition application are parallelized, resulting in a speedup of about 10 on a 16-core compute node. The Levenberg-Marquardt (LM) algorithm is added into HGC5 with the Jacobian calculation and lambda search parallelized using MPI. With this hybrid approach, compute nodes at the number of adjustable parameters (when the forward difference is used for Jacobian approximation), or twice that number (if the center difference is used), are used to reduce the calibration time from days and weeks to a few hours for the two applications. This approach can be extended to global optimization scheme and Monte Carol analysis where thousands of compute nodes can be efficiently utilized.

    11. TWRS technical baseline database manager definition document

      SciTech Connect (OSTI)

      Acree, C.D.

      1997-08-13

      This document serves as a guide for using the TWRS Technical Baseline Database Management Systems Engineering (SE) support tool in performing SE activities for the Tank Waste Remediation System (TWRS). This document will provide a consistent interpretation of the relationships between the TWRS Technical Baseline Database Management software and the present TWRS SE practices. The Database Manager currently utilized is the RDD-1000 System manufactured by the Ascent Logic Corporation. In other documents, the term RDD-1000 may be used interchangeably with TWRS Technical Baseline Database Manager.

    12. The Nuclear Energy Advanced Modeling and Simulation Enabling Computational Technologies FY09 Report

      SciTech Connect (OSTI)

      Diachin, L F; Garaizar, F X; Henson, V E; Pope, G

      2009-10-12

      In this document we report on the status of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Enabling Computational Technologies (ECT) effort. In particular, we provide the context for ECT In the broader NEAMS program and describe the three pillars of the ECT effort, namely, (1) tools and libraries, (2) software quality assurance, and (3) computational facility (computers, storage, etc) needs. We report on our FY09 deliverables to determine the needs of the integrated performance and safety codes (IPSCs) in these three areas and lay out the general plan for software quality assurance to meet the requirements of DOE and the DOE Advanced Fuel Cycle Initiative (AFCI). We conclude with a brief description of our interactions with the Idaho National Laboratory computer center to determine what is needed to expand their role as a NEAMS user facility.

    13. Computing Videos

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Computing Videos Computing

    14. Hanford Site technical baseline database. Revision 1

      SciTech Connect (OSTI)

      Porter, P.E.

      1995-01-27

      This report lists the Hanford specific files (Table 1) that make up the Hanford Site Technical Baseline Database. Table 2 includes the delta files that delineate the differences between this revision and revision 0 of the Hanford Site Technical Baseline Database. This information is being managed and maintained on the Hanford RDD-100 System, which uses the capabilities of RDD-100, a systems engineering software system of Ascent Logic Corporation (ALC). This revision of the Hanford Site Technical Baseline Database uses RDD-100 version 3.0.2.2 (see Table 3). Directories reflect those controlled by the Hanford RDD-100 System Administrator. Table 4 provides information regarding the platform. A cassette tape containing the Hanford Site Technical Baseline Database is available.

    15. Systems, methods and computer-readable media to model kinetic performance of rechargeable electrochemical devices

      DOE Patents [OSTI]

      Gering, Kevin L.

      2013-01-01

      A system includes an electrochemical cell, monitoring hardware, and a computing system. The monitoring hardware samples performance characteristics of the electrochemical cell. The computing system determines cell information from the performance characteristics. The computing system also analyzes the cell information of the electrochemical cell with a Butler-Volmer (BV) expression modified to determine exchange current density of the electrochemical cell by including kinetic performance information related to pulse-time dependence, electrode surface availability, or a combination thereof. A set of sigmoid-based expressions may be included with the modified-BV expression to determine kinetic performance as a function of pulse time. The determined exchange current density may be used with the modified-BV expression, with or without the sigmoid expressions, to analyze other characteristics of the electrochemical cell. Model parameters can be defined in terms of cell aging, making the overall kinetics model amenable to predictive estimates of cell kinetic performance along the aging timeline.

    16. Modeling of BWR core meltdown accidents - for application in the MELRPI. MOD2 computer code

      SciTech Connect (OSTI)

      Koh, B R; Kim, S H; Taleyarkhan, R P; Podowski, M Z; Lahey, Jr, R T

      1985-04-01

      This report summarizes improvements and modifications made in the MELRPI computer code. A major difference between this new, updated version of the code, called MELRPI.MOD2, and the one reported previously, concerns the inclusion of a model for the BWR emergency core cooling systems (ECCS). This model and its computer implementation, the ECCRPI subroutine, account for various emergency injection modes, for both intact and rubblized geometries. Other changes to MELRPI deal with an improved model for canister wall oxidation, rubble bed modeling, and numerical integration of system equations. A complete documentation of the entire MELRPI.MOD2 code is also given, including an input guide, list of subroutines, sample input/output and program listing.

    17. DOE Issues Funding Opportunity for Advanced Computational and Modeling Research for the Electric Power System

      Broader source: Energy.gov [DOE]

      The objective of this Funding Opportunity Announcement (FOA) is to leverage scientific advancements in mathematics and computation for application to power system models and software tools, with the long-term goal of enabling real-time protection and control based on wide-area sensor measurements.

    18. Technical Review of the CENWP Computational Fluid Dynamics Model of the John Day Dam Forebay

      SciTech Connect (OSTI)

      Rakowski, Cynthia L.; Serkowski, John A.; Richmond, Marshall C.

      2010-12-01

      The US Army Corps of Engineers Portland District (CENWP) has developed a computational fluid dynamics (CFD) model of the John Day forebay on the Columbia River to aid in the development and design of alternatives to improve juvenile salmon passage at the John Day Project. At the request of CENWP, Pacific Northwest National Laboratory (PNNL) Hydrology Group has conducted a technical review of CENWP's CFD model run in CFD solver software, STAR-CD. PNNL has extensive experience developing and applying 3D CFD models run in STAR-CD for Columbia River hydroelectric projects. The John Day forebay model developed by CENWP is adequately configured and validated. The model is ready for use simulating forebay hydraulics for structural and operational alternatives. The approach and method are sound, however CENWP has identified some improvements that need to be made for future models and for modifications to this existing model.

    19. Computational method and system for modeling, analyzing, and optimizing DNA amplification and synthesis

      DOE Patents [OSTI]

      Vandersall, Jennifer A.; Gardner, Shea N.; Clague, David S.

      2010-05-04

      A computational method and computer-based system of modeling DNA synthesis for the design and interpretation of PCR amplification, parallel DNA synthesis, and microarray chip analysis. The method and system include modules that address the bioinformatics, kinetics, and thermodynamics of DNA amplification and synthesis. Specifically, the steps of DNA selection, as well as the kinetics and thermodynamics of DNA hybridization and extensions, are addressed, which enable the optimization of the processing and the prediction of the products as a function of DNA sequence, mixing protocol, time, temperature and concentration of species.

    20. Computer modeling of electromagnetic edge containment in twin-roll casting

      SciTech Connect (OSTI)

      Chang, F.C.; Turner, L.R.; Hull, J.R.; Wang, Y.H.; Blazek, K.E.

      1998-07-01

      This paper presents modeling studies of magnetohydrodynamics (MHD) analysis in twin-roll casting. Argonne National Laboratory (ANL) and Inland Steel Company have worked together to develop a 3-D computer model that can predict eddy currents, fluid flows, and liquid metal containment for an electromagnetic (EM) edge containment device. This mathematical model can greatly shorten casting research on the use of EM fields for liquid metal containment and control. It can also optimize the existing casting processes and minimize expensive, time-consuming full-scale testing. The model was verified by comparing predictions with experimental results of liquid-metal containment and fluid flow in EM edge dams designed at Inland Steel for twin-roll casting. Numerical simulation was performed by coupling a three-dimensional (3-D) finite-element EM code (ELEKTRA) and a 3-D finite-difference fluids code (CaPS-EM) to solve Maxwell`s equations, Ohm`s law, Navier-Stokes equations, and transport equations of turbulence flow in a casting process that uses EM fields. ELEKTRA is able to predict the eddy-current distribution and electromagnetic forces in complex geometry. CaPS-EM is capable of modeling fluid flows with free-surfaces and dynamic rollers. The computed 3-D magnetic fields and induced eddy currents in ELEKTRA are used as input to flow-field computations in CaPS-EM. Results of the numerical simulation compared well with measurements obtained from both static and dynamic tests.

    1. Models the Electromagnetic Response of a 3D Distribution using MP COMPUTERS

      Energy Science and Technology Software Center (OSTI)

      1999-05-01

      EM3D models the electromagnetic response of a 3D distribution of conductivity, dielectric permittivity and magnetic permeability within the earth for geophysical applications using massively parallel computers. The simulations are carried out in the frequency domain for either electric or magnetic sources for either scattered or total filed formulations of Maxwell''s equations. The solution is based on the method of finite differences and includes absorbing boundary conditions so that responses can be modeled up into themore »radar range where wave propagation is dominant. Recent upgrades in the software include the incorporation of finite size sources, that in addition to dipolar source fields, and a low induction number preconditioner that can significantly reduce computational run times. A graphical user interface (GUI) is bundled with the software so that complicated 3D models can be easily constructed and simulated with the software. The GUI also allows for plotting of the output.« less

    2. High-Performance Computing Modeling Advances Accelerator Science for High-Energy Physics

      SciTech Connect (OSTI)

      Amundson, James; Macridin, Alexandru; Spentzouris, Panagiotis

      2014-11-01

      The development and optimization of particle accelerators are essential for advancing our understanding of the properties of matter, energy, space and time. Particle accelerators are complex devices whose behavior involves many physical effects on multiple scales. Therefore, advanced computational tools utilizing high-performance computing (HPC) are essential for accurately modeling them. In the past decade, the DOE SciDAC program has produced such accelerator-modeling tools, which have beem employed to tackle some of the most difficult accelerator science problems. In this article we discuss the Synergia beam-dynamics framework and its applications to high-intensity particle accelerator physics. Synergia is an accelerator simulation package capable of handling the entire spectrum of beam dynamics simulations. We present the design principles, key physical and numerical models in Synergia and its performance on HPC platforms. Finally, we present the results of Synergia applications for the Fermilab proton source upgrade, known as the Proton Improvement Plan (PIP).

    3. High-Performance Computing Modeling Advances Accelerator Science for High-Energy Physics

      DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

      Amundson, James; Macridin, Alexandru; Spentzouris, Panagiotis

      2014-11-01

      The development and optimization of particle accelerators are essential for advancing our understanding of the properties of matter, energy, space and time. Particle accelerators are complex devices whose behavior involves many physical effects on multiple scales. Therefore, advanced computational tools utilizing high-performance computing (HPC) are essential for accurately modeling them. In the past decade, the DOE SciDAC program has produced such accelerator-modeling tools, which have beem employed to tackle some of the most difficult accelerator science problems. In this article we discuss the Synergia beam-dynamics framework and its applications to high-intensity particle accelerator physics. Synergia is an accelerator simulation packagemore »capable of handling the entire spectrum of beam dynamics simulations. We present the design principles, key physical and numerical models in Synergia and its performance on HPC platforms. Finally, we present the results of Synergia applications for the Fermilab proton source upgrade, known as the Proton Improvement Plan (PIP).« less

    4. Sodium fast reactor gaps analysis of computer codes and models for accident analysis and reactor safety.

      SciTech Connect (OSTI)

      Carbajo, Juan; Jeong, Hae-Yong; Wigeland, Roald; Corradini, Michael; Schmidt, Rodney Cannon; Thomas, Justin; Wei, Tom; Sofu, Tanju; Ludewig, Hans; Tobita, Yoshiharu; Ohshima, Hiroyuki; Serre, Frederic

      2011-06-01

      This report summarizes the results of an expert-opinion elicitation activity designed to qualitatively assess the status and capabilities of currently available computer codes and models for accident analysis and reactor safety calculations of advanced sodium fast reactors, and identify important gaps. The twelve-member panel consisted of representatives from five U.S. National Laboratories (SNL, ANL, INL, ORNL, and BNL), the University of Wisconsin, the KAERI, the JAEA, and the CEA. The major portion of this elicitation activity occurred during a two-day meeting held on Aug. 10-11, 2010 at Argonne National Laboratory. There were two primary objectives of this work: (1) Identify computer codes currently available for SFR accident analysis and reactor safety calculations; and (2) Assess the status and capability of current US computer codes to adequately model the required accident scenarios and associated phenomena, and identify important gaps. During the review, panel members identified over 60 computer codes that are currently available in the international community to perform different aspects of SFR safety analysis for various event scenarios and accident categories. A brief description of each of these codes together with references (when available) is provided. An adaptation of the Predictive Capability Maturity Model (PCMM) for computational modeling and simulation is described for use in this work. The panel's assessment of the available US codes is presented in the form of nine tables, organized into groups of three for each of three risk categories considered: anticipated operational occurrences (AOOs), design basis accidents (DBA), and beyond design basis accidents (BDBA). A set of summary conclusions are drawn from the results obtained. At the highest level, the panel judged that current US code capabilities are adequate for licensing given reasonable margins, but expressed concern that US code development activities had stagnated and that the experienced user-base and the experimental validation base was decaying away quickly.

    5. DualTrust: A Trust Management Model for Swarm-Based Autonomic Computing Systems

      SciTech Connect (OSTI)

      Maiden, Wendy M.

      2010-05-01

      Trust management techniques must be adapted to the unique needs of the application architectures and problem domains to which they are applied. For autonomic computing systems that utilize mobile agents and ant colony algorithms for their sensor layer, certain characteristics of the mobile agent ant swarm -- their lightweight, ephemeral nature and indirect communication -- make this adaptation especially challenging. This thesis looks at the trust issues and opportunities in swarm-based autonomic computing systems and finds that by monitoring the trustworthiness of the autonomic managers rather than the swarming sensors, the trust management problem becomes much more scalable and still serves to protect the swarm. After analyzing the applicability of trust management research as it has been applied to architectures with similar characteristics, this thesis specifies the required characteristics for trust management mechanisms used to monitor the trustworthiness of entities in a swarm-based autonomic computing system and describes a trust model that meets these requirements.

    6. Computing Sciences

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Division The Computational Research Division conducts research and development in mathematical modeling and simulation, algorithm design, data storage, management and...

    7. Superior model for fault tolerance computation in designing nano-sized circuit systems

      SciTech Connect (OSTI)

      Singh, N. S. S. Muthuvalu, M. S.; Asirvadam, V. S.

      2014-10-24

      As CMOS technology scales nano-metrically, reliability turns out to be a decisive subject in the design methodology of nano-sized circuit systems. As a result, several computational approaches have been developed to compute and evaluate reliability of desired nano-electronic circuits. The process of computing reliability becomes very troublesome and time consuming as the computational complexity build ups with the desired circuit size. Therefore, being able to measure reliability instantly and superiorly is fast becoming necessary in designing modern logic integrated circuits. For this purpose, the paper firstly looks into the development of an automated reliability evaluation tool based on the generalization of Probabilistic Gate Model (PGM) and Boolean Difference-based Error Calculator (BDEC) models. The Matlab-based tool allows users to significantly speed-up the task of reliability analysis for very large number of nano-electronic circuits. Secondly, by using the developed automated tool, the paper explores into a comparative study involving reliability computation and evaluation by PGM and, BDEC models for different implementations of same functionality circuits. Based on the reliability analysis, BDEC gives exact and transparent reliability measures, but as the complexity of the same functionality circuits with respect to gate error increases, reliability measure by BDEC tends to be lower than the reliability measure by PGM. The lesser reliability measure by BDEC is well explained in this paper using distribution of different signal input patterns overtime for same functionality circuits. Simulation results conclude that the reliability measure by BDEC depends not only on faulty gates but it also depends on circuit topology, probability of input signals being one or zero and also probability of error on signal lines.

    8. Hybrid Electric Vehicle Fleet and Baseline Performance Testing

      SciTech Connect (OSTI)

      J. Francfort; D. Karner

      2006-04-01

      The U.S. Department of Energy’s Advanced Vehicle Testing Activity (AVTA) conducts baseline performance and fleet testing of hybrid electric vehicles (HEV). To date, the AVTA has completed baseline performance testing on seven HEV models and accumulated 1.4 million fleet testing miles on 26 HEVs. The HEV models tested or in testing include: Toyota Gen I and Gen II Prius, and Highlander; Honda Insight, Civic and Accord; Chevrolet Silverado; Ford Escape; and Lexus RX 400h. The baseline performance testing includes dynamometer and closed track testing to document the HEV’s fuel economy (SAE J1634) and performance in a controlled environment. During fleet testing, two of each HEV model are driven to 160,000 miles per vehicle within 36 months, during which maintenance and repair events, and fuel use is recorded and used to compile life-cycle costs. At the conclusion of the 160,000 miles of fleet testing, the SAE J1634 tests are rerun and each HEV battery pack is tested. These AVTA testing activities are conducted by the Idaho National Laboratory, Electric Transportation Applications, and Exponent Failure Analysis Associates. This paper discusses the testing methods and results.

    9. Compare Energy Use in Variable Refrigerant Flow Heat Pumps Field Demonstration and Computer Model

      SciTech Connect (OSTI)

      Sharma, Chandan; Raustad, Richard

      2013-06-01

      Variable Refrigerant Flow (VRF) heat pumps are often regarded as energy efficient air-conditioning systems which offer electricity savings as well as reduction in peak electric demand while providing improved individual zone setpoint control. One of the key advantages of VRF systems is minimal duct losses which provide significant reduction in energy use and duct space. However, there is limited data available to show their actual performance in the field. Since VRF systems are increasingly gaining market share in the US, it is highly desirable to have more actual field performance data of these systems. An effort was made in this direction to monitor VRF system performance over an extended period of time in a US national lab test facility. Due to increasing demand by the energy modeling community, an empirical model to simulate VRF systems was implemented in the building simulation program EnergyPlus. This paper presents the comparison of energy consumption as measured in the national lab and as predicted by the program. For increased accuracy in the comparison, a customized weather file was created by using measured outdoor temperature and relative humidity at the test facility. Other inputs to the model included building construction, VRF system model based on lab measured performance, occupancy of the building, lighting/plug loads, and thermostat set-points etc. Infiltration model inputs were adjusted in the beginning to tune the computer model and then subsequent field measurements were compared to the simulation results. Differences between the computer model results and actual field measurements are discussed. The computer generated VRF performance closely resembled the field measurements.

    10. GASFLOW: A Computational Fluid Dynamics Code for Gases, Aerosols, and Combustion, Volume 1: Theory and Computational Model

      SciTech Connect (OSTI)

      Nichols, B.D.; Mueller, C.; Necker, G.A.; Travis, J.R.; Spore, J.W.; Lam, K.L.; Royl, P.; Redlinger, R.; Wilson, T.L.

      1998-10-01

      Los Alamos National Laboratory (LANL) and Forschungszentrum Karlsruhe (FzK) are developing GASFLOW, a three-dimensional (3D) fluid dynamics field code as a best-estimate tool to characterize local phenomena within a flow field. Examples of 3D phenomena include circulation patterns; flow stratification; hydrogen distribution mixing and stratification; combustion and flame propagation; effects of noncondensable gas distribution on local condensation and evaporation; and aerosol entrainment, transport, and deposition. An analysis with GASFLOW will result in a prediction of the gas composition and discrete particle distribution in space and time throughout the facility and the resulting pressure and temperature loadings on the walls and internal structures with or without combustion. A major application of GASFLOW is for predicting the transport, mixing, and combustion of hydrogen and other gases in nuclear reactor containments and other facilities. It has been applied to situations involving transporting and distributing combustible gas mixtures. It has been used to study gas dynamic behavior (1) in low-speed, buoyancy-driven flows, as well as sonic flows or diffusion dominated flows; and (2) during chemically reacting flows, including deflagrations. The effects of controlling such mixtures by safety systems can be analyzed. The code version described in this manual is designated GASFLOW 2.1, which combines previous versions of the United States Nuclear Regulatory Commission code HMS (for Hydrogen Mixing Studies) and the Department of Energy and FzK versions of GASFLOW. The code was written in standard Fortran 90. This manual comprises three volumes. Volume I describes the governing physical equations and computational model. Volume II describes how to use the code to set up a model geometry, specify gas species and material properties, define initial and boundary conditions, and specify different outputs, especially graphical displays. Sample problems are included. Volume III contains some of the assessments performed by LANL and FzK. GASFLOW is under continual development, assessment, and application by LANL and FzK. This manual is considered a living document and will be updated as warranted.

    11. High Performance Computing Modeling Advances Accelerator Science for High-Energy Physics

      DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

      Amundson, James; Macridin, Alexandru; Spentzouris, Panagiotis

      2014-07-28

      The development and optimization of particle accelerators are essential for advancing our understanding of the properties of matter, energy, space, and time. Particle accelerators are complex devices whose behavior involves many physical effects on multiple scales. Therefore, advanced computational tools utilizing high-performance computing are essential for accurately modeling them. In the past decade, the US Department of Energy's SciDAC program has produced accelerator-modeling tools that have been employed to tackle some of the most difficult accelerator science problems. The authors discuss the Synergia framework and its applications to high-intensity particle accelerator physics. Synergia is an accelerator simulation package capable ofmore » handling the entire spectrum of beam dynamics simulations. Our authors present Synergia's design principles and its performance on HPC platforms.« less

    12. Marine and Hydrokinetic Technology (MHK) Instrumentation, Measurement, and Computer Modeling Workshop

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      Marine and Hydrokinetic Technology (MHK) Instrumentation, Measurement, and Computer Modeling Workshop W. Musial, M. Lawson, and S. Rooney National Renewable Energy Laboratory Technical Report NREL/TP-5000-57605 February 2013 NREL is a national laboratory of the U.S. Department of Energy, Office of Energy Efficiency & Renewable Energy, operated by the Alliance for Sustainable Energy, LLC. National Renewable Energy Laboratory 15013 Denver West Parkway Golden, Colorado 80401 303-275-3000 *

    13. NREL Computer Models Integrate Wind Turbines with Floating Platforms (Fact Sheet)

      SciTech Connect (OSTI)

      Not Available

      2011-07-01

      Far off the shores of energy-hungry coastal cities, powerful winds blow over the open ocean, where the water is too deep for today's seabed-mounted offshore wind turbines. For the United States to tap into these vast offshore wind energy resources, wind turbines must be mounted on floating platforms to be cost effective. Researchers at the National Renewable Energy Laboratory (NREL) are supporting that development with computer models that allow detailed analyses of such floating wind turbines.

    14. Marine and Hydrokinetic Technology (MHK) Instrumentation, Measurement, and Computer Modeling Workshop

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      and Hydrokinetic Technology (MHK) Instrumentation, Measurement, and Computer Modeling Workshop W. Musial, M. Lawson, and S. Rooney National Renewable Energy Laboratory Technical Report NREL/TP-5000-57605 February 2013 NREL is a national laboratory of the U.S. Department of Energy, Office of Energy Efficiency & Renewable Energy, operated by the Alliance for Sustainable Energy, LLC. National Renewable Energy Laboratory 15013 Denver West Parkway Golden, Colorado 80401 303-275-3000 *

    15. Solid Waste Program technical baseline description

      SciTech Connect (OSTI)

      Carlson, A.B.

      1994-07-01

      The system engineering approach has been taken to describe the technical baseline under which the Solid Waste Program is currently operating. The document contains a mission analysis, function analysis, system definition, documentation requirements, facility and project bases, and uncertainties facing the program.

    16. Waste management project technical baseline description

      SciTech Connect (OSTI)

      Sederburg, J.P.

      1997-08-13

      A systems engineering approach has been taken to describe the technical baseline under which the Waste Management Project is currently operating. The document contains a mission analysis, function analysis, requirement analysis, interface definitions, alternative analysis, system definition, documentation requirements, implementation definitions, and discussion of uncertainties facing the Project.

    17. Review of the synergies between computational modeling and experimental characterization of materials across length scales

      SciTech Connect (OSTI)

      Dingreville, RĂ©mi; Karnesky, Richard A.; Puel, Guillaume; Schmitt, Jean -Hubert

      2015-11-16

      With the increasing interplay between experimental and computational approaches at multiple length scales, new research directions are emerging in materials science and computational mechanics. Such cooperative interactions find many applications in the development, characterization and design of complex material systems. This manuscript provides a broad and comprehensive overview of recent trends in which predictive modeling capabilities are developed in conjunction with experiments and advanced characterization to gain a greater insight into structure–property relationships and study various physical phenomena and mechanisms. The focus of this review is on the intersections of multiscale materials experiments and modeling relevant to the materials mechanics community. After a general discussion on the perspective from various communities, the article focuses on the latest experimental and theoretical opportunities. Emphasis is given to the role of experiments in multiscale models, including insights into how computations can be used as discovery tools for materials engineering, rather than to “simply” support experimental work. This is illustrated by examples from several application areas on structural materials. In conclusion this manuscript ends with a discussion on some problems and open scientific questions that are being explored in order to advance this relatively new field of research.

    18. South Africa-Danish Government Baseline Workstream | Open Energy...

      Open Energy Info (EERE)

      Baseline Workstream Jump to: navigation, search Name South Africa-Danish Government Baseline Workstream AgencyCompany Organization Danish Government Partner Danish Ministry for...

    19. Brazil-Danish Government Baseline Workstream | Open Energy Information

      Open Energy Info (EERE)

      Danish Government Baseline Workstream Jump to: navigation, search Name Brazil-Danish Government Baseline Workstream AgencyCompany Organization Danish Government Partner Danish...

    20. Mexico-Danish Government Baseline Workstream | Open Energy Information

      Open Energy Info (EERE)

      Danish Government Baseline Workstream Jump to: navigation, search Name Mexico-Danish Government Baseline Workstream AgencyCompany Organization Danish Government Partner Danish...

    1. Indonesia-Danish Government Baseline Workstream | Open Energy...

      Open Energy Info (EERE)

      Indonesia-Danish Government Baseline Workstream Jump to: navigation, search Name Indonesia-Danish Government Baseline Workstream AgencyCompany Organization Danish Government...

    2. India-Danish Government Baseline Workstream | Open Energy Information

      Open Energy Info (EERE)

      Danish Government Baseline Workstream Jump to: navigation, search Name India-Danish Government Baseline Workstream AgencyCompany Organization Danish Government Partner Danish...

    3. UNFCCC-Consolidated baseline and monitoring methodology for landfill...

      Open Energy Info (EERE)

      Consolidated baseline and monitoring methodology for landfill gas project activities Jump to: navigation, search Tool Summary LAUNCH TOOL Name: UNFCCC-Consolidated baseline and...

    4. China-Danish Government Baseline Workstream | Open Energy Information

      Open Energy Info (EERE)

      Danish Government Baseline Workstream Jump to: navigation, search Name China-Danish Government Baseline Workstream AgencyCompany Organization Danish Government Partner Danish...

    5. Efficient Computation of Info-Gap Robustness for Finite Element Models

      SciTech Connect (OSTI)

      Stull, Christopher J.; Hemez, Francois M.; Williams, Brian J.

      2012-07-05

      A recent research effort at LANL proposed info-gap decision theory as a framework by which to measure the predictive maturity of numerical models. Info-gap theory explores the trade-offs between accuracy, that is, the extent to which predictions reproduce the physical measurements, and robustness, that is, the extent to which predictions are insensitive to modeling assumptions. Both accuracy and robustness are necessary to demonstrate predictive maturity. However, conducting an info-gap analysis can present a formidable challenge, from the standpoint of the required computational resources. This is because a robustness function requires the resolution of multiple optimization problems. This report offers an alternative, adjoint methodology to assess the info-gap robustness of Ax = b-like numerical models solved for a solution x. Two situations that can arise in structural analysis and design are briefly described and contextualized within the info-gap decision theory framework. The treatments of the info-gap problems, using the adjoint methodology are outlined in detail, and the latter problem is solved for four separate finite element models. As compared to statistical sampling, the proposed methodology offers highly accurate approximations of info-gap robustness functions for the finite element models considered in the report, at a small fraction of the computational cost. It is noted that this report considers only linear systems; a natural follow-on study would extend the methodologies described herein to include nonlinear systems.

    6. Computer modeling of electromagnetic fields and fluid flows for edge containment in continuous casting

      SciTech Connect (OSTI)

      Chang, F.C.; Hull, J.R.; Wang, Y.H.; Blazek, K.E.

      1996-02-01

      A computer model was developed to predict eddy currents and fluid flows in molten steel. The model was verified by comparing predictions with experimental results of liquid-metal containment and fluid flow in electromagnetic (EM) edge dams (EMDs) designed at Inland Steel for twin-roll casting. The model can optimize the EMD design so it is suitable for application, and minimize expensive, time-consuming full-scale testing. Numerical simulation was performed by coupling a three-dimensional (3-D) finite-element EM code (ELEKTRA) and a 3-D finite-difference fluids code (CaPS-EM) to solve heat transfer, fluid flow, and turbulence transport in a casting process that involves EM fields. ELEKTRA is able to predict the eddy- current distribution and the electromagnetic forces in complex geometries. CaPS-EM is capable of modeling fluid flows with free surfaces. Results of the numerical simulation compared well with measurements obtained from a static test.

    7. Computational fluid dynamics modeling of coal gasification in a pressurized spout-fluid bed

      SciTech Connect (OSTI)

      Zhongyi Deng; Rui Xiao; Baosheng Jin; He Huang; Laihong Shen; Qilei Song; Qianjun Li

      2008-05-15

      Computational fluid dynamics (CFD) modeling, which has recently proven to be an effective means of analysis and optimization of energy-conversion processes, has been extended to coal gasification in this paper. A 3D mathematical model has been developed to simulate the coal gasification process in a pressurized spout-fluid bed. This CFD model is composed of gas-solid hydrodynamics, coal pyrolysis, char gasification, and gas phase reaction submodels. The rates of heterogeneous reactions are determined by combining Arrhenius rate and diffusion rate. The homogeneous reactions of gas phase can be treated as secondary reactions. A comparison of the calculated and experimental data shows that most gasification performance parameters can be predicted accurately. This good agreement indicates that CFD modeling can be used for complex fluidized beds coal gasification processes. 37 refs., 7 figs., 5 tabs.

    8. An integrated computer modeling environment for regional land use, air quality, and transportation planning

      SciTech Connect (OSTI)

      Hanley, C.J.; Marshall, N.L.

      1997-04-01

      The Land Use, Air Quality, and Transportation Integrated Modeling Environment (LATIME) represents an integrated approach to computer modeling and simulation of land use allocation, travel demand, and mobile source emissions for the Albuquerque, New Mexico, area. This environment provides predictive capability combined with a graphical and geographical interface. The graphical interface shows the causal relationships between data and policy scenarios and supports alternative model formulations. Scenarios are launched from within a Geographic Information System (GIS), and data produced by each model component at each time step within a simulation is stored in the GIS. A menu-driven query system is utilized to review link-based results and regional and area-wide results. These results can also be compared across time or between alternative land use scenarios. Using this environment, policies can be developed and implemented based on comparative analysis, rather than on single-step future projections. 16 refs., 3 figs., 2 tabs.

    9. Complex functionality with minimal computation. Promise and pitfalls of reduced-tracer ocean biogeochemistry models

      SciTech Connect (OSTI)

      Galbraith, Eric D.; Dunne, John P.; Gnanadesikan, Anand; Slater, Richard D.; Sarmiento, Jorge L.; Dufour, Carolina O.; de Souza, Gregory F.; Bianchi, Daniele; Claret, Mariona; Rodgers, Keith B.; Marvasti, Seyedehsafoura Sedigh

      2015-12-21

      Earth System Models increasingly include ocean biogeochemistry models in order to predict changes in ocean carbon storage, hypoxia, and biological productivity under climate change. However, state-of-the-art ocean biogeochemical models include many advected tracers, that significantly increase the computational resources required, forcing a trade-off with spatial resolution. Here, we compare a state-of the art model with 30 prognostic tracers (TOPAZ) with two reduced-tracer models, one with 6 tracers (BLING), and the other with 3 tracers (miniBLING). The reduced-tracer models employ parameterized, implicit biological functions, which nonetheless capture many of the most important processes resolved by TOPAZ. All three are embedded in the same coupled climate model. Despite the large difference in tracer number, the absence of tracers for living organic matter is shown to have a minimal impact on the transport of nutrient elements, and the three models produce similar mean annual preindustrial distributions of macronutrients, oxygen, and carbon. Significant differences do exist among the models, in particular the seasonal cycle of biomass and export production, but it does not appear that these are necessary consequences of the reduced tracer number. With increasing CO2, changes in dissolved oxygen and anthropogenic carbon uptake are very similar across the different models. Thus, while the reduced-tracer models do not explicitly resolve the diversity and internal dynamics of marine ecosystems, we demonstrate that such models are applicable to a broad suite of major biogeochemical concerns, including anthropogenic change. Lastly, these results are very promising for the further development and application of reduced-tracer biogeochemical models that incorporate ‘‘sub-ecosystem-scale’’ parameterizations.

    10. Complex functionality with minimal computation. Promise and pitfalls of reduced-tracer ocean biogeochemistry models

      DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

      Galbraith, Eric D.; Dunne, John P.; Gnanadesikan, Anand; Slater, Richard D.; Sarmiento, Jorge L.; Dufour, Carolina O.; de Souza, Gregory F.; Bianchi, Daniele; Claret, Mariona; Rodgers, Keith B.; et al

      2015-12-21

      Earth System Models increasingly include ocean biogeochemistry models in order to predict changes in ocean carbon storage, hypoxia, and biological productivity under climate change. However, state-of-the-art ocean biogeochemical models include many advected tracers, that significantly increase the computational resources required, forcing a trade-off with spatial resolution. Here, we compare a state-of the art model with 30 prognostic tracers (TOPAZ) with two reduced-tracer models, one with 6 tracers (BLING), and the other with 3 tracers (miniBLING). The reduced-tracer models employ parameterized, implicit biological functions, which nonetheless capture many of the most important processes resolved by TOPAZ. All three are embedded inmore » the same coupled climate model. Despite the large difference in tracer number, the absence of tracers for living organic matter is shown to have a minimal impact on the transport of nutrient elements, and the three models produce similar mean annual preindustrial distributions of macronutrients, oxygen, and carbon. Significant differences do exist among the models, in particular the seasonal cycle of biomass and export production, but it does not appear that these are necessary consequences of the reduced tracer number. With increasing CO2, changes in dissolved oxygen and anthropogenic carbon uptake are very similar across the different models. Thus, while the reduced-tracer models do not explicitly resolve the diversity and internal dynamics of marine ecosystems, we demonstrate that such models are applicable to a broad suite of major biogeochemical concerns, including anthropogenic change. Lastly, these results are very promising for the further development and application of reduced-tracer biogeochemical models that incorporate ‘‘sub-ecosystem-scale’’ parameterizations.« less

    11. Baseline Microstructural Characterization of Outer 3013 Containers

      SciTech Connect (OSTI)

      Zapp, Phillip E.; Dunn, Kerry A

      2005-07-31

      Three DOE Standard 3013 outer storage containers were examined to characterize the microstructure of the type 316L stainless steel material of construction. Two of the containers were closure-welded yielding production-quality outer 3013 containers; the third examined container was not closed. Optical metallography and Knoop microhardness measurements were performed to establish a baseline characterization that will support future destructive examinations of 3013 outer containers in the storage inventory. Metallography revealed the microstructural features typical of this austenitic stainless steel as it is formed and welded. The grains were equiaxed with evident annealing twins. Flow lines were prominent in the forming directions of the cylindrical body and flat lids and bottom caps. No adverse indications were seen. Microhardness values, although widely varying, were consistent with annealed austenitic stainless steel. The data gathered as part of this characterization will be used as a baseline for the destructive examination of 3013 containers removed from the storage inventory.

    12. Rapidly re-computable EEG (electroencephalography) forward models for realistic head shapes

      SciTech Connect (OSTI)

      Ermer, J. J.; Mosher, J. C.; Baillet, S.; Leahy, R. M.

      2001-01-01

      Solution of the EEG source localization (inverse) problem utilizing model-based methods typically requires a significant number of forward model evaluations. For subspace based inverse methods like MUSIC [6], the total number of forward model evaluations can often approach an order of 10{sup 3} or 10{sup 4}. Techniques based on least-squares minimization may require significantly more evaluations. The observed set of measurements over an M-sensor array is often expressed as a linear forward spatio-temporal model of the form: F = GQ + N (1) where the observed forward field F (M-sensors x N-time samples) can be expressed in terms of the forward model G, a set of dipole moment(s) Q (3xP-dipoles x N-time samples) and additive noise N. Because of their simplicity, ease of computation, and relatively good accuracy, multi-layer spherical models [7] (or fast approximations described in [1], [7]) have traditionally been the 'forward model of choice' for approximating the human head. However, approximation of the human head via a spherical model does have several key drawbacks. By its very shape, the use of a spherical model distorts the true distribution of passive currents in the skull cavity. Spherical models also require that the sensor positions be projected onto the fitted sphere (Fig. 1), resulting in a distortion of the true sensor-dipole spatial geometry (and ultimately the computed surface potential). The use of a single 'best-fitted' sphere has the added drawback of incomplete coverage of the inner skull region, often ignoring areas such as the frontal cortex. In practice, this problem is typically countered by fitting additional sphere(s) to those region(s) not covered by the primary sphere. The use of these additional spheres results in added complication to the forward model. Using high-resolution spatial information obtained via X-ray CT or MR imaging, a realistic head model can be formed by tessellating the head into a set of contiguous regions (typically the scalp, outer skull, and inner skull surfaces). Since accurate in vivo determination of internal conductivities is currently not currently possible, the head is typically assumed to consist of a set of contiguous isotropic regions, each with constant conductivity.

    13. Energy Intensity Baselining and Tracking Guidance

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      Learn more at betterbuildings.energy.gov Energy Intensity Baselining and Tracking Guidance i Preface The U.S. Department of Energy's (DOE's) Better Buildings, Better Plants Program (Better Plants) is a voluntary energy efficiency leadership initiative for U.S. manufacturers. The program encourages companies to commit to reduce the energy intensity of their U.S. manufacturing operations, usually by 25% over a 10-year period. Companies joining Better Plants are recognized by DOE for their

    14. Computational Fluid Dynamics (CFD) Modeling for High Rate Pulverized Coal Injection (PCI) into the Blast Furnace

      SciTech Connect (OSTI)

      Dr. Chenn Zhou

      2008-10-15

      Pulverized coal injection (PCI) into the blast furnace (BF) has been recognized as an effective way to decrease the coke and total energy consumption along with minimization of environmental impacts. However, increasing the amount of coal injected into the BF is currently limited by the lack of knowledge of some issues related to the process. It is therefore important to understand the complex physical and chemical phenomena in the PCI process. Due to the difficulty in attaining trus BF measurements, Computational fluid dynamics (CFD) modeling has been identified as a useful technology to provide such knowledge. CFD simulation is powerful for providing detailed information on flow properties and performing parametric studies for process design and optimization. In this project, comprehensive 3-D CFD models have been developed to simulate the PCI process under actual furnace conditions. These models provide raceway size and flow property distributions. The results have provided guidance for optimizing the PCI process.

    15. Module 7 - Integrated Baseline Review and Change Control | Department of

      Energy Savers [EERE]

      Energy 7 - Integrated Baseline Review and Change Control Module 7 - Integrated Baseline Review and Change Control This module focuses on integrated baseline reviews (IBR) and change control. This module outlines the objective and responsibility of an integrated baseline review. Additionally, this module will discuss the change control process required for implementing earned value

    16. Document Number Q0029500 Baseline Risk Assessment Update 4.0 Baseline Risk Assessment Update

      Office of Legacy Management (LM)

      Baseline Risk Assessment Update 4.0 Baseline Risk Assessment Update This section updates the human health and the ecological risk assessments that were originally presented in the 1998 RI (DOE 1998a). The impacts on the 1998 risk assessments are summarized in Section 2.9. 4.1 Human Health Risk Assessment Several activities completed since 1998 have contributed to changes in surface water and ground water concentrations. Activities that have impacted, or likely impacted surface water and ground

    17. Computational Model of Population Dynamics Based on the Cell Cycle and Local Interactions

      SciTech Connect (OSTI)

      Oprisan, Sorinel Adrian; Oprisan, Ana

      2005-03-31

      Our study bridges cellular (mesoscopic) level interactions and global population (macroscopic) dynamics of carcinoma. The morphological differences and transitions between well and smooth defined benign tumors and tentacular malignat tumors suggest a theoretical analysis of tumor invasion based on the development of mathematical models exhibiting bifurcations of spatial patterns in the density of tumor cells. Our computational model views the most representative and clinically relevant features of oncogenesis as a fight between two distinct sub-systems: the immune system of the host and the neoplastic system. We implemented the neoplastic sub-system using a three-stage cell cycle: active, dormant, and necrosis. The second considered sub-system consists of cytotoxic active (effector) cells -- EC, with a very broad phenotype ranging from NK cells to CTL cells, macrophages, etc. Based on extensive numerical simulations, we correlated the fractal dimensions for carcinoma, which could be obtained from tumor imaging, with the malignat stage. Our computational model was able to also simulate the effects of surgical, chemotherapeutical, and radiotherapeutical treatments.

    18. Enabling a Highly-Scalable Global Address Space Model for Petascale Computing

      SciTech Connect (OSTI)

      Apra, Edoardo; Vetter, Jeffrey S; Yu, Weikuan

      2010-01-01

      Over the past decade, the trajectory to the petascale has been built on increased complexity and scale of the underlying parallel architectures. Meanwhile, software de- velopers have struggled to provide tools that maintain the productivity of computational science teams using these new systems. In this regard, Global Address Space (GAS) programming models provide a straightforward and easy to use addressing model, which can lead to improved produc- tivity. However, the scalability of GAS depends directly on the design and implementation of the runtime system on the target petascale distributed-memory architecture. In this paper, we describe the design, implementation, and optimization of the Aggregate Remote Memory Copy Interface (ARMCI) runtime library on the Cray XT5 2.3 PetaFLOPs computer at Oak Ridge National Laboratory. We optimized our implementation with the flow intimation technique that we have introduced in this paper. Our optimized ARMCI implementation improves scalability of both the Global Arrays (GA) programming model and a real-world chemistry application NWChem from small jobs up through 180,000 cores.

    19. Scope Management Baseline Development (FPM 208), Idaho | Department of

      Energy Savers [EERE]

      Energy Scope Management Baseline Development (FPM 208), Idaho Scope Management Baseline Development (FPM 208), Idaho March 29, 2016 8:00AM EDT to March 31, 2016 5:00PM EDT Scope Management Baseline Development Level 2 Required Course 3 days / 24 CLPs This course is designed to enhance a Program or Project Manager's ability to clearly define requirements and scope, develop a defensible baseline, and manage conformance to the baseline throughout the project life-cycle. The course emphasizes

    20. High-Performance Computer Modeling of the Cosmos-Iridium Collision

      SciTech Connect (OSTI)

      Olivier, S; Cook, K; Fasenfest, B; Jefferson, D; Jiang, M; Leek, J; Levatin, J; Nikolaev, S; Pertica, A; Phillion, D; Springer, K; De Vries, W

      2009-08-28

      This paper describes the application of a new, integrated modeling and simulation framework, encompassing the space situational awareness (SSA) enterprise, to the recent Cosmos-Iridium collision. This framework is based on a flexible, scalable architecture to enable efficient simulation of the current SSA enterprise, and to accommodate future advancements in SSA systems. In particular, the code is designed to take advantage of massively parallel, high-performance computer systems available, for example, at Lawrence Livermore National Laboratory. We will describe the application of this framework to the recent collision of the Cosmos and Iridium satellites, including (1) detailed hydrodynamic modeling of the satellite collision and resulting debris generation, (2) orbital propagation of the simulated debris and analysis of the increased risk to other satellites (3) calculation of the radar and optical signatures of the simulated debris and modeling of debris detection with space surveillance radar and optical systems (4) determination of simulated debris orbits from modeled space surveillance observations and analysis of the resulting orbital accuracy, (5) comparison of these modeling and simulation results with Space Surveillance Network observations. We will also discuss the use of this integrated modeling and simulation framework to analyze the risks and consequences of future satellite collisions and to assess strategies for mitigating or avoiding future incidents, including the addition of new sensor systems, used in conjunction with the Space Surveillance Network, for improving space situational awareness.

    1. Swelling in light water reactor internal components: Insights from computational modeling

      SciTech Connect (OSTI)

      Stoller, Roger E.; Barashev, Alexander V.; Golubov, Stanislav I.

      2015-08-01

      A modern cluster dynamics model has been used to investigate the materials and irradiation parameters that control microstructural evolution under the relatively low-temperature exposure conditions that are representative of the operating environment for in-core light water reactor components. The focus is on components fabricated from austenitic stainless steel. The model accounts for the synergistic interaction between radiation-produced vacancies and the helium that is produced by nuclear transmutation reactions. Cavity nucleation rates are shown to be relatively high in this temperature regime (275 to 325°C), but are sensitive to assumptions about the fine scale microstructure produced under low-temperature irradiation. The cavity nucleation rates observed run counter to the expectation that void swelling would not occur under these conditions. This expectation was based on previous research on void swelling in austenitic steels in fast reactors. This misleading impression arose primarily from an absence of relevant data. The results of the computational modeling are generally consistent with recent data obtained by examining ex-service components. However, it has been shown that the sensitivity of the model s predictions of low-temperature swelling behavior to assumptions about the primary damage source term and specification of the mean-field sink strengths is somewhat greater that that observed at higher temperatures. Further assessment of the mathematical model is underway to meet the long-term objective of this research, which is to provide a predictive model of void swelling at relevant lifetime exposures to support extended reactor operations.

    2. New Set of Computational Tools and Models Expected to Help Enable Rapid Development and Deployment of Carbon Capture Technologies

      Broader source: Energy.gov [DOE]

      An eagerly anticipated suite of 21 computational tools and models to help enable rapid development and deployment of new carbon capture technologies is now available from the Carbon Capture Simulation Initiative.

    3. Validation of the thermospheric vector spherical harmonic (VSH) computer model. Master's thesis

      SciTech Connect (OSTI)

      Davis, J.L.

      1991-01-01

      A semi-empirical computer model of the lower thermosphere has been developed that provides a description of the composition and dynamics of the thermosphere (Killeen et al., 1992). Input variables needed to run the VSH model include time, space and geophysical conditions. One of the output variables the model provides, neutral density, is of particular interest to the U.S. Air Force. Neutral densities vary both as a result of change in solar flux (eg. the solar cycle) and as a result of changes in the magnetosphere (eg. large changes occur in neutral density during geomagnetic storms). Satellites in earth orbit experience aerodynamic drag due to the atmospheric density of the thermosphere. Variability in the neutral density described above affects the drag a satellite experiences and as a result can change the orbital characteristics of the satellite. These changes make it difficult to track the satellite's position. Therefore, it is particularly important to insure that the accuracy of the model's neutral density is optimized for all input parameters. To accomplish this, a validation program was developed to evaluate the strengths and weaknesses of the model's density output by comparing it to SETA-2 (satellite electrostatic accelerometer) total mass density measurements.

    4. In-Service Design & Performance Prediction of Advanced Fusion Material Systems by Computational Modeling and Simulation

      SciTech Connect (OSTI)

      G. R. Odette; G. E. Lucas

      2005-11-15

      This final report on "In-Service Design & Performance Prediction of Advanced Fusion Material Systems by Computational Modeling and Simulation" (DE-FG03-01ER54632) consists of a series of summaries of work that has been published, or presented at meetings, or both. It briefly describes results on the following topics: 1) A Transport and Fate Model for Helium and Helium Management; 2) Atomistic Studies of Point Defect Energetics, Dynamics and Interactions; 3) Multiscale Modeling of Fracture consisting of: 3a) A Micromechanical Model of the Master Curve (MC) Universal Fracture Toughness-Temperature Curve Relation, KJc(T - To), 3b) An Embrittlement DTo Prediction Model for the Irradiation Hardening Dominated Regime, 3c) Non-hardening Irradiation Assisted Thermal and Helium Embrittlement of 8Cr Tempered Martensitic Steels: Compilation and Analysis of Existing Data, 3d) A Model for the KJc(T) of a High Strength NFA MA957, 3e) Cracked Body Size and Geometry Effects of Measured and Effective Fracture Toughness-Model Based MC and To Evaluations of F82H and Eurofer 97, 3-f) Size and Geometry Effects on the Effective Toughness of Cracked Fusion Structures; 4) Modeling the Multiscale Mechanics of Flow Localization-Ductility Loss in Irradiation Damaged BCC Alloys; and 5) A Universal Relation Between Indentation Hardness and True Stress-Strain Constitutive Behavior. Further details can be found in the cited references or presentations that generally can be accessed on the internet, or provided upon request to the authors. Finally, it is noted that this effort was integrated with our base program in fusion materials, also funded by the DOE OFES.

    5. Why applicants should use computer simulation models to comply with the FERC`s new merger policy

      SciTech Connect (OSTI)

      Frankena, M.W.; Morris, J.R.

      1997-02-01

      Computer models for electric utility use in complying with the US Federal Energy Regulatory Commission policy on mergers are described. Four types of simulation models that are widely used in the electric power industry are considered as tools for analyzing market power issues: dispatch/transportation models, dispatch/unit-commitment models, load-flow models, and load-flow/dispatch models. Basic model capabilities and limitations are described. Uses of the models for other purposes are also noted, including regulatory filings, antitrust litigation, and evaluation of pricing strategies.

    6. Subsurface Multiphase Flow and Multicomponent Reactive Transport Modeling using High-Performance Computing

      SciTech Connect (OSTI)

      Hammond, Glenn E.; Lichtner, Peter C.; Lu, Chuan

      2007-07-16

      Numerical modeling has become a critical tool to the U.S. Department of Energy for evaluating the environmental impact of alternative energy sources and remediation strategies for legacy waste sites. Unfortunately, the physical and chemical complexity of many sites overwhelms the capabilities of even most “state of the art” groundwater models. Of particular concern are the representation of highly-heterogeneous stratified rock/soil layers in the subsurface and the biological and geochemical interactions of chemical species within multiple fluid phases. Clearly, there is a need for higher-resolution modeling (i.e. more spatial, temporal, and chemical degrees of freedom) and increasingly mechanistic descriptions of subsurface physicochemical processes. We present SciDAC-funded research being performed in the development of PFLOTRAN, a parallel multiphase flow and multicomponent reactive transport model. Written in Fortran90, PFLOTRAN is founded upon PETSc data structures and solvers. We are employing PFLOTRAN in the simulation of uranium transport at the Hanford 300 Area, a contaminated site of major concern to the Department of Energy, the State of Washington, and other government agencies. By leveraging the billions of degrees of freedom available through high-performance computation using tens of thousands of processors, we can better characterize the release of uranium into groundwater and its subsequent transport to the Columbia River, and thereby better understand and evaluate the effectiveness of various proposed remediation strategies.

    7. COMPUTATIONAL THERMODYNAMIC MODELING OF HOT CORROSION OF ALLLOYS HAYNES 242 AND HASTELLOYTMN FOR MOLTEN SALT SERVICE

      SciTech Connect (OSTI)

      Michael V. Glazoff; Piyush Sabharwall; Akira Tokuhiro

      2014-09-01

      An evaluation of thermodynamic aspects of hot corrosion of the superalloys Haynes 242 and HastelloyTM N in the eutectic mixtures of KF and ZrF4 is carried out for development of Advanced High Temperature Reactor (AHTR). This work models the behavior of several superalloys, potential candidates for the AHTR, using computational thermodynamics tool (ThermoCalc), leading to the development of thermodynamic description of the molten salt eutectic mixtures, and on that basis, mechanistic prediction of hot corrosion. The results from these studies indicated that the principal mechanism of hot corrosion was associated with chromium leaching for all of the superalloys described above. However, HastelloyTM N displayed the best hot corrosion performance. This was not surprising given it was developed originally to withstand the harsh conditions of molten salt environment. However, the results obtained in this study provided confidence in the employed methods of computational thermodynamics and could be further used for future alloy design efforts. Finally, several potential solutions to mitigate hot corrosion were proposed for further exploration, including coating development and controlled scaling of intermediate compounds in the KF-ZrF4 system.

    8. Emergency Response Equipment and Related Training: Airborne Radiological Computer System (Model II)

      SciTech Connect (OSTI)

      David P. Colton

      2007-02-28

      The materials included in the Airborne Radiological Computer System, Model-II (ARCS-II) were assembled with several considerations in mind. First, the system was designed to measure and record the airborne gamma radiation levels and the corresponding latitude and longitude coordinates, and to provide a first overview look of the extent and severity of an accident's impact. Second, the portable system had to be light enough and durable enough that it could be mounted in an aircraft, ground vehicle, or watercraft. Third, the system must control the collection and storage of the data, as well as provide a real-time display of the data collection results to the operator. The notebook computer and color graphics printer components of the system would only be used for analyzing and plotting the data. In essence, the provided equipment is composed of an acquisition system and an analysis system. The data can be transferred from the acquisition system to the analysis system at the end of the data collection or at some other agreeable time.

    9. Global Nuclear Energy Partnership Waste Treatment Baseline

      SciTech Connect (OSTI)

      Dirk Gombert; William Ebert; James Marra; Robert Jubin; John Vienna

      2008-05-01

      The Global Nuclear Energy Partnership program (GNEP) is designed to demonstrate a proliferation-resistant and sustainable integrated nuclear fuel cycle that can be commercialized and used internationally. Alternative stabilization concepts for byproducts and waste streams generated by fuel recycling processes were evaluated and a baseline of waste forms was recommended for the safe disposition of waste streams. Waste forms are recommended based on the demonstrated or expected commercial practicability and technical maturity of the processes needed to make the waste forms, and performance of the waste form materials when disposed. Significant issues remain in developing technologies to process some of the wastes into the recommended waste forms, and a detailed analysis of technology readiness and availability may lead to the choice of a different waste form than what is recommended herein. Evolving regulations could also affect the selection of waste forms.

    10. Extraction of actinides by multi-dentate diamides and their evaluation with computational molecular modeling

      SciTech Connect (OSTI)

      Sasaki, Y.; Kitatsuji, Y.; Hirata, M.; Kimura, T.; Yoshizuka, K.

      2008-07-01

      Multi-dentate diamides have been synthesized and examined for actinide (An) extractions. Bi- and tridentate extractants are the focus in this work. The extraction of actinides was performed from 0.1-6 M HNO{sub 3} to organic solvents. It was obvious that N,N,N',N'-tetra-alkyl-diglycolamide (DGA) derivatives, 2,2'-(methylimino)bis(N,N-dioctyl-acetamide) (MIDOA), and N,N'-dimethyl-N,N'-dioctyl-2-(3-oxa-pentadecane)-malonamide (DMDOOPDMA) have relatively high D values (D(Pu) > 70). The following notable results using DGA extractants were obtained: (1) DGAs with short alkyl chains give higher D values than those with long alkyl chain, (2) DGAs with long alkyl chain have high solubility in n-dodecane. Computational molecular modeling was also used to elucidate the effects of structural and electronic properties of the reagents on their different extractabilities. (authors)

    11. Introduction to Focus Issue: Rhythms and Dynamic Transitions in Neurological Disease: Modeling, Computation, and Experiment

      SciTech Connect (OSTI)

      Kaper, Tasso J. Kramer, Mark A.; Rotstein, Horacio G.

      2013-12-15

      Rhythmic neuronal oscillations across a broad range of frequencies, as well as spatiotemporal phenomena, such as waves and bumps, have been observed in various areas of the brain and proposed as critical to brain function. While there is a long and distinguished history of studying rhythms in nerve cells and neuronal networks in healthy organisms, the association and analysis of rhythms to diseases are more recent developments. Indeed, it is now thought that certain aspects of diseases of the nervous system, such as epilepsy, schizophrenia, Parkinson's, and sleep disorders, are associated with transitions or disruptions of neurological rhythms. This focus issue brings together articles presenting modeling, computational, analytical, and experimental perspectives about rhythms and dynamic transitions between them that are associated to various diseases.

    12. Subsurface Multiphase Flow and Multicomponent Reactive Transport Modeling using High-Performance Computing

      SciTech Connect (OSTI)

      Hammond, Glenn E.; Lichtner, Peter C.; Lu, Chuan

      2007-08-01

      Numerical modeling has become a critical tool to the Department of Energy for evaluating the environmental impact of alternative energy sources and remediation strategies for legacy waste sites. Unfortunately, the physical and chemical complexity of many sites overwhelms the capabilities of even most “state of the art” groundwater models. Of particular concern are the representation of highly-heterogeneous stratified rock/soil layers in the subsurface and the biological and geochemical interactions of chemical species within multiple fluid phases. Clearly, there is a need for higher-resolution modeling (i.e. more spatial, temporal, and chemical degrees of freedom) and increasingly mechanistic descriptions of subsurface physicochemical processes. We present research being performed in the development of PFLOTRAN, a parallel multiphase flow and multicomponent reactive transport model. Written in Fortran90, PFLOTRAN is founded upon PETSc data structures and solvers and has exhibited impressive strong scalability on up to 4000 processors on the ORNL Cray XT3. We are employing PFLOTRAN in the simulation of uranium transport at the Hanford 300 Area, a contaminated site of major concern to the Department of Energy, the State of Washington, and other government agencies where overly-simplistic historical modeling erroneously predicted decade removal times for uranium by ambient groundwater flow. By leveraging the billions of degrees of freedom available through high-performance computation using tens of thousands of processors, we can better characterize the release of uranium into groundwater and its subsequent transport to the Columbia River, and thereby better understand and evaluate the effectiveness of various proposed remediation strategies.

    13. Computing Resources

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Cluster-Image TRACC RESEARCH Computational Fluid Dynamics Computational Structural Mechanics Transportation Systems Modeling Computing Resources The TRACC Computational Clusters With the addition of a new cluster called Zephyr that was made operational in September of this year (2012), TRACC now offers two clusters to choose from: Zephyr and our original cluster that has now been named Phoenix. Zephyr was acquired from Atipa technologies, and it is a 92-node system with each node having two AMD

    14. Energy Intensity Baselining and Tracking Guidance | Department of Energy

      Office of Environmental Management (EM)

      Technical Assistance » Better Plants » Energy Intensity Baselining and Tracking Guidance Energy Intensity Baselining and Tracking Guidance The Energy Intensity Baselining and Tracking Guidance for the Better Buildings, Better Plants Program helps companies meet the program's reporting requirements by describing the steps necessary to develop an energy consumption and energy intensity baseline and calculating consumption and intensity changes over time. Most of the calculation steps described

    15. Development of Computational Tools for Metabolic Model Curation, Flux Elucidation and Strain Design

      SciTech Connect (OSTI)

      Maranas, Costas D

      2012-05-21

      An overarching goal of the Department of Energy™ mission is the efficient deployment and engineering of microbial and plant systems to enable biomass conversion in pursuit of high energy density liquid biofuels. This has spurred the pace at which new organisms are sequenced and annotated. This torrent of genomic information has opened the door to understanding metabolism in not just skeletal pathways and a handful of microorganisms but for truly genome-scale reconstructions derived for hundreds of microbes and plants. Understanding and redirecting metabolism is crucial because metabolic fluxes are unique descriptors of cellular physiology that directly assess the current cellular state and quantify the effect of genetic engineering interventions. At the same time, however, trying to keep pace with the rate of genomic data generation has ushered in a number of modeling and computational challenges related to (i) the automated assembly, testing and correction of genome-scale metabolic models, (ii) metabolic flux elucidation using labeled isotopes, and (iii) comprehensive identification of engineering interventions leading to the desired metabolism redirection.

    16. Wind Turbine Modeling for Computational Fluid Dynamics: December 2010 - December 2012

      SciTech Connect (OSTI)

      Tossas, L. A. M.; Leonardi, S.

      2013-07-01

      With the shortage of fossil fuel and the increasing environmental awareness, wind energy is becoming more and more important. As the market for wind energy grows, wind turbines and wind farms are becoming larger. Current utility-scale turbines extend a significant distance into the atmospheric boundary layer. Therefore, the interaction between the atmospheric boundary layer and the turbines and their wakes needs to be better understood. The turbulent wakes of upstream turbines affect the flow field of the turbines behind them, decreasing power production and increasing mechanical loading. With a better understanding of this type of flow, wind farm developers could plan better-performing, less maintenance-intensive wind farms. Simulating this flow using computational fluid dynamics is one important way to gain a better understanding of wind farm flows. In this study, we compare the performance of actuator disc and actuator line models in producing wind turbine wakes and the wake-turbine interaction between multiple turbines. We also examine parameters that affect the performance of these models, such as grid resolution, the use of a tip-loss correction, and the way in which the turbine force is projected onto the flow field.

    17. Computational Nanophotonics: Model Optical Interactions and Transport in Tailored Nanosystem Architectures

      SciTech Connect (OSTI)

      Stockman, Mark; Gray, Steven

      2014-02-21

      The program is directed toward development of new computational approaches to photoprocesses in nanostructures whose geometry and composition are tailored to obtain desirable optical responses. The emphasis of this specific program is on the development of computational methods and prediction and computational theory of new phenomena of optical energy transfer and transformation on the extreme nanoscale (down to a few nanometers).

    18. A sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory.

      SciTech Connect (OSTI)

      Johnson, J. D.; Oberkampf, William Louis; Helton, Jon Craig (Arizona State University, Tempe, AZ); Storlie, Curtis B. (North Carolina State University, Raleigh, NC)

      2006-10-01

      Evidence theory provides an alternative to probability theory for the representation of epistemic uncertainty in model predictions that derives from epistemic uncertainty in model inputs, where the descriptor epistemic is used to indicate uncertainty that derives from a lack of knowledge with respect to the appropriate values to use for various inputs to the model. The potential benefit, and hence appeal, of evidence theory is that it allows a less restrictive specification of uncertainty than is possible within the axiomatic structure on which probability theory is based. Unfortunately, the propagation of an evidence theory representation for uncertainty through a model is more computationally demanding than the propagation of a probabilistic representation for uncertainty, with this difficulty constituting a serious obstacle to the use of evidence theory in the representation of uncertainty in predictions obtained from computationally intensive models. This presentation describes and illustrates a sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory. Preliminary trials indicate that the presented strategy can be used to propagate uncertainty representations based on evidence theory in analysis situations where naive sampling-based (i.e., unsophisticated Monte Carlo) procedures are impracticable due to computational cost.

    19. Gated integrator with signal baseline subtraction

      DOE Patents [OSTI]

      Wang, X.

      1996-12-17

      An ultrafast, high precision gated integrator includes an opamp having differential inputs. A signal to be integrated is applied to one of the differential inputs through a first input network, and a signal indicative of the DC offset component of the signal to be integrated is applied to the other of the differential inputs through a second input network. A pair of electronic switches in the first and second input networks define an integrating period when they are closed. The first and second input networks are substantially symmetrically constructed of matched components so that error components introduced by the electronic switches appear symmetrically in both input circuits and, hence, are nullified by the common mode rejection of the integrating opamp. The signal indicative of the DC offset component is provided by a sample and hold circuit actuated as the integrating period begins. The symmetrical configuration of the integrating circuit improves accuracy and speed by balancing out common mode errors, by permitting the use of high speed switching elements and high speed opamps and by permitting the use of a small integrating time constant. The sample and hold circuit substantially eliminates the error caused by the input signal baseline offset during a single integrating window. 5 figs.

    20. LTC vacuum blasting machine (concrete): Baseline report

      SciTech Connect (OSTI)

      1997-07-31

      The LTC shot blast technology was tested and is being evaluated at Florida International University (FIU) as a baseline technology. In conjunction with FIU`s evaluation of efficiency and cost, this report covers the evaluation conducted for safety and health issues. It is a commercially available technology and has been used for various projects at locations throughout the country. The LTC 1073 Vacuum Blasting Machine uses a high-capacity, direct-pressure blasting system which incorporates a continuous feed for the blast media. The blast media cleans the surface within the contained brush area of the blast. It incorporates a vacuum system which removes dust and debris from the surface as it is blasted. The safety and health evaluation during the testing demonstration focused on two main areas of exposure: dust and noise. Dust exposure during maintenance activities was minimal, but due to mechanical difficulties dust monitoring could not be conducted during operation. Noise exposure was significant. Further testing for each of these exposures is recommended because of the outdoor environment where the testing demonstration took place. This may cause the results to be inaccurate. It is feasible that the dust and noise levels will be higher in an enclosed environment. In addition, other safety and health issues found were ergonomics, heat stress, tripping hazards, electrical hazards, lockout/tagout, and arm-hand vibration.

    1. LTC vacuum blasting machine (metal): Baseline report

      SciTech Connect (OSTI)

      1997-07-31

      The LTC coating removal technology was tested and is being evaluated at Florida International University (FIU) as a baseline technology. In conjunction with evaluation of efficiency and cost, this report covers the evaluation conducted for safety and health issues. It is a commercially available technology and has been used for various projects at locations throughout the country. The LTC coating removal system consisted of several hand tools, a Roto Peen scaler, and a needlegun. They are designed to remove coatings from steel, concrete, brick, and wood. These hand tools are used with the LTC PTC-6 vacuum system to capture dust and debris as removal of the coating takes place. The safety and health evaluation during the testing demonstration focused on two main areas of exposure: dust and noise. The dust exposure was minimal but noise exposure was significant. Further testing for each exposure is recommended because of the environment where the testing demonstration took place. It is feasible that the dust and noise levels will be higher in an enclosed operating environment of different construction. In addition, other areas of concern found were arm-hand vibration, whole-body vibration, ergonomics, heat stress, tripping hazards, electrical hazards, machine guarding, and lockout/tagout.

    2. Pentek metal coating removal system: Baseline report

      SciTech Connect (OSTI)

      1997-07-31

      The Pentek coating removal technology was tested and is being evaluated at Florida International University (FIU) as a baseline technology. In conjunction with FIU`s evaluation of efficiency and cost, this report covers evaluation conducted for safety and health issues. It is a commercially available technology and has been used for various projects at locations throughout the country. The Pentek coating removal system consisted of the ROTO-PEEN Scaler, CORNER-CUTTER{reg_sign}, and VAC-PAC{reg_sign}. They are designed to remove coatings from steel, concrete, brick, and wood. The Scaler uses 3M Roto Peen tungsten carbide cutters while the CORNER-CUTTER{reg_sign} uses solid needles for descaling activities. These hand tools are used with the VAC-PAC{reg_sign} vacuum system to capture dust and debris as removal of the coating takes place. The safety and health evaluation during the testing demonstration focused on two main areas of exposure: dust and noise. Dust exposure minimal, but noise exposure was significant. Further testing for each exposure is recommended because of the environment where the testing demonstration took place. It is feasible that the dust and noise levels will be higher in an enclosed operating environment of different construction. In addition, other areas of concern found were arm-hand vibration, whole-body, ergonomics, heat stress, tripping hazards, electrical hazards, machine guarding, and lockout/tagout.

    3. Baseline air quality study at Fermilab

      SciTech Connect (OSTI)

      Dave, M.J.; Charboneau, R.

      1980-10-01

      Air quality and meteorological data collected at Fermi National Accelerator Laboratory are presented. The data represent baseline values for the pre-construction phase of a proposed coal-gasification test facility. Air quality data were characterized through continuous monitoring of gaseous pollutants, collection of meteorological data, data acquisition and reduction, and collection and analysis of discrete atmospheric samples. Seven air quality parameters were monitored and recorded on a continuous real-time basis: sulfur dioxide, ozone, total hydrocarbons, nonreactive hydrocarbons, nitric oxide, nitrogen oxides, and carbon monoxide. A 20.9-m tower was erected near Argonne's mobile air monitoring laboratory, which was located immediately downwind of the proposed facility. The tower was instrumented at three levels to collect continuous meteorological data. Wind speed was monitored at three levels; wind direction, horizontal and vertical, at the top level; ambient temperature at the top level; and differential temperature between all three levels. All continuously-monitored parameters were digitized and recorded on magnetic tape. Appropriate software was prepared to reduce the data. Statistical summaries, grphical displays, and correlation studies also are presented.

    4. Gated integrator with signal baseline subtraction

      DOE Patents [OSTI]

      Wang, Xucheng (Lisle, IL)

      1996-01-01

      An ultrafast, high precision gated integrator includes an opamp having differential inputs. A signal to be integrated is applied to one of the differential inputs through a first input network, and a signal indicative of the DC offset component of the signal to be integrated is applied to the other of the differential inputs through a second input network. A pair of electronic switches in the first and second input networks define an integrating period when they are closed. The first and second input networks are substantially symmetrically constructed of matched components so that error components introduced by the electronic switches appear symmetrically in both input circuits and, hence, are nullified by the common mode rejection of the integrating opamp. The signal indicative of the DC offset component is provided by a sample and hold circuit actuated as the integrating period begins. The symmetrical configuration of the integrating circuit improves accuracy and speed by balancing out common mode errors, by permitting the use of high speed switching elements and high speed opamps and by permitting the use of a small integrating time constant. The sample and hold circuit substantially eliminates the error caused by the input signal baseline offset during a single integrating window.

    5. Toward accurate tooth segmentation from computed tomography images using a hybrid level set model

      SciTech Connect (OSTI)

      Gan, Yangzhou; Zhao, Qunfei; Xia, Zeyang E-mail: jing.xiong@siat.ac.cn; Hu, Ying; Xiong, Jing E-mail: jing.xiong@siat.ac.cn; Zhang, Jianwei

      2015-01-15

      Purpose: A three-dimensional (3D) model of the teeth provides important information for orthodontic diagnosis and treatment planning. Tooth segmentation is an essential step in generating the 3D digital model from computed tomography (CT) images. The aim of this study is to develop an accurate and efficient tooth segmentation method from CT images. Methods: The 3D dental CT volumetric images are segmented slice by slice in a two-dimensional (2D) transverse plane. The 2D segmentation is composed of a manual initialization step and an automatic slice by slice segmentation step. In the manual initialization step, the user manually picks a starting slice and selects a seed point for each tooth in this slice. In the automatic slice segmentation step, a developed hybrid level set model is applied to segment tooth contours from each slice. Tooth contour propagation strategy is employed to initialize the level set function automatically. Cone beam CT (CBCT) images of two subjects were used to tune the parameters. Images of 16 additional subjects were used to validate the performance of the method. Volume overlap metrics and surface distance metrics were adopted to assess the segmentation accuracy quantitatively. The volume overlap metrics were volume difference (VD, mm{sup 3}) and Dice similarity coefficient (DSC, %). The surface distance metrics were average symmetric surface distance (ASSD, mm), RMS (root mean square) symmetric surface distance (RMSSSD, mm), and maximum symmetric surface distance (MSSD, mm). Computation time was recorded to assess the efficiency. The performance of the proposed method has been compared with two state-of-the-art methods. Results: For the tested CBCT images, the VD, DSC, ASSD, RMSSSD, and MSSD for the incisor were 38.16 ± 12.94 mm{sup 3}, 88.82 ± 2.14%, 0.29 ± 0.03 mm, 0.32 ± 0.08 mm, and 1.25 ± 0.58 mm, respectively; the VD, DSC, ASSD, RMSSSD, and MSSD for the canine were 49.12 ± 9.33 mm{sup 3}, 91.57 ± 0.82%, 0.27 ± 0.02 mm, 0.28 ± 0.03 mm, and 1.06 ± 0.40 mm, respectively; the VD, DSC, ASSD, RMSSSD, and MSSD for the premolar were 37.95 ± 10.13 mm{sup 3}, 92.45 ± 2.29%, 0.29 ± 0.06 mm, 0.33 ± 0.10 mm, and 1.28 ± 0.72 mm, respectively; the VD, DSC, ASSD, RMSSSD, and MSSD for the molar were 52.38 ± 17.27 mm{sup 3}, 94.12 ± 1.38%, 0.30 ± 0.08 mm, 0.35 ± 0.17 mm, and 1.52 ± 0.75 mm, respectively. The computation time of the proposed method for segmenting CBCT images of one subject was 7.25 ± 0.73 min. Compared with two other methods, the proposed method achieves significant improvement in terms of accuracy. Conclusions: The presented tooth segmentation method can be used to segment tooth contours from CT images accurately and efficiently.

    6. Marine and Hydrokinetic Technology (MHK) Instrumentation, Measurement, and Computer Modeling Workshop

      SciTech Connect (OSTI)

      Musial, W.; Lawson, M.; Rooney, S.

      2013-02-01

      The Marine and Hydrokinetic Technology (MHK) Instrumentation, Measurement, and Computer Modeling Workshop was hosted by the National Renewable Energy Laboratory (NREL) in Broomfield, Colorado, July 9–10, 2012. The workshop brought together over 60 experts in marine energy technologies to disseminate technical information to the marine energy community, and to collect information to help identify ways in which the development of a commercially viable marine energy industry can be accelerated. The workshop was comprised of plenary sessions that reviewed the state of the marine energy industry and technical sessions that covered specific topics of relevance. Each session consisted of presentations, followed by facilitated discussions. During the facilitated discussions, the session chairs posed several prepared questions to the presenters and audience to encourage communication and the exchange of ideas between technical experts. Following the workshop, attendees were asked to provide written feedback on their takeaways from the workshop and their best ideas on how to accelerate the pace of marine energy technology development. The first four sections of this document give a general overview of the workshop format, provide presentation abstracts, supply discussion session notes, and list responses to the post-workshop questions. The final section presents key findings and conclusions from the workshop that suggest what the most pressing MHK technology needs are and how the U.S. Department of Energy (DOE) and national laboratory resources can be utilized to assist the marine energy industry in the most effective manner.

    7. Marine and Hydrokinetic Technology (MHK) Instrumentation, Measurement, and Computer Modeling Workshop

      SciTech Connect (OSTI)

      Musial, W.; Lawson, M.; Rooney, S.

      2013-02-01

      The Marine and Hydrokinetic Technology (MHK) Instrumentation, Measurement, and Computer Modeling Workshop was hosted by the National Renewable Energy Laboratory (NREL) in Broomfield, Colorado, July 9-10, 2012. The workshop brought together over 60 experts in marine energy technologies to disseminate technical information to the marine energy community and collect information to help identify ways in which the development of a commercially viable marine energy industry can be accelerated. The workshop was comprised of plenary sessions that reviewed the state of the marine energy industry and technical sessions that covered specific topics of relevance. Each session consisted of presentations, followed by facilitated discussions. During the facilitated discussions, the session chairs posed several prepared questions to the presenters and audience to encourage communication and the exchange of ideas between technical experts. Following the workshop, attendees were asked to provide written feedback on their takeaways and their best ideas on how to accelerate the pace of marine energy technology development. The first four sections of this document give a general overview of the workshop format, provide presentation abstracts and discussion session notes, and list responses to the post-workshop questions. The final section presents key findings and conclusions from the workshop that suggest how the U.S. Department of Energy and national laboratory resources can be utilized to most effectively assist the marine energy industry.

    8. Transfer matrix computation of critical polynomials for two-dimensional Potts models

      DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

      Jacobsen, Jesper Lykke; Scullard, Christian R.

      2013-02-04

      We showed, In our previous work, that critical manifolds of the q-state Potts model can be studied by means of a graph polynomial PB(q, v), henceforth referred to as the critical polynomial. This polynomial may be defined on any periodic two-dimensional lattice. It depends on a finite subgraph B, called the basis, and the manner in which B is tiled to construct the lattice. The real roots v = eK — 1 of PB(q, v) either give the exact critical points for the lattice, or provide approximations that, in principle, can be made arbitrarily accurate by increasing the size ofmore » B in an appropriate way. In earlier work, PB(q, v) was defined by a contraction-deletion identity, similar to that satisfied by the Tutte polynomial. Here, we give a probabilistic definition of PB(q, v), which facilitates its computation, using the transfer matrix, on much larger B than was previously possible.We present results for the critical polynomial on the (4, 82), kagome, and (3, 122) lattices for bases of up to respectively 96, 162, and 243 edges, compared to the limit of 36 edges with contraction-deletion. We discuss in detail the role of the symmetries and the embedding of B. The critical temperatures vc obtained for ferromagnetic (v > 0) Potts models are at least as precise as the best available results from Monte Carlo simulations or series expansions. For instance, with q = 3 we obtain vc(4, 82) = 3.742 489 (4), vc(kagome) = 1.876 459 7 (2), and vc(3, 122) = 5.033 078 49 (4), the precision being comparable or superior to the best simulation results. More generally, we trace the critical manifolds in the real (q, v) plane and discuss the intricate structure of the phase diagram in the antiferromagnetic (v < 0) region.« less

    9. Computer modeling of electrical and thermal performance during bipolar pulsed radiofrequency for pain relief

      SciTech Connect (OSTI)

      Pérez, Juan J.; Pérez-Cajaraville, Juan J.; Muñoz, Víctor; Berjano, Enrique

      2014-07-15

      Purpose: Pulsed RF (PRF) is a nonablative technique for treating neuropathic pain. Bipolar PRF application is currently aimed at creating a “strip lesion” to connect the electrode tips; however, the electrical and thermal performance during bipolar PRF is currently unknown. The objective of this paper was to study the temperature and electric field distributions during bipolar PRF. Methods: The authors developed computer models to study temperature and electric field distributions during bipolar PRF and to assess the possible ablative thermal effect caused by the accumulated temperature spikes, along with any possible electroporation effects caused by the electrical field. The authors also modeled the bipolar ablative mode, known as bipolar Continuous Radiofrequency (CRF), in order to compare both techniques. Results: There were important differences between CRF and PRF in terms of electrical and thermal performance. In bipolar CRF: (1) the initial temperature of the tissue impacts on temperature progress and hence on the thermal lesion dimension; and (2) at 37 °C, 6-min of bipolar CRF creates a strip thermal lesion between the electrodes when these are separated by a distance of up to 20 mm. In bipolar PRF: (1) an interelectrode distance shorter than 5 mm produces thermal damage (i.e., ablative effect) in the intervening tissue after 6 min of bipolar RF; and (2) the possible electroporation effect (electric fields higher than 150 kV m{sup −1}) would be exclusively circumscribed to a very small zone of tissue around the electrode tip. Conclusions: The results suggest that (1) the clinical parameters considered to be suitable for bipolar CRF should not necessarily be considered valid for bipolar PRF, and vice versa; and (2) the ablative effect of the CRF mode is mainly due to its much greater level of delivered energy than is the case in PRF, and therefore at same applied energy levels, CRF, and PRF are expected to result in same outcomes in terms of thermal damage zone dimension.

    10. NREL: Climate Neutral Research Campuses - Determine Baseline Energy

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Consumption Determine Baseline Energy Consumption To create a climate action plan for your research campus, begin by determining current energy consumption and the resulting greenhouse gas emissions. You can then break down emissions by sector. It important to understand the following at the beginning: The Importance of a Baseline "The baseline inventory also provides a common data set for establishing benchmarks and priorities during the strategic planning stage and a means for

    11. Mid-Atlantic Baseline Studies Project | Department of Energy

      Energy Savers [EERE]

      Mid-Atlantic Baseline Studies Project Mid-Atlantic Baseline Studies Project Funded by the Department of Energy, along with a number of partners, the collaborative Mid-Atlantic Baseline Studies Project, led by the Biodiversity Research Institute (BRI), helps improve understanding of species composition and use of the Mid-Atlantic marine environment in order to promote more sustainable offshore wind development. This first-of-its-kind study along the Eastern Seaboard of the United States delivers

    12. ENERGY STAR PortfolioManager Baseline Year Instructions

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      Baseline Year" Time frame Select "Multiple Properties" Using filters, choose properties to include in report Check box to Select all filtered properties Select these reporting items for the template Generate a new report using the template you created Once the report has been generated, download it as an Excel file Open downloaded "Baseline Year" report, select all and copy In report spreadsheet, choose the "Baseline

    13. Introducing Enabling Computational Tools to the Climate Sciences: Multi-Resolution Climate Modeling with Adaptive Cubed-Sphere Grids

      SciTech Connect (OSTI)

      Jablonowski, Christiane

      2015-07-14

      The research investigates and advances strategies how to bridge the scale discrepancies between local, regional and global phenomena in climate models without the prohibitive computational costs of global cloud-resolving simulations. In particular, the research explores new frontiers in computational geoscience by introducing high-order Adaptive Mesh Refinement (AMR) techniques into climate research. AMR and statically-adapted variable-resolution approaches represent an emerging trend for atmospheric models and are likely to become the new norm in future-generation weather and climate models. The research advances the understanding of multi-scale interactions in the climate system and showcases a pathway how to model these interactions effectively with advanced computational tools, like the Chombo AMR library developed at the Lawrence Berkeley National Laboratory. The research is interdisciplinary and combines applied mathematics, scientific computing and the atmospheric sciences. In this research project, a hierarchy of high-order atmospheric models on cubed-sphere computational grids have been developed that serve as an algorithmic prototype for the finite-volume solution-adaptive Chombo-AMR approach. The foci of the investigations have lied on the characteristics of both static mesh adaptations and dynamically-adaptive grids that can capture flow fields of interest like tropical cyclones. Six research themes have been chosen. These are (1) the introduction of adaptive mesh refinement techniques into the climate sciences, (2) advanced algorithms for nonhydrostatic atmospheric dynamical cores, (3) an assessment of the interplay between resolved-scale dynamical motions and subgrid-scale physical parameterizations, (4) evaluation techniques for atmospheric model hierarchies, (5) the comparison of AMR refinement strategies and (6) tropical cyclone studies with a focus on multi-scale interactions and variable-resolution modeling. The results of this research project demonstrate significant advances in all six research areas. The major conclusions are that statically-adaptive variable-resolution modeling is currently becoming mature in the climate sciences, and that AMR holds outstanding promise for future-generation weather and climate models on high-performance computing architectures.

    14. Chile-Danish Government Baseline Workstream | Open Energy Information

      Open Energy Info (EERE)

      Kenya, Mexico, South Africa, Thailand and Vietnam), to share practices on setting national greenhouse gas emissions baseline scenarios. The aim of the workstream is to...

    15. Ethiopia-National Greenhouse Gas Emissions Baseline Scenarios...

      Open Energy Info (EERE)

      National Greenhouse Gas Emissions Baseline Scenarios: Learning from Experiences in Developing Countries Jump to: navigation, search Name Ethiopia-National Greenhouse Gas Emissions...

    16. EA-1943: Long Baseline Neutrino Facility/Deep Underground Neutrino...

      Broader source: Energy.gov (indexed) [DOE]

      May 27, 2015 EA-1943: Draft Environmental Assessment Long Baseline Neutrino FacilityDeep Underground Neutrino Experiment (LBNFDUNE) at Fermilab, Batavia, Illinois and the...

    17. Updates to the International Linear Collider Damping Rings Baseline...

      Office of Scientific and Technical Information (OSTI)

      Updates to the International Linear Collider Damping Rings Baseline Design Citation Details In-Document Search Title: Updates to the International Linear Collider Damping Rings...

    18. EA-1943: Construction and Operation of the Long Baseline Neutrino...

      Office of Environmental Management (EM)

      Neutrino Experiment at Fermilab, Batavia, Illinois, and Sanford Underground Research Facility, Lead, South Dakota EA-1943: Construction and Operation of the Long Baseline...

    19. Cost and Performance Comparison Baseline for Fossil Energy Power...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      blocks together into a new, revolutionary concept for future coal-based power and energy production. Objective To establish baseline performance and cost estimates for today's...

    20. U.S. Department of Energy Performance Baseline Guide

      Broader source: Directives, Delegations, and Requirements [Office of Management (MA)]

      2008-09-12

      The guide supports DOE O 413.3A and identifies key performance baseline development processes and practices. Does not cancel other directives.

    1. South Africa - Greenhouse Gas Emission Baselines and Reduction...

      Open Energy Info (EERE)

      from Buildings AgencyCompany Organization United Nations Environment Programme Sector Energy Focus Area Buildings Topics Baseline projection, GHG inventory, Pathways analysis,...

    2. Mexico - Greenhouse Gas Emissions Baselines and Reduction Potentials...

      Open Energy Info (EERE)

      from Buildings AgencyCompany Organization United Nations Environment Programme Sector Energy Focus Area Buildings Topics Baseline projection, GHG inventory, Pathways analysis,...

    3. NETL - Bituminous Baseline Performance and Cost Interactive Tool...

      Open Energy Info (EERE)

      from the Cost and Performance Baseline for Fossil Energy Plants - Bituminous Coal and Natural Gas to Electricity report. The tool provides an interactive summary of the full...

    4. Sandia Energy - Scaled Wind Farm Technology Facility Baselining...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Project Accelerates Work Home Renewable Energy Energy SWIFT Facilities Partnership News Wind Energy News & Events Systems Analysis Scaled Wind Farm Technology Facility Baselining...

    5. International Nuclear Energy Research Initiative Development of Computational Models for Pyrochemical Electrorefiners of Nuclear Waste Transmutation Systems

      SciTech Connect (OSTI)

      M.F. Simpson; K.-R. Kim

      2010-12-01

      In support of closing the nuclear fuel cycle using non-aqueous separations technology, this project aims to develop computational models of electrorefiners based on fundamental chemical and physical processes. Spent driver fuel from Experimental Breeder Reactor-II (EBR-II) is currently being electrorefined in the Fuel Conditioning Facility (FCF) at Idaho National Laboratory (INL). And Korea Atomic Energy Research Institute (KAERI) is developing electrorefining technology for future application to spent fuel treatment and management in the Republic of Korea (ROK). Electrorefining is a critical component of pyroprocessing, a non-aqueous chemical process which separates spent fuel into four streams: (1) uranium metal, (2) U/TRU metal, (3) metallic high-level waste containing cladding hulls and noble metal fission products, and (4) ceramic high-level waste containing sodium and active metal fission products. Having rigorous yet flexible electrorefiner models will facilitate process optimization and assist in trouble-shooting as necessary. To attain such models, INL/UI has focused on approaches to develop a computationally-light and portable two-dimensional (2D) model, while KAERI/SNU has investigated approaches to develop a computationally intensive three-dimensional (3D) model for detailed and fine-tuned simulation.

    6. Baseline Assessment of TREAT for Modeling and Analysis Needs

      SciTech Connect (OSTI)

      Bess, John Darrell; DeHart, Mark David

      2015-10-01

      The Transient Reactor Test Facility (TREAT) is an air-cooled, thermal, heterogeneous test facility designed to evaluate reactor fuels and structural materials under conditions simulating various types of nuclear excursions and transient undercooling situations the could occur in a nuclear reactor. Fuel meltdowns, metal-water reactions, thermal interaction between overheated fuel and coolant, and the transient behavior of ceramic fuel for high temperature systems can be investigated [Freund 1958]. The contribution by TREAT to reactor safety includes the following: 1) provision of basic data to predict the safety margin of fuel designs and the severity of potential accidents, 2) service as a proving ground for fuel concepts designed to reduce or prevent consequent hazards, and 3) provision of nondestructive test data via neutron radiography of fuel samples irradiated in other test reactors [Sachs 1974]. The major unique features of TREAT include large flux integral absorption due to its high heat capacity; inherent, essentially instantaneous, temperature-dependent shutdown mechanism; rapid control rod movement; and visual access to the core center [MacFarlane 1958]. TREAT’s primary purpose was to simulate accident conditions leading to fuel damge, including melting or vaporization of the test specimens, while leaving undamaged the reactor’s fuel. During steady-state operation, TREAT could be utilized as a large neutron-radiography facility and could examine assemblies up to 15 ft. in length. Unique shielded viewing slots allowed for both optical and gamma camera systems to record reactive mechanisms occurring during the experiment on film for detailed study. A fast-neutron hodoscope was a key diagnostic instrument that by collimating and detecting fission neutrons emitted by experiment fuel samples could provide time and spatial resolution of fuel motion during transients and in-place measurement of fuel distribution before and after an experiment. The reactor was operated from February 1959 until April 1994, generating over 720 MWh of energy. The reactor underwent a major upgrade in 1988, which included installation of new instrumentation and control systems, with refurbishment of the rod drive systems. The only major difference, in-core, between the preupgrade core and the upgraded core was the change in the number and locations of the control rods; all other neutronics behavior is essentially unchanged.

    7. Model Baseline Fire Department/Fire Protection Engineering Assessment

      Broader source: Energy.gov [DOE]

      The purpose of the document is to comprehensively delineate and rationalize the roles and responsibilities of the Fire Department and Fire Protection (Engineering).

    8. Sandia National Laboratories: Advanced Simulation and Computing...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      ASC Advanced Simulation and Computing Computational Systems & Software Environment Crack Modeling The Computational Systems & Software Environment program builds integrated,...

    9. Electrochemistry Diagnostics of Baseline and New Materials | Department of

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      Energy 2 DOE Hydrogen and Fuel Cells Program and Vehicle Technologies Program Annual Merit Review and Peer Evaluation Meeting PDF icon es033_kostecki_2012_o.pdf More Documents & Publications Electrochemistry Diagnostics of Baseline and New Materials Electrochemistry Diagnostics of Baseline and New Materials Overview of Applied Battery Research

    10. Electrochemistry Diagnostics of Baseline and New Materials | Department of

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      Energy 1 DOE Hydrogen and Fuel Cells Program, and Vehicle Technologies Program Annual Merit Review and Peer Evaluation PDF icon es033_kostecki_2011_p.pdf More Documents & Publications Electrochemistry Diagnostics of Baseline and New Materials Electrochemistry Diagnostics of Baseline and New Materials Interfacial Processes - Diagnostics

    11. Baseline and Target Values for PV Forecasts: Toward Improved Solar Power Forecasting: Preprint

      SciTech Connect (OSTI)

      Zhang, Jie; Hodge, Bri-Mathias; Lu, Siyuan; Hamann, Hendrik F.; Lehman, Brad; Simmons, Joseph; Campos, Edwin; Banunarayanan, Venkat

      2015-08-05

      Accurate solar power forecasting allows utilities to get the most out of the solar resources on their systems. To truly measure the improvements that any new solar forecasting methods can provide, it is important to first develop (or determine) baseline and target solar forecasting at different spatial and temporal scales. This paper aims to develop baseline and target values for solar forecasting metrics. These were informed by close collaboration with utility and independent system operator partners. The baseline values are established based on state-of-the-art numerical weather prediction models and persistence models. The target values are determined based on the reduction in the amount of reserves that must be held to accommodate the uncertainty of solar power output. forecasting metrics. These were informed by close collaboration with utility and independent system operator partners. The baseline values are established based on state-of-the-art numerical weather prediction models and persistence models. The target values are determined based on the reduction in the amount of reserves that must be held to accommodate the uncertainty of solar power output.

    12. Overview of Computer-Aided Engineering of Batteries and Introduction to Multi-Scale, Multi-Dimensional Modeling of Li-Ion Batteries (Presentation)

      SciTech Connect (OSTI)

      Pesaran, A.; Kim, G. H.; Smith, K.; Santhanagopalan, S.; Lee, K. J.

      2012-05-01

      This 2012 Annual Merit Review presentation gives an overview of the Computer-Aided Engineering of Batteries (CAEBAT) project and introduces the Multi-Scale, Multi-Dimensional model for modeling lithium-ion batteries for electric vehicles.

    13. Fort Irwin Integrated Resource Assessment. Volume 2, Baseline detail

      SciTech Connect (OSTI)

      Richman, E.E.; Keller, J.M.; Dittmer, A.L.; Hadley, D.L.

      1994-01-01

      This report documents the assessment of baseline energy use at Fort Irwin, a US Army Forces Command facility near Barstow, California. It is a companion report to Volume 1, Executive Summary, and Volume 3, Integrated Resource Assessment. The US Army Forces Command (FORSCOM) has tasked the US Department of Energy (DOE) Federal Energy Management Program (FEMP), supported by the Pacific Northwest Laboratory (PNL), to identify, evaluate, and assist in acquiring all cost-effective energy projects at Fort Irwin. This is part of a model program that PNL has designed to support energy-use decisions in the federal sector. This program (1) identifies and evaluates all cost-effective energy projects; (2) develops a schedule at each installation for project acquisition considering project type, size, timing, and capital requirements, as well as energy and dollar savings; and (3) targets 100% of the financing required to implement energy efficiency projects. PNL applied this model program to Fort Irwin. This analysis examines the characteristics of electric, propane gas, and vehicle fuel use for a typical operating year. It records energy-use intensities for the facilities at Fort Irwin by building type and energy end use. It also breaks down building energy consumption by fuel type, energy end use, and building type. A complete energy consumption reconciliation is presented that accounts for all energy use among buildings, utilities, and applicable losses.

    14. Baseline Evaluations to Support Control Room Modernization at Nuclear Power Plants

      SciTech Connect (OSTI)

      Boring, Ronald L.; Joe, Jeffrey C.

      2015-02-01

      For any major control room modernization activity at a commercial nuclear power plant (NPP) in the U.S., a utility should carefully follow the four phases prescribed by the U.S. Nuclear Regulatory Commission in NUREG-0711, Human Factors Engineering Program Review Model. These four phases include Planning and Analysis, Design, Verification and Validation, and Implementation and Operation. While NUREG-0711 is a useful guideline, it is written primarily from the perspective of regulatory review, and it therefore does not provide a nuanced account of many of the steps the utility might undertake as part of control room modernization. The guideline is largely summative—intended to catalog final products—rather than formative—intended to guide the overall modernization process. In this paper, we highlight two crucial formative sub-elements of the Planning and Analysis phase specific to control room modernization that are not covered in NUREG-0711. These two sub-elements are the usability and ergonomics baseline evaluations. A baseline evaluation entails evaluating the system as-built and currently in use. The usability baseline evaluation provides key insights into operator performance using the control system currently in place. The ergonomics baseline evaluation identifies possible deficiencies in the physical configuration of the control system. Both baseline evaluations feed into the design of the replacement system and subsequent summative benchmarking activities that help ensure that control room modernization represents a successful evolution of the control system.

    15. Computation of Domain-Averaged Irradiance with a Simple Two-Stream Radiative Transfer Model Including Vertical Cloud Property Correlations

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Computation of Domain-Averaged Irradiance with a Simple Two-Stream Radiative Transfer Model Including Vertical Cloud Property Correlations S. Kato Center for Atmospheric Sciences Hampton University Hampton, Virginia Introduction Recent development of remote sensing instruments by Atmospheric Radiation Measurement (ARM?) Program provides information of spatial and temporal variability of cloud structures. However it is not clear what cloud properties are required to express complicated cloud

    16. Coupling of Mechanical Behavior of Cell Components to Electrochemical-Thermal Models for Computer- Aided Engineering of Batteries under Abuse

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Coupling of Mechanical Behavior of Cell Components to Electrochemical-Thermal Models for Computer- Aided Engineering of Batteries under Abuse P.I.: Ahmad Pesaran Team: Tomasz Wierzbicki and Elham Sahraei (MIT) Genong Li and Lewis Collins (ANSYS) M. Sprague, G.H. Kim and S. Santhangopalan (NREL) June 17, 2014 This presentation does not contain any proprietary, confidential, or otherwise restricted information. Project ID: ES199 NREL/PR-5400-61885 2 Overview * Project Start: October 2013 * Project

    17. Reducing computation in an i-vector speaker recognition system using a tree-structured universal background model

      DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

      McClanahan, Richard; De Leon, Phillip L.

      2014-08-20

      The majority of state-of-the-art speaker recognition systems (SR) utilize speaker models that are derived from an adapted universal background model (UBM) in the form of a Gaussian mixture model (GMM). This is true for GMM supervector systems, joint factor analysis systems, and most recently i-vector systems. In all of the identified systems, the posterior probabilities and sufficient statistics calculations represent a computational bottleneck in both enrollment and testing. We propose a multi-layered hash system, employing a tree-structured GMM–UBM which uses Runnalls’ Gaussian mixture reduction technique, in order to reduce the number of these calculations. Moreover, with this tree-structured hash, wemore » can trade-off reduction in computation with a corresponding degradation of equal error rate (EER). As an example, we also reduce this computation by a factor of 15× while incurring less than 10% relative degradation of EER (or 0.3% absolute EER) when evaluated with NIST 2010 speaker recognition evaluation (SRE) telephone data.« less

    18. CORCON-MOD3: An integrated computer model for analysis of molten core-concrete interactions. User`s manual

      SciTech Connect (OSTI)

      Bradley, D.R.; Gardner, D.R.; Brockmann, J.E.; Griffith, R.O.

      1993-10-01

      The CORCON-Mod3 computer code was developed to mechanistically model the important core-concrete interaction phenomena, including those phenomena relevant to the assessment of containment failure and radionuclide release. The code can be applied to a wide range of severe accident scenarios and reactor plants. The code represents the current state of the art for simulating core debris interactions with concrete. This document comprises the user`s manual and gives a brief description of the models and the assumptions and limitations in the code. Also discussed are the input parameters and the code output. Two sample problems are also given.

    19. A non-CFD modeling system for computing 3D wind and concentration fields in urban environments

      SciTech Connect (OSTI)

      Nelson, Matthew A; Brown, Michael J; Williams, Michael D; Gowardhan, Akshay; Pardyjak, Eric R

      2010-01-01

      The Quick Urban & Industrial Complex (QUIC) Dispersion Modeling System has been developed to rapidly compute the transport and dispersion of toxic agent releases in the vicinity of buildings. It is composed of an empirical-diagnostic wind solver, an 'urbanized' Lagrangian random-walk model, and a graphical user interface. The code has been used for homeland security and environmental air pollution applications. In this paper, we discuss the wind solver methodology and improvements made to the original Roeckle schemes in order to better capture flow fields in dense built-up areas. The mode1-computed wind and concentration fields are then compared to measurements from several field experiments. Improvements to the QUIC Dispersion Modeling System have been made to account for the inhomogeneous and complex building layouts found in large cities. The logic that has been introduced into the code is described and comparisons of model output to full-scale outdoor urban measurements in Oklahoma City and New York City are given. Although far from perfect, the model agreed fairly well with measurements and in many cases performed equally to CFD codes.

    20. Modeling

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      New Project Is the ACME of Computer Science to Address Climate Change Analysis, Climate, Global Climate & Energy, Modeling, Modeling & Analysis, News, News & Events, Partnership New Project Is the ACME of Computer Science to Address Climate Change Sandia high-performance computing (HPC) researchers are working with DOE and 14 other national laboratories and institutions to develop and apply the most complete climate and Earth system model, to address the most challenging and

    1. Flow field computation of the NREL S809 airfoil using various turbulence models

      SciTech Connect (OSTI)

      Chang, Y.L.; Yang, S.L.; Arici, O. [Michigan Technological Univ., Houghton, MI (United States). Mechanical Engineering-Engineering Mechanics Dept.

      1996-10-01

      Performance comparison of three popular turbulence models, namely Baldwin-Lomas algebraic model, Chien`s Low-Reynolds-Number {kappa}-{epsilon} model, and Wilcox`s Low-Reynolds-Number {kappa}-{omega} model, is given. These models were applied to calculate the flow field around the National Renewable Energy Laboratory S809 airfoil using Total Variational Diminishing scheme. Numerical results of C{sub P}, C{sub L}, and C{sub D} are presented along with the Delft experimental data. It is shown that all three models perform well for attached flow, i.e., no flow separation at low angles of attack. However, at high angles of attack with flow separation, convergence characteristics show Wilcox`s model outperforms the other models. Results of this study will be used to guide the authors in their dynamic stall research.

    2. Application of high performance computing to automotive design and manufacturing: Composite materials modeling task technical manual for constitutive models for glass fiber-polymer matrix composites

      SciTech Connect (OSTI)

      Simunovic, S; Zacharia, T

      1997-11-01

      This report provides a theoretical background for three constitutive models for a continuous strand mat (CSM) glass fiber-thermoset polymer matrix composite. The models were developed during fiscal years 1994 through 1997 as a part of the Cooperative Research and Development Agreement, "Application of High-Performance Computing to Automotive Design and Manufacturing." The full derivation of constitutive relations in the framework of the continuum program DYNA3D and have been used for the simulation and impact analysis of CSM composite tubes. The analysis of simulation and experimental results show that the model based on strain tensor split yields the most accurate results of the three implemented models. The parameters used in the models and their derivation from the physical tests are documented.

    3. Biodiversity Research Institute Mid-Atlantic Baseline Study Webinar

      Broader source: Energy.gov [DOE]

      Carried out by the Biodiversity Research Institute (BRI) and funded by the U.S. Department of Energy Wind and Water Power Technology Office and other partners, the goal of the Mid-Atlantic Baseline...

    4. U.S Department of Energy Performance Baseline Guide

      Broader source: Directives, Delegations, and Requirements [Office of Management (MA)]

      2011-09-23

      This guide identifies key Performance Baseline (PB) elements, development processes, and practices; describes the context in which DOE PB development occurs; and suggests ways of addressing the critical elements in PB development.

    5. Cost and Performance Baseline for Fossil Energy Plants Volume...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      www.netl.doe.gov This page intentionally left blank Cost and Performance Baseline for Coal-to-SNG and Ammonia (Volume 2) i Table of Contents LIST OF EXHIBITS......

    6. Making a Computer Model of the Most Complex System Ever Built - Continuum

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Magazine | NREL photo of a man and a woman pointing to a large computer screen, which shows an advanced systems analysis of U.S. power systems with examples of high area renewables. David Mooney, director for NREL's Strategic Energy Analysis Center (left), and Robin Newmark, NREL's associate director for Energy Analysis and Decision Support (right), examine an advanced systems analysis of the impacts of high penetrations of renewable energy on the U.S. electrical grid. Photo by Dennis

    7. CFD [computational fluid dynamics] And Safety Factors. Computer modeling of complex processes needs old-fashioned experiments to stay in touch with reality.

      SciTech Connect (OSTI)

      Leishear, Robert A.; Lee, Si Y.; Poirier, Michael R.; Steeper, Timothy J.; Ervin, Robert C.; Giddings, Billy J.; Stefanko, David B.; Harp, Keith D.; Fowley, Mark D.; Van Pelt, William B.

      2012-10-07

      Computational fluid dynamics (CFD) is recognized as a powerful engineering tool. That is, CFD has advanced over the years to the point where it can now give us deep insight into the analysis of very complex processes. There is a danger, though, that an engineer can place too much confidence in a simulation. If a user is not careful, it is easy to believe that if you plug in the numbers, the answer comes out, and you are done. This assumption can lead to significant errors. As we discovered in the course of a study on behalf of the Department of Energy's Savannah River Site in South Carolina, CFD models fail to capture some of the large variations inherent in complex processes. These variations, or scatter, in experimental data emerge from physical tests and are inadequately captured or expressed by calculated mean values for a process. This anomaly between experiment and theory can lead to serious errors in engineering analysis and design unless a correction factor, or safety factor, is experimentally validated. For this study, blending times for the mixing of salt solutions in large storage tanks were the process of concern under investigation. This study focused on the blending processes needed to mix salt solutions to ensure homogeneity within waste tanks, where homogeneity is required to control radioactivity levels during subsequent processing. Two of the requirements for this task were to determine the minimum number of submerged, centrifugal pumps required to blend the salt mixtures in a full-scale tank in half a day or less, and to recommend reasonable blending times to achieve nearly homogeneous salt mixtures. A full-scale, low-flow pump with a total discharge flow rate of 500 to 800 gpm was recommended with two opposing 2.27-inch diameter nozzles. To make this recommendation, both experimental and CFD modeling were performed. Lab researchers found that, although CFD provided good estimates of an average blending time, experimental blending times varied significantly from the average.

    8. COMPARATIVE COMPUTATIONAL MODELING OF AIRFLOWS AND VAPOR DOSIMETY IN THE RESPIRATORY TRACTS OF RAT, MONKEY, AND HUMAN

      SciTech Connect (OSTI)

      Corley, Richard A.; Kabilan, Senthil; Kuprat, Andrew P.; Carson, James P.; Minard, Kevin R.; Jacob, Rick E.; Timchalk, Charles; Glenny, Robb W.; Pipavath, Sudhaker; Cox, Timothy C.; Wallis, Chris; Larson, Richard; Fanucchi, M.; Postlewait, Ed; Einstein, Daniel R.

      2012-07-01

      Coupling computational fluid dynamics (CFD) with physiologically based pharmacokinetic (PBPK) models is useful for predicting site-specific dosimetry of airborne materials in the respiratory tract and elucidating the importance of species differences in anatomy, physiology, and breathing patterns. Historically, these models were limited to discrete regions of the respiratory system. CFD/PBPK models have now been developed for the rat, monkey, and human that encompass airways from the nose or mouth to the lung. A PBPK model previously developed to describe acrolein uptake in nasal tissues was adapted to the extended airway models as an example application. Model parameters for each anatomic region were obtained from the literature, measured directly, or estimated from published data. Airflow and site-specific acrolein uptake patterns were determined under steadystate inhalation conditions to provide direct comparisons with prior data and nasalonly simulations. Results confirmed that regional uptake was dependent upon airflow rates and acrolein concentrations with nasal extraction efficiencies predicted to be greatest in the rat, followed by the monkey, then the human. For human oral-breathing simulations, acrolein uptake rates in oropharyngeal and laryngeal tissues were comparable to nasal tissues following nasal breathing under the same exposure conditions. For both breathing modes, higher uptake rates were predicted for lower tracheo-bronchial tissues of humans than either the rat or monkey. These extended airway models provide a unique foundation for comparing dosimetry across a significantly more extensive range of conducting airways in the rat, monkey, and human than prior CFD models.

    9. In pursuit of an accurate spatial and temporal model of biomolecules at the atomistic level: a perspective on computer simulation

      SciTech Connect (OSTI)

      Gray, Alan; Harlen, Oliver G.; Harris, Sarah A.; Khalid, Syma; Leung, Yuk Ming; Lonsdale, Richard; Mulholland, Adrian J.; Pearson, Arwen R.; Read, Daniel J.; Richardson, Robin A.

      2015-01-01

      The current computational techniques available for biomolecular simulation are described, and the successes and limitations of each with reference to the experimental biophysical methods that they complement are presented. Despite huge advances in the computational techniques available for simulating biomolecules at the quantum-mechanical, atomistic and coarse-grained levels, there is still a widespread perception amongst the experimental community that these calculations are highly specialist and are not generally applicable by researchers outside the theoretical community. In this article, the successes and limitations of biomolecular simulation and the further developments that are likely in the near future are discussed. A brief overview is also provided of the experimental biophysical methods that are commonly used to probe biomolecular structure and dynamics, and the accuracy of the information that can be obtained from each is compared with that from modelling. It is concluded that progress towards an accurate spatial and temporal model of biomacromolecules requires a combination of all of these biophysical techniques, both experimental and computational.

    10. Annual Technology Baseline (Including Supporting Data); NREL (National

      Office of Scientific and Technical Information (OSTI)

      Renewable Energy Laboratory) (Conference) | SciTech Connect SciTech Connect Search Results Conference: Annual Technology Baseline (Including Supporting Data); NREL (National Renewable Energy Laboratory) Citation Details In-Document Search Title: Annual Technology Baseline (Including Supporting Data); NREL (National Renewable Energy Laboratory) Consistent cost and performance data for various electricity generation technologies can be difficult to find and may change frequently for certain

    11. Annual Technology Baseline (Including Supporting Data); NREL (National

      Office of Scientific and Technical Information (OSTI)

      Renewable Energy Laboratory) (Conference) | SciTech Connect Annual Technology Baseline (Including Supporting Data); NREL (National Renewable Energy Laboratory) Citation Details In-Document Search Title: Annual Technology Baseline (Including Supporting Data); NREL (National Renewable Energy Laboratory) × You are accessing a document from the Department of Energy's (DOE) SciTech Connect. This site is a product of DOE's Office of Scientific and Technical Information (OSTI) and is provided as a

    12. Computational modeling of electrostatic charge and fields produced by hypervelocity impact

      SciTech Connect (OSTI)

      Crawford, David A.

      2015-05-19

      Following prior experimental evidence of electrostatic charge separation, electric and magnetic fields produced by hypervelocity impact, we have developed a model of electrostatic charge separation based on plasma sheath theory and implemented it into the CTH shock physics code. Preliminary assessment of the model shows good qualitative and quantitative agreement between the model and prior experiments at least in the hypervelocity regime for the porous carbonate material tested. The model agrees with the scaling analysis of experimental data performed in the prior work, suggesting that electric charge separation and the resulting electric and magnetic fields can be a substantial effect at larger scales, higher impact velocities, or both.

    13. Computational model, method, and system for kinetically-tailoring multi-drug chemotherapy for individuals

      DOE Patents [OSTI]

      Gardner, Shea Nicole (San Leandro, CA)

      2007-10-23

      A method and system for tailoring treatment regimens to individual patients with diseased cells exhibiting evolution of resistance to such treatments. A mathematical model is provided which models rates of population change of proliferating and quiescent diseased cells using cell kinetics and evolution of resistance of the diseased cells, and pharmacokinetic and pharmacodynamic models. Cell kinetic parameters are obtained from an individual patient and applied to the mathematical model to solve for a plurality of treatment regimens, each having a quantitative efficacy value associated therewith. A treatment regimen may then be selected from the plurlaity of treatment options based on the efficacy value.

    14. Protein superfamily members as targets for computer modeling: The carbohydrate recognition domain of a macrophage lectin

      SciTech Connect (OSTI)

      Stenkamp, R.E.; Aruffo, A.; Bajorath, J.

      1996-12-31

      Members of protein superfamilies display similar folds, but share only limited sequence identity, often 25% or less. Thus, it is not straightforward to apply standard homology modeling methods to construct reliable three-dimensional models of such proteins. A three-dimensional model of the carbohydrate recognition domain of the rat macrophage lectin, a member of the calcium-dependent (C-type) lectin superfamily, has been generated to illustrate how information provided by comparison of X-ray structures and sequence-structure alignments can aid in comparative modeling when primary sequence similarities are low. 20 refs., 4 figs.

    15. Multigroup computation of the temperature-dependent Resonance Scattering Model (RSM) and its implementation

      SciTech Connect (OSTI)

      Ghrayeb, S. Z.; Ouisloumen, M.; Ougouag, A. M.; Ivanov, K. N.

      2012-07-01

      A multi-group formulation for the exact neutron elastic scattering kernel is developed. This formulation is intended for implementation into a lattice physics code. The correct accounting for the crystal lattice effects influences the estimated values for the probability of neutron absorption and scattering, which in turn affect the estimation of core reactivity and burnup characteristics. A computer program has been written to test the formulation for various nuclides. Results of the multi-group code have been verified against the correct analytic scattering kernel. In both cases neutrons were started at various energies and temperatures and the corresponding scattering kernels were tallied. (authors)

    16. Predicting oropharyngeal tumor volume throughout the course of radiation therapy from pretreatment computed tomography data using general linear models

      SciTech Connect (OSTI)

      Yock, Adam D. Kudchadker, Rajat J.; Rao, Arvind; Dong, Lei; Beadle, Beth M.; Garden, Adam S.; Court, Laurence E.

      2014-05-15

      Purpose: The purpose of this work was to develop and evaluate the accuracy of several predictive models of variation in tumor volume throughout the course of radiation therapy. Methods: Nineteen patients with oropharyngeal cancers were imaged daily with CT-on-rails for image-guided alignment per an institutional protocol. The daily volumes of 35 tumors in these 19 patients were determined and used to generate (1) a linear model in which tumor volume changed at a constant rate, (2) a general linear model that utilized the power fit relationship between the daily and initial tumor volumes, and (3) a functional general linear model that identified and exploited the primary modes of variation between time series describing the changing tumor volumes. Primary and nodal tumor volumes were examined separately. The accuracy of these models in predicting daily tumor volumes were compared with those of static and linear reference models using leave-one-out cross-validation. Results: In predicting the daily volume of primary tumors, the general linear model and the functional general linear model were more accurate than the static reference model by 9.9% (range: ?11.6%–23.8%) and 14.6% (range: ?7.3%–27.5%), respectively, and were more accurate than the linear reference model by 14.2% (range: ?6.8%–40.3%) and 13.1% (range: ?1.5%–52.5%), respectively. In predicting the daily volume of nodal tumors, only the 14.4% (range: ?11.1%–20.5%) improvement in accuracy of the functional general linear model compared to the static reference model was statistically significant. Conclusions: A general linear model and a functional general linear model trained on data from a small population of patients can predict the primary tumor volume throughout the course of radiation therapy with greater accuracy than standard reference models. These more accurate models may increase the prognostic value of information about the tumor garnered from pretreatment computed tomography images and facilitate improved treatment management.

    17. Climate Models: Rob Jacob | Argonne National Laboratory

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      --Tribology -Mathematics, computing, & computer science --Cloud computing --Modeling, simulation, & visualization --Petascale & exascale computing --Supercomputing &...

    18. Use of model calibration to achieve high accuracy in analysis of computer networks

      DOE Patents [OSTI]

      Frogner, Bjorn; Guarro, Sergio; Scharf, Guy

      2004-05-11

      A system and method are provided for creating a network performance prediction model, and calibrating the prediction model, through application of network load statistical analyses. The method includes characterizing the measured load on the network, which may include background load data obtained over time, and may further include directed load data representative of a transaction-level event. Probabilistic representations of load data are derived to characterize the statistical persistence of the network performance variability and to determine delays throughout the network. The probabilistic representations are applied to the network performance prediction model to adapt the model for accurate prediction of network performance. Certain embodiments of the method and system may be used for analysis of the performance of a distributed application characterized as data packet streams.

    19. Computational modeling of electrostatic charge and fields produced by hypervelocity impact

      DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

      Crawford, David A.

      2015-05-19

      Following prior experimental evidence of electrostatic charge separation, electric and magnetic fields produced by hypervelocity impact, we have developed a model of electrostatic charge separation based on plasma sheath theory and implemented it into the CTH shock physics code. Preliminary assessment of the model shows good qualitative and quantitative agreement between the model and prior experiments at least in the hypervelocity regime for the porous carbonate material tested. The model agrees with the scaling analysis of experimental data performed in the prior work, suggesting that electric charge separation and the resulting electric and magnetic fields can be a substantial effectmore » at larger scales, higher impact velocities, or both.« less

    20. Baseline design/economics for advanced Fischer-Tropsch technology. Quarterly report, January--March 1992

      SciTech Connect (OSTI)

      Not Available

      1992-09-01

      The objectives of the study are to: Develop a baseline design for indirect liquefaction using advanced Fischer-Tropsch (F-T) technology. Prepare the capital and operating costs for the baseline design. Develop a process flow sheet simulation (PFS) model. This report summarizes the activities completed during the period December 23, 1992 through March 15, 1992. In Task 1, Baseline Design and Alternates, the following activities related to the tradeoff studies were completed: approach and basis; oxygen purity; F-T reactor pressure; wax yield; autothermal reformer; hydrocarbons (C{sub 3}/C{sub 4}s) recovery; and hydrogenrecovery. In Task 3, Engineering Design Criteria, activities were initiated to support the process tradeoff studies in Task I and to develop the environmental strategy for the Illinois site. The work completed to date consists of the development of the F-T reactor yield correlation from the Mobil dam and a brief review of the environmental strategy prepared for the same site in the direct liquefaction baseline study.Some work has also been done in establishing site-related criteria, in establishing the maximum vessel diameter for train sizing and in coping with the low H{sub 2}/CO ratio from the Shell gasifier. In Task 7, Project Management and Administration, the following activities were completed: the subcontract agreement between Amoco and Bechtel was negotiated; a first technical progress meeting was held at the Bechtel office in February; and the final Project Management Plan was approved by PETC and issued in March 1992.

    1. Multiproject baselines for evaluation of electric power projects

      SciTech Connect (OSTI)

      Sathaye, Jayant; Murtishaw, Scott; Price, Lynn; Lefranc, Maurice; Roy, Joyashree; Winkler, Harald; Spalding-Fecher, Randall

      2003-03-12

      Calculating greenhouse gas emissions reductions from climate change mitigation projects requires construction of a baseline that sets emissions levels that would have occurred without the project. This paper describes a standardized multiproject methodology for setting baselines, represented by the emissions rate (kg C/kWh), for electric power projects. A standardized methodology would reduce the transaction costs of projects. The most challenging aspect of setting multiproject emissions rates is determining the vintage and types of plants to include in the baseline and the stringency of the emissions rates to be considered, in order to balance the desire to encourage no- or low-carbon projects while maintaining environmental integrity. The criteria for selecting power plants to include in the baseline depend on characteristics of both the project and the electricity grid it serves. Two case studies illustrate the application of these concepts to the electric power grids in eastern India and South Africa. We use hypothetical, but realistic, climate change projects in each country to illustrate the use of the multiproject methodology, and note the further research required to fully understand the implications of the various choices in constructing and using these baselines.

    2. Computational Fluid Dynamics

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      scour-tracc-cfd TRACC RESEARCH Computational Fluid Dynamics Computational Structural Mechanics Transportation Systems Modeling Computational Fluid Dynamics Overview of CFD: Video Clip with Audio Computational fluid dynamics (CFD) research uses mathematical and computational models of flowing fluids to describe and predict fluid response in problems of interest, such as the flow of air around a moving vehicle or the flow of water and sediment in a river. Coupled with appropriate and prototypical

    3. Fort Drum integrated resource assessment. Volume 2, Baseline detail

      SciTech Connect (OSTI)

      Dixon, D.R.; Armstrong, P.R.; Brodrick, J.R.; Daellenbach, K.K.; Di Massa, F.V.; Keller, J.M.; Richman, E.E.; Sullivan, G.P.; Wahlstrom, R.R.

      1992-12-01

      The US Army Forces Command (FORSCOM) has tasked the Pacific Northwest Laboratory (PNL) as the lead laboratory supporting the US Department of Energy (DOE) Federal Energy Management Program`s mission to identify, evaluate, and assist in acquiring all cost-effective energy projects at Fort Drum. This is a model program PNL is designing for federal customers served by the Niagara Mohawk Power Company. It will identify and evaluate all electric and fossil fuel cost-effective energy projects; develop a schedule at each installation for project acquisition considering project type, size, timing, and capital requirements, as well as energy and dollar savings; and secure 100% of the financing required to implement electric energy efficiency projects from Niagara Mohawk and have Niagara Mohawk procure the necessary contractors to perform detailed audits and install the technologies. This report documents the assessment of baseline energy use at one of Niagara Mohawk`s primary federal facilities, the FORSCOM Fort Drum facility located near Watertown, New York. It is a companion report to Volume 1, the Executive Summary, and Volume 3, the Resource Assessment. This analysis examines the characteristics of electric, gas, oil, propane, coal, and purchased thermal capacity use for fiscal year (FY) 1990. It records energy-use intensities for the facilities at Fort Drum by building type and energy end use. It also breaks down building energy consumption by fuel type, energy end use, and building type. A complete energy consumption reconciliation is presented that includes the accounting of all energy use among buildings, utilities, central systems, and applicable losses.

    4. Griffiss AFB integrated resource assessment. Volume 2, Electric baseline detail

      SciTech Connect (OSTI)

      Dixon, D.R.; Armstrong, P.R.; Keller, J.M.

      1993-02-01

      The US Air Force Air Combat Command has tasked the Pacific Northwest Laboratory (PNL) as the lead laboratory supporting the US Department of Energy (DOE) Federal Energy Management Program`s (FEMP) mission to identify, evaluate, and assist in acquiring all cost-effective energy projects at Griffiss Air Force Base (AFB). This is a model program PNL is designing for federal customers served by the Niagara Mohawk Power Company (Niagara Mohawk). It will (1) identify and evaluate all electric cost-effective energy projects; (2) develop a schedule at each installation for project acquisition considering project type, size, timing, and capital requirements, as well as energy and dollar savings; and (3) secure 100% of the financing required to implement electric energy efficiency projects from Niagara Mohawk and have Niagara Mohawk procure the necessary contractors to perform detailed audits and install the technologies. This report documents the assessment of baseline energy use at one of Niagara Mohawk`s primary federal facilities, Griffiss AFB, an Air Combat Command facility located near Rome, New York. It is a companion report to Volume 1, the Executive Summary, and Volume 3, the Electric Resource Assessment. The analysis examines the characteristics of electric, gas, oil, propane, coal, and purchased thermal capacity use for fiscal year (FY) 1990. The results include energy-use intensities for the facilities at Griffiss AFB by building type and electric energy end use. A complete electric energy consumption reconciliation is presented that accounts for the distribution of all major electric energy uses and losses among buildings, utilities, and central systems.

    5. Computational modeling of Krypton gas puffs with tailored mass density profiles on Z

      SciTech Connect (OSTI)

      Jennings, C. A.; Ampleford, D. J.; Lamppa, D. C.; Hansen, S. B.; Jones, B.; Harvey-Thompson, A. J.; Jobe, M.; Strizic, T.; Reneker, J.; Rochau, G. A.; Cuneo, M. E.

      2015-05-15

      Large diameter multi-shell gas puffs rapidly imploded by high current (?20 MA, ?100?ns) on the Z generator of Sandia National Laboratories are able to produce high-intensity Krypton K-shell emission at ?13?keV. Efficiently radiating at these high photon energies is a significant challenge which requires the careful design and optimization of the gas distribution. To facilitate this, we hydrodynamically model the gas flow out of the nozzle and then model its implosion using a 3-dimensional resistive, radiative MHD code (GORGON). This approach enables us to iterate between modeling the implosion and gas flow from the nozzle to optimize radiative output from this combined system. Guided by our implosion calculations, we have designed gas profiles that help mitigate disruption from Magneto-Rayleigh–Taylor implosion instabilities, while preserving sufficient kinetic energy to thermalize to the high temperatures required for K-shell emission.

    6. Computational modeling of Krypton gas puffs with tailored mass density profiles on Z.

      DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

      Jennings, Christopher A.; Ampleford, David J.; Lamppa, Derek C.; Hansen, Stephanie B.; Jones, Brent Manley; Harvey-Thompson, Adam James; Jobe, Marc Ronald Lee; Reneker, Joseph; Rochau, Gregory A.; Cuneo, Michael Edward; et al

      2015-05-18

      Large diameter multi-shell gas puffs rapidly imploded by high current (~20 MA, ~100 ns) on the Z generator of Sandia National Laboratories are able to produce high-intensity Krypton K-shell emission at ~13 keV. Efficiently radiating at these high photon energies is a significant challenge which requires the careful design and optimization of the gas distribution. To facilitate this, we hydrodynamically model the gas flow out of the nozzle and then model its implosion using a 3-dimensional resistive, radiative MHD code (GORGON). This approach enables us to iterate between modeling the implosion and gas flow from the nozzle to optimize radiativemore » output from this combined system. Furthermore, guided by our implosion calculations, we have designed gas profiles that help mitigate disruption from Magneto-Rayleigh–Taylor implosion instabilities, while preserving sufficient kinetic energy to thermalize to the high temperatures required for K-shell emission.« less

    7. Computational modeling of Krypton gas puffs with tailored mass density profiles on Z.

      SciTech Connect (OSTI)

      Jennings, Christopher A.; Ampleford, David J.; Lamppa, Derek C.; Hansen, Stephanie B.; Jones, Brent Manley; Harvey-Thompson, Adam James; Jobe, Marc Ronald Lee; Reneker, Joseph; Rochau, Gregory A.; Cuneo, Michael Edward; Strizic, T.

      2015-05-18

      Large diameter multi-shell gas puffs rapidly imploded by high current (~20 MA, ~100 ns) on the Z generator of Sandia National Laboratories are able to produce high-intensity Krypton K-shell emission at ~13 keV. Efficiently radiating at these high photon energies is a significant challenge which requires the careful design and optimization of the gas distribution. To facilitate this, we hydrodynamically model the gas flow out of the nozzle and then model its implosion using a 3-dimensional resistive, radiative MHD code (GORGON). This approach enables us to iterate between modeling the implosion and gas flow from the nozzle to optimize radiative output from this combined system. Furthermore, guided by our implosion calculations, we have designed gas profiles that help mitigate disruption from Magneto-Rayleigh–Taylor implosion instabilities, while preserving sufficient kinetic energy to thermalize to the high temperatures required for K-shell emission.

    8. Computational modeling of structure of metal matrix composite in centrifugal casting process

      SciTech Connect (OSTI)

      Zagorski, Roman [Department of Electrotechnology, Faculty of Materials Science and Metallurgy, Silesian University of Technology, ul. Krasinskiego 8, 40-019, Katowice (Poland)

      2007-04-07

      The structure of alumina matrix composite reinforced with crystalline particles obtained during centrifugal casting process are studied. Several parameters of cast process like pouring temperature, temperature, rotating speed and size of casting mould which influent on structure of composite are examined. Segregation of crystalline particles depended on other factors such as: the gradient of density of the liquid matrix and reinforcement, thermal processes connected with solidifying of the cast, processes leading to changes in physical and structural properties of liquid composite are also investigated. All simulation are carried out by CFD program Fluent. Numerical simulations are performed using the FLUENT two-phase free surface (air and matrix) unsteady flow model (volume of fluid model - VOF) and discrete phase model (DPM)

    9. Precision Measurements of Long-Baseline Neutrino Oscillation at LBNF

      DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

      Worcester, Elizabeth

      2015-08-06

      In a long-baseline neutrino oscillation experiment, the primary physics objectives are to determine the neutrino mass hierarchy, to determine the octant of the neutrino mixing angle Ξ23, to search for CP violation in neutrino oscillation, and to precisely measure the size of any CP-violating effect that is discovered. This presentation provides a brief introduction to these measurements and reports on efforts to optimize the design of a long-baseline neutrino oscillation experiment, the status of LBNE, and the transition to an international collaboration at LBNF.

    10. Precision Measurements of Long-Baseline Neutrino Oscillation at LBNF

      SciTech Connect (OSTI)

      Worcester, Elizabeth

      2015-08-06

      In a long-baseline neutrino oscillation experiment, the primary physics objectives are to determine the neutrino mass hierarchy, to determine the octant of the neutrino mixing angle ?23, to search for CP violation in neutrino oscillation, and to precisely measure the size of any CP-violating effect that is discovered. This presentation provides a brief introduction to these measurements and reports on efforts to optimize the design of a long-baseline neutrino oscillation experiment, the status of LBNE, and the transition to an international collaboration at LBNF.

    11. DFT modeling of adsorption onto uranium metal using large-scale parallel computing

      SciTech Connect (OSTI)

      Davis, N.; Rizwan, U.

      2013-07-01

      There is a dearth of atomistic simulations involving the surface chemistry of 7-uranium which is of interest as the key fuel component of a breeder-burner stage in future fuel cycles. Recent availability of high-performance computing hardware and software has rendered extended quantum chemical surface simulations involving actinides feasible. With that motivation, data for bulk and surface 7-phase uranium metal are calculated in the plane-wave pseudopotential density functional theory method. Chemisorption of atomic hydrogen and oxygen on several un-relaxed low-index faces of 7-uranium is considered. The optimal adsorption sites (calculated cohesive energies) on the (100), (110), and (111) faces are found to be the one-coordinated top site (8.8 eV), four-coordinated center site (9.9 eV), and one-coordinated top 1 site (7.9 eV) respectively, for oxygen; and the four-coordinated center site (2.7 eV), four-coordinated center site (3.1 eV), and three-coordinated top2 site (3.2 eV) for hydrogen. (authors)

    12. Computational Nanophotonics: modeling optical interactions and transport in tailored nanosystem architectures

      SciTech Connect (OSTI)

      Schatz, George; Ratner, Mark

      2014-02-27

      This report describes research by George Schatz and Mark Ratner that was done over the period 10/03-5/09 at Northwestern University. This research project was part of a larger research project with the same title led by Stephen Gray at Argonne. A significant amount of our work involved collaborations with Gray, and there were many joint publications as summarized later. In addition, a lot of this work involved collaborations with experimental groups at Northwestern, Argonne, and elsewhere. The research was primarily concerned with developing theory and computational methods that can be used to describe the interaction of light with noble metal nanoparticles (especially silver) that are capable of plasmon excitation. Classical electrodynamics provides a powerful approach for performing these studies, so much of this research project involved the development of methods for solving Maxwell’s equations, including both linear and nonlinear effects, and examining a wide range of nanostructures, including particles, particle arrays, metal films, films with holes, and combinations of metal nanostructures with polymers and other dielectrics. In addition, our work broke new ground in the development of quantum mechanical methods to describe plasmonic effects based on the use of time dependent density functional theory, and we developed new theory concerned with the coupling of plasmons to electrical transport in molecular wire structures. Applications of our technology were aimed at the development of plasmonic devices as components of optoelectronic circuits, plasmons for spectroscopy applications, and plasmons for energy-related applications.

    13. Unveiling Stability Criteria of DNA-Carbon Nanotubes Constructs by Scanning Tunneling Microscopy and Computational Modeling

      DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

      Kilina, Svetlana; Yarotski, Dzmitry A.; Talin, A. Alec; Tretiak, Sergei; Taylor, Antoinette J.; Balatsky, Alexander V.

      2011-01-01

      We present a combined approach that relies on computational simulations and scanning tunneling microscopy (STM) measurements to reveal morphological properties and stability criteria of carbon nanotube-DNA (CNT-DNA) constructs. Application of STM allows direct observation of very stable CNT-DNA hybrid structures with the well-defined DNA wrapping angle of 63.4 ° and a coiling period of 3.3 nm. Using force field simulations, we determine how the DNA-CNT binding energy depends on the sequence and binding geometry of a single strand DNA. This dependence allows us to quantitatively characterize the stability of a hybrid structure with an optimal π-stacking between DNA nucleotides andmore » the tube surface and better interpret STM data. Our simulations clearly demonstrate the existence of a very stable DNA binding geometry for (6,5) CNT as evidenced by the presence of a well-defined minimum in the binding energy as a function of an angle between DNA strand and the nanotube chiral vector. This novel approach demonstrates the feasibility of CNT-DNA geometry studies with subnanometer resolution and paves the way towards complete characterization of the structural and electronic properties of drug-delivering systems based on DNA-CNT hybrids as a function of DNA sequence and a nanotube chirality.« less

    14. COMPUTATIONAL AND EXPERIMENTAL MODELING OF THREE-PHASE SLURRY-BUBBLE COLUMN REACTOR

      SciTech Connect (OSTI)

      Isaac K. Gamwo; Dimitri Gidaspow

      1999-09-01

      Considerable progress has been achieved in understanding three-phase reactors from the point of view of kinetic theory. In a paper in press for publication in Chemical Engineering Science (Wu and Gidaspow, 1999) we have obtained a complete numerical solution of bubble column reactors. In view of the complexity of the simulation a better understanding of the processes using simplified analytical solutions is required. Such analytical solutions are presented in the attached paper, Large Scale Oscillations or Gravity Waves in Risers and Bubbling Beds. This paper presents analytical solutions for bubbling frequencies and standing wave flow patterns. The flow patterns in operating slurry bubble column reactors are not optimum. They involve upflow in the center and downflow at the walls. It may be possible to control flow patterns by proper redistribution of heat exchangers in slurry bubble column reactors. We also believe that the catalyst size in operating slurry bubble column reactors is not optimum. To obtain an optimum size we are following up on the observation of George Cody of Exxon who reported a maximum granular temperature (random particle kinetic energy) for a particle size of 90 microns. The attached paper, Turbulence of Particles in a CFB and Slurry Bubble Columns Using Kinetic Theory, supports George Cody's observations. However, our explanation for the existence of the maximum in granular temperature differs from that proposed by George Cody. Further computer simulations and experiments involving measurements of granular temperature are needed to obtain a sound theoretical explanation for the possible existence of an optimum catalyst size.

    15. Computational Modeling of Fluid Flow through a Fracture in Permeable Rock

      SciTech Connect (OSTI)

      Crandall, Dustin; Ahmadi, Goodarz; Smith, Duane H

      2010-01-01

      Laminar, single-phase, finite-volume solutions to the Navier–Stokes equations of fluid flow through a fracture within permeable media have been obtained. The fracture geometry was acquired from computed tomography scans of a fracture in Berea sandstone, capturing the small-scale roughness of these natural fluid conduits. First, the roughness of the two-dimensional fracture profiles was analyzed and shown to be similar to Brownian fractal structures. The permeability and tortuosity of each fracture profile was determined from simulations of fluid flow through these geometries with impermeable fracture walls. A surrounding permeable medium, assumed to obey Darcy’s Law with permeabilities from 0.2 to 2,000 millidarcies, was then included in the analysis. A series of simulations for flows in fractured permeable rocks was performed, and the results were used to develop a relationship between the flow rate and pressure loss for fractures in porous rocks. The resulting frictionfactor, which accounts for the fracture geometric properties, is similar to the cubic law; it has the potential to be of use in discrete fracture reservoir-scale simulations of fluid flow through highly fractured geologic formations with appreciable matrix permeability. The observed fluid flow from the surrounding permeable medium to the fracture was significant when the resistance within the fracture and the medium were of the same order. An increase in the volumetric flow rate within the fracture profile increased by more than 5% was observed for flows within high permeability-fractured porous media.

    16. Mathematical and computational modeling of the diffraction problems by discrete singularities method

      SciTech Connect (OSTI)

      Nesvit, K. V.

      2014-11-12

      The main objective of this study is reduced the boundary-value problems of scattering and diffraction waves on plane-parallel structures to the singular or hypersingular integral equations. For these cases we use a method of the parametric representations of the integral and pseudo-differential operators. Numerical results of the model scattering problems on periodic and boundary gratings and also on the gratings above a flat screen reflector are presented in this paper.

    17. The Use Of Computational Human Performance Modeling As Task Analysis Tool

      SciTech Connect (OSTI)

      Jacuqes Hugo; David Gertman

      2012-07-01

      During a review of the Advanced Test Reactor safety basis at the Idaho National Laboratory, human factors engineers identified ergonomic and human reliability risks involving the inadvertent exposure of a fuel element to the air during manual fuel movement and inspection in the canal. There were clear indications that these risks increased the probability of human error and possible severe physical outcomes to the operator. In response to this concern, a detailed study was conducted to determine the probability of the inadvertent exposure of a fuel element. Due to practical and safety constraints, the task network analysis technique was employed to study the work procedures at the canal. Discrete-event simulation software was used to model the entire procedure as well as the salient physical attributes of the task environment, such as distances walked, the effect of dropped tools, the effect of hazardous body postures, and physical exertion due to strenuous tool handling. The model also allowed analysis of the effect of cognitive processes such as visual perception demands, auditory information and verbal communication. The model made it possible to obtain reliable predictions of operator performance and workload estimates. It was also found that operator workload as well as the probability of human error in the fuel inspection and transfer task were influenced by the concurrent nature of certain phases of the task and the associated demand on cognitive and physical resources. More importantly, it was possible to determine with reasonable accuracy the stages as well as physical locations in the fuel handling task where operators would be most at risk of losing their balance and falling into the canal. The model also provided sufficient information for a human reliability analysis that indicated that the postulated fuel exposure accident was less than credible.

    18. Computational mechanics

      SciTech Connect (OSTI)

      Goudreau, G.L.

      1993-03-01

      The Computational Mechanics thrust area sponsors research into the underlying solid, structural and fluid mechanics and heat transfer necessary for the development of state-of-the-art general purpose computational software. The scale of computational capability spans office workstations, departmental computer servers, and Cray-class supercomputers. The DYNA, NIKE, and TOPAZ codes have achieved world fame through our broad collaborators program, in addition to their strong support of on-going Lawrence Livermore National Laboratory (LLNL) programs. Several technology transfer initiatives have been based on these established codes, teaming LLNL analysts and researchers with counterparts in industry, extending code capability to specific industrial interests of casting, metalforming, and automobile crash dynamics. The next-generation solid/structural mechanics code, ParaDyn, is targeted toward massively parallel computers, which will extend performance from gigaflop to teraflop power. Our work for FY-92 is described in the following eight articles: (1) Solution Strategies: New Approaches for Strongly Nonlinear Quasistatic Problems Using DYNA3D; (2) Enhanced Enforcement of Mechanical Contact: The Method of Augmented Lagrangians; (3) ParaDyn: New Generation Solid/Structural Mechanics Codes for Massively Parallel Processors; (4) Composite Damage Modeling; (5) HYDRA: A Parallel/Vector Flow Solver for Three-Dimensional, Transient, Incompressible Viscous How; (6) Development and Testing of the TRIM3D Radiation Heat Transfer Code; (7) A Methodology for Calculating the Seismic Response of Critical Structures; and (8) Reinforced Concrete Damage Modeling.

    19. Computational Intelligence Based Data Fusion Algorithm for Dynamic sEMG and Skeletal Muscle Force Modelling

      SciTech Connect (OSTI)

      Chandrasekhar Potluri,; Madhavi Anugolu; Marco P. Schoen; D. Subbaram Naidu

      2013-08-01

      In this work, an array of three surface Electrography (sEMG) sensors are used to acquired muscle extension and contraction signals for 18 healthy test subjects. The skeletal muscle force is estimated using the acquired sEMG signals and a Non-linear Wiener Hammerstein model, relating the two signals in a dynamic fashion. The model is obtained from using System Identification (SI) algorithm. The obtained force models for each sensor are fused using a proposed fuzzy logic concept with the intent to improve the force estimation accuracy and resilience to sensor failure or misalignment. For the fuzzy logic inference system, the sEMG entropy, the relative error, and the correlation of the force signals are considered for defining the membership functions. The proposed fusion algorithm yields an average of 92.49% correlation between the actual force and the overall estimated force output. In addition, the proposed fusionbased approach is implemented on a test platform. Experiments indicate an improvement in finger/hand force estimation.

    20. THE FIRST VERY LONG BASELINE INTERFEROMETRIC SETI EXPERIMENT

      SciTech Connect (OSTI)

      Rampadarath, H.; Morgan, J. S.; Tingay, S. J.; Trott, C. M.

      2012-08-15

      The first Search for Extra-Terrestrial Intelligence (SETI) conducted with very long baseline interferometry (VLBI) is presented. By consideration of the basic principles of interferometry, we show that VLBI is efficient at discriminating between SETI signals and human generated radio frequency interference (RFI). The target for this study was the star Gliese 581, thought to have two planets within its habitable zone. On 2007 June 19, Gliese 581 was observed for 8 hr at 1230-1544 MHz with the Australian Long Baseline Array. The data set was searched for signals appearing on all interferometer baselines above five times the noise limit. A total of 222 potential SETI signals were detected and by using automated data analysis techniques were ruled out as originating from the Gliese 581 system. From our results we place an upper limit of 7 MW Hz{sup -1} on the power output of any isotropic emitter located in the Gliese 581 system within this frequency range. This study shows that VLBI is ideal for targeted SETI including follow-up observations. The techniques presented are equally applicable to next-generation interferometers, such as the long baselines of the Square Kilometre Array.

    1. 241-AZ Farm Annulus Extent of Condition Baseline Inspection

      SciTech Connect (OSTI)

      Engeman, Jason K.; Girardot, Crystal L.; Vazquez, Brandon J.

      2013-05-15

      This report provides the results of the comprehensive annulus visual inspection for tanks 241- AZ-101 and 241-AZ-102 performed in fiscal year 2013. The inspection established a baseline covering about 95 percent of the annulus floor for comparison with future inspections. Any changes in the condition are also included in this document.

    2. Technical Baseline Summary Description for the Tank Farm Contractor

      SciTech Connect (OSTI)

      TEDESCHI, A.R.

      2000-04-21

      This document is a revision of the document titled above, summarizing the technical baseline of the Tank Farm Contractor. It is one of several documents prepared by CH2M HILL Hanford Group, Inc. to support the U.S. Department of Energy Office of River Protection Tank Waste Retrieval and Disposal Mission at Hanford.

    3. Computation & Simulation > Theory & Computation > Research >...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      it. Click above to view. computational2 computational3 In This Section Computation & Simulation Computation & Simulation Extensive combinatorial results and ongoing basic...

    4. A Computational Model of the Mark-IV Electrorefiner: Phase I -- Fuel Basket/Salt Interface

      SciTech Connect (OSTI)

      Robert Hoover; Supathorn Phongikaroon; Shelly Li; Michael Simpson; Tae-Sic Yoo

      2009-09-01

      Spent driver fuel from the Experimental Breeder Reactor-II (EBR-II) is currently being treated in the Mk-IV electrorefiner (ER) in the Fuel Conditioning Facility (FCF) at Idaho National Laboratory. The modeling approach to be presented here has been developed to help understand the effect of different parameters on the dynamics of this system. The first phase of this new modeling approach focuses on the fuel basket/salt interface involving the transport of various species found in the driver fuels (e.g. uranium and zirconium). This approach minimizes the guessed parameters to only one, the exchange current density (i0). U3+ and Zr4+ were the only species used for the current study. The result reveals that most of the total cell current is used for the oxidation of uranium, with little being used by zirconium. The dimensionless approach shows that the total potential is a strong function of i0 and a weak function of wt% of uranium in the salt system for initiation processes.

    5. EVMS Training Snippet: 4.2 Integrated Baseline Review (IBR) Process...

      Energy Savers [EERE]

      2 Integrated Baseline Review (IBR) Process EVMS Training Snippet: 4.2 Integrated Baseline Review (IBR) Process This EVMS Training Snippet sponsored by the Office of Project...

    6. Mathematical and Computational Epidemiology

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Mathematical and Computational Epidemiology Search Site submit Contacts | Sponsors Mathematical and Computational Epidemiology Los Alamos National Laboratory change this image and alt text Menu About Contact Sponsors Research Agent-based Modeling Mixing Patterns, Social Networks Mathematical Epidemiology Social Internet Research Uncertainty Quantification Publications People Mathematical and Computational Epidemiology (MCEpi) Quantifying model uncertainty in agent-based simulations for

    7. Revised SRC-I project baseline. Volume 1

      SciTech Connect (OSTI)

      Not Available

      1984-01-01

      International Coal Refining Company (ICRC), in cooperation with the Commonwealth of Kentucky has contracted with the United States Department of Energy (DOE) to design, build and operate a first-of-its-kind plant demonstrating the economic, environmental, socioeconomic and technical feasibility of the direct coal liquefaction process known as SRC-I. ICRC has made a massive commitment of time and expertise to design processes, plan and formulate policy, schedules, costs and technical drawings for all plant systems. These fully integrated plans comprise the Project Baseline and are the basis for all future detailed engineering, plant construction, operation, and other work set forth in the contract between ICRC and the DOE. Volumes I and II of the accompanying documents constitute the updated Project Baseline for the SRC-I two-stage liquefaction plant. International Coal Refining Company believes this versatile plant design incorporates the most advanced coal liquefaction system available in the synthetic fuels field. SRC-I two-stage liquefaction, as developed by ICRC, is the way of the future in coal liquefaction because of its product slate flexibility, high process thermal efficiency, and low consumption of hydrogen. The SRC-I Project Baseline design also has made important state-of-the-art advances in areas such as environmental control systems. Because of a lack of funding, the DOE has curtailed the total project effort without specifying a definite renewal date. This precludes the development of revised accurate and meaningful schedules and, hence, escalated project costs. ICRC has revised and updated the original Design Baseline to include in the technical documentation all of the approved but previously non-incorporated Category B and C and new Post-Baseline Engineering Change Proposals.

    8. Inference of tumor evolution during chemotherapy by computational modeling and in situ analysis of genetic and phenotypic cellular diversity

      DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

      Almendro, Vanessa; Cheng, Yu -Kang; Randles, Amanda; Itzkovitz, Shalev; Marusyk, Andriy; Ametller, Elisabet; Gonzalez-Farre, Xavier; Muñoz, Montse; Russnes, Hege  G.; Helland, Åslaug; et al

      2014-02-01

      Cancer therapy exerts a strong selection pressure that shapes tumor evolution, yet our knowledge of how tumors change during treatment is limited. Here, we report the analysis of cellular heterogeneity for genetic and phenotypic features and their spatial distribution in breast tumors pre- and post-neoadjuvant chemotherapy. We found that intratumor genetic diversity was tumor-subtype specific, and it did not change during treatment in tumors with partial or no response. However, lower pretreatment genetic diversity was significantly associated with pathologic complete response. In contrast, phenotypic diversity was different between pre- and post-treatment samples. We also observed significant changes in the spatialmore » distribution of cells with distinct genetic and phenotypic features. We used these experimental data to develop a stochastic computational model to infer tumor growth patterns and evolutionary dynamics. Our results highlight the importance of integrated analysis of genotypes and phenotypes of single cells in intact tissues to predict tumor evolution.« less

    9. Inference of tumor evolution during chemotherapy by computational modeling and in situ analysis of genetic and phenotypic cellular diversity

      SciTech Connect (OSTI)

      Almendro, Vanessa; Cheng, Yu -Kang; Randles, Amanda; Itzkovitz, Shalev; Marusyk, Andriy; Ametller, Elisabet; Gonzalez-Farre, Xavier; Muńoz, Montse; Russnes, Hege  G.; Helland, Ćslaug; Rye, Inga  H.; Borresen-Dale, Anne -Lise; Maruyama, Reo; van Oudenaarden, Alexander; Dowsett, Mitchell; Jones, Robin  L.; Reis-Filho, Jorge; Gascon, Pere; Gönen, Mithat; Michor, Franziska; Polyak, Kornelia

      2014-02-01

      Cancer therapy exerts a strong selection pressure that shapes tumor evolution, yet our knowledge of how tumors change during treatment is limited. Here, we report the analysis of cellular heterogeneity for genetic and phenotypic features and their spatial distribution in breast tumors pre- and post-neoadjuvant chemotherapy. We found that intratumor genetic diversity was tumor-subtype specific, and it did not change during treatment in tumors with partial or no response. However, lower pretreatment genetic diversity was significantly associated with pathologic complete response. In contrast, phenotypic diversity was different between pre- and post-treatment samples. We also observed significant changes in the spatial distribution of cells with distinct genetic and phenotypic features. We used these experimental data to develop a stochastic computational model to infer tumor growth patterns and evolutionary dynamics. Our results highlight the importance of integrated analysis of genotypes and phenotypes of single cells in intact tissues to predict tumor evolution.

    10. Computational Study of Bond Dissociation Enthalpies for Substituted $\\beta$-O-4 Lignin Model Compounds

      SciTech Connect (OSTI)

      Younker, Jarod M; Beste, Ariana; Buchanan III, A C

      2011-01-01

      The biopolymer lignin is a potential source of valuable chemicals. Phenethyl phenyl ether (PPE) is representative of the dominant $\\beta$-O-4 ether linkage. Density functional theory (DFT) is used to calculate the Boltzmann-weighted carbon-oxygen and carbon-carbon bond dissociation enthalpies (BDEs) of substituted PPE. These values are important in order to understand lignin decomposition. Exclusion of all conformers that have distributions of less than 5\\% at 298 K impacts the BDE by less than 1 kcal mol$^{-1}$. We find that aliphatic hydroxyl/methylhydroxyl substituents introduce only small changes to the BDEs (0-3 kcal mol$^{-1}$). Substitution on the phenyl ring at the $ortho$ position substantially lowers the C-O BDE, except in combination with the hydroxyl/methylhydroxyl substituents, where the effect of methoxy substitution is reduced by hydrogen bonding. Hydrogen bonding between the aliphatic substituents and the ether oxygen in the PPE derivatives has a significant influence on the BDE. CCSD(T)-calculated BDEs and hydrogen bond strengths of $ortho$-substituted anisoles when compared with M06-2X values confirm that the latter method is sufficient to describe the molecules studied and provide an important benchmark for lignin model compounds.

    11. Risk and Vulnerability Assessment Using Cybernomic Computational Models: Tailored for Industrial Control Systems

      SciTech Connect (OSTI)

      Abercrombie, Robert K; Sheldon, Federick T.; Schlicher, Bob G

      2015-01-01

      There are many influencing economic factors to weigh from the defender-practitioner stakeholder point-of-view that involve cost combined with development/deployment models. Some examples include the cost of countermeasures themselves, the cost of training and the cost of maintenance. Meanwhile, we must better anticipate the total cost from a compromise. The return on investment in countermeasures is essentially impact costs (i.e., the costs from violating availability, integrity and confidentiality / privacy requirements). The natural question arises about choosing the main risks that must be mitigated/controlled and monitored in deciding where to focus security investments. To answer this question, we have investigated the cost/benefits to the attacker/defender to better estimate risk exposure. In doing so, it s important to develop a sound basis for estimating the factors that derive risk exposure, such as likelihood that a threat will emerge and whether it will be thwarted. This impact assessment framework can provide key information for ranking cybersecurity threats and managing risk.

    12. Estimating baseline risks from biouptake and food ingestion at a contaminated site

      SciTech Connect (OSTI)

      MacDonell, M.; Woytowich, K.; Blunt, D.; Picel, M.

      1993-11-01

      Biouptake of contaminants and subsequent human exposure via food ingestion represents a public concern at many contaminated sites. Site-specific measurements from plant and animal studies are usually quite limited, so this exposure pathway is often modeled to assess the potential for adverse health effects. A modeling tool was applied to evaluate baseline risks at a contaminated site in Missouri, and the results were used to confirm that ingestion of fish and game animals from the site area do not pose a human health threat. Results were also used to support the development of cleanup criteria for site soil.

    13. TriBITS lifecycle model. Version 1.0, a lean/agile software lifecycle model for research-based computational science and engineering and applied mathematical software.

      SciTech Connect (OSTI)

      Willenbring, James M.; Bartlett, Roscoe Ainsworth; Heroux, Michael Allen

      2012-01-01

      Software lifecycles are becoming an increasingly important issue for computational science and engineering (CSE) software. The process by which a piece of CSE software begins life as a set of research requirements and then matures into a trusted high-quality capability is both commonplace and extremely challenging. Although an implicit lifecycle is obviously being used in any effort, the challenges of this process - respecting the competing needs of research vs. production - cannot be overstated. Here we describe a proposal for a well-defined software lifecycle process based on modern Lean/Agile software engineering principles. What we propose is appropriate for many CSE software projects that are initially heavily focused on research but also are expected to eventually produce usable high-quality capabilities. The model is related to TriBITS, a build, integration and testing system, which serves as a strong foundation for this lifecycle model, and aspects of this lifecycle model are ingrained in the TriBITS system. Here, we advocate three to four phases or maturity levels that address the appropriate handling of many issues associated with the transition from research to production software. The goals of this lifecycle model are to better communicate maturity levels with customers and to help to identify and promote Software Engineering (SE) practices that will help to improve productivity and produce better software. An important collection of software in this domain is Trilinos, which is used as the motivation and the initial target for this lifecycle model. However, many other related and similar CSE (and non-CSE) software projects can also make good use of this lifecycle model, especially those that use the TriBITS system. Indeed this lifecycle process, if followed, will enable large-scale sustainable integration of many complex CSE software efforts across several institutions.

    14. Computational Structural Mechanics

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      load-2 TRACC RESEARCH Computational Fluid Dynamics Computational Structural Mechanics Transportation Systems Modeling Computational Structural Mechanics Overview of CSM Computational structural mechanics is a well-established methodology for the design and analysis of many components and structures found in the transportation field. Modern finite-element models (FEMs) play a major role in these evaluations, and sophisticated software, such as the commercially available LS-DYNAÂź code, is

    15. Baseline ecological risk assessment Salmon Site, Lamar County, Mississippi

      SciTech Connect (OSTI)

      1995-04-01

      The Salmon Site (SS), formerly the Tatum Dome Test Site, located in Mississippi was the site of two nuclear and two gas explosion tests conducted between 1964 and 1970. A consequence of these testing activities is that radionuclides were released into the salt dome, where they are presently contained. During reentry drilling and other site activities, incidental liquid and solid wastes that contained radioactivity were generated, resulting in some soil, ground water and equipment contamination. As part of the remedial investigation effort, a Baseline Ecological Risk Assessment was conducted at the SS. The purpose is to gauge ecological and other environmental impacts attributable to past activities at the former test facility. The results of this facility-specific baseline risk assessment are presented in this document.

    16. Baseline measurements of terrestrial gamma radioactivity at the CEBAF site

      SciTech Connect (OSTI)

      Wollenberg, H.A.; Smith, A.R.

      1991-10-01

      A survey of the gamma radiation background from terrestrial sources was conducted at the CEBAF site, Newport News, Virginia, on November 12--16, 1990, to provide a gamma radiation baseline for the site prior to the startup of the accelerator. The concentrations and distributions of the natural radioelements in exposed soil were measured, and the results of the measurements were converted into gamma-ray exposure rates. Concurrently, samples were collected for laboratory gamma spectral analyses.

    17. Computational fluid dynamics modeling of two-phase flow in a BWR fuel assembly. Final CRADA Report.

      SciTech Connect (OSTI)

      Tentner, A.; Nuclear Engineering Division

      2009-10-13

      A direct numerical simulation capability for two-phase flows with heat transfer in complex geometries can considerably reduce the hardware development cycle, facilitate the optimization and reduce the costs of testing of various industrial facilities, such as nuclear power plants, steam generators, steam condensers, liquid cooling systems, heat exchangers, distillers, and boilers. Specifically, the phenomena occurring in a two-phase coolant flow in a BWR (Boiling Water Reactor) fuel assembly include coolant phase changes and multiple flow regimes which directly influence the coolant interaction with fuel assembly and, ultimately, the reactor performance. Traditionally, the best analysis tools for this purpose of two-phase flow phenomena inside the BWR fuel assembly have been the sub-channel codes. However, the resolution of these codes is too coarse for analyzing the detailed intra-assembly flow patterns, such as flow around a spacer element. Advanced CFD (Computational Fluid Dynamics) codes provide a potential for detailed 3D simulations of coolant flow inside a fuel assembly, including flow around a spacer element using more fundamental physical models of flow regimes and phase interactions than sub-channel codes. Such models can extend the code applicability to a wider range of situations, which is highly important for increasing the efficiency and to prevent accidents.

    18. Computational fluid dynamics modeling of chemical looping combustion process with calcium sulphate oxygen carrier - article no. A19

      SciTech Connect (OSTI)

      Baosheng Jin; Rui Xiao; Zhongyi Deng; Qilei Song

      2009-07-01

      To concentrate CO{sub 2} in combustion processes by efficient and energy-saving ways is a first and very important step for its sequestration. Chemical looping combustion (CLC) could easily achieve this goal. A chemical-looping combustion system consists of a fuel reactor and an air reactor. Two reactors in the form of interconnected fluidized beds are used in the process: (1) a fuel reactor where the oxygen carrier is reduced by reaction with the fuel, and (2) an air reactor where the reduced oxygen carrier from the fuel reactor is oxidized with air. The outlet gas from the fuel reactor consists of CO{sub 2} and H{sub 2}O, while the outlet gas stream from the air reactor contains only N{sub 2} and some unused O{sub 2}. The water in combustion products can be easily removed by condensation and pure carbon dioxide is obtained without any loss of energy for separation. Until now, there is little literature about mathematical modeling of chemical-looping combustion using the computational fluid dynamics (CFD) approach. In this work, the reaction kinetic model of the fuel reactor (CaSO{sub 4}+ H{sub 2}) is developed by means of the commercial code FLUENT and the effects of partial pressure of H{sub 2} (concentration of H{sub 2}) on chemical looping combustion performance are also studied. The results show that the concentration of H{sub 2} could enhance the CLC performance.

    19. Modeling

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Widespread Hydrogen Fueling Infrastructure Is the Goal of H2FIRST Project Capabilities, Center for Infrastructure Research and Innovation (CIRI), Computational Modeling & Simulation, Energy, Energy Storage, Energy Storage Systems, Facilities, Infrastructure Security, Materials Science, Modeling, Modeling & Analysis, News, News & Events, Partnership, Research & Capabilities, Systems Analysis, Systems Engineering, Transportation Energy Widespread Hydrogen Fueling Infrastructure Is

    20. A full-spectral Bayesian reconstruction approach based on the material decomposition model applied in dual-energy computed tomography

      SciTech Connect (OSTI)

      Cai, C.; Rodet, T.; Mohammad-Djafari, A.; Legoupil, S.

      2013-11-15

      Purpose: Dual-energy computed tomography (DECT) makes it possible to get two fractions of basis materials without segmentation. One is the soft-tissue equivalent water fraction and the other is the hard-matter equivalent bone fraction. Practical DECT measurements are usually obtained with polychromatic x-ray beams. Existing reconstruction approaches based on linear forward models without counting the beam polychromaticity fail to estimate the correct decomposition fractions and result in beam-hardening artifacts (BHA). The existing BHA correction approaches either need to refer to calibration measurements or suffer from the noise amplification caused by the negative-log preprocessing and the ill-conditioned water and bone separation problem. To overcome these problems, statistical DECT reconstruction approaches based on nonlinear forward models counting the beam polychromaticity show great potential for giving accurate fraction images.Methods: This work proposes a full-spectral Bayesian reconstruction approach which allows the reconstruction of high quality fraction images from ordinary polychromatic measurements. This approach is based on a Gaussian noise model with unknown variance assigned directly to the projections without taking negative-log. Referring to Bayesian inferences, the decomposition fractions and observation variance are estimated by using the joint maximum a posteriori (MAP) estimation method. Subject to an adaptive prior model assigned to the variance, the joint estimation problem is then simplified into a single estimation problem. It transforms the joint MAP estimation problem into a minimization problem with a nonquadratic cost function. To solve it, the use of a monotone conjugate gradient algorithm with suboptimal descent steps is proposed.Results: The performance of the proposed approach is analyzed with both simulated and experimental data. The results show that the proposed Bayesian approach is robust to noise and materials. It is also necessary to have the accurate spectrum information about the source-detector system. When dealing with experimental data, the spectrum can be predicted by a Monte Carlo simulator. For the materials between water and bone, less than 5% separation errors are observed on the estimated decomposition fractions.Conclusions: The proposed approach is a statistical reconstruction approach based on a nonlinear forward model counting the full beam polychromaticity and applied directly to the projections without taking negative-log. Compared to the approaches based on linear forward models and the BHA correction approaches, it has advantages in noise robustness and reconstruction accuracy.

    1. Computational mechanics

      SciTech Connect (OSTI)

      Raboin, P J

      1998-01-01

      The Computational Mechanics thrust area is a vital and growing facet of the Mechanical Engineering Department at Lawrence Livermore National Laboratory (LLNL). This work supports the development of computational analysis tools in the areas of structural mechanics and heat transfer. Over 75 analysts depend on thrust area-supported software running on a variety of computing platforms to meet the demands of LLNL programs. Interactions with the Department of Defense (DOD) High Performance Computing and Modernization Program and the Defense Special Weapons Agency are of special importance as they support our ParaDyn project in its development of new parallel capabilities for DYNA3D. Working with DOD customers has been invaluable to driving this technology in directions mutually beneficial to the Department of Energy. Other projects associated with the Computational Mechanics thrust area include work with the Partnership for a New Generation Vehicle (PNGV) for ''Springback Predictability'' and with the Federal Aviation Administration (FAA) for the ''Development of Methodologies for Evaluating Containment and Mitigation of Uncontained Engine Debris.'' In this report for FY-97, there are five articles detailing three code development activities and two projects that synthesized new code capabilities with new analytic research in damage/failure and biomechanics. The article this year are: (1) Energy- and Momentum-Conserving Rigid-Body Contact for NIKE3D and DYNA3D; (2) Computational Modeling of Prosthetics: A New Approach to Implant Design; (3) Characterization of Laser-Induced Mechanical Failure Damage of Optical Components; (4) Parallel Algorithm Research for Solid Mechanics Applications Using Finite Element Analysis; and (5) An Accurate One-Step Elasto-Plasticity Algorithm for Shell Elements in DYNA3D.

    2. Electromagnetic analysis of forces and torques on the baseline and enhanced ITER shield modules due to plasma disruption.

      SciTech Connect (OSTI)

      Kotulski, Joseph Daniel; Coats, Rebecca Sue; Pasik, Michael Francis; Ulrickson, Michael Andrew

      2009-08-01

      An electromagnetic analysis is performed on the ITER shield modules under different plasma-disruption scenarios using the OPERA-3d software. The models considered include the baseline design as provided by the International Organization and an enhanced design that includes the more realistic geometrical features of a shield module. The modeling procedure is explained, electromagnetic torques are presented, and results of the modeling are discussed.

    3. EVMS Training Snippet: 4.6 Baseline Control Methods | Department of Energy

      Office of Environmental Management (EM)

      6 Baseline Control Methods EVMS Training Snippet: 4.6 Baseline Control Methods This EVMS Training Snippet, sponsored by the Office of Project Management (PM) discusses baseline revisions and the different baseline control vehicles used in DOE. Link to Video Presentation | Prior Snippet (4.5) | Next Snippet (4.7) | Return to Index PDF icon Slides Only PDF icon Slides with Notes More Documents & Publications EVMS Training Snippet: 4.1 The Over Target Baseline (OTB) and The Over Target Schedule

    4. Parallel computing works

      SciTech Connect (OSTI)

      Not Available

      1991-10-23

      An account of the Caltech Concurrent Computation Program (C{sup 3}P), a five year project that focused on answering the question: Can parallel computers be used to do large-scale scientific computations '' As the title indicates, the question is answered in the affirmative, by implementing numerous scientific applications on real parallel computers and doing computations that produced new scientific results. In the process of doing so, C{sup 3}P helped design and build several new computers, designed and implemented basic system software, developed algorithms for frequently used mathematical computations on massively parallel machines, devised performance models and measured the performance of many computers, and created a high performance computing facility based exclusively on parallel computers. While the initial focus of C{sup 3}P was the hypercube architecture developed by C. Seitz, many of the methods developed and lessons learned have been applied successfully on other massively parallel architectures.

    5. DEVELOPMENT OF A COMPUTATIONAL MULTIPHASE FLOW MODEL FOR FISCHER TROPSCH SYNTHESIS IN A SLURRY BUBBLE COLUMN REACTOR

      SciTech Connect (OSTI)

      Donna Post Guillen; Tami Grimmett; Anastasia M. Gribik; Steven P. Antal

      2010-09-01

      The Hybrid Energy Systems Testing (HYTEST) Laboratory is being established at the Idaho National Laboratory to develop and test hybrid energy systems with the principal objective to safeguard U.S. Energy Security by reducing dependence on foreign petroleum. A central component of the HYTEST is the slurry bubble column reactor (SBCR) in which the gas-to-liquid reactions will be performed to synthesize transportation fuels using the Fischer Tropsch (FT) process. SBCRs are cylindrical vessels in which gaseous reactants (for example, synthesis gas or syngas) is sparged into a slurry of liquid reaction products and finely dispersed catalyst particles. The catalyst particles are suspended in the slurry by the rising gas bubbles and serve to promote the chemical reaction that converts syngas to a spectrum of longer chain hydrocarbon products, which can be upgraded to gasoline, diesel or jet fuel. These SBCRs operate in the churn-turbulent flow regime which is characterized by complex hydrodynamics, coupled with reacting flow chemistry and heat transfer, that effect reactor performance. The purpose of this work is to develop a computational multiphase fluid dynamic (CMFD) model to aid in understanding the physico-chemical processes occurring in the SBCR. Our team is developing a robust methodology to couple reaction kinetics and mass transfer into a four-field model (consisting of the bulk liquid, small bubbles, large bubbles and solid catalyst particles) that includes twelve species: (1) CO reactant, (2) H2 reactant, (3) hydrocarbon product, and (4) H2O product in small bubbles, large bubbles, and the bulk fluid. Properties of the hydrocarbon product were specified by vapor liquid equilibrium calculations. The absorption and kinetic models, specifically changes in species concentrations, have been incorporated into the mass continuity equation. The reaction rate is determined based on the macrokinetic model for a cobalt catalyst developed by Yates and Satterfield [1]. The model includes heat generation due to the exothermic chemical reaction, as well as heat removal from a constant temperature heat exchanger. Results of the CMFD simulations (similar to those shown in Figure 1) will be presented.

    6. Computational Modeling & Simulation

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      2 - Sandia Energy Energy Search Icon Sandia Home Locations Contact Us Employee Locator Energy & Climate Secure & Sustainable Energy Future Stationary Power Energy Conversion Efficiency Solar Energy Wind Energy Water Power Supercritical CO2 Geothermal Natural Gas Safety, Security & Resilience of the Energy Infrastructure Energy Storage Nuclear Power & Engineering Grid Modernization Battery Testing Nuclear Fuel Cycle Defense Waste Management Programs Advanced Nuclear Energy Nuclear

    7. Computational Modeling & Simulation

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      3 - Sandia Energy Energy Search Icon Sandia Home Locations Contact Us Employee Locator Energy & Climate Secure & Sustainable Energy Future Stationary Power Energy Conversion Efficiency Solar Energy Wind Energy Water Power Supercritical CO2 Geothermal Natural Gas Safety, Security & Resilience of the Energy Infrastructure Energy Storage Nuclear Power & Engineering Grid Modernization Battery Testing Nuclear Fuel Cycle Defense Waste Management Programs Advanced Nuclear Energy Nuclear

    8. Computational Modeling & Simulation

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Sandia Energy Energy Search Icon Sandia Home Locations Contact Us Employee Locator Energy & Climate Secure & Sustainable Energy Future Stationary Power Energy Conversion Efficiency Solar Energy Wind Energy Water Power Supercritical CO2 Geothermal Natural Gas Safety, Security & Resilience of the Energy Infrastructure Energy Storage Nuclear Power & Engineering Grid Modernization Battery Testing Nuclear Fuel Cycle Defense Waste Management Programs Advanced Nuclear Energy Nuclear

    9. NREL: MIDC/SRRL Baseline Measurement System (39.74 N, 105.18 W, 1829 m,

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      GMT-7) Solar Radiation Research Laboratory Baseline Measurement System

    10. Applied Computer Science

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Results from a climate simulation computed using the Model for Prediction Across Scales (MPAS) code. This visualization shows the temperature of ocean currents using a green and ...

    11. KINETIC MODELING OF A FISCHER-TROPSCH REACTION OVER A COBALT CATALYST IN A SLURRY BUBBLE COLUMN REACTOR FOR INCORPORATION INTO A COMPUTATIONAL MULTIPHASE FLUID DYNAMICS MODEL

      SciTech Connect (OSTI)

      Anastasia Gribik; Doona Guillen, PhD; Daniel Ginosar, PhD

      2008-09-01

      Currently multi-tubular fixed bed reactors, fluidized bed reactors, and slurry bubble column reactors (SBCRs) are used in commercial Fischer Tropsch (FT) synthesis. There are a number of advantages of the SBCR compared to fixed and fluidized bed reactors. The main advantage of the SBCR is that temperature control and heat recovery are more easily achieved. The SBCR is a multiphase chemical reactor where a synthesis gas, comprised mainly of H2 and CO, is bubbled through a liquid hydrocarbon wax containing solid catalyst particles to produce specialty chemicals, lubricants, or fuels. The FT synthesis reaction is the polymerization of methylene groups [-(CH2)-] forming mainly linear alkanes and alkenes, ranging from methane to high molecular weight waxes. The Idaho National Laboratory is developing a computational multiphase fluid dynamics (CMFD) model of the FT process in a SBCR. This paper discusses the incorporation of absorption and reaction kinetics into the current hydrodynamic model. A phased approach for incorporation of the reaction kinetics into a CMFD model is presented here. Initially, a simple kinetic model is coupled to the hydrodynamic model, with increasing levels of complexity added in stages. The first phase of the model includes incorporation of the absorption of gas species from both large and small bubbles into the bulk liquid phase. The driving force for the gas across the gas liquid interface into the bulk liquid is dependent upon the interfacial gas concentration in both small and large bubbles. However, because it is difficult to measure the concentration at the gas-liquid interface, coefficients for convective mass transfer have been developed for the overall driving force between the bulk concentrations in the gas and liquid phases. It is assumed that there are no temperature effects from mass transfer of the gas phases to the bulk liquid phase, since there are only small amounts of dissolved gas in the liquid phase. The product from the incorporation of absorption is the steady state concentration profile of the absorbed gas species in the bulk liquid phase. The second phase of the model incorporates a simplified macrokinetic model to the mass balance equation in the CMFD code. Initially, the model assumes that the catalyst particles are sufficiently small such that external and internal mass and heat transfer are not rate limiting. The model is developed utilizing the macrokinetic rate expression developed by Yates and Satterfield (1991). Initially, the model assumes that the only species formed other than water in the FT reaction is C27H56. Change in moles of the reacting species and the resulting temperature of the catalyst and fluid phases is solved simultaneously. The macrokinetic model is solved in conjunction with the species transport equations in a separate module which is incorporated into the CMFD code.

    12. Computational Analysis of the Pyrolysis of ..beta..-O4 Lignin Model Compounds: Concerted vs. Homolytic Fragmentation

      SciTech Connect (OSTI)

      Clark, J. M.; Robichaud, D. J.; Nimlos, M. R.

      2012-01-01

      The thermochemical conversion of biomass to liquid transportation fuels is a very attractive technology for expanding the utilization of carbon neutral processes and reducing dependency on fossil fuel resources. As with all such emerging technologies, biomass conversion through gasification or pyrolysis has a number of obstacles that need to be overcome to make these processes cost competitive with the refining of fossil fuels. Our current efforts have focused on the investigation of the thermochemistry of the linkages between lignin units using ab initio calculations on dimeric lignin model compounds. All calculations were carried out using M062X density functional theory at the 6-311++G(d,p) basis set. The M062X method has been shown to be consistent with the CBS-QB3 method while being significantly less computationally expensive. To date we have only completed the study on the b-O4 compounds. The theoretical calculations performed in the study indicate that concerted elimination pathways dominate over bond homolysis reactions under typical pyrolysis conditions. However, this does not mean that concerted elimination will be the dominant loss process for lignin. Bimolecular radical chemistry could very well dwarf the unimolecular pathways investigated in this study. These concerted pathways tend to form stable, reasonably non-reactive products that would be more suited producing a fungible bio-oil for the production of liquid transportation fuels.

    13. Integrated Baseline System (IBS) Version 2.0: Utilities Guide

      SciTech Connect (OSTI)

      Burford, M.J.; Downing, T.R.; Williams, J.R.; Bower, J.C.

      1994-03-01

      The Integrated Baseline System (IBS) is an emergency management planning and analysis tool being developed under the direction of the US Army Nuclear and Chemical Agency. This Utilities Guide explains how you can use the IBS utility programs to manage and manipulate various kinds of IBS data. These programs include utilities for creating, editing, and displaying maps and other data that are referenced to geographic location. The intended audience for this document are chiefly data managers but also system managers and some emergency management planners and analysts.

    14. Baseline review of the U.S. LHC Accelerator project

      SciTech Connect (OSTI)

      1998-02-01

      The Department of Energy (DOE) Review of the U.S. Large Hadron Collider (LHC) Accelerator project was conducted February 23--26, 1998, at the request of Dr. John R. O`Fallon, Director, Division of High Energy Physics, Office of Energy Research, U.S. DOE. This is the first review of the U.S. LHC Accelerator project. Overall, the Committee found that the U.S. LHC Accelerator project effort is off to a good start and that the proposed scope is very conservative for the funding available. The Committee recommends that the project be initially baselined at a total cost of $110 million, with a scheduled completion data of 2005. The U.S. LHC Accelerator project will supply high technology superconducting magnets for the interaction regions (IRs) and the radio frequency (rf) straight section of the LHC intersecting storage rings. In addition, the project provides the cryogenic support interface boxes to service the magnets and radiation absorbers to protect the IR dipoles and the inner triplet quadrupoles. US scientists will provide support in analyzing some of the detailed aspects of accelerator physics in the two rings. The three laboratories participating in this project are Brookhaven National Laboratory, Fermi National Accelerator Laboratory (Fermilab), and Lawrence Berkeley National Laboratory. The Committee was very impressed by the technical capabilities of the US LHC Accelerator project team. Cost estimates for each subsystem of the US LHC Accelerator project were presented to the Review Committee, with a total cost including contingency of $110 million (then year dollars). The cost estimates were deemed to be conservative. A re-examination of the funding profile, costs, and schedules on a centralized project basis should lead to an increased list of deliverables. The Committee concluded that the proposed scope of US deliverables to CERN can be readily accomplished with the $110 million total cost baseline for the project. The current deliverables should serve as the baseline scope with the firm expectation that additional scope will be restored to the baseline as the project moves forward. The Committee supports the FY 1998 work plan and scope of deliverables but strongly recommends the reevaluation of costs and schedules with the goal of producing a plan for restoring the US deliverables to CERN. This plan should provide precise dates when scope decisions must be made.

    15. Integrated Baseline System (IBS) Version 1.03: Utilities guide

      SciTech Connect (OSTI)

      Burford, M.J.; Downing, T.R.; Pottier, M.C.; Schrank, E.E.; Williams, J.R.

      1993-01-01

      The Integrated Baseline System (IBS) is an emergency management planning and analysis tool that was developed under the direction of the Federal Emergency Management Agency (FEMA). This Utilities Guide explains how to operate utility programs that are supplied as a part of the IBS. These utility programs are chiefly for managing and manipulating various kinds of IBS data and system administration files. Many of the utilities are for creating, editing, converting, or displaying map data and other data that are related to geographic location.

    16. Computer-Based Procedures for Field Workers in Nuclear Power Plants: Development of a Model of Procedure Usage and Identification of Requirements

      SciTech Connect (OSTI)

      Katya Le Blanc; Johanna Oxstrand

      2012-04-01

      The nuclear industry is constantly trying to find ways to decrease the human error rate, especially the human errors associated with procedure use. As a step toward the goal of improving procedure use performance, researchers, together with the nuclear industry, have been looking at replacing the current paper-based procedures with computer-based procedure systems. The concept of computer-based procedures is not new by any means; however most research has focused on procedures used in the main control room. Procedures reviewed in these efforts are mainly emergency operating procedures and normal operating procedures. Based on lessons learned for these previous efforts we are now exploring a more unknown application for computer based procedures - field procedures, i.e. procedures used by nuclear equipment operators and maintenance technicians. The Idaho National Laboratory and participants from the U.S. commercial nuclear industry are collaborating in an applied research effort with the objective of developing requirements and specifications for a computer-based procedure system to be used by field workers. The goal is to identify the types of human errors that can be mitigated by using computer-based procedures and how to best design the computer-based procedures to do so. This paper describes the development of a Model of Procedure Use and the qualitative study on which the model is based. The study was conducted in collaboration with four nuclear utilities and five research institutes. During the qualitative study and the model development requirements and for computer-based procedures were identified.

    17. Advanced Artificial Science. The development of an artificial science and engineering research infrastructure to facilitate innovative computational modeling, analysis, and application to interdisciplinary areas of scientific investigation.

      SciTech Connect (OSTI)

      Saffer, Shelley I.

      2014-12-01

      This is a final report of the DOE award DE-SC0001132, Advanced Artificial Science. The development of an artificial science and engineering research infrastructure to facilitate innovative computational modeling, analysis, and application to interdisciplinary areas of scientific investigation. This document describes the achievements of the goals, and resulting research made possible by this award.

    18. Advanced Scientific Computing Research

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Advanced Scientific Computing Research Advanced Scientific Computing Research Discovering, developing, and deploying computational and networking capabilities to analyze, model, simulate, and predict complex phenomena important to the Department of Energy. Get Expertise Pieter Swart (505) 665 9437 Email Pat McCormick (505) 665-0201 Email Dave Higdon (505) 667-2091 Email Fulfilling the potential of emerging computing systems and architectures beyond today's tools and techniques to deliver

    19. Moving baseline for evaluation of advanced coal-extraction systems

      SciTech Connect (OSTI)

      Bickerton, C.R.; Westerfield, M.D.

      1981-04-15

      This document reports results from the initial effort to establish baseline economic performance comparators for a program whose intent is to define, develop, and demonstrate advanced systems suitable for coal resource extraction beyond the year 2000. Systems used in this study were selected from contemporary coal mining technology and from conservative conjectures of year 2000 technology. The analysis was also based on a seam thickness of 6 ft. Therefore, the results are specific to the study systems and the selected seam thickness. To be more beneficial to the program, the effort should be extended to other seam thicknesses. This document is one of a series which describe systems level requirements for advanced underground coal mining equipment. Five areas of performance are discussed: production cost, miner safety, miner health, environmental impact, and recovery efficiency. The projections for cost and production capability comprise a so-called moving baseline which will be used to assess compliance with the systems requirement for production cost. Separate projections were prepared for room and pillar, longwall, and shortwall technology all operating under comparable sets of mining conditions. This work is part of an effort to define and develop innovative coal extraction systems suitable for the significant resources remaining in the year 2000.

    20. Baseline Glass Development for Combined Fission Products Waste Streams

      SciTech Connect (OSTI)

      Crum, Jarrod V.; Billings, Amanda Y.; Lang, Jesse B.; Marra, James C.; Rodriguez, Carmen P.; Ryan, Joseph V.; Vienna, John D.

      2009-06-29

      Borosilicate glass was selected as the baseline technology for immobilization of the Cs/Sr/Ba/Rb (Cs), lanthanide (Ln) and transition metal fission product (TM) waste steams as part of a cost benefit analysis study.[1] Vitrification of the combined waste streams have several advantages, minimization of the number of waste forms, a proven technology, and similarity to waste forms currently accepted for repository disposal. A joint study was undertaken by Pacific Northwest National Laboratory (PNNL) and Savannah River National Laboratory (SRNL) to develop acceptable glasses for the combined Cs + Ln + TM waste streams (Option 1) and Cs + Ln combined waste streams (Option 2) generated by the AFCI UREX+ set of processes. This study is aimed to develop baseline glasses for both combined waste stream options and identify key waste components and their impact on waste loading. The elemental compositions of the four-corners study were used along with the available separations data to determine the effect of burnup, decay, and separations variability on estimated waste stream compositions.[2-5] Two different components/scenarios were identified that could limit waste loading of the combined Cs + LN + TM waste streams, where as the combined Cs + LN waste stream has no single component that is perceived to limit waste loading. Combined Cs + LN waste stream in a glass waste form will most likely be limited by heat due to the high activity of Cs and Sr isotopes.

    1. Final report for %22High performance computing for advanced national electric power grid modeling and integration of solar generation resources%22, LDRD Project No. 149016.

      SciTech Connect (OSTI)

      Reno, Matthew J.; Riehm, Andrew Charles; Hoekstra, Robert John; Munoz-Ramirez, Karina; Stamp, Jason Edwin; Phillips, Laurence R.; Adams, Brian M.; Russo, Thomas V.; Oldfield, Ron A.; McLendon, William Clarence, III; Nelson, Jeffrey Scott; Hansen, Clifford W.; Richardson, Bryan T.; Stein, Joshua S.; Schoenwald, David Alan; Wolfenbarger, Paul R.

      2011-02-01

      Design and operation of the electric power grid (EPG) relies heavily on computational models. High-fidelity, full-order models are used to study transient phenomena on only a small part of the network. Reduced-order dynamic and power flow models are used when analysis involving thousands of nodes are required due to the computational demands when simulating large numbers of nodes. The level of complexity of the future EPG will dramatically increase due to large-scale deployment of variable renewable generation, active load and distributed generation resources, adaptive protection and control systems, and price-responsive demand. High-fidelity modeling of this future grid will require significant advances in coupled, multi-scale tools and their use on high performance computing (HPC) platforms. This LDRD report demonstrates SNL's capability to apply HPC resources to these 3 tasks: (1) High-fidelity, large-scale modeling of power system dynamics; (2) Statistical assessment of grid security via Monte-Carlo simulations of cyber attacks; and (3) Development of models to predict variability of solar resources at locations where little or no ground-based measurements are available.

    2. BLENDING STUDY FOR SRR SALT DISPOSITION INTEGRATION: TANK 50H SCALE-MODELING AND COMPUTER-MODELING FOR BLENDING PUMP DESIGN, PHASE 2

      SciTech Connect (OSTI)

      Leishear, R.; Poirier, M.; Fowley, M.

      2011-05-26

      The Salt Disposition Integration (SDI) portfolio of projects provides the infrastructure within existing Liquid Waste facilities to support the startup and long term operation of the Salt Waste Processing Facility (SWPF). Within SDI, the Blend and Feed Project will equip existing waste tanks in the Tank Farms to serve as Blend Tanks where 300,000-800,000 gallons of salt solution will be blended in 1.3 million gallon tanks and qualified for use as feedstock for SWPF. Blending requires the miscible salt solutions from potentially multiple source tanks per batch to be well mixed without disturbing settled sludge solids that may be present in a Blend Tank. Disturbing solids may be problematic both from a feed quality perspective as well as from a process safety perspective where hydrogen release from the sludge is a potential flammability concern. To develop the necessary technical basis for the design and operation of blending equipment, Savannah River National Laboratory (SRNL) completed scaled blending and transfer pump tests and computational fluid dynamics (CFD) modeling. A 94 inch diameter pilot-scale blending tank, including tank internals such as the blending pump, transfer pump, removable cooling coils, and center column, were used in this research. The test tank represents a 1/10.85 scaled version of an 85 foot diameter, Type IIIA, nuclear waste tank that may be typical of Blend Tanks used in SDI. Specifically, Tank 50 was selected as the tank to be modeled per the SRR, Project Engineering Manager. SRNL blending tests investigated various fixed position, non-rotating, dual nozzle pump designs, including a blending pump model provided by the blend pump vendor, Curtiss Wright (CW). Primary research goals were to assess blending times and to evaluate incipient sludge disturbance for waste tanks. Incipient sludge disturbance was defined by SRR and SRNL as minor blending of settled sludge from the tank bottom into suspension due to blending pump operation, where the sludge level was shown to remain constant. To experimentally model the sludge layer, a very thin, pourable, sludge simulant was conservatively used for all testing. To experimentally model the liquid, supernate layer above the sludge in waste tanks, two salt solution simulants were used, which provided a bounding range of supernate properties. One solution was water (H{sub 2}O + NaOH), and the other was an inhibited, more viscous salt solution. The research performed and data obtained significantly advances the understanding of fluid mechanics, mixing theory and CFD modeling for nuclear waste tanks by benchmarking CFD results to actual experimental data. This research significantly bridges the gap between previous CFD models and actual field experiences in real waste tanks. A finding of the 2009, DOE, Slurry Retrieval, Pipeline Transport and Plugging, and Mixing Workshop was that CFD models were inadequate to assess blending processes in nuclear waste tanks. One recommendation from that Workshop was that a validation, or bench marking program be performed for CFD modeling versus experiment. This research provided experimental data to validate and correct CFD models as they apply to mixing and blending in nuclear waste tanks. Extensive SDI research was a significant step toward bench marking and applying CFD modeling. This research showed that CFD models not only agreed with experiment, but demonstrated that the large variance in actual experimental data accounts for misunderstood discrepancies between CFD models and experiments. Having documented this finding, SRNL was able to provide correction factors to be used with CFD models to statistically bound full scale CFD results. Through the use of pilot scale tests performed for both types of pumps and available engineering literature, SRNL demonstrated how to effectively apply CFD results to salt batch mixing in full scale waste tanks. In other words, CFD models were in error prior to development of experimental correction factors determined during this research, which provided a technique to use CFD models for salt batch mixing and transfer pump operations. This major scientific advance in mixing technology resulted in multi-million dollar cost savings to SRR. New techniques were developed for both experiment and analysis to complete this research. Supporting this success, research findings are summarized in the Conclusions section of this report, and technical recommendations for design and operation are included in this section of the report.

    3. Computational Fluid Dynamics Modeling of the Bonneville Project: Tailrace Spill Patterns for Low Flows and Corner Collector Smolt Egress

      SciTech Connect (OSTI)

      Rakowski, Cynthia L.; Serkowski, John A.; Richmond, Marshall C.; Perkins, William A.

      2010-12-01

      In 2003, an extension of the existing ice and trash sluiceway was added at Bonneville Powerhouse 2 (B2). This extension started at the existing corner collector for the ice and trash sluiceway adjacent to Bonneville Powerhouse 2 and the new sluiceway was extended to the downstream end of Cascade Island. The sluiceway was designed to improve juvenile salmon survival by bypassing turbine passage at B2, and placing these smolt in downstream flowing water minimizing their exposure to fish and avian predators. In this study, a previously developed computational fluid dynamics model was modified and used to characterized tailrace hydraulics and sluiceway egress conditions for low total river flows and low levels of spillway flow. STAR-CD v4.10 was used for seven scenarios of low total river flow and low spill discharges. The simulation results were specifically examined to look at tailrace hydraulics at 5 ft below the tailwater elevation, and streamlines used to compare streamline pathways for streamlines originating in the corner collector outfall and adjacent to the outfall. These streamlines indicated that for all higher spill percentage cases (25% and greater) that streamlines from the corner collector did not approach the shoreline at the downstream end of Bradford Island. For the cases with much larger spill percentages, the streamlines from the corner collector were mid-channel or closer to the Washington shore as they moved downstream. Although at 25% spill at 75 kcfs total river, the total spill volume was sufficient to "cushion" the flow from the corner collector from the Bradford Island shore, areas of recirculation were modeled in the spillway tailrace. However, at the lowest flows and spill percentages, the streamlines from the B2 corner collector pass very close to the Bradford Island shore. In addition, the very flow velocity flows and large areas of recirculation greatly increase potential predator exposure of the spillway passed smolt. If there is concern for egress issues for smolt passing through the spillway, the spill pattern and volume need to be revisited.

    4. DOE Announces Webinars on the Mid-Atlantic Baseline Study, EPA...

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      used in the Mid-Atlantic Baseline Studies (MABS), a project intended to help inform the ... The Mid-Atlantic Baseline Studies (MABS) Project was carried out by the Biodiversity ...

    5. EVMS Training Snippet: 4.1 The Over Target Baseline (OTB) and...

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      EVMS Training Snippet: 3.1A Integrated Master Schedule (IMS) Initial Baseline Review EVMS Training Snippet: 4.6 Baseline Control Methods EVMS Training Snippet: 4.9 High-level EVM...

    6. Computational design and analysis of flatback airfoil wind tunnel experiment.

      SciTech Connect (OSTI)

      Mayda, Edward A.; van Dam, C.P.; Chao, David D.; Berg, Dale E.

      2008-03-01

      A computational fluid dynamics study of thick wind turbine section shapes in the test section of the UC Davis wind tunnel at a chord Reynolds number of one million is presented. The goals of this study are to validate standard wind tunnel wall corrections for high solid blockage conditions and to reaffirm the favorable effect of a blunt trailing edge or flatback on the performance characteristics of a representative thick airfoil shape prior to building the wind tunnel models and conducting the experiment. The numerical simulations prove the standard wind tunnel corrections to be largely valid for the proposed test of 40% maximum thickness to chord ratio airfoils at a solid blockage ratio of 10%. Comparison of the computed lift characteristics of a sharp trailing edge baseline airfoil and derived flatback airfoils reaffirms the earlier observed trend of reduced sensitivity to surface contamination with increasing trailing edge thickness.

    7. Waste Assessment Baseline for the IPOC Second Floor, West Wing

      SciTech Connect (OSTI)

      McCord, Samuel A

      2015-04-01

      Following a building-wide waste assessment in September, 2014, and subsequent presentation to Sandia leadership regarding the goal of Zero Waste by 2025, the occupants of the IPOC Second Floor, West Wing contacted the Materials Sustainability and Pollution Prevention (MSP2) team to guide them to Zero Waste in advance of the rest of the site. The occupants are from Center 3600, Public Relations and Communications , and Center 800, Independent Audit, Ethics and Business Conduct . To accomplish this, MSP2 conducted a new limited waste assessment from March 2-6, 2015 to compare the second floor, west wing to the building as a whole. The assessment also serves as a baseline with which to mark improvements in diversion in approximately 6 months.

    8. Sandia National Laboratories, California proposed CREATE facility environmental baseline survey.

      SciTech Connect (OSTI)

      Catechis, Christopher Spyros

      2013-10-01

      Sandia National Laboratories, Environmental Programs completed an environmental baseline survey (EBS) of 12.6 acres located at Sandia National Laboratories/California (SNL/CA) in support of the proposed Collaboration in Research and Engineering for Advanced Technology and Education (CREATE) Facility. The survey area is comprised of several parcels of land within SNL/CA, County of Alameda, California. The survey area is located within T 3S, R 2E, Section 13. The purpose of this EBS is to document the nature, magnitude, and extent of any environmental contamination of the property; identify potential environmental contamination liabilities associated with the property; develop sufficient information to assess the health and safety risks; and ensure adequate protection for human health and the environment related to a specific property.

    9. Compute nodes

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Compute nodes Compute nodes Click here to see more detailed hierachical map of the topology of a compute node. Last edited: 2016-02-01 08:07:08

    10. Computing Information

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      here you can find information relating to: Obtaining the right computer accounts. Using NIC terminals. Using BooNE's Computing Resources, including: Choosing your desktop....

    11. Computer System,

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      undergraduate summer institute http:isti.lanl.gov (Educational Prog) 2016 Computer System, Cluster, and Networking Summer Institute Purpose The Computer System,...

    12. TRIDAC host computer functional specification

      SciTech Connect (OSTI)

      Hilbert, S.M.; Hunter, S.L.

      1983-08-23

      The purpose of this document is to outline the baseline functional requirements for the Triton Data Acquisition and Control (TRIDAC) Host Computer Subsystem. The requirements presented in this document are based upon systems that currently support both the SIS and the Uranium Separator Technology Groups in the AVLIS Program at the Lawrence Livermore National Laboratory and upon the specific demands associated with the extended safe operation of the SIS Triton Facility.

    13. Integrated Baseline System (IBS). Version 1.03, System Management Guide

      SciTech Connect (OSTI)

      Williams, J.R.; Bailey, S.; Bower, J.C.

      1993-01-01

      This IBS System Management Guide explains how to install or upgrade the Integrated Baseline System (IBS) software package. The IBS is an emergency management planning and analysis tool that was developed under the direction of the Federal Emergency Management Agency (FEMA). This guide includes detailed instructions for installing the IBS software package on a Digital Equipment Corporation (DEC) VAX computer from the IBS distribution tapes. The installation instructions include procedures for both first-time installations and upgrades to existing IBS installations. To ensure that the system manager has the background necessary for successful installation of the IBS package, this guide also includes information on IBS computer requirements, software organization, and the generation of IBS distribution tapes. When special utility programs are used during IBS installation and setups, this guide refers you to the IBS Utilities Guide for specific instructions. This guide also refers you to the IBS Data Management Guide for detailed descriptions of some IBS data files and structures. Any special requirements for installation are not documented here but should be included in a set of installation notes that come with the distribution tapes.

    14. Baseline Risk Assessment Supporting Closure at Waste Management Area C at the Hanford Site Washington

      SciTech Connect (OSTI)

      Singleton, Kristin M.

      2015-01-07

      The Office of River Protection under the U.S. Department of Energy is pursuing closure of the Single-Shell Tank (SST) Waste Management Area (WMA) C under the requirements of the Hanford Federal Facility Agreement and Consent Order (HFFACO). A baseline risk assessment (BRA) of current conditions is based on available characterization data and information collected at WMA C. The baseline risk assessment is being developed as a part of a Resource Conservation and Recovery Act (RCRA) Facility Investigation (RFI)/Corrective Measures Study (CMS) at WMA C that is mandatory under Comprehensive Environmental Response, Compensation, and Liability Act and RCRA corrective action. The RFI/CMS is needed to identify and evaluate the hazardous chemical and radiological contamination in the vadose zone from past releases of waste from WMA C. WMA C will be under Federal ownership and control for the foreseeable future, and managed as an industrial area with restricted access and various institutional controls. The exposure scenarios evaluated under these conditions include Model Toxics Control Act (MTCA) Method C, industrial worker, maintenance and surveillance worker, construction worker, and trespasser scenarios. The BRA evaluates several unrestricted land use scenarios (residential all-pathway, MTCA Method B, and Tribal) to provide additional information for risk management. Analytical results from 13 shallow zone (0 to 15 ft. below ground surface) sampling locations were collected to evaluate human health impacts at WMA C. In addition, soil analytical data were screened against background concentrations and ecological soil screening levels to determine if soil concentrations have the potential to adversely affect ecological receptors. Analytical data from 12 groundwater monitoring wells were evaluated between 2004 and 2013. A screening of groundwater monitoring data against background concentrations and Federal maximum concentration levels was used to determine vadose zone contamination impacts on groundwater. Waste Management Area C is the first of the Hanford tank farms to begin the closure planning process. The current baseline risk assessment will provide valuable information for making corrective actions and closure decisions for WMA C, and will also support the planning for future tank farm soil investigation and baseline risk assessments.

    15. Lawrence Livermore National Laboratory Emergency Response Capability Baseline Needs Assessment Requirement Document

      SciTech Connect (OSTI)

      Sharry, J A

      2009-12-30

      This revision of the LLNL Fire Protection Baseline Needs Assessment (BNA) was prepared by John A. Sharry, LLNL Fire Marshal and LLNL Division Leader for Fire Protection and reviewed by Martin Gresho, Sandia/CA Fire Marshal. The document follows and expands upon the format and contents of the DOE Model Fire Protection Baseline Capabilities Assessment document contained on the DOE Fire Protection Web Site, but only address emergency response. The original LLNL BNA was created on April 23, 1997 as a means of collecting all requirements concerning emergency response capabilities at LLNL (including response to emergencies at Sandia/CA) into one BNA document. The original BNA documented the basis for emergency response, emergency personnel staffing, and emergency response equipment over the years. The BNA has been updated and reissued five times since in 1998, 1999, 2000, 2002, and 2004. A significant format change was performed in the 2004 update of the BNA in that it was 'zero based.' Starting with the requirement documents, the 2004 BNA evaluated the requirements, and determined minimum needs without regard to previous evaluations. This 2010 update maintains the same basic format and requirements as the 2004 BNA. In this 2010 BNA, as in the previous BNA, the document has been intentionally divided into two separate documents - the needs assessment (1) and the compliance assessment (2). The needs assessment will be referred to as the BNA and the compliance assessment will be referred to as the BNA Compliance Assessment. The primary driver for separation is that the needs assessment identifies the detailed applicable regulations (primarily NFPA Standards) for emergency response capabilities based on the hazards present at LLNL and Sandia/CA and the geographical location of the facilities. The needs assessment also identifies areas where the modification of the requirements in the applicable NFPA standards is appropriate, due to the improved fire protection provided, the remote location and low population density of some the facilities. As such, the needs assessment contains equivalencies to the applicable requirements. The compliance assessment contains no such equivalencies and simply assesses the existing emergency response resources to the requirements of the BNA and can be updated as compliance changes independent of the BNA update schedule. There are numerous NFPA codes and standards and other requirements and guidance documents that address the subject of emergency response. These requirements documents are not always well coordinated and may contain duplicative or conflicting requirements or even coverage gaps. Left unaddressed, this regulatory situation results in frequent interpretation of requirements documents. Different interpretations can then lead to inconsistent implementation. This BNA addresses this situation by compiling applicable requirements from all identified sources (see Section 5) and analyzing them collectively to address conflict and overlap as applicable to the hazards presented by the LLNL and Sandia/CA sites (see Section 7). The BNA also generates requirements when needed to fill any identified gaps in regulatory coverage. Finally, the BNA produces a customized simple set of requirements, appropriate for the DOE protection goals, such as those defined in DOE O 420.1B, the hazard level, the population density, the topography, and the site layout at LLNL and Sandia/CA that will be used as the baseline requirements set - the 'baseline needs' - for emergency response at LLNL and Sandia/CA. A template approach is utilized to accomplish this evaluation for each of the nine topical areas that comprise the baseline needs for emergency response. The basis for conclusions reached in determining the baseline needs for each of the topical areas is presented in Sections 7.1 through 7.9. This BNA identifies only mandatory requirements and establishes the minimum performance criteria. The minimum performance criteria may not be the level of performance desired Lawrence Livermore National Laboratory or Sandia/CA

    16. Development and Verification of a Computational Fluid Dynamics Model of a Horizontal-Axis Tidal Current Turbine

      SciTech Connect (OSTI)

      Lawson, Mi. J.; Li, Y.; Sale, D. C.

      2011-01-01

      This paper describes the development of a computational fluid dynamics (CFD) methodology to simulate the hydrodynamics of horizontal-axis tidal current turbines (HATTs). First, an HATT blade was designed using the blade element momentum method in conjunction with a genetic optimization algorithm. Several unstructured computational grids were generated using this blade geometry and steady CFD simulations were used to perform a grid resolution study. Transient simulations were then performed to determine the effect of time-dependent flow phenomena and the size of the computational timestep on the numerical solution. Qualitative measures of the CFD solutions were independent of the grid resolution. Conversely, quantitative comparisons of the results indicated that the use of coarse computational grids results in an under prediction of the hydrodynamic forces on the turbine blade in comparison to the forces predicted using more resolved grids. For the turbine operating conditions considered in this study, the effect of the computational timestep on the CFD solution was found to be minimal, and the results from steady and transient simulations were in good agreement. Additionally, the CFD results were compared to corresponding blade element momentum method calculations and reasonable agreement was shown. Nevertheless, we expect that for other turbine operating conditions, where the flow over the blade is separated, transient simulations will be required.

    17. Idaho National Laboratory’s Greenhouse Gas FY08 Baseline

      SciTech Connect (OSTI)

      Jennifer D. Morton

      2011-06-01

      A greenhouse gas (GHG) inventory is a systematic attempt to account for the production and release of certain gasses generated by an institution from various emission sources. The gasses of interest are those which have become identified by climate science as related to anthropogenic global climate change. This document presents an inventory of GHGs generated during fiscal year (FY) 2008 by Idaho National Laboratory (INL), a Department of Energy (DOE)-sponsored entity, located in southeastern Idaho. Concern about the environmental impact of GHGs has grown in recent years. This, together with a desire to decrease harmful environmental impacts, would be enough to encourage the calculation of a baseline estimate of total GHGs generated at INL. Additionally, INL has a desire to see how its emissions compare with similar institutions, including other DOE national laboratories. Executive Order 13514 requires that federal agencies and institutions document reductions in GHG emissions in the future, and such documentation will require knowledge of a baseline against which reductions can be measured. INL's FY08 GHG inventory was calculated according to methodologies identified in federal GHG guidance documents using operational control boundaries. It measures emissions generated in three Scopes: (1) INL emissions produced directly by stationary or mobile combustion and by fugitive emissions, (2) the share of emissions generated by entities from which INL purchased electrical power, and (3) indirect or shared emissions generated by outsourced activities that benefit INL (occur outside INL's organizational boundaries but are a consequence of INL's activities). This inventory found that INL generated a total of 113,049 MT of CO2-equivalent emissions during FY08. The following conclusions were made from looking at the results of the individual contributors to INL's baseline GHG inventory: (1) Electricity (including the associated transmission and distribution losses) is the largest contributor to INL's GHG inventory, with over 50% of the CO2e emissions; (2) Other sources with high emissions were stationary combustion (facility fuels), waste disposal (including fugitive emissions from the onsite landfill and contracted disposal), mobile combustion (fleet fuels), employee commuting, and business air travel; and (3) Sources with low emissions were wastewater treatment (onsite and contracted), fugitive emissions from refrigerants, and business ground travel (in personal and rental vehicles). This report details the methods behind quantifying INL's GHG inventory and discusses lessons learned on better practices by which information important to tracking GHGs can be tracked and recorded. It is important to note that because this report differentiates between those portions of INL that are managed and operated by the Battelle Energy Alliance (BEA) and those managed by other contractors, it includes only that large proportion of Laboratory activities overseen by BEA. It is assumed that other contractors will provide similar reporting for those activities they manage, where appropriate.

    18. Idaho National Laboratory’s Greenhouse Gas FY08 Baseline

      SciTech Connect (OSTI)

      Jennifer D. Morton

      2010-09-01

      A greenhouse gas (GHG) inventory is a systematic attempt to account for the production and release of certain gasses generated by an institution from various emission sources. The gasses of interest are those which have become identified by climate science as related to anthropogenic global climate change. This document presents an inventory of GHGs generated during fiscal year (FY) 2008 by Idaho National Laboratory (INL), a Department of Energy (DOE)-sponsored entity, located in southeastern Idaho. Concern about the environmental impact of GHGs has grown in recent years. This, together with a desire to decrease harmful environmental impacts, would be enough to encourage the calculation of a baseline estimate of total GHGs generated at the INL. Additionally, the INL has a desire to see how its emissions compare with similar institutions, including other DOE-sponsored national laboratories. Executive Order 13514 requires that federally-sponsored agencies and institutions document reductions in GHG emissions in the future, and such documentation will require knowledge of a baseline against which reductions can be measured. INL’s FY08 GHG inventory was calculated according to methodologies identified in Federal recommendations and an as-yet-unpublished Technical and Support Document (TSD) using operational control boundary. It measures emissions generated in three Scopes: (1) INL emissions produced directly by stationary or mobile combustion and by fugitive emissions, (2) the share of emissions generated by entities from which INL purchased electrical power, and (3) indirect or shared emissions generated by outsourced activities that benefit INL (occur outside INL’s organizational boundaries but are a consequence of INL’s activities). This inventory found that INL generated a total of 114,256 MT of CO2-equivalent emissions during fiscal year 2008 (FY08). The following conclusions were made from looking at the results of the individual contributors to INL’s baseline GHG inventory: • Electricity is the largest contributor to INL’s GHG inventory, with over 50% of the net anthropogenic CO2e emissions • Other sources with high emissions were stationary combustion, fugitive emissions from the onsite landfill, mobile combustion (fleet fuels) and the employee commute • Sources with low emissions were contracted waste disposal, wastewater treatment (onsite and contracted) and fugitive emissions from refrigerants. This report details the methods behind quantifying INL’s GHG inventory and discusses lessons learned on better practices by which information important to tracking GHGs can be tracked and recorded. It is important to stress that the methodology behind this inventory followed guidelines that have not yet been formally adopted. Thus, some modification of the conclusions may be necessary as additional guidance is received. Further, because this report differentiates between those portions of the INL that are managed and operated by the Battelle Energy Alliance (BEA) and those managed by other contractors, it includes only that large proportion of Laboratory activities overseen by BEA. It is assumed that other contractors will provide similar reporting for those activities they manage, where appropriate.

    19. Development and Verification of a Computational Fluid Dynamics Model of a Horizontal-Axis Tidal Current Turbine

      SciTech Connect (OSTI)

      Lawson, M. J.; Li, Y.; Sale, D. C.

      2011-10-01

      This paper describes the development of a computational fluid dynamics (CFD) methodology to simulate the hydrodynamics of horizontal-axis tidal current turbines. Qualitative measures of the CFD solutions were independent of the grid resolution. Conversely, quantitative comparisons of the results indicated that the use of coarse computational grids results in an under prediction of the hydrodynamic forces on the turbine blade in comparison to the forces predicted using more resolved grids. For the turbine operating conditions considered in this study, the effect of the computational timestep on the CFD solution was found to be minimal, and the results from steady and transient simulations were in good agreement. Additionally, the CFD results were compared to corresponding blade element momentum method calculations and reasonable agreement was shown. Nevertheless, we expect that for other turbine operating conditions, where the flow over the blade is separated, transient simulations will be required.

    20. Seizure control with thermal energy? Modeling of heat diffusivity in brain tissue and computer-based design of a prototype mini-cooler.

      SciTech Connect (OSTI)

      Osario, I.; Chang, F.-C.; Gopalsami, N.; Nuclear Engineering Division; Univ. of Kansas

      2009-10-01

      Automated seizure blockage is a top priority in epileptology. Lowering nervous tissue temperature below a certain level suppresses abnormal neuronal activity, an approach with certain advantages over electrical stimulation, the preferred investigational therapy for pharmacoresistant seizures. A computer model was developed to identify an efficient probe design and parameters that would allow cooling of brain tissue by no less than 21 C in 30 s, maximum. The Pennes equation and the computer code ABAQUS were used to investigate the spatiotemporal behavior of heat diffusivity in brain tissue. Arrays of distributed probes deliver sufficient thermal energy to decrease, inhomogeneously, brain tissue temperature from 37 to 20 C in 30 s and from 37 to 15 C in 60 s. Tissue disruption/loss caused by insertion of this probe is considerably less than that caused by ablative surgery. This model may be applied for the design and development of cooling devices for seizure control.

    1. Using computer-extracted image features for modeling of error-making patterns in detection of mammographic masses among radiology residents

      SciTech Connect (OSTI)

      Zhang, Jing Ghate, Sujata V.; Yoon, Sora C.; Lo, Joseph Y.; Kuzmiak, Cherie M.; Mazurowski, Maciej A.

      2014-09-15

      Purpose: Mammography is the most widely accepted and utilized screening modality for early breast cancer detection. Providing high quality mammography education to radiology trainees is essential, since excellent interpretation skills are needed to ensure the highest benefit of screening mammography for patients. The authors have previously proposed a computer-aided education system based on trainee models. Those models relate human-assessed image characteristics to trainee error. In this study, the authors propose to build trainee models that utilize features automatically extracted from images using computer vision algorithms to predict likelihood of missing each mass by the trainee. This computer vision-based approach to trainee modeling will allow for automatically searching large databases of mammograms in order to identify challenging cases for each trainee. Methods: The authors’ algorithm for predicting the likelihood of missing a mass consists of three steps. First, a mammogram is segmented into air, pectoral muscle, fatty tissue, dense tissue, and mass using automated segmentation algorithms. Second, 43 features are extracted using computer vision algorithms for each abnormality identified by experts. Third, error-making models (classifiers) are applied to predict the likelihood of trainees missing the abnormality based on the extracted features. The models are developed individually for each trainee using his/her previous reading data. The authors evaluated the predictive performance of the proposed algorithm using data from a reader study in which 10 subjects (7 residents and 3 novices) and 3 experts read 100 mammographic cases. Receiver operating characteristic (ROC) methodology was applied for the evaluation. Results: The average area under the ROC curve (AUC) of the error-making models for the task of predicting which masses will be detected and which will be missed was 0.607 (95% CI,0.564-0.650). This value was statistically significantly different from 0.5 (p < 0.0001). For the 7 residents only, the AUC performance of the models was 0.590 (95% CI,0.537-0.642) and was also significantly higher than 0.5 (p = 0.0009). Therefore, generally the authors’ models were able to predict which masses were detected and which were missed better than chance. Conclusions: The authors proposed an algorithm that was able to predict which masses will be detected and which will be missed by each individual trainee. This confirms existence of error-making patterns in the detection of masses among radiology trainees. Furthermore, the proposed methodology will allow for the optimized selection of difficult cases for the trainees in an automatic and efficient manner.

    2. Development Of Regional Climate Mitigation Baseline For A DominantAgro-Ecological Zone Of Karnataka, India

      SciTech Connect (OSTI)

      Sudha, P.; Shubhashree, D.; Khan, H.; Hedge, G.T.; Murthy, I.K.; Shreedhara, V.; Ravindranath, N.H.

      2007-06-01

      Setting a baseline for carbon stock changes in forest andland use sector mitigation projects is an essential step for assessingadditionality of the project. There are two approaches for settingbaselines namely, project-specific and regional baseline. This paperpresents the methodology adopted for estimating the land available formitigation, for developing a regional baseline, transaction cost involvedand a comparison of project-specific and regional baseline. The studyshowed that it is possible to estimate the potential land and itssuitability for afforestation and reforestation mitigation projects,using existing maps and data, in the dry zone of Karnataka, southernIndia. The study adopted a three-step approach for developing a regionalbaseline, namely: i) identification of likely baseline options for landuse, ii) estimation of baseline rates of land-use change, and iii)quantification of baseline carbon profile over time. The analysis showedthat carbon stock estimates made for wastelands and fallow lands forproject-specific as well as the regional baseline are comparable. Theratio of wasteland Carbon stocks of a project to regional baseline is1.02, and that of fallow lands in the project to regional baseline is0.97. The cost of conducting field studies for determination of regionalbaseline is about a quarter of the cost of developing a project-specificbaseline on a per hectare basis. The study has shown the reliability,feasibility and cost-effectiveness of adopting regional baseline forforestry sectormitigation projects.

    3. Scientific Opportunities with the Long-Baseline Neutrino Experiment

      SciTech Connect (OSTI)

      Adams, C.; et al.,

      2013-07-28

      In this document, we describe the wealth of science opportunities and capabilities of LBNE, the Long-Baseline Neutrino Experiment. LBNE has been developed to provide a unique and compelling program for the exploration of key questions at the forefront of particle physics. Chief among the discovery opportunities are observation of CP symmetry violation in neutrino mixing, resolution of the neutrino mass hierarchy, determination of maximal or near-maximal mixing in neutrinos, searches for nucleon decay signatures, and detailed studies of neutrino bursts from galactic supernovae. To fulfill these and other goals as a world-class facility, LBNE is conceived around four central components: (1) a new, intense wide-band neutrino source at Fermilab, (2) a fine-grained `near' neutrino detector just downstream of the source, (3) the Sanford Underground Research Facility (SURF) in Lead, South Dakota at an optimal distance (~1300 km) from the neutrino source, and (4) a massive liquid argon time-projection chamber (LArTPC) deployed there as a 'far' detector. The facilities envisioned are expected to enable many other science opportunities due to the high event rates and excellent detector resolution from beam neutrinos in the near detector and atmospheric neutrinos in the far detector. This is a mature, well developed, world class experiment whose relevance, importance, and probability of unearthing critical and exciting physics has increased with time.

    4. Pentek metal coating removal system: Baseline report; Greenbook (chapter)

      SciTech Connect (OSTI)

      1997-07-31

      The Pentek coating removal technology was tested and is being evaluated at Florida International University (FIU) as a baseline technology. In conjunction with FIU`s evaluation of efficiency and cost, this report covers evaluation conducted for safety and health issues. It is a commercially available technology and has been used for various projects at locations throughout the country. The Pentek coating removal system consisted of the ROTO-PEEN Scaler, CORNER-CUTTER{reg_sign}, and VAC-PAC{reg_sign}. They are designed to remove coatings from steel, concrete, brick, and wood. The Scaler uses 3M Roto Peen tungsten carbide cutters while the CORNER-CUTTER{reg_sign} uses solid needles for descaling activities. These hand tools are used with the VAC-PAC{reg_sign} vacuum system to capture dust and debris as removal of the coating takes place. The safety and health evaluation during the testing demonstration focused on two main areas of exposure: dust and noise. Dust exposure minimal, but noise exposure was significant. Further testing for each exposure is recommended because of the environment where the testing demonstration took place. It is feasible that the dust and noise levels will be higher in an enclosed operating environment of different construction. In addition, other areas of concern found were arm-hand vibration, whole-body, ergonomics, heat stress, tripping hazards, electrical hazards, machine guarding, and lockout/tagout.

    5. Ultra-high pressure water jet: Baseline report

      SciTech Connect (OSTI)

      1997-07-31

      The ultra-high pressure waterjet technology was being evaluated at Florida International University (FIU) as a baseline technology. In conjunction with FIU`s evaluation of efficiency and cost, this report covers the evaluation conducted for safety and health issues. It is a commercially available technology and has been used for various projects at locations throughout the country. The ultra-high pressure waterjet technology acts as a cutting tool for the removal of surface substrates. The Husky{trademark} pump feeds water to a lance that directs the high pressure water at the surface to be removed. The safety and health evaluation during the testing demonstration focused on two main areas of exposure. These were dust and noise. The dust exposure was found to be minimal, which would be expected due to the wet environment inherent in the technology, but noise exposure was at a significant level. Further testing for noise is recommended because of the outdoor environment where the testing demonstration took place. In addition, other areas of concern found were arm-hand vibration, ergonomics, heat stress, tripping hazards, electrical hazards, lockout/tagout, fall hazards, slipping hazards, hazards associated with the high pressure water, and hazards associated with air pressure systems.

    6. Pentek concrete scabbling system: Baseline report; Greenbook (chapter)

      SciTech Connect (OSTI)

      1997-07-31

      The Pentek scabbling technology was tested at Florida International University (FIU) and is being evaluated as a baseline technology. This report evaluates it for safety and health issues. It is a commercially available technology and has been used for various projects at locations throughout the country. The Pentek concrete scabbling system consisted of the MOOSE, SQUIRREL-I, and SQUIRREL-III scabblers. The scabblers are designed to scarify concrete floors and slabs using cross-section, tungsten carbide tipped bits. The bits are designed to remove concrete in 318 inch increments. The bits are either 9-tooth or demolition type. The scabblers are used with a vacuum system designed to collect and filter the concrete dust and contamination that is removed from the surface. The safety and health evaluation conducted during the testing demonstration focused on two main areas of exposure: dust and noise. Dust exposure was minimal, but noise exposure was significant. Further testing for each of these exposures is recommended. Because of the outdoor environment where the testing demonstration took place, results may be inaccurate. It is feasible that the dust and noise levels will be higher in an enclosed operating environment. Other areas of concern were arm-hand vibration, whole-body vibration, ergonomics, heat stress, tripping hazards, electrical hazards, machine guarding, and lockout/tagout.

    7. LTC vacuum blasting machine (metal) baseline report: Greenbook (chapter)

      SciTech Connect (OSTI)

      1997-07-31

      The LTC coating removal technology was tested and is being evaluated at Florida International University (FIU) as a baseline technology. In conjunction with FIU`s evaluation of efficiency and cost, this report covers the evaluation conducted for safety and health issues. It is a commercially available technology and has been used for various projects at locations throughout the country. The LTC coating removal system consisted of several hand tools, a Roto Peen scaler, and a needlegun. They are designed to remove coatings from steel, concrete, brick, and wood. These hand tools are used with the LTC PTC-6 vacuum system to capture dust and debris as removal of the coating takes place. The safety and health evaluation during the testing demonstration focused on two main areas of exposure: dust and noise. The dust exposure was minimal but noise exposure was significant. Further testing for each exposure is recommended because of the environment where the testing demonstration took place. It is feasible that the dust and noise levels will be higher in an enclosed operating environment of different construction. In addition, other areas of concern found were arm-hand vibration, whole-body vibration, ergonomics, heat stress, tripping hazards, electrical hazards, machine guarding, and lockout/tagout.

    8. LTC vacuum blasting maching (concrete): Baseline report: Greenbook (Chapter)

      SciTech Connect (OSTI)

      1997-07-31

      The LTC shot blast technology was tested and is being evaluated at Florida International University (FIU) as a baseline technology. In conjuction with FIU`s evaluation of efficiency and cost, this report covers the evaluation conducted for safety and health issues. It is a commercially available technology and has been used for various projects at locations throughout the country. The LTC 1073 Vacuum Blasting Machine uses a high-capacity, direct-pressure blasting system which incorporates a continuous feed for the blast media. The blast media cleans the surface within the contained brush area of the blast. It incorporates a vacuum system which removes dust and debris from the surface as it is blasted. The safety and health evaluation during the testing demonstration focused on two main areas of exposure: dust and noise. Dust exposure during maintenance activities was minimal, but due to mechanical difficulties dust monitoring could not be conducted during operation. Noise exposure was significant. Further testing for each of these exposures is recommended because of the outdoor environment where the testing demonstration took place. This may cause the results to be inaccurate. It is feasible that the dust and noise levels will be higher in an enclosed environment. In addition, other safety and health issues found were ergonomics, heat stress, tripping hazards, electrical hazards, lockout/tagout, and arm-hand vibration.

    9. A SURVEY OF ASTRONOMICAL RESEARCH: A BASELINE FOR ASTRONOMICAL DEVELOPMENT

      SciTech Connect (OSTI)

      Ribeiro, V. A. R. M.; Russo, P.; Cárdenas-Avendańo, A. E-mail: russo@strw.leidenuniv.nl

      2013-12-01

      Measuring scientific development is a difficult task. Different metrics have been put forward to evaluate scientific development; in this paper we explore a metric that uses the number of peer-reviewed, and when available non-peer-reviewed, research articles as an indicator of development in the field of astronomy. We analyzed the available publication record, using the Smithsonian Astrophysical Observatory/NASA Astrophysics Database System, by country affiliation in the time span between 1950 and 2011 for countries with a gross national income of less than 14,365 USD in 2010. This represents 149 countries. We propose that this metric identifies countries in ''astronomical development'' with a culture of research publishing. We also propose that for a country to develop in astronomy, it should invest in outside expert visits, send its staff abroad to study, and establish a culture of scientific publishing. Furthermore, we propose that this paper may be used as a baseline to measure the success of major international projects, such as the International Year of Astronomy 2009.

    10. Computing and Computational Sciences Directorate - Computer Science...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      AWARD Winners: Jess Gehin; Jackie Isaacs; Douglas Kothe; Debbie McCoy; Bonnie Nestor; John Turner; Gilbert Weigand Organization(s): Nuclear Technology Program; Computing and...

    11. Economic Model For a Return on Investment Analysis of United States Government High Performance Computing (HPC) Research and Development (R & D) Investment

      SciTech Connect (OSTI)

      Joseph, Earl C.; Conway, Steve; Dekate, Chirag

      2013-09-30

      This study investigated how high-performance computing (HPC) investments can improve economic success and increase scientific innovation. This research focused on the common good and provided uses for DOE, other government agencies, industry, and academia. The study created two unique economic models and an innovation index: 1 A macroeconomic model that depicts the way HPC investments result in economic advancements in the form of ROI in revenue (GDP), profits (and cost savings), and jobs. 2 A macroeconomic model that depicts the way HPC investments result in basic and applied innovations, looking at variations by sector, industry, country, and organization size. ïč A new innovation index that provides a means of measuring and comparing innovation levels. Key findings of the pilot study include: IDC collected the required data across a broad set of organizations, with enough detail to create these models and the innovation index. The research also developed an expansive list of HPC success stories.

    12. Computing and Computational Sciences Directorate - Information Technology

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Computational Sciences and Engineering The Computational Sciences and Engineering Division (CSED) is ORNL's premier source of basic and applied research in the field of data sciences and knowledge discovery. CSED's science agenda is focused on research and development related to knowledge discovery enabled by the explosive growth in the availability, size, and variability of dynamic and disparate data sources. This science agenda encompasses data sciences as well as advanced modeling and

    13. India's baseline plan for nuclear energy self-sufficiency.

      SciTech Connect (OSTI)

      Bucher, R .G.; Nuclear Engineering Division

      2009-01-01

      India's nuclear energy strategy has traditionally strived for energy self-sufficiency, driven largely by necessity following trade restrictions imposed by the Nuclear Suppliers Group (NSG) following India's 'peaceful nuclear explosion' of 1974. On September 6, 2008, the NSG agreed to create an exception opening nuclear trade with India, which may create opportunities for India to modify its baseline strategy. The purpose of this document is to describe India's 'baseline plan,' which was developed under constrained trade conditions, as a basis for understanding changes in India's path as a result of the opening of nuclear commerce. Note that this treatise is based upon publicly available information. No attempt is made to judge whether India can meet specified goals either in scope or schedule. In fact, the reader is warned a priori that India's delivery of stated goals has often fallen short or taken a significantly longer period to accomplish. It has been evident since the early days of nuclear power that India's natural resources would determine the direction of its civil nuclear power program. It's modest uranium but vast thorium reserves dictated that the country's primary objective would be thorium utilization. Estimates of India's natural deposits vary appreciably, but its uranium reserves are known to be extremely limited, totaling approximately 80,000 tons, on the order of 1% of the world's deposits; and nominally one-third of this ore is of very low uranium concentration. However, India's roughly 300,000 tons of thorium reserves account for approximately 30% of the world's total. Confronted with this reality, the future of India's nuclear power industry is strongly dependent on the development of a thorium-based nuclear fuel cycle as the only way to insure a stable, sustainable, and autonomous program. The path to India's nuclear energy self-sufficiency was first outlined in a seminal paper by Drs. H. J. Bhabha and N. B. Prasad presented at the Second United Nations Conference on the Peaceful Uses of Atomic Energy in 1958. The paper described a three stage plan for a sustainable nuclear energy program consistent with India's limited uranium but abundant thorium natural resources. In the first stage, natural uranium would be used to fuel graphite or heavy water moderated reactors. Plutonium extracted from the spent fuel of these thermal reactors would drive fast reactors in the second stage that would contain thorium blankets for breeding uranium-233 (U-233). In the final stage, this U-233 would fuel thorium burning reactors that would breed and fission U-233 in situ. This three stage blueprint still reigns as the core of India's civil nuclear power program. India's progress in the development of nuclear power, however, has been impacted by its isolation from the international nuclear community for its development of nuclear weapons and consequent refusal to sign the Nuclear Nonproliferation Treaty (NPT). Initially, India was engaged in numerous cooperative research programs with foreign countries; for example, under the 'Atoms for Peace' program, India acquired the Cirus reactor, a 40 MWt research reactor from Canada moderated with heavy water from the United States. India was also actively engaged in negotiations for the NPT. But, on May 18, 1974, India conducted a 'peaceful nuclear explosion' at Pokharan using plutonium produced by the Cirus reactor, abruptly ending the era of international collaboration. India then refused to sign the NPT, which it viewed as discriminatory since it would be required to join as a non-nuclear weapons state. As a result of India's actions, the Nuclear Suppliers Group (NSG) was created in 1975 to establish guidelines 'to apply to nuclear transfers for peaceful purposes to help ensure that such transfers would not be diverted to unsafeguarded nuclear fuel cycle or nuclear explosive activities. These nuclear export controls have forced India to be largely self-sufficient in all nuclear-related technologies.

    14. Long-Term Stewardship Baseline Report and Transition Guidance

      SciTech Connect (OSTI)

      Kristofferson, Keith

      2001-11-01

      Long-term stewardship consists of those actions necessary to maintain and demonstrate continued protection of human health and the environment after facility cleanup is complete. As the Department of Energy’s (DOE) lead laboratory for environmental management programs, the Idaho National Engineering and Environmental Laboratory (INEEL) administers DOE’s long-term stewardship science and technology efforts. The INEEL provides DOE with technical, and scientific expertise needed to oversee its long-term environmental management obligations complexwide. Long-term stewardship is administered and overseen by the Environmental Management Office of Science and Technology. The INEEL Long-Term Stewardship Program is currently developing the management structures and plans to complete INEEL-specific, long-term stewardship obligations. This guidance document (1) assists in ensuring that the program leads transition planning for the INEEL with respect to facility and site areas and (2) describes the classes and types of criteria and data required to initiate transition for areas and sites where the facility mission has ended and cleanup is complete. Additionally, this document summarizes current information on INEEL facilities, structures, and release sites likely to enter long-term stewardship at the completion of DOE’s cleanup mission. This document is not intended to function as a discrete checklist or local procedure to determine readiness to transition. It is an overarching document meant as guidance in implementing specific transition procedures. Several documents formed the foundation upon which this guidance was developed. Principal among these documents was the Long-Term Stewardship Draft Technical Baseline; A Report to Congress on Long-Term Stewardship, Volumes I and II; Infrastructure Long-Range Plan; Comprehensive Facility Land Use Plan; INEEL End-State Plan; and INEEL Institutional Plan.

    15. Climate Models: Rob Jacob | Argonne National Laboratory

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      science & technology Environmental modeling tools Programs Mathematics, computing, & computer science Modeling, simulation, & visualization Rob Jacob, Computational Climate...

    16. Modeling

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      diffuse interface methods in ALE-AMR code with application in modeling NDCX-II experiments Wangyi Liu 1 , John Barnard 2 , Alex Friedman 2 , Nathan Masters 2 , Aaron Fisher 2 , Alice Koniges 2 , David Eder 2 1 LBNL, USA, 2 LLNL, USA This work was part of the Petascale Initiative in Computational Science at NERSC, supported by the Director, Office of Science, Advanced Scientific Computing Research, of the U.S. Department of Energy under Contract No. DE-AC02-05CH11231. This work was performed

    17. Computational Tools to Assess Turbine Biological Performance

      SciTech Connect (OSTI)

      Richmond, Marshall C.; Serkowski, John A.; Rakowski, Cynthia L.; Strickler, Brad; Weisbeck, Molly; Dotson, Curtis L.

      2014-07-24

      Public Utility District No. 2 of Grant County (GCPUD) operates the Priest Rapids Dam (PRD), a hydroelectric facility on the Columbia River in Washington State. The dam contains 10 Kaplan-type turbine units that are now more than 50 years old. Plans are underway to refit these aging turbines with new runners. The Columbia River at PRD is a migratory pathway for several species of juvenile and adult salmonids, so passage of fish through the dam is a major consideration when upgrading the turbines. In this paper, a method for turbine biological performance assessment (BioPA) is demonstrated. Using this method, a suite of biological performance indicators is computed based on simulated data from a CFD model of a proposed turbine design. Each performance indicator is a measure of the probability of exposure to a certain dose of an injury mechanism. Using known relationships between the dose of an injury mechanism and frequency of injury (dose–response) from laboratory or field studies, the likelihood of fish injury for a turbine design can be computed from the performance indicator. By comparing the values of the indicators from proposed designs, the engineer can identify the more-promising alternatives. We present an application of the BioPA method for baseline risk assessment calculations for the existing Kaplan turbines at PRD that will be used as the minimum biological performance that a proposed new design must achieve.

    18. Methods and computer executable instructions for rapidly calculating simulated particle transport through geometrically modeled treatment volumes having uniform volume elements for use in radiotherapy

      DOE Patents [OSTI]

      Frandsen, Michael W.; Wessol, Daniel E.; Wheeler, Floyd J.

      2001-01-16

      Methods and computer executable instructions are disclosed for ultimately developing a dosimetry plan for a treatment volume targeted for irradiation during cancer therapy. The dosimetry plan is available in "real-time" which especially enhances clinical use for in vivo applications. The real-time is achieved because of the novel geometric model constructed for the planned treatment volume which, in turn, allows for rapid calculations to be performed for simulated movements of particles along particle tracks there through. The particles are exemplary representations of neutrons emanating from a neutron source during BNCT. In a preferred embodiment, a medical image having a plurality of pixels of information representative of a treatment volume is obtained. The pixels are: (i) converted into a plurality of substantially uniform volume elements having substantially the same shape and volume of the pixels; and (ii) arranged into a geometric model of the treatment volume. An anatomical material associated with each uniform volume element is defined and stored. Thereafter, a movement of a particle along a particle track is defined through the geometric model along a primary direction of movement that begins in a starting element of the uniform volume elements and traverses to a next element of the uniform volume elements. The particle movement along the particle track is effectuated in integer based increments along the primary direction of movement until a position of intersection occurs that represents a condition where the anatomical material of the next element is substantially different from the anatomical material of the starting element. This position of intersection is then useful for indicating whether a neutron has been captured, scattered or exited from the geometric model. From this intersection, a distribution of radiation doses can be computed for use in the cancer therapy. The foregoing represents an advance in computational times by multiple factors of time magnitudes.

    19. Analytical and computational study of the ideal full two-fluid plasma model and asymptotic approximations for Hall-magnetohydrodynamics

      SciTech Connect (OSTI)

      Srinivasan, B.; Shumlak, U.

      2011-09-15

      The 5-moment two-fluid plasma model uses Euler equations to describe the ion and electron fluids and Maxwell's equations to describe the electric and magnetic fields. Two-fluid physics becomes significant when the characteristic spatial scales are on the order of the ion skin depth and characteristic time scales are on the order of the ion cyclotron period. The full two-fluid plasma model has disparate characteristic speeds ranging from the ion and electron speeds of sound to the speed of light. Two asymptotic approximations are applied to the full two-fluid plasma to arrive at the Hall-MHD model, namely negligible electron inertia and infinite speed of light. The full two-fluid plasma model and the Hall-MHD model are studied for applications to an electromagnetic plasma shock, geospace environmental modeling (GEM challenge) magnetic reconnection, an axisymmetric Z-pinch, and an axisymmetric field reversed configuration (FRC).

    20. COMPUTATIONAL SCIENCE CENTER

      SciTech Connect (OSTI)

      DAVENPORT, J.

      2006-11-01

      Computational Science is an integral component of Brookhaven's multi science mission, and is a reflection of the increased role of computation across all of science. Brookhaven currently has major efforts in data storage and analysis for the Relativistic Heavy Ion Collider (RHIC) and the ATLAS detector at CERN, and in quantum chromodynamics. The Laboratory is host for the QCDOC machines (quantum chromodynamics on a chip), 10 teraflop/s computers which boast 12,288 processors each. There are two here, one for the Riken/BNL Research Center and the other supported by DOE for the US Lattice Gauge Community and other scientific users. A 100 teraflop/s supercomputer will be installed at Brookhaven in the coming year, managed jointly by Brookhaven and Stony Brook, and funded by a grant from New York State. This machine will be used for computational science across Brookhaven's entire research program, and also by researchers at Stony Brook and across New York State. With Stony Brook, Brookhaven has formed the New York Center for Computational Science (NYCCS) as a focal point for interdisciplinary computational science, which is closely linked to Brookhaven's Computational Science Center (CSC). The CSC has established a strong program in computational science, with an emphasis on nanoscale electronic structure and molecular dynamics, accelerator design, computational fluid dynamics, medical imaging, parallel computing and numerical algorithms. We have been an active participant in DOES SciDAC program (Scientific Discovery through Advanced Computing). We are also planning a major expansion in computational biology in keeping with Laboratory initiatives. Additional laboratory initiatives with a dependence on a high level of computation include the development of hydrodynamics models for the interpretation of RHIC data, computational models for the atmospheric transport of aerosols, and models for combustion and for energy utilization. The CSC was formed to bring together researchers in these areas and to provide a focal point for the development of computational expertise at the Laboratory. These efforts will connect to and support the Department of Energy's long range plans to provide Leadership class computing to researchers throughout the Nation. Recruitment for six new positions at Stony Brook to strengthen its computational science programs is underway. We expect some of these to be held jointly with BNL.

    1. Applied Computer Science

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      ADTSC » CCS » CCS-7 Applied Computer Science Innovative co-design of applications, algorithms, and architectures in order to enable scientific simulations at extreme scale Leadership Group Leader Linn Collins Email Deputy Group Leader (Acting) Bryan Lally Email Climate modeling visualization Results from a climate simulation computed using the Model for Prediction Across Scales (MPAS) code. This visualization shows the temperature of ocean currents using a green and blue color scale. These

    2. 2008 CHP Baseline Assessment and Action Plan for the California Market |

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      Department of Energy California Market 2008 CHP Baseline Assessment and Action Plan for the California Market This 2008 report provides an updated baseline assessment and action plan for combined heat and power (CHP) in California and identifies hurdles that prevent the expanded use of CHP systems. This report was prepared by the Pacific Region CHP Application Center (RAC). PDF icon chp_california_2008.pdf More Documents & Publications 2008 CHP Baseline Assessment and Action Plan for the

    3. Optimization of the CLIC Baseline Collimation System (Conference) | SciTech

      Office of Scientific and Technical Information (OSTI)

      Connect Optimization of the CLIC Baseline Collimation System Citation Details In-Document Search Title: Optimization of the CLIC Baseline Collimation System Important efforts have recently been dedicated to the improvement of the design of the baseline collimation system of the Compact Linear Collider (CLIC). Different aspects of the design have been optimized: the transverse collimation depths have been recalculated in order to reduce the collimator wakefield effects while maintaining a

    4. EVMS Training Snippet: 4.1 The Over Target Baseline (OTB) and The Over

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      Target Schedule (OTS) Implementations | Department of Energy EVMS Training Snippet: 4.1 The Over Target Baseline (OTB) and The Over Target Schedule (OTS) Implementations » EVMS Training Snippet: 4.1 The Over Target Baseline (OTB) and The Over Target Schedule (OTS) Implementations EVMS Training Snippet: 4.1 The Over Target Baseline (OTB) and The Over Target Schedule (OTS) Implementations This EVMS Training Snippet, sponsored by the Office of Project Management (PM) covers Over Target

    5. EVMS Training Snippet: 4.1 The Over Target Baseline (OTB) and The Over

      Office of Environmental Management (EM)

      Target Schedule (OTS) Implementations | Department of Energy 1 The Over Target Baseline (OTB) and The Over Target Schedule (OTS) Implementations EVMS Training Snippet: 4.1 The Over Target Baseline (OTB) and The Over Target Schedule (OTS) Implementations This EVMS Training Snippet, sponsored by the Office of Project Management (PM) covers Over Target Baseline and Over Target Schedule implementations. Link to Video Presentation | Prior Snippet (3.3) | Next Snippet (4.2) | Return to Index PDF

    6. EVMS Training Snippet: 4.2 Integrated Baseline Review (IBR) Process |

      Office of Environmental Management (EM)

      Department of Energy 2 Integrated Baseline Review (IBR) Process EVMS Training Snippet: 4.2 Integrated Baseline Review (IBR) Process This EVMS Training Snippet sponsored by the Office of Project Management (PM) covers the Integrated Baseline Review (IBR) process. Link to Video Presentation | Prior Snippet (4.1) | Next Snippet (4.3) | Return to Index PDF icon Slides Only PDF icon Slides with Notes More Documents & Publications EVMS Training Snippet: 1.4 EVMS Stage 2 Surveillance EVMS

    7. Compute Nodes

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Compute Nodes Compute Nodes Quad CoreAMDOpteronprocessor Compute Node Configuration 9,572 nodes 1 quad-core AMD 'Budapest' 2.3 GHz processor per node 4 cores per node (38,288 total cores) 8 GB DDR3 800 MHz memory per node Peak Gflop rate 9.2 Gflops/core 36.8 Gflops/node 352 Tflops for the entire machine Each core has their own L1 and L2 caches, with 64 KB and 512KB respectively 2 MB L3 cache shared among the 4 cores Compute Node Software By default the compute nodes run a restricted low-overhead

    8. LEDSGP/Transportation Toolkit/Key Actions/Create a Baseline ...

      Open Energy Info (EERE)

      a Baseline) Jump to: navigation, search LEDSGP Logo.png Transportation Toolkit Home Tools Training Request Assistance Key Actions for Low-Emission Development in Transportation...

    9. Microsoft PowerPoint - Snippet 4.6 Baseline Control Methods 20140723 [Compatibility Mode]

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      baseline revisions and the different baseline control vehicles used in DOE. 1 It is a given that during the life of the project, the performance measurement baseline (PMB) will change for a variety of reasons. These changes may affect the technical scope, schedule, and/or budget of the project. Revisions to the baseline may be necessary to maintain a valid work plan. In accordance with the DOE Acquisition Guide Chapter 43.3 (March 2013), certain changes cannot be made to the PMB, such as

    10. U.S. Department of Energy Performance Baseline Guide - DOE Directives...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      5A, U.S. Department of Energy Performance Baseline Guide by Brian Kong Functional areas: Program Management, Project Management, Work Processes This guide identifies key PB...

    11. Mechanism and computational model for Lyman-{alpha}-radiation generation by high-intensity-laser four-wave mixing in Kr-Ar gas

      SciTech Connect (OSTI)

      Louchev, Oleg A.; Saito, Norihito; Wada, Satoshi; Bakule, Pavel; Yokoyama, Koji; Ishida, Katsuhiko; Iwasaki, Masahiko

      2011-09-15

      We present a theoretical model combined with a computational study of a laser four-wave mixing process under optical discharge in which the non-steady-state four-wave amplitude equations are integrated with the kinetic equations of initial optical discharge and electron avalanche ionization in Kr-Ar gas. The model is validated by earlier experimental data showing strong inhibition of the generation of pulsed, tunable Lyman-{alpha} (Ly-{alpha}) radiation when using sum-difference frequency mixing of 212.6 nm and tunable infrared radiation (820-850 nm). The rigorous computational approach to the problem reveals the possibility and mechanism of strong auto-oscillations in sum-difference resonant Ly-{alpha} generation due to the combined effect of (i) 212.6-nm (2+1)-photon ionization producing initial electrons, followed by (ii) the electron avalanche dominated by 843-nm radiation, and (iii) the final breakdown of the phase matching condition. The model shows that the final efficiency of Ly-{alpha} radiation generation can achieve a value of {approx}5x10{sup -4} which is restricted by the total combined absorption of the fundamental and generated radiation.

    12. Inter-comparison of Computer Codes for TRISO-based Fuel Micro-Modeling and Performance Assessment

      SciTech Connect (OSTI)

      Brian Boer; Chang Keun Jo; Wen Wu; Abderrafi M. Ougouag; Donald McEachren; Francesco Venneri

      2010-10-01

      The Next Generation Nuclear Plant (NGNP), the Deep Burn Pebble Bed Reactor (DB-PBR) and the Deep Burn Prismatic Block Reactor (DB-PMR) are all based on fuels that use TRISO particles as their fundamental constituent. The TRISO particle properties include very high durability in radiation environments, hence the designs reliance on the TRISO to form the principal barrier to radioactive materials release. This durability forms the basis for the selection of this fuel type for applications such as Deep Bun (DB), which require exposures up to four times those expected for light water reactors. It follows that the study and prediction of the durability of TRISO particles must be carried as part of the safety and overall performance characterization of all the designs mentioned above. Such evaluations have been carried out independently by the performers of the DB project using independently developed codes. These codes, PASTA, PISA and COPA, incorporate models for stress analysis on the various layers of the TRISO particle (and of the intervening matrix material for some of them), model for fission products release and migration then accumulation within the SiC layer of the TRISO particle, just next to the layer, models for free oxygen and CO formation and migration to the same location, models for temperature field modeling within the various layers of the TRISO particle and models for the prediction of failure rates. All these models may be either internal to the code or external. This large number of models and the possibility of different constitutive data and model formulations and the possibility of a variety of solution techniques makes it highly unlikely that the model would give identical results in the modeling of identical situations. The purpose of this paper is to present the results of an inter-comparison between the codes and to identify areas of agreement and areas that need reconciliation. The inter-comparison has been carried out by the cooperating institutions using a set of pre-defined TRISO conditions (burnup levels, temperature or power levels, etc.) and the outcome will be tabulated in the full length paper. The areas of agreement will be pointed out and the areas that require further modeling or reconciliation will be shown. In general the agreement between the codes is good within less than one order of magnitude in the prediction of TRISO failure rates.

    13. General Merchandise 2009 TSD Miami Low Plug Load Baseline | Open...

      Open Energy Info (EERE)

      90.1 2004 Model Year 2009 IDF file http:apps1.eere.energy.govbuildingsenergyplusmodelsMiami2009TSDGeneralMerchLPLbaseline.idf XML file http:apps1.eere.energy.gov...

    14. General Merchandise 2009 TSD Chicago Low Plug Load Baseline ...

      Open Energy Info (EERE)

      90.1 2004 Model Year 2009 IDF file http:apps1.eere.energy.govbuildingsenergyplusmodelsMiami2009TSDGeneralMerchLPLBaseline.idf XML file http:apps1.eere.energy.gov...

    15. General Merchandise 2009 TSD Chicago High Plug Load Baseline...

      Open Energy Info (EERE)

      90.1 2004 Model Year 2009 IDF file http:apps1.eere.energy.govbuildingsenergyplusmodelsChicago2009TSDGeneralMerchHPLBaseline.idf XML file http:apps1.eere.energy.gov...

    16. General Merchandise 2009 TSD Miami High Plug Load Baseline |...

      Open Energy Info (EERE)

      90.1 2004 Model Year 2009 IDF file http:apps1.eere.energy.govbuildingsenergyplusmodelsMiami2009TSDGeneralMerchHPLbaseline.idf XML file http:apps1.eere.energy.gov...

    17. Reformulated Gasoline Complex Model

      Gasoline and Diesel Fuel Update (EIA)

      Refiners Switch to Reformulated Gasoline Complex Model Contents * Summary * Introduction o Table 1. Comparison of Simple Model and Complex Model RFG Per Gallon Requirements * Statutory, Individual Refinery, and Compliance Baselines o Table 2. Statutory Baseline Fuel Compositions * Simple Model * Complex Model o Table 3. Complex Model Variables * Endnotes Related EIA Short-Term Forecast Analysis Products * RFG Simple and Complex Model Spreadsheets * Areas Particpating in the Reformulated Gasoline

    18. Computer Science

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Cite Seer Department of Energy provided open access science research citations in chemistry, physics, materials, engineering, and computer science IEEE Xplore Full text...

    19. Computer Security

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Computer Security All JLF participants must fully comply with all LLNL computer security regulations and procedures. A laptop entering or leaving B-174 for the sole use by a US citizen and so configured, and requiring no IP address, need not be registered for use in the JLF. By September 2009, it is expected that computers for use by Foreign National Investigators will have no special provisions. Notify maricle1@llnl.gov of all other computers entering, leaving, or being moved within B 174. Use

    20. Computed solid phases limiting the concentration of dissolved constituents in basalt aquifers of the Columbia Plateau in eastern Washington. Geochemical modeling and nuclide/rock/groundwater interaction studies

      SciTech Connect (OSTI)

      Deutsch, W.J.; Jenne, E.A.; Krupka, K.M.

      1982-08-01

      A speciation-solubility geochemical model, WATEQ2, was used to analyze geographically-diverse, ground-water samples from the aquifers of the Columbia Plateau basalts in eastern Washington. The ground-water samples compute to be at equilibrium with calcite, which provides both a solubility control for dissolved calcium and a pH buffer. Amorphic ferric hydroxide, Fe(OH)/sub 3/(A), is at saturation or modestly oversaturated in the few water samples with measured redox potentials. Most of the ground-water samples compute to be at equilibrium with amorphic silica (glass) and wairakite, a zeolite, and are saturated to oversaturated with respect to allophane, an amorphic aluminosilicate. The water samples are saturated to undersaturated with halloysite, a clay, and are variably oversaturated with regard to other secondary clay minerals. Equilibrium between the ground water and amorphic silica presumably results from the dissolution of the glassy matrix of the basalt. The oversaturation of the clay minerals other than halloysite indicates that their rate of formation lags the dissolution rate of the basaltic glass. The modeling results indicate that metastable amorphic solids limit the concentration of dissolved silicon and suggest the same possibility for aluminum and iron, and that the processes of dissolution of basaltic glass and formation of metastable secondary minerals are continuing even though the basalts are of Miocene age. The computed solubility relations are found to agree with the known assemblages of alteration minerals in the basalt fractures and vesicles. Because the chemical reactivity of the bedrock will influence the transport of solutes in ground water, the observed solubility equilibria are important factors with regard to chemical-retention processes associated with the possible migration of nuclear waste stored in the earth's crust.

    1. Atomic-Scale Design of Iron Fischer-Tropsch Catalysts; A Combined Computational Chemistry, Experimental, and Microkinetic Modeling Approach

      SciTech Connect (OSTI)

      Manos Mavrikakis; James Dumesic; Rahul Nabar; Calvin Bartholonew; Hu Zou; Uchenna Paul

      2008-09-29

      This work focuses on (1) searching/summarizing published Fischer-Tropsch synthesis (FTS) mechanistic and kinetic studies of FTS reactions on iron catalysts; (2) preparation and characterization of unsupported iron catalysts with/without potassium/platinum promoters; (3) measurement of H{sub 2} and CO adsorption/dissociation kinetics on iron catalysts using transient methods; (3) analysis of the transient rate data to calculate kinetic parameters of early elementary steps in FTS; (4) construction of a microkinetic model of FTS on iron, and (5) validation of the model from collection of steady-state rate data for FTS on iron catalysts. Three unsupported iron catalysts and three alumina-supported iron catalysts were prepared by non-aqueous-evaporative deposition (NED) or aqueous impregnation (AI) and characterized by chemisorption, BET, temperature-programmed reduction (TPR), extent-of-reduction, XRD, and TEM methods. These catalysts, covering a wide range of dispersions and metal loadings, are well-reduced and relatively thermally stable up to 500-600 C in H{sub 2} and thus ideal for kinetic and mechanistic studies. Kinetic parameters for CO adsorption, CO dissociation, and surface carbon hydrogenation on these catalysts were determined from temperature-programmed desorption (TPD) of CO and temperature programmed surface hydrogenation (TPSR), temperature-programmed hydrogenation (TPH), and isothermal, transient hydrogenation (ITH). A microkinetic model was constructed for the early steps in FTS on polycrystalline iron from the kinetic parameters of elementary steps determined experimentally in this work and from literature values. Steady-state rate data were collected in a Berty reactor and used for validation of the microkinetic model. These rate data were fitted to 'smart' Langmuir-Hinshelwood rate expressions derived from a sequence of elementary steps and using a combination of fitted steady-state parameters and parameters specified from the transient measurements. The results provide a platform for further development of microkinetic models of FTS on Fe and a basis for more precise modeling of FTS activity of Fe catalysts. Calculations using periodic, self-consistent Density Functional Theory (DFT) methods were performed on various realistic models of industrial, Fe-based FTS catalysts. Close-packed, most stable Fe(110) facet was analyzed and subsequently carbide formation was found to be facile leading to the choice of the FeC(110) model representing a Fe facet with a sub-surface C atom. The Pt adatom (Fe{sup Pt}(110)) was found to be the most stable model for our studies into Pt promotion and finally the role of steps was elucidated by recourse to the defected Fe(211) facet. Binding Energies(BEs), preferred adsorption sites and geometries for all FTS relevant stable species and intermediates were evaluated on each model catalyst facet. A mechanistic model (comprising of 32 elementary steps involving 19 species) was constructed and each elementary step therein was fully characterized with respect to its thermochemistry and kinetics. Kinetic calculations involved evaluation of the Minimum Energy Pathways (MEPs) and activation energies (barriers) for each step. Vibrational frequencies were evaluated for the preferred adsorption configuration of each species with the aim of evaluating entropy-changes, pre exponential factors and serving as a useful connection with experimental surface science techniques. Comparative analysis among these four facets revealed important trends in their relative behavior and roles in FTS catalysis. Overall the First Principles Calculations afforded us a new insight into FTS catalysis on Fe and modified-Fe catalysts.

    2. Computing and Computational Sciences Directorate - Divisions

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      CCSD Divisions Computational Sciences and Engineering Computer Sciences and Mathematics Information Technolgoy Services Joint Institute for Computational Sciences National Center for Computational Sciences

    3. Baseline System Costs for 50.0 MW Enhanced Geothermal System--A Function of: Working Fluid, Technology, and Location, Location, Location

      Broader source: Energy.gov [DOE]

      Project objectives: Develop a baseline cost model of a 50.0 MW Enhanced Geothermal System, including all aspects of the project, from finding the resource through to operation, for a particularly challenging scenario: the deep, radioactively decaying granitic rock of the Pioneer Valley in Western Massachusetts.

    4. Cloud-Based Model Calibration Using OpenStudio: Preprint

      SciTech Connect (OSTI)

      Hale, E.; Lisell, L.; Goldwasser, D.; Macumber, D.; Dean, J.; Metzger, I.; Parker, A.; Long, N.; Ball, B.; Schott, M.; Weaver, E.; Brackney, L.

      2014-03-01

      OpenStudio is a free, open source Software Development Kit (SDK) and application suite for performing building energy modeling and analysis. The OpenStudio Parametric Analysis Tool has been extended to allow cloud-based simulation of multiple OpenStudio models parametrically related to a baseline model. This paper describes the new cloud-based simulation functionality and presents a model cali-bration case study. Calibration is initiated by entering actual monthly utility bill data into the baseline model. Multiple parameters are then varied over multiple iterations to reduce the difference between actual energy consumption and model simulation results, as calculated and visualized by billing period and by fuel type. Simulations are per-formed in parallel using the Amazon Elastic Cloud service. This paper highlights model parameterizations (measures) used for calibration, but the same multi-nodal computing architecture is available for other purposes, for example, recommending combinations of retrofit energy saving measures using the calibrated model as the new baseline.

    5. Cost and performance baseline for fossil energy plants

      SciTech Connect (OSTI)

      2007-05-15

      The objective of this report is to present performance and cost data for fossil energy power systems, specifically integrated gasification combined cycle (IGCC), pulverized coal (PC), and natural gas combined cycle (NGCC) plants, in a consistent technical and economic manner that accurately reflects current market conditions for plants starting operation in 2010. This is Volume 2 of the three-volume report. Twelve different power plant design configurations were analyzed. These include six IGCC cases utilizing the General Electric Energy (GEE), ConocoPhillips (CoP), and Shell gasifiers each with and without CO{sub 2} capture, and six cases representing conventional technologies: PC-subcritical, PC-supercritical, and NGCC plants both with and without CO{sub 2} capture. Cases 7 and 8 were originally included in this study and involve production of synthetic natural gas (SNG) and the repowering of an existing NGCC facility using SNG. The two SNG cases were subsequently moved to Volume 2 of this report resulting in the discontinuity of case numbers (1-6 and 9-14). Chapter 2 provides the basis for technical, environmental and cost evaluations. Chapter 3 describes the IGCC technologies modeled and presents the results for the six IGCC cases. Chapter 4 describes the PC technologies modeled and presents the results for the four PC cases. Chapter 5 described the NGCC technologies modeled and presents the results for the two NGCC cases. Chapter 6 contains the reference list. 64 refs., 253 exhibits.

    6. Properties of a soft-core model of methanol: An integral equation theory and computer simulation study

      SciTech Connect (OSTI)

      Huš, Matej; Urbic, Tomaz; Munaò, Gianmarco

      2014-10-28

      Thermodynamic and structural properties of a coarse-grained model of methanol are examined by Monte Carlo simulations and reference interaction site model (RISM) integral equation theory. Methanol particles are described as dimers formed from an apolar Lennard-Jones sphere, mimicking the methyl group, and a sphere with a core-softened potential as the hydroxyl group. Different closure approximations of the RISM theory are compared and discussed. The liquid structure of methanol is investigated by calculating site-site radial distribution functions and static structure factors for a wide range of temperatures and densities. Results obtained show a good agreement between RISM and Monte Carlo simulations. The phase behavior of methanol is investigated by employing different thermodynamic routes for the calculation of the RISM free energy, drawing gas-liquid coexistence curves that match the simulation data. Preliminary indications for a putative second critical point between two different liquid phases of methanol are also discussed.

    7. Computational Physics and Methods

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      2 Computational Physics and Methods Performing innovative simulations of physics phenomena on tomorrow's scientific computing platforms Growth and emissivity of young galaxy hosting a supermassive black hole as calculated in cosmological code ENZO and post-processed with radiative transfer code AURORA. image showing detailed turbulence simulation, Rayleigh-Taylor Turbulence imaging: the largest turbulence simulations to date Advanced multi-scale modeling Turbulence datasets Density iso-surfaces

    8. Development of an Extensible Computational Framework for Centralized Storage and Distributed Curation and Analysis of Genomic Data Genome-scale Metabolic Models

      SciTech Connect (OSTI)

      Stevens, Rick

      2010-08-01

      The DOE funded KBase project of the Stevens group at the University of Chicago was focused on four high-level goals: (i) improve extensibility, accessibility, and scalability of the SEED framework for genome annotation, curation, and analysis; (ii) extend the SEED infrastructure to support transcription regulatory network reconstructions (2.1), metabolic model reconstruction and analysis (2.2), assertions linked to data (2.3), eukaryotic annotation (2.4), and growth phenotype prediction (2.5); (iii) develop a web-API for programmatic remote access to SEED data and services; and (iv) application of all tools to bioenergy-related genomes and organisms. In response to these goals, we enhanced and improved the ModelSEED resource within the SEED to enable new modeling analyses, including improved model reconstruction and phenotype simulation. We also constructed a new website and web-API for the ModelSEED. Further, we constructed a comprehensive web-API for the SEED as a whole. We also made significant strides in building infrastructure in the SEED to support the reconstruction of transcriptional regulatory networks by developing a pipeline to identify sets of consistently expressed genes based on gene expression data. We applied this pipeline to 29 organisms, computing regulons which were subsequently stored in the SEED database and made available on the SEED website (http://pubseed.theseed.org). We developed a new pipeline and database for the use of kmers, or short 8-residue oligomer sequences, to annotate genomes at high speed. Finally, we developed the PlantSEED, or a new pipeline for annotating primary metabolism in plant genomes. All of the work performed within this project formed the early building blocks for the current DOE Knowledgebase system, and the kmer annotation pipeline, plant annotation pipeline, and modeling tools are all still in use in KBase today.

    9. Compute Nodes

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Compute Nodes Compute Nodes There are currently 2632 nodes available on PDSF. The compute (batch) nodes at PDSF are heterogenous, reflecting the periodic procurement of new nodes (and the eventual retirement of old nodes). From the user's perspective they are essentially all equivalent except that some have more memory per job slot. If your jobs have memory requirements beyond the default maximum of 1.1GB you should specify that in your job submission and the batch system will run your job on an

    10. Compositional modeling in porous media using constant volume flash and flux computation without the need for phase identification

      SciTech Connect (OSTI)

      Polívka, Ond?ej Mikyška, Ji?í

      2014-09-01

      The paper deals with the numerical solution of a compositional model describing compressible two-phase flow of a mixture composed of several components in porous media with species transfer between the phases. The mathematical model is formulated by means of the extended Darcy's laws for all phases, components continuity equations, constitutive relations, and appropriate initial and boundary conditions. The splitting of components among the phases is described using a new formulation of the local thermodynamic equilibrium which uses volume, temperature, and moles as specification variables. The problem is solved numerically using a combination of the mixed-hybrid finite element method for the total flux discretization and the finite volume method for the discretization of transport equations. A new approach to numerical flux approximation is proposed, which does not require the phase identification and determination of correspondence between the phases on adjacent elements. The time discretization is carried out by the backward Euler method. The resulting large system of nonlinear algebraic equations is solved by the Newton–Raphson iterative method. We provide eight examples of different complexity to show reliability and robustness of our approach.

    11. SRNL RADIONUCLIDE FIELD LYSIMETER EXPERIMENT: BASELINE CONSTRUCTION AND IMPLEMENTATION

      SciTech Connect (OSTI)

      Roberts, K.; Kaplan, D.; Bagwell, L.; Powell, B.; Almond, P.; Emerson, H.; Hixon, A.; Jablonski, J.; Buchanan, C.; Waterhouse, T.

      2012-10-17

      The purpose of this document is to compile information regarding experimental design, facility design, construction, radionuclide source preparation, and path forward for the ten year Savannah River National Laboratory (SRNL) Radionuclide Field Lysimeter Experiment at the Savannah River Site (SRS). This is a collaborative effort by researchers at SRNL and Clemson University. The scientific objectives of this study are to: Study long-term radionuclide transport under conditions more representative of vadose zone conditions than laboratory experiments; Provide more realistic quantification of radionuclide transport and geochemistry in the vadose zone, providing better information pertinent to radioactive waste storage solutions than presently exists; Reduce uncertainty and improve justification for geochemical models such as those used in performance assessments and composite analyses.

    12. Baseline ecological footprint of Sandia National Laboratories, New Mexico.

      SciTech Connect (OSTI)

      Coplen, Amy K.; Mizner, Jack Harry,; Ubechel, Norion M.

      2009-01-01

      The Ecological Footprint Model is a mechanism for measuring the environmental effects of operations at Sandia National Laboratories in Albuquerque, New Mexico (SNL/NM). This analysis quantifies environmental impact associated with energy use, transportation, waste, land use, and water consumption at SNL/NM for fiscal year 2005 (FY05). Since SNL/NM's total ecological footprint (96,434 gha) is greater than the waste absorption capacity of its landholdings (338 gha), it created an ecological deficit of 96,096 gha. This deficit is equal to 886,470lha, or about 3,423 square miles of Pinyon-Juniper woodlands and desert grassland. 89% of the ecological footprint can be attributed to energy use, indicating that in order to mitigate environmental impact, efforts should be focused on energy efficiency, energy reduction, and the incorporation of additional renewable energy alternatives at SNL/NM.

    13. ABB SCADA/EMS System INEEL Baseline Summary Test Report (November 2004) |

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      Department of Energy SCADA/EMS System INEEL Baseline Summary Test Report (November 2004) ABB SCADA/EMS System INEEL Baseline Summary Test Report (November 2004) This document covers the security evaluation of the "baseline" or "as delivered" system performed in the Idaho National Engineering and Environmental Laboratory (INEEL) SCADA test bed as part of the Critical Infrastructure Test Range Development Program, which is funded by the U.S. Department of Energy; Office of

    14. Compute Nodes

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Nodes Quad CoreAMDOpteronprocessor Compute Node Configuration 9,572 nodes 1 quad-core AMD 'Budapest' 2.3 GHz processor per node 4 cores per node (38,288 total cores) 8 GB...

    15. Atomic-Scale Design of Iron Fischer-Tropsch Catalysts: A Combined Computational Chemistry, Experimental, and Microkinetic Modeling Approach

      SciTech Connect (OSTI)

      Manos Mavrikakis; James A. Dumesic; Rahul P. Nabar

      2006-09-29

      Work continued on the development of a microkinetic model of Fischer-Tropsch synthesis (FTS) on supported and unsupported Fe catalysts. The following aspects of the FT mechanism on unsupported iron catalysts were investigated on during this third year: (1) the collection of rate data in a Berty CSTR reactor based on sequential design of experiments; (2) CO adsorption and CO-TPD for obtaining the heat of adsorption of CO on polycrystalline iron; and (3) isothermal hydrogenation (IH) after Fischer Tropsch reaction to identify and quantify surface carbonaceous species. Rates of C{sub 2+} formation on unsupported iron catalysts at 220 C and 20 atm correlated well to a Langmuir-Hinshelwood type expression, derived assuming carbon hydrogenation to CH and OH recombination to water to be rate-determining steps. From desorption of molecularly adsorbed CO at different temperatures the heat of adsorption of CO on polycrystalline iron was determined to be 100 kJ/mol. Amounts and types of carbonaceous species formed after FT reaction for 5-10 minutes at 150, 175, 200 and 285 C vary significantly with temperature. Mr. Brian Critchfield completed his M.S. thesis work on a statistically designed study of the kinetics of FTS on 20% Fe/alumina. Preparation of a paper describing this work is in progress. Results of these studies were reported at the Annual Meeting of the Western States Catalysis and at the San Francisco AIChE meeting. In the coming period, studies will focus on quantitative determination of the rates of kinetically-relevant elementary steps on unsupported Fe catalysts with/without K and Pt promoters by SSITKA method. This study will help us to (1) understand effects of promoter and support on elementary kinetic parameters and (2) build a microkinetics model for FTS on iron. Calculations using periodic, self-consistent Density Functional Theory (DFT) methods were performed on models of defected Fe surfaces, most significantly the stepped Fe(211) surface. Binding Energies (BE's), preferred adsorption sites and geometries of all the FTS relevant stable species and intermediates were evaluated. Each elementary step of our reaction model was fully characterized with respect to its thermochemistry and comparisons between the stepped Fe(211) facet and the most-stable Fe(110) facet were established. In most cases the BE's on Fe(211) reflected the trends observed earlier on Fe(110), yet there were significant variations imposed on the underlying trends. Vibrational frequencies were evaluated for the preferred adsorption configurations of each species with the aim of evaluating the entropy-changes and preexponential factors for each elementary step. Kinetic studies were performed for the early steps of FTS (up to CH{sub 4} formation) and CO dissociation. This involved evaluation of the Minimum Energy Pathway (MEP) and activation energy barrier for the steps involved. We concluded that Fe(211) would allow for far more facile CO dissociation in comparison to other Fe catalysts studied so far, but the other FTS steps studied remained mostly unchanged.

    16. Exascale Computing

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Computing Exascale Computing CoDEx Project: A Hardware/Software Codesign Environment for the Exascale Era The next decade will see a rapid evolution of HPC node architectures as power and cooling constraints are limiting increases in microprocessor clock speeds and constraining data movement. Applications and algorithms will need to change and adapt as node architectures evolve. A key element of the strategy as we move forward is the co-design of applications, architectures and programming

    17. LHC Computing

      SciTech Connect (OSTI)

      Lincoln, Don

      2015-07-28

      The LHC is the world’s highest energy particle accelerator and scientists use it to record an unprecedented amount of data. This data is recorded in electronic format and it requires an enormous computational infrastructure to convert the raw data into conclusions about the fundamental rules that govern matter. In this video, Fermilab’s Dr. Don Lincoln gives us a sense of just how much data is involved and the incredible computer resources that makes it all possible.

    18. User manual for AQUASTOR: a computer model for cost analysis of aquifer thermal-energy storage oupled with district-heating or cooling systems. Volume II. Appendices

      SciTech Connect (OSTI)

      Huber, H.D.; Brown, D.R.; Reilly, R.W.

      1982-04-01

      A computer model called AQUASTOR was developed for calculating the cost of district heating (cooling) using thermal energy supplied by an aquifer thermal energy storage (ATES) system. the AQUASTOR Model can simulate ATES district heating systems using stored hot water or ATES district cooling systems using stored chilled water. AQUASTOR simulates the complete ATES district heating (cooling) system, which consists of two prinicpal parts: the ATES supply system and the district heating (cooling) distribution system. The supply system submodel calculates the life-cycle cost of thermal energy supplied to the distribution system by simulating the technical design and cash flows for the exploration, development, and operation of the ATES supply system. The distribution system submodel calculates the life-cycle cost of heat (chill) delivered by the distribution system to the end-users by simulating the technical design and cash flows for the construction and operation of the distribution system. The model combines the technical characteristics of the supply system and the technical characteristics of the distribution system with financial and tax conditions for the entities operating the two systems into one techno-economic model. This provides the flexibility to individually or collectively evaluate the impact of different economic and technical parameters, assumptions, and uncertainties on the cost of providing district heating (cooling) with an ATES system. This volume contains all the appendices, including supply and distribution system cost equations and models, descriptions of predefined residential districts, key equations for the cooling degree-hour methodology, a listing of the sample case output, and appendix H, which contains the indices for supply input parameters, distribution input parameters, and AQUASTOR subroutines.

    19. Light output measurements and computational models of microcolumnar CsI scintillators for x-ray imaging

      SciTech Connect (OSTI)

      Nillius, Peter Klamra, Wlodek; Danielsson, Mats; Sibczynski, Pawel; Sharma, Diksha; Badano, Aldo

      2015-02-15

      Purpose: The authors report on measurements of light output and spatial resolution of microcolumnar CsI:Tl scintillator detectors for x-ray imaging. In addition, the authors discuss the results of simulations aimed at analyzing the results of synchrotron and sealed-source exposures with respect to the contributions of light transport to the total light output. Methods: The authors measured light output from a 490-?m CsI:Tl scintillator screen using two setups. First, the authors used a photomultiplier tube (PMT) to measure the response of the scintillator to sealed-source exposures. Second, the authors performed imaging experiments with a 27-keV monoenergetic synchrotron beam and a slit to calculate the total signal generated in terms of optical photons per keV. The results of both methods are compared to simulations obtained with hybridMANTIS, a coupled x-ray, electron, and optical photon Monte Carlo transport package. The authors report line response (LR) and light output for a range of linear absorption coefficients and describe a model that fits at the same time the light output and the blur measurements. Comparing the experimental results with the simulations, the authors obtained an estimate of the absorption coefficient for the model that provides good agreement with the experimentally measured LR. Finally, the authors report light output simulation results and their dependence on scintillator thickness and reflectivity of the backing surface. Results: The slit images from the synchrotron were analyzed to obtain a total light output of 48 keV{sup ?1} while measurements using the fast PMT instrument setup and sealed-sources reported a light output of 28 keV{sup ?1}. The authors attribute the difference in light output estimates between the two methods to the difference in time constants between the camera and PMT measurements. Simulation structures were designed to match the light output measured with the camera while providing good agreement with the measured LR resulting in a bulk absorption coefficient of 5 Ś 10{sup ?5} ?m{sup ?1}. Conclusions: The combination of experimental measurements for microcolumnar CsI:Tl scintillators using sealed-sources and synchrotron exposures with results obtained via simulation suggests that the time course of the emission might play a role in experimental estimates. The procedure yielded an experimentally derived linear absorption coefficient for microcolumnar Cs:Tl of 5 Ś 10{sup ?5} ?m{sup ?1}. To the author’s knowledge, this is the first time this parameter has been validated against experimental observations. The measurements also offer insight into the relative role of optical transport on the effective optical yield of the scintillator with microcolumnar structure.

    20. Free energy of RNA-counterion interactions in a tight-binding model computed by a discrete space mapping

      SciTech Connect (OSTI)

      Henke, Paul S.; Mak, Chi H.

      2014-08-14

      The thermodynamic stability of a folded RNA is intricately tied to the counterions and the free energy of this interaction must be accounted for in any realistic RNA simulations. Extending a tight-binding model published previously, in this paper we investigate the fundamental structure of charges arising from the interaction between small functional RNA molecules and divalent ions such as Mg{sup 2+} that are especially conducive to stabilizing folded conformations. The characteristic nature of these charges is utilized to construct a discretely connected energy landscape that is then traversed via a novel application of a deterministic graph search technique. This search method can be incorporated into larger simulations of small RNA molecules and provides a fast and accurate way to calculate the free energy arising from the interactions between an RNA and divalent counterions. The utility of this algorithm is demonstrated within a fully atomistic Monte Carlo simulation of the P4-P6 domain of the Tetrahymena group I intron, in which it is shown that the counterion-mediated free energy conclusively directs folding into a compact structure.

    1. Baseline Design of a Hurricane-Resilient Wind Turbine (Poster)

      SciTech Connect (OSTI)

      Damiani, R.; Robertson, A.; Schreck, S.; Maples, B.; Anderson, M.; Finucane, Z.; Raina, A.

      2014-10-01

      Under U.S. Department of Energy-sponsored research FOA 415, the National Renewable Energy Laboratory led a team of research groups to produce a complete design of a large wind turbine system to be deployable in the western Gulf of Mexico region. As such, the turbine and its support structure would be subjected to hurricane-loading conditions. Among the goals of this research was the exploration of advanced and innovative configurations that would help decrease the levelized cost of energy (LCOE) of the design, and the expansion of the basic IEC design load cases (DLCs) to include hurricane environmental conditions. The wind turbine chosen was a three-bladed, downwind, direct-drive, 10-MW rated machine. The rotor blade was optimized based on an IEC load suite analysis. The drivetrain and nacelle components were scaled up from a smaller sized turbine using industry best practices. The tubular steel tower was sized using ultimate load values derived from the rotor optimization analysis. The substructure is an innovative battered and raked jacket structure. The innovative turbine has also been modeled within an aero-servo-hydro-elastic tool, and future papers will discuss results of the dynamic response analysis for select DLCs. Although multiple design iterations could not be performed because of limited resources in this study, and are left to future research, the obtained data will offer a good indication of the expected LCOE for large offshore wind turbines to be deployed in subtropical U.S. waters, and the impact design innovations can have on this value.

    2. An evaluation of baseline conditions at lease tract C-a, Rio Blanco County, Colorado

      SciTech Connect (OSTI)

      Barteaux, W.L.; Biezugbe, G.

      1987-09-01

      An analysis was made of baseline groundwater quality data from oil shale lease tract C-a, managed by Rio Blanco Oil Shale Company. The data are limited in several respects. All conclusions drawn from the data must be qualified with these limitations. Baseline conditions were determined by analyzing data from wells in the upper bedrock and lower bedrock aquifers and from the alluvial wells. Baseline data were considered all data collected before mining operations began. The water quality was then evaluated using the 1987 Colorado State Basic Standards for Ground Water as a basis. The maximum baseline values for several parameters in each aquifer exceed the standard values. The quality of the upper lower bedrock aquifers varies from region to region within the site. Data on the lower bedrock aquifer are insufficient for speculation on the cause of the variations. Variations in the upper bedrock aquifer are possibly caused by leakage of the lower bedrock aquifer. 16 refs., 9 figs., 9 tabs.

    3. 2008 CHP Baseline Assessment and Action Plan for the Hawaii Market |

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      Department of Energy Hawaii Market 2008 CHP Baseline Assessment and Action Plan for the Hawaii Market The purpose of this 2008 report is to provide an updated baseline assessment and action plan for combined heat and power (CHP) in Hawaii and to identify the hurdles that prevent the expanded use of CHP systems. This report was prepared by the Pacific Region CHP Application Center (RAC). PDF icon chp_hawaii_2008.pdf More Documents & Publications Renewable Power Options for Electricity

    4. Optimization of the CLIC Baseline Collimation System (Conference) | SciTech

      Office of Scientific and Technical Information (OSTI)

      Connect Optimization of the CLIC Baseline Collimation System Citation Details In-Document Search Title: Optimization of the CLIC Baseline Collimation System × You are accessing a document from the Department of Energy's (DOE) SciTech Connect. This site is a product of DOE's Office of Scientific and Technical Information (OSTI) and is provided as a public service. Visit OSTI to utilize additional information resources in energy science and technology. A paper copy of this document is also

    5. EA-1943: Construction and Operation of the Long Baseline Neutrino Facility

      Office of Environmental Management (EM)

      and Deep Underground Neutrino Experiment at Fermilab, Batavia, Illinois, and Sanford Underground Research Facility, Lead, South Dakota | Department of Energy 43: Construction and Operation of the Long Baseline Neutrino Facility and Deep Underground Neutrino Experiment at Fermilab, Batavia, Illinois, and Sanford Underground Research Facility, Lead, South Dakota EA-1943: Construction and Operation of the Long Baseline Neutrino Facility and Deep Underground Neutrino Experiment at Fermilab,

    6. Computing and Computational Sciences Directorate - Contacts

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Home â€ș About Us Contacts Jeff Nichols Associate Laboratory Director Computing and Computational Sciences Becky Verastegui Directorate Operations Manager Computing and Computational Sciences Directorate Michael Bartell Chief Information Officer Information Technologies Services Division Jim Hack Director, Climate Science Institute National Center for Computational Sciences Shaun Gleason Division Director Computational Sciences and Engineering Barney Maccabe Division Director Computer Science

    7. Compute Nodes

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Compute Nodes Compute Nodes MC-proc.png Compute Node Configuration 6,384 nodes 2 twelve-core AMD 'MagnyCours' 2.1-GHz processors per node (see die image to the right and schematic below) 24 cores per node (153,216 total cores) 32 GB DDR3 1333-MHz memory per node (6,000 nodes) 64 GB DDR3 1333-MHz memory per node (384 nodes) Peak Gflop/s rate: 8.4 Gflops/core 201.6 Gflops/node 1.28 Peta-flops for the entire machine Each core has its own L1 and L2 caches, with 64 KB and 512KB respectively One 6-MB

    8. Computing Resources

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Resources This page is the repository for sundry items of information relevant to general computing on BooNE. If you have a question or problem that isn't answered here, or a suggestion for improving this page or the information on it, please mail boone-computing@fnal.gov and we'll do our best to address any issues. Note about this page Some links on this page point to www.everything2.com, and are meant to give an idea about a concept or thing without necessarily wading through a whole website

    9. M & V Shootout: Setting the Stage For Testing the Performance of New Energy Baseline

      SciTech Connect (OSTI)

      Touzani, Samir; Custodio, Claudine; Sohn, Michael; Fernandes, Samuel; Granderson, Jessica; Jump, David; Taylor, Cody

      2015-07-01

      Trustworthy savings calculations are critical to convincing investors in energy efficiency projects of the benefit and cost-effectiveness of such investments and their ability to replace or defer supply-side capital investments. However, today’s methods for measurement and verification (M&V) of energy savings constitute a significant portion of the total costs of efficiency projects. They also require time-consuming data acquisition and often do not deliver results until years after the program period has ended. A spectrum of savings calculation approaches are used, with some relying more heavily on measured data and others relying more heavily on estimated or modeled data, or stipulated information. The rising availability of “smart” meters, combined with new analytical approaches to quantifying savings, has opened the door to conducting M&V more quickly and at lower cost, with comparable or improved accuracy. Energy management and information systems (EMIS) technologies, not only enable significant site energy savings, but are also beginning to offer M&V capabilities. This paper expands recent analyses of public-domain, whole-building M&V methods, focusing on more novel baseline modeling approaches that leverage interval meter data. We detail a testing procedure and metrics to assess the performance of these new approaches using a large test dataset. We also provide conclusions regarding the accuracy, cost, and time trade-offs between more traditional M&V and these emerging streamlined methods. Finally, we discuss the potential evolution of M&V to better support the energy efficiency industry through low-cost approaches, and the long-term agenda for validation of building energy analytics.

    10. Magnetic resonance imaging and computational fluid dynamics (CFD) simulations of rabbit nasal airflows for the development of hybrid CFD/PBPK models

      SciTech Connect (OSTI)

      Corley, Richard A.; Minard, Kevin R.; Kabilan, Senthil; Einstein, Daniel R.; Kuprat, Andrew P.; harkema, J. R.; Kimbell, Julia; Gargas, M. L.; Kinzell, John H.

      2009-06-01

      The percentages of total air?ows over the nasal respiratory and olfactory epithelium of female rabbits were cal-culated from computational ?uid dynamics (CFD) simulations of steady-state inhalation. These air?ow calcula-tions, along with nasal airway geometry determinations, are critical parameters for hybrid CFD/physiologically based pharmacokinetic models that describe the nasal dosimetry of water-soluble or reactive gases and vapors in rabbits. CFD simulations were based upon three-dimensional computational meshes derived from magnetic resonance images of three adult female New Zealand White (NZW) rabbits. In the anterior portion of the nose, the maxillary turbinates of rabbits are considerably more complex than comparable regions in rats, mice, mon-keys, or humans. This leads to a greater surface area to volume ratio in this region and thus the potential for increased extraction of water soluble or reactive gases and vapors in the anterior portion of the nose compared to many other species. Although there was considerable interanimal variability in the ?ne structures of the nasal turbinates and air?ows in the anterior portions of the nose, there was remarkable consistency between rabbits in the percentage of total inspired air?ows that reached the ethmoid turbinate region (~50%) that is presumably lined with olfactory epithelium. These latter results (air?ows reaching the ethmoid turbinate region) were higher than previous published estimates for the male F344 rat (19%) and human (7%). These di?erences in regional air?ows can have signi?cant implications in interspecies extrapolations of nasal dosimetry.

    11. Appendix A: GPRA08 benefits estimates: NEMS and MARKAL Model Baseline Cases

      SciTech Connect (OSTI)

      None, None

      2009-01-18

      Document summarizes the results of the benefits analysis of EERE’s programs, as described in the FY 2008 Budget Request. EERE estimates benefits for its overall portfolio and nine Research, Development, Demonstration, and Deployment (RD3) programs.

    12. Sandia Energy - Computational Science

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Computational Science Home Energy Research Advanced Scientific Computing Research (ASCR) Computational Science Computational Sciencecwdd2015-03-26T13:35:2...

    13. Baseline Library

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Skip navigation links Marketing Resources Reports, Publications, and Research Agricultural Commercial Consumer Products Industrial Institutional Multi-Sector Residential...

    14. Technology for Increasing Geothermal Energy Productivity. Computer Models to Characterize the Chemical Interactions of Goethermal Fluids and Injectates with Reservoir Rocks, Wells, Surface Equiptment

      SciTech Connect (OSTI)

      Nancy Moller Weare

      2006-07-25

      This final report describes the results of a research program we carried out over a five-year (3/1999-9/2004) period with funding from a Department of Energy geothermal FDP grant (DE-FG07-99ID13745) and from other agencies. The goal of research projects in this program were to develop modeling technologies that can increase the understanding of geothermal reservoir chemistry and chemistry-related energy production processes. The ability of computer models to handle many chemical variables and complex interactions makes them an essential tool for building a fundamental understanding of a wide variety of complex geothermal resource and production chemistry. With careful choice of methodology and parameterization, research objectives were to show that chemical models can correctly simulate behavior for the ranges of fluid compositions, formation minerals, temperature and pressure associated with present and near future geothermal systems as well as for the very high PT chemistry of deep resources that is intractable with traditional experimental methods. Our research results successfully met these objectives. We demonstrated that advances in physical chemistry theory can be used to accurately describe the thermodynamics of solid-liquid-gas systems via their free energies for wide ranges of composition (X), temperature and pressure. Eight articles on this work were published in peer-reviewed journals and in conference proceedings. Four are in preparation. Our work has been presented at many workshops and conferences. We also considerably improved our interactive web site (geotherm.ucsd.edu), which was in preliminary form prior to the grant. This site, which includes several model codes treating different XPT conditions, is an effective means to transfer our technologies and is used by the geothermal community and other researchers worldwide. Our models have wide application to many energy related and other important problems (e.g., scaling prediction in petroleum production systems, stripping towers for mineral production processes, nuclear waste storage, CO2 sequestration strategies, global warming). Although funding decreases cut short completion of several research activities, we made significant progress on these abbreviated projects.

    15. Sandia Energy » Computational Modeling & Simulation

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      new-crew-database-receives-first-set-of-datafeed 0 Aerodynamic Wind-Turbine Blade Design for the National Rotor Testbed http:energy.sandia.govaerodynamic-wind-turbin...

    16. Nuclear Hybrid Energy System Modeling: RELAP5 Dynamic Coupling Capabilities

      SciTech Connect (OSTI)

      Piyush Sabharwall; Nolan Anderson; Haihua Zhao; Shannon Bragg-Sitton; George Mesina

      2012-09-01

      The nuclear hybrid energy systems (NHES) research team is currently developing a dynamic simulation of an integrated hybrid energy system. A detailed simulation of proposed NHES architectures will allow initial computational demonstration of a tightly coupled NHES to identify key reactor subsystem requirements, identify candidate reactor technologies for a hybrid system, and identify key challenges to operation of the coupled system. This work will provide a baseline for later coupling of design-specific reactor models through industry collaboration. The modeling capability addressed in this report focuses on the reactor subsystem simulation.

    17. Computer System,

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      System, Cluster, and Networking Summer Institute New Mexico Consortium and Los Alamos National Laboratory HOW TO APPLY Applications will be accepted JANUARY 5 - FEBRUARY 13, 2016 Computing and Information Technology undegraduate students are encouraged to apply. Must be a U.S. citizen. * Submit a current resume; * Offcial University Transcript (with spring courses posted and/or a copy of spring 2016 schedule) 3.0 GPA minimum; * One Letter of Recommendation from a Faculty Member; and * Letter of

    18. Computing Events

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Events Computing Events Spotlighting the most advanced scientific and technical applications in the world! Featuring exhibits of the latest and greatest technologies from industry, academia and government research organizations; many of these technologies will be seen for the first time in Denver. Supercomputing Conference 13 Denver, Colorado November 17-22, 2013 Spotlighting the most advanced scientific and technical applications in the world, SC13 will bring together the international

    19. Vandenberg Air Force Base integrated resource assessment. Volume 2, Baseline detail

      SciTech Connect (OSTI)

      Halverson, M.A.; Richman, E.E.; Dagle, J.E.; Hickman, B.J.; Daellenbach, K.K.; Sullivan, G.P.

      1993-06-01

      The US Air Force Space Command has tasked the Pacific Northwest Laboratory, as the lead laboratory supporting the US Department of Energy Federal Energy Management Program, to identify, evaluate, and assist in acquiring all cost-effective energy projects at Vandenberg Air Force Base (VAFB). This is a model program PNL is designing for federal customers served by the Pacific Gas and Electric Company (PG and E). The primary goal of the VAFB project is to identify all electric energy efficiency opportunities, and to negotiate with PG and E to acquire those resources through a customized demand-side management program for its federal clients. That customized program should have three major characteristics: (1) 100% up-front financing; (2) substantial utility cost-sharing; and (3) utility implementation through energy service companies under contract to the utility. A similar arrangement will be pursued with Southern California Gas for non-electric resource opportunities if that is deemed desirable by the site and if the gas utility seems open to such an approach. This report documents the assessment of baseline energy use at VAFB located near Lompoc, California. It is a companion report to Volume 1, Executive Summary, and Volume 3, Resource Assessment. This analysis examines the characteristics of electric, natural gas, fuel oil, and propane use for fiscal year 1991. It records energy-use intensities for the facilities at VAFB by building type and energy end use. It also breaks down building energy consumption by fuel type, energy end use, and building type. A more complete energy consumption reconciliation is presented that includes the accounting of all energy use among buildings, utilities, and applicable losses.

    20. Baseline biological risk assessment for aquatic populations occurring near Eielson Air Force Base, Alaska

      SciTech Connect (OSTI)

      Dauble, D.; Brandt, C.; Lewis, R.; Smith, R.

      1995-12-31

      Eielson Air Force Base (AFB), Alaska was listed as a Superfund site in November 1989 with 64 potential source areas of contamination. As part of a sitewide remedial investigation, baseline risk assessments were conducted in 1993 and 1994 to evaluate hazards posed to biological receptors and to human health. Fish tissue, aquatic invertebrates, aquatic vegetation, sediment, and surface water data were collected from several on-site and off-site surface water bodies. An initial screening risk assessment indicated that several surface water sites along two major tributary creeks flowing through the base had unacceptable risks to both aquatic receptors and to human health because of DDTs. Other contaminants of concern (i.e., PCBs and PAHs) were below screening risk levels for aquatic organisms, but contributed to an unacceptable risk to human health. Additional samples was taken in 1994 to characterize the site-wide distribution of PAHs, DDTs, and PCBs in aquatic biota and sediments. Concentrations of PAHs were invertebrates > aquatic vegetation > fish, but concentrations were sufficiently low that they posed no significant risk to biological receptors. Pesticides were detected in all fish tissue samples. Polychlorinated biphenyls (PCBs) were also detected in most fish from Garrison Slough. The pattern of PCB concentrations in Arctic grayling (Thymallus arcticus) was related to their proximity to a sediment source in lower Garrison Slough. Ingestion of PCB-contaminated fish is the primary human-health risk driver for surface water bodies on Eielson AFB, resulting in carcinogenic risks > 1 {times} 10{sup {minus}4} for future recreational land-use at some sites. Principal considerations affecting uncertainty in the risk assessment process included spatial and temporal variability in media contaminant concentrations and inconsistencies between modelled and measured body burdens.