National Library of Energy BETA

Sample records for baseline computational model

  1. Proposed Methodology for LEED Baseline Refrigeration Modeling (Presentation)

    SciTech Connect (OSTI)

    Deru, M.

    2011-02-01

    This PowerPoint presentation summarizes a proposed methodology for LEED baseline refrigeration modeling. The presentation discusses why refrigeration modeling is important, the inputs of energy models, resources, reference building model cases, baseline model highlights, example savings calculations and results.

  2. Integrated Baseline Bystem (IBS) Version 1.03: Models guide

    SciTech Connect (OSTI)

    Not Available

    1993-01-01

    The Integrated Baseline System)(IBS), operated by the Federal Emergency Management Agency (FEMA), is a system of computerized tools for emergency planning and analysis. This document is the models guide for the IBS and explains how to use the emergency related computer models. This document provides information for the experienced system user, and is the primary reference for the computer modeling software supplied with the system. It is designed for emergency managers and planners, and others familiar with the concepts of computer modeling. Although the IBS manual set covers basic and advanced operations, it is not a complete reference document set. Emergency situation modeling software in the IBS is supported by additional technical documents. Some of the other IBS software is commercial software for which more complete documentation is available. The IBS manuals reference such documentation where necessary.

  3. Integrated Baseline System (IBS) Version 2.0: Models guide

    SciTech Connect (OSTI)

    Not Available

    1994-03-01

    The Integrated Baseline System (IBS) is an emergency management planning and analysis tool being developed under the direction of the US Army Nuclear and Chemical Agency. This Models Guide summarizes the IBS use of several computer models for predicting the results of emergency situations. These include models for predicting dispersion/doses of airborne contaminants, traffic evacuation, explosion effects, heat radiation from a fire, and siren sound transmission. The guide references additional technical documentation on the models when such documentation is available from other sources. The audience for this manual is chiefly emergency management planners and analysts, but also data managers and system managers.

  4. Functional Testing Protocols for Commercial Building Efficiency Baseline Modeling Software

    SciTech Connect (OSTI)

    Jump, David; Price, Phillip N.; Granderson, Jessica; Sohn, Michael

    2013-09-06

    This document describes procedures for testing and validating proprietary baseline energy modeling software accuracy in predicting energy use over the period of interest, such as a month or a year. The procedures are designed according to the methodology used for public domain baselining software in another LBNL report that was (like the present report) prepared for Pacific Gas and Electric Company: ?Commercial Building Energy Baseline Modeling Software: Performance Metrics and Method Testing with Open Source Models and Implications for Proprietary Software Testing Protocols? (referred to here as the ?Model Analysis Report?). The test procedure focuses on the quality of the software?s predictions rather than on the specific algorithms used to predict energy use. In this way the software vendor is not required to divulge or share proprietary information about how their software works, while enabling stakeholders to assess its performance.

  5. Examining Uncertainty in Demand Response Baseline Models and Variability in Automated Response to Dynamic Pricing

    SciTech Connect (OSTI)

    Mathieu, Johanna L.; Callaway, Duncan S.; Kiliccote, Sila

    2011-08-15

    Controlling electric loads to deliver power system services presents a number of interesting challenges. For example, changes in electricity consumption of Commercial and Industrial (C&I) facilities are usually estimated using counterfactual baseline models, and model uncertainty makes it difficult to precisely quantify control responsiveness. Moreover, C&I facilities exhibit variability in their response. This paper seeks to understand baseline model error and demand-side variability in responses to open-loop control signals (i.e. dynamic prices). Using a regression-based baseline model, we define several Demand Response (DR) parameters, which characterize changes in electricity use on DR days, and then present a method for computing the error associated with DR parameter estimates. In addition to analyzing the magnitude of DR parameter error, we develop a metric to determine how much observed DR parameter variability is attributable to real event-to-event variability versus simply baseline model error. Using data from 38 C&I facilities that participated in an automated DR program in California, we find that DR parameter errors are large. For most facilities, observed DR parameter variability is likely explained by baseline model error, not real DR parameter variability; however, a number of facilities exhibit real DR parameter variability. In some cases, the aggregate population of C&I facilities exhibits real DR parameter variability, resulting in implications for the system operator with respect to both resource planning and system stability.

  6. Theory, Modeling and Computation

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Theory, Modeling and Computation Theory, Modeling and Computation The sophistication of modeling and simulation will be enhanced not only by the wealth of data available from MaRIE but by the increased computational capacity made possible by the advent of extreme computing. CONTACT Jack Shlachter (505) 665-1888 Email Extreme Computing to Power Accurate Atomistic Simulations Advances in high-performance computing and theory allow longer and larger atomistic simulations than currently possible.

  7. Computational Modeling | Bioenergy | NREL

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Computational Modeling NREL uses computational modeling to increase the efficiency of biomass conversion by rational design using multiscale modeling, applying theoretical approaches, and testing scientific hypotheses. model of enzymes wrapping on cellulose; colorful circular structures entwined through blue strands Cellulosomes are complexes of protein scaffolds and enzymes that are highly effective in decomposing biomass. This is a snapshot of a coarse-grain model of complex cellulosome

  8. Baseline Assessment of TREAT for Modeling and Analysis Needs

    SciTech Connect (OSTI)

    Bess, John Darrell; DeHart, Mark David

    2015-10-01

    TREAT is an air-cooled, graphite moderated, thermal, heterogeneous test facility designed to evaluate reactor fuels and structural materials under conditions simulating various types of nuclear excursions and transient undercooling situations that could occur in a nuclear reactor. After 21 years in a standby mode, TREAT is being re-activated to revive transient testing capabilities. Given the time elapsed and the concurrent loss of operating experience, current generation and advanced computational methods are being applied to begin TREAT modeling and simulation prior to renewed at-power operations. Such methods have limited value in predicting the behavior of TREAT without proper validation. Hence, the U.S. DOE has developed a number of programs to support development of benchmarks for both critical and transient operations. Extensive effort has been expended at INL to collect detailed descriptions, drawings and specifications for all aspects of TREAT, and to resolve conflicting data found through this process. This report provides a collection of these data, with updated figures that are significantly more readable than historic drawings and illustrations, compositions, and dimensions based on the best available sources. This document is not nor should it be considered to be a benchmark report. Rather, it is intended to provide one-stop shopping, to the extent possible, for other work that seeks to prepare detailed, accurate models of the core and its components. Given the nature of the variety of historic documents available and the loss of institutional memory, the only completely accurate database of TREAT data is TREAT itself. Unfortunately, disassembly of TREAT for inspection, assay, and measurement is highly unlikely. Hence the data provided herein is intended serve as a best-estimate substitute.

  9. Baseline requirements of the proposed action for the Transportation Management Division routing models

    SciTech Connect (OSTI)

    Johnson, P.E.; Joy, D.S.

    1995-02-01

    The potential impacts associated with the transportation of hazardous materials are important to shippers, carriers, and the general public. This is particularly true for shipments of radioactive material. The shippers are primarily concerned with safety, security, efficiency, and equipment requirements. The carriers are concerned with the potential impact that radioactive shipments may have on their operations--particularly if such materials are involved in an accident. The general public has also expressed concerns regarding the safety of transporting radioactive and other hazardous materials through their communities. Because transportation routes are a central concern in hazardous material transport, the prediction of likely routes is the first step toward resolution of these issues. In response to these routing needs, several models have been developed over the past fifteen years at Oak Ridge National Laboratory (ORNL). The HIGHWAY routing model is used to predict routes for truck transportation, the INTERLINE routing model is used to predict both rail and barge routes, and the AIRPORT locator model is used to determine airports with specified criteria near a specific location. As part of the ongoing improvement of the US Department of Energy`s (DOE) Environmental Management Transportation Management Division`s (EM-261) computer systems and development efforts, a Baseline Requirements Assessment Session on the HIGHWAY, INTERLINE, and AIRPORT models was held at ORNL on April 27, 1994. The purpose of this meeting was to discuss the existing capabilities of the models and data bases and to review enhancements of the models and data bases to expand their usefulness. The results of the Baseline Requirements Assessment Section will be discussed in this report. The discussions pertaining to the different models are contained in separate sections.

  10. Statistical Analysis of Baseline Load Models for Non-Residential Buildings

    SciTech Connect (OSTI)

    Coughlin, Katie; Piette, Mary Ann; Goldman, Charles; Kiliccote, Sila

    2008-11-10

    Policymakers are encouraging the development of standardized and consistent methods to quantify the electric load impacts of demand response programs. For load impacts, an essential part of the analysis is the estimation of the baseline load profile. In this paper, we present a statistical evaluation of the performance of several different models used to calculate baselines for commercial buildings participating in a demand response program in California. In our approach, we use the model to estimate baseline loads for a large set of proxy event days for which the actual load data are also available. Measures of the accuracy and bias of different models, the importance of weather effects, and the effect of applying morning adjustment factors (which use data from the day of the event to adjust the estimated baseline) are presented. Our results suggest that (1) the accuracy of baseline load models can be improved substantially by applying a morning adjustment, (2) the characterization of building loads by variability and weather sensitivity is a useful indicator of which types of baseline models will perform well, and (3) models that incorporate temperature either improve the accuracy of the model fit or do not change it.

  11. Understanding the Effect of Baseline Modeling Implementation Choices on Analysis of Demand Response Performance

    SciTech Connect (OSTI)

    University of California, Berkeley; Addy, Nathan; Kiliccote, Sila; Mathieu, Johanna; Callaway, Duncan S.

    2012-06-13

    Accurate evaluation of the performance of buildings participating in Demand Response (DR) programs is critical to the adoption and improvement of these programs. Typically, we calculate load sheds during DR events by comparing observed electric demand against counterfactual predictions made using statistical baseline models. Many baseline models exist and these models can produce different shed calculations. Moreover, modelers implementing the same baseline model can make different modeling implementation choices, which may affect shed estimates. In this work, using real data, we analyze the effect of different modeling implementation choices on shed predictions. We focused on five issues: weather data source, resolution of data, methods for determining when buildings are occupied, methods for aligning building data with temperature data, and methods for power outage filtering. Results indicate sensitivity to the weather data source and data filtration methods as well as an immediate potential for automation of methods to choose building occupied modes.

  12. Dixie Valley Engineered Geothermal System Exploration Methodology Project, Baseline Conceptual Model Report

    SciTech Connect (OSTI)

    Iovenitti, Joe

    2013-05-15

    The Engineered Geothermal System (EGS) Exploration Methodology Project is developing an exploration approach for EGS through the integration of geoscientific data. The Project chose the Dixie Valley Geothermal System in Nevada as a field laboratory site for methodlogy calibration purposes because, in the public domain, it is a highly characterized geothermal systems in the Basin and Range with a considerable amount of geoscience and most importantly, well data. This Baseline Conceptual Model report summarizes the results of the first three project tasks (1) collect and assess the existing public domain geoscience data, (2) design and populate a GIS database, and (3) develop a baseline (existing data) geothermal conceptual model, evaluate geostatistical relationships, and generate baseline, coupled EGS favorability/trust maps from +1km above sea level (asl) to -4km asl for the Calibration Area (Dixie Valley Geothermal Wellfield) to identify EGS drilling targets at a scale of 5km x 5km. It presents (1) an assessment of the readily available public domain data and some proprietary data provided by Terra-Gen Power, LLC, (2) a re-interpretation of these data as required, (3) an exploratory geostatistical data analysis, (4) the baseline geothermal conceptual model, and (5) the EGS favorability/trust mapping. The conceptual model presented applies to both the hydrothermal system and EGS in the Dixie Valley region.

  13. Dixie Valley Engineered Geothermal System Exploration Methodology Project, Baseline Conceptual Model Report

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Iovenitti, Joe

    The Engineered Geothermal System (EGS) Exploration Methodology Project is developing an exploration approach for EGS through the integration of geoscientific data. The Project chose the Dixie Valley Geothermal System in Nevada as a field laboratory site for methodlogy calibration purposes because, in the public domain, it is a highly characterized geothermal systems in the Basin and Range with a considerable amount of geoscience and most importantly, well data. This Baseline Conceptual Model report summarizes the results of the first three project tasks (1) collect and assess the existing public domain geoscience data, (2) design and populate a GIS database, and (3) develop a baseline (existing data) geothermal conceptual model, evaluate geostatistical relationships, and generate baseline, coupled EGS favorability/trust maps from +1km above sea level (asl) to -4km asl for the Calibration Area (Dixie Valley Geothermal Wellfield) to identify EGS drilling targets at a scale of 5km x 5km. It presents (1) an assessment of the readily available public domain data and some proprietary data provided by Terra-Gen Power, LLC, (2) a re-interpretation of these data as required, (3) an exploratory geostatistical data analysis, (4) the baseline geothermal conceptual model, and (5) the EGS favorability/trust mapping. The conceptual model presented applies to both the hydrothermal system and EGS in the Dixie Valley region.

  14. Results from baseline tests of the SPRE I and comparison with code model predictions

    SciTech Connect (OSTI)

    Cairelli, J.E.; Geng, S.M.; Skupinski, R.C.

    1994-09-01

    The Space Power Research Engine (SPRE), a free-piston Stirling engine with linear alternator, is being tested at the NASA Lewis Research Center as part of the Civil Space Technology Initiative (CSTI) as a candidate for high capacity space power. This paper presents results of base-line engine tests at design and off-design operating conditions. The test results are compared with code model predictions.

  15. Computer modeling helps manage wildfires

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Computer modeling helps manage wildfires Community Connections: Your link to news and opportunities from Los Alamos National Laboratory Latest Issue: September 1, 2016 all issues All Issues » submit Computer modeling helps manage wildfires Technology increases preparedness, improves firefighting strategies. September 1, 2016 Smoke over the Jemez Mountains during the 2011 Las Conchas wildfire. Smoke over the Jemez Mountains during the 2011 Las Conchas wildfire. Contacts Director, Community

  16. Dixie Valley Engineered Geothermal System Exploration Methodology Project, Baseline Conceptual Model Report

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Iovenitti, Joe

    2014-01-02

    The Engineered Geothermal System (EGS) Exploration Methodology Project is developing an exploration approach for EGS through the integration of geoscientific data. The Project chose the Dixie Valley Geothermal System in Nevada as a field laboratory site for methodology calibration purposes because, in the public domain, it is a highly characterized geothermal system in the Basin and Range with a considerable amount of geoscience and most importantly, well data. The overall project area is 2500km2 with the Calibration Area (Dixie Valley Geothermal Wellfield) being about 170km2. The project was subdivided into five tasks (1) collect and assess the existing public domain geoscience data; (2) design and populate a GIS database; (3) develop a baseline (existing data) geothermal conceptual model, evaluate geostatistical relationships, and generate baseline, coupled EGS favorability/trust maps from +1km above sea level (asl) to -4km asl for the Calibration Area at 0.5km intervals to identify EGS drilling targets at a scale of 5km x 5km; (4) collect new geophysical and geochemical data, and (5) repeat Task 3 for the enhanced (baseline + new ) data. Favorability maps were based on the integrated assessment of the three critical EGS exploration parameters of interest: rock type, temperature and stress. A complimentary trust map was generated to compliment the favorability maps to graphically illustrate the cumulative confidence in the data used in the favorability mapping. The Final Scientific Report (FSR) is submitted in two parts with Part I describing the results of project Tasks 1 through 3 and Part II covering the results of project Tasks 4 through 5 plus answering nine questions posed in the proposal for the overall project. FSR Part I presents (1) an assessment of the readily available public domain data and some proprietary data provided by Terra-Gen Power, LLC, (2) a re-interpretation of these data as required, (3) an exploratory geostatistical data analysis, (4

  17. LANL computer model boosts engine efficiency

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    LANL computer model boosts engine efficiency LANL computer model boosts engine efficiency The KIVA model has been instrumental in helping researchers and manufacturers understand...

  18. Dixie Valley Engineered Geothermal System Exploration Methodology Project, Baseline Conceptual Model Report

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Iovenitti, Joe

    FSR Part I presents (1) an assessment of the readily available public domain data and some proprietary data provided by Terra-Gen Power, LLC, (2) a re-interpretation of these data as required, (3) an exploratory geostatistical data analysis, (4) the baseline geothermal conceptual model, and (5) the EGS favorability/trust mapping. The conceptual model presented applies to both the hydrothermal system and EGS in the Dixie Valley region. FSR Part II presents (1) 278 new gravity stations; (2) enhanced gravity-magnetic modeling; (3) 42 new ambient seismic noise survey stations; (4) an integration of the new seismic noise data with a regional seismic network; (5) a new methodology and approach to interpret this data; (5) a novel method to predict rock type and temperature based on the newly interpreted data; (6) 70 new magnetotelluric (MT) stations; (7) an integrated interpretation of the enhanced MT data set; (8) the results of a 308 station soil CO2 gas survey; (9) new conductive thermal modeling in the project area; (10) new convective modeling in the Calibration Area; (11) pseudo-convective modeling in the Calibration Area; (12) enhanced data implications and qualitative geoscience correlations at three scales (a) Regional, (b) Project, and (c) Calibration Area; (13) quantitative geostatistical exploratory data analysis; and (14) responses to nine questions posed in the proposal for this investigation. Enhanced favorability/trust maps were not generated because there was not a sufficient amount of new, fully-vetted (see below) rock type, temperature, and stress data. The enhanced seismic data did generate a new method to infer rock type and temperature. However, in the opinion of the Principal Investigator for this project, this new methodology needs to be tested and evaluated at other sites in the Basin and Range before it is used to generate the referenced maps. As in the baseline conceptual model, the enhanced findings can be applied to both the hydrothermal

  19. Dixie Valley Engineered Geothermal System Exploration Methodology Project, Baseline Conceptual Model Report

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Iovenitti, Joe

    2014-01-02

    FSR Part I presents (1) an assessment of the readily available public domain data and some proprietary data provided by Terra-Gen Power, LLC, (2) a re-interpretation of these data as required, (3) an exploratory geostatistical data analysis, (4) the baseline geothermal conceptual model, and (5) the EGS favorability/trust mapping. The conceptual model presented applies to both the hydrothermal system and EGS in the Dixie Valley region. FSR Part II presents (1) 278 new gravity stations; (2) enhanced gravity-magnetic modeling; (3) 42 new ambient seismic noise survey stations; (4) an integration of the new seismic noise data with a regional seismic network; (5) a new methodology and approach to interpret this data; (5) a novel method to predict rock type and temperature based on the newly interpreted data; (6) 70 new magnetotelluric (MT) stations; (7) an integrated interpretation of the enhanced MT data set; (8) the results of a 308 station soil CO2 gas survey; (9) new conductive thermal modeling in the project area; (10) new convective modeling in the Calibration Area; (11) pseudo-convective modeling in the Calibration Area; (12) enhanced data implications and qualitative geoscience correlations at three scales (a) Regional, (b) Project, and (c) Calibration Area; (13) quantitative geostatistical exploratory data analysis; and (14) responses to nine questions posed in the proposal for this investigation. Enhanced favorability/trust maps were not generated because there was not a sufficient amount of new, fully-vetted (see below) rock type, temperature, and stress data. The enhanced seismic data did generate a new method to infer rock type and temperature. However, in the opinion of the Principal Investigator for this project, this new methodology needs to be tested and evaluated at other sites in the Basin and Range before it is used to generate the referenced maps. As in the baseline conceptual model, the enhanced findings can be applied to both the hydrothermal

  20. Baseline for Climate Change: Modeling Watershed Aquatic Biodiversity Relative to Environmental and Anthropogenic Factors

    SciTech Connect (OSTI)

    Maurakis, Eugene G

    2010-10-01

    Objectives of the two-year study were to (1) establish baselines for fish and macroinvertebrate community structures in two mid-Atlantic lower Piedmont watersheds (Quantico Creek, a pristine forest watershed; and Cameron Run, an urban watershed, Virginia) that can be used to monitor changes relative to the impacts related to climate change in the future; (2) create mathematical expressions to model fish species richness and diversity, and macroinvertebrate taxa and macroinvertebrate functional feeding group taxa richness and diversity that can serve as a baseline for future comparisons in these and other watersheds in the mid-Atlantic region; and (3) heighten peoples awareness, knowledge and understanding of climate change and impacts on watersheds in a laboratory experience and interactive exhibits, through internship opportunities for undergraduate and graduate students, a week-long teacher workshop, and a website about climate change and watersheds. Mathematical expressions modeled fish and macroinvertebrate richness and diversity accurately well during most of the six thermal seasons where sample sizes were robust. Additionally, hydrologic models provide the basis for estimating flows under varying meteorological conditions and landscape changes. Continuations of long-term studies are requisite for accurately teasing local human influences (e.g. urbanization and watershed alteration) from global anthropogenic impacts (e.g. climate change) on watersheds. Effective and skillful translations (e.g. annual potential exposure of 750,000 people to our inquiry-based laboratory activities and interactive exhibits in Virginia) of results of scientific investigations are valuable ways of communicating information to the general public to enhance their understanding of climate change and its effects in watersheds.

  1. Improved computer models support genetics research

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    February Simple computer models unravel genetic stress reactions in cells Simple computer models unravel genetic stress reactions in cells Integrated biological and...

  2. Development of baseline water quality stormwater detention pond model for Chesapeake Bay catchments

    SciTech Connect (OSTI)

    Musico, W.J.; Yoon, J.

    1999-07-01

    An environmental impact assessment is required for every proposed development in the Commonwealth of Virginia to help identify areas of potential concerns. The purpose of the Chesapeake Bay Local Assistance Department (CBLAD), Guidance Calculation Procedures is to ensure that development of previously constructed areas do not further exacerbate current problems of stormwater-induced eutrophication and downstream flooding. The methodology is based on the post development conditions that will not generate greater peak flows and will result in a 10% overall reduction of total phosphorus. Currently, several well-known models can develop hydrographs and pollutographs that accurately model the real response of a given watershed to any given rainfall event. However, conventional method of achieving the desired peak flow reduction and pollutant removal is not a deterministic procedure, and is inherently a trail and error process. A method of quickly and accurately determining the required size of stormwater easements was developed to evaluate the effectiveness of alternative stormwater collection and treatment systems. In this method, predevelopment conditions were modeled first to estimate the peak flows and subsequent pollutants generation that can be used as a baseline for post development plan. Resulting stormwater easement estimates facilitate decision-making processes during the planning and development phase of a project. The design can be optimized for the minimum cost or the smallest-possible pond size required for peak flow reduction and detention time given the most basic data such as: inflow hydrograph and maximum allowable pond depth.

  3. Parallel computing in enterprise modeling.

    SciTech Connect (OSTI)

    Goldsby, Michael E.; Armstrong, Robert C.; Shneider, Max S.; Vanderveen, Keith; Ray, Jaideep; Heath, Zach; Allan, Benjamin A.

    2008-08-01

    This report presents the results of our efforts to apply high-performance computing to entity-based simulations with a multi-use plugin for parallel computing. We use the term 'Entity-based simulation' to describe a class of simulation which includes both discrete event simulation and agent based simulation. What simulations of this class share, and what differs from more traditional models, is that the result sought is emergent from a large number of contributing entities. Logistic, economic and social simulations are members of this class where things or people are organized or self-organize to produce a solution. Entity-based problems never have an a priori ergodic principle that will greatly simplify calculations. Because the results of entity-based simulations can only be realized at scale, scalable computing is de rigueur for large problems. Having said that, the absence of a spatial organizing principal makes the decomposition of the problem onto processors problematic. In addition, practitioners in this domain commonly use the Java programming language which presents its own problems in a high-performance setting. The plugin we have developed, called the Parallel Particle Data Model, overcomes both of these obstacles and is now being used by two Sandia frameworks: the Decision Analysis Center, and the Seldon social simulation facility. While the ability to engage U.S.-sized problems is now available to the Decision Analysis Center, this plugin is central to the success of Seldon. Because Seldon relies on computationally intensive cognitive sub-models, this work is necessary to achieve the scale necessary for realistic results. With the recent upheavals in the financial markets, and the inscrutability of terrorist activity, this simulation domain will likely need a capability with ever greater fidelity. High-performance computing will play an important part in enabling that greater fidelity.

  4. Modeling of Electric Water Heaters for Demand Response: A Baseline PDE Model

    SciTech Connect (OSTI)

    Xu, Zhijie; Diao, Ruisheng; Lu, Shuai; Lian, Jianming; Zhang, Yu

    2014-09-05

    Demand response (DR)control can effectively relieve balancing and frequency regulation burdens on conventional generators, facilitate integrating more renewable energy, and reduce generation and transmission investments needed to meet peak demands. Electric water heaters (EWHs) have a great potential in implementing DR control strategies because: (a) the EWH power consumption has a high correlation with daily load patterns; (b) they constitute a significant percentage of domestic electrical load; (c) the heating element is a resistor, without reactive power consumption; and (d) they can be used as energy storage devices when needed. Accurately modeling the dynamic behavior of EWHs is essential for designing DR controls. Various water heater models, simplified to different extents, were published in the literature; however, few of them were validated against field measurements, which may result in inaccuracy when implementing DR controls. In this paper, a partial differential equation physics-based model, developed to capture detailed temperature profiles at different tank locations, is validated against field test data for more than 10 days. The developed model shows very good performance in capturing water thermal dynamics for benchmark testing purposes

  5. Renewable Diesel from Algal Lipids: An Integrated Baseline for Cost, Emissions, and Resource Potential from a Harmonized Model

    SciTech Connect (OSTI)

    Davis, R.; Fishman, D.; Frank, E. D.; Wigmosta, M. S.; Aden, A.; Coleman, A. M.; Pienkos, P. T.; Skaggs, R. J.; Venteris, E. R.; Wang, M. Q.

    2012-06-01

    The U.S. Department of Energy's Biomass Program has begun an initiative to obtain consistent quantitative metrics for algal biofuel production to establish an 'integrated baseline' by harmonizing and combining the Program's national resource assessment (RA), techno-economic analysis (TEA), and life-cycle analysis (LCA) models. The baseline attempts to represent a plausible near-term production scenario with freshwater microalgae growth, extraction of lipids, and conversion via hydroprocessing to produce a renewable diesel (RD) blendstock. Differences in the prior TEA and LCA models were reconciled (harmonized) and the RA model was used to prioritize and select the most favorable consortium of sites that supports production of 5 billion gallons per year of RD. Aligning the TEA and LCA models produced slightly higher costs and emissions compared to the pre-harmonized results. However, after then applying the productivities predicted by the RA model (13 g/m2/d on annual average vs. 25 g/m2/d in the original models), the integrated baseline resulted in markedly higher costs and emissions. The relationship between performance (cost and emissions) and either productivity or lipid fraction was found to be non-linear, and important implications on the TEA and LCA results were observed after introducing seasonal variability from the RA model. Increasing productivity and lipid fraction alone was insufficient to achieve cost and emission targets; however, combined with lower energy, less expensive alternative technology scenarios, emissions and costs were substantially reduced.

  6. Development and application of a statistical methodology to evaluate the predictive accuracy of building energy baseline models

    SciTech Connect (OSTI)

    Granderson, Jessica; Price, Phillip N.

    2014-03-01

    This paper documents the development and application of a general statistical methodology to assess the accuracy of baseline energy models, focusing on its application to Measurement and Verification (M&V) of whole-­building energy savings. The methodology complements the principles addressed in resources such as ASHRAE Guideline 14 and the International Performance Measurement and Verification Protocol. It requires fitting a baseline model to data from a ``training period’’ and using the model to predict total electricity consumption during a subsequent ``prediction period.’’ We illustrate the methodology by evaluating five baseline models using data from 29 buildings. The training period and prediction period were varied, and model predictions of daily, weekly, and monthly energy consumption were compared to meter data to determine model accuracy. Several metrics were used to characterize the accuracy of the predictions, and in some cases the best-­performing model as judged by one metric was not the best performer when judged by another metric.

  7. Predictive Capability Maturity Model for computational modeling...

    Office of Scientific and Technical Information (OSTI)

    Sponsoring Org: USDOE Country of Publication: United States Language: English Subject: 97 MATHEMATICAL METHODS AND COMPUTING; 99 GENERAL AND MISCELLANEOUSMATHEMATICS, COMPUTING, ...

  8. Cupola Furnace Computer Process Model

    SciTech Connect (OSTI)

    Seymour Katz

    2004-12-31

    The cupola furnace generates more than 50% of the liquid iron used to produce the 9+ million tons of castings annually. The cupola converts iron and steel into cast iron. The main advantages of the cupola furnace are lower energy costs than those of competing furnaces (electric) and the ability to melt less expensive metallic scrap than the competing furnaces. However the chemical and physical processes that take place in the cupola furnace are highly complex making it difficult to operate the furnace in optimal fashion. The results are low energy efficiency and poor recovery of important and expensive alloy elements due to oxidation. Between 1990 and 2004 under the auspices of the Department of Energy, the American Foundry Society and General Motors Corp. a computer simulation of the cupola furnace was developed that accurately describes the complex behavior of the furnace. When provided with the furnace input conditions the model provides accurate values of the output conditions in a matter of seconds. It also provides key diagnostics. Using clues from the diagnostics a trained specialist can infer changes in the operation that will move the system toward higher efficiency. Repeating the process in an iterative fashion leads to near optimum operating conditions with just a few iterations. More advanced uses of the program have been examined. The program is currently being combined with an ''Expert System'' to permit optimization in real time. The program has been combined with ''neural network'' programs to affect very easy scanning of a wide range of furnace operation. Rudimentary efforts were successfully made to operate the furnace using a computer. References to these more advanced systems will be found in the ''Cupola Handbook''. Chapter 27, American Foundry Society, Des Plaines, IL (1999).

  9. Commercial Building Energy Baseline Modeling Software: Performance Metrics and Method Testing with Open Source Models and Implications for Proprietary Software Testing

    SciTech Connect (OSTI)

    Price, Phillip N.; Granderson, Jessica; Sohn, Michael; Addy, Nathan; Jump, David

    2013-09-01

    The overarching goal of this work is to advance the capabilities of technology evaluators in evaluating the building-level baseline modeling capabilities of Energy Management and Information System (EMIS) software. Through their customer engagement platforms and products, EMIS software products have the potential to produce whole-building energy savings through multiple strategies: building system operation improvements, equipment efficiency upgrades and replacements, and inducement of behavioral change among the occupants and operations personnel. Some offerings may also automate the quantification of whole-building energy savings, relative to a baseline period, using empirical models that relate energy consumption to key influencing parameters, such as ambient weather conditions and building operation schedule. These automated baseline models can be used to streamline the whole-building measurement and verification (M&V) process, and therefore are of critical importance in the context of multi-measure whole-building focused utility efficiency programs. This report documents the findings of a study that was conducted to begin answering critical questions regarding quantification of savings at the whole-building level, and the use of automated and commercial software tools. To evaluate the modeling capabilities of EMIS software particular to the use case of whole-building savings estimation, four research questions were addressed: 1. What is a general methodology that can be used to evaluate baseline model performance, both in terms of a) overall robustness, and b) relative to other models? 2. How can that general methodology be applied to evaluate proprietary models that are embedded in commercial EMIS tools? How might one handle practical issues associated with data security, intellectual property, appropriate testing ‘blinds’, and large data sets? 3. How can buildings be pre-screened to identify those that are the most model-predictable, and therefore those

  10. Computable General Equilibrium Models for Sustainability Impact...

    Open Energy Info (EERE)

    Publications, Softwaremodeling tools User Interface: Other Website: iatools.jrc.ec.europa.eudocsecolecon2006.pdf Computable General Equilibrium Models for Sustainability...

  11. Climate Modeling using High-Performance Computing

    SciTech Connect (OSTI)

    Mirin, A A

    2007-02-05

    The Center for Applied Scientific Computing (CASC) and the LLNL Climate and Carbon Science Group of Energy and Environment (E and E) are working together to improve predictions of future climate by applying the best available computational methods and computer resources to this problem. Over the last decade, researchers at the Lawrence Livermore National Laboratory (LLNL) have developed a number of climate models that provide state-of-the-art simulations on a wide variety of massively parallel computers. We are now developing and applying a second generation of high-performance climate models. Through the addition of relevant physical processes, we are developing an earth systems modeling capability as well.

  12. Improved computer models support genetics research

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    February » Simple computer models unravel genetic stress reactions in cells Simple computer models unravel genetic stress reactions in cells Integrated biological and computational methods provide insight into why genes are activated. February 8, 2013 When complete, these barriers will be a portion of the NMSSUP upgrade. This molecular structure depicts a yeast transfer ribonucleic acid (tRNA), which carries a single amino acid to the ribosome during protein construction. A combined

  13. Improved computer models support genetics research

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Simple computer models unravel genetic stress reactions in cells Simple computer models unravel genetic stress reactions in cells Integrated biological and computational methods provide insight into why genes are activated. February 8, 2013 When complete, these barriers will be a portion of the NMSSUP upgrade. This molecular structure depicts a yeast transfer ribonucleic acid (tRNA), which carries a single amino acid to the ribosome during protein construction. A combined experimental and

  14. Low Mach Number Models in Computational Astrophysics

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Ann Almgren Low Mach Number Models in Computational Astrophysics February 4, 2014 Ann Almgren. Berkeley Lab Downloads Almgren-nug2014.pdf | Adobe Acrobat PDF file Low Mach Number Models in Computational Astrophysics - Ann Almgren, Berkeley Lab Last edited: 2016-04-29 11:34:50

  15. Appendix A - GPRA06 benefits estimates: MARKAL and NEMS model baseline cases

    SciTech Connect (OSTI)

    None, None

    2009-01-18

    NEMS is an integrated energy model of the U.S. energy system developed by the Energy Information Administration (EIA) for forecasting and policy analysis purposes.

  16. LANL computer model boosts engine efficiency

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    LANL computer model boosts engine efficiency LANL computer model boosts engine efficiency The KIVA model has been instrumental in helping researchers and manufacturers understand combustion processes, accelerate engine development and improve engine design and efficiency. September 25, 2012 KIVA simulation of an experimental engine with DOHC quasi-symmetric pent-roof combustion chamber and 4 valves. KIVA simulation of an experimental engine with DOHC quasi-symmetric pent-roof combustion chamber

  17. Section 23: Models and Computer Codes

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Application-2014 for the Waste Isolation Pilot Plant Models and Computer Codes (40 CFR § 194.23) United States Department of Energy Waste Isolation Pilot Plant Carlsbad Field Office Carlsbad, New Mexico Compliance Recertification Application 2014 Models and Computer Codes (40 CFR § 194.23) Table of Contents 23.0 Models and Computer Codes (40 CFR § 194.23) 23.1 Requirements 23.2 40 CFR § 194.23(a)(1) 23.2.1 Background 23.2.2 1998 Certification Decision 23.2.3 Changes in the CRA-2004 23.2.4

  18. Computer Model Buildings Contaminated with Radioactive Material

    Energy Science and Technology Software Center (OSTI)

    1998-05-19

    The RESRAD-BUILD computer code is a pathway analysis model designed to evaluate the potential radiological dose incurred by an individual who works or lives in a building contaminated with radioactive material.

  19. Significant Enhancement of Computational Efficiency in Nonlinear Multiscale Battery Model for Computer Aided Engineering

    SciTech Connect (OSTI)

    Smith, Kandler; Graf, Peter; Jun, Myungsoo; Yang, Chuanbo; Li, Genong; Li, Shaoping; Hochman, Amit; Tselepidakis, Dimitrios

    2015-06-09

    This presentation provides an update on improvements in computational efficiency in a nonlinear multiscale battery model for computer aided engineering.

  20. Estimating Demand Response Load Impacts: Evaluation of BaselineLoad Models for Non-Residential Buildings in California

    SciTech Connect (OSTI)

    Coughlin, Katie; Piette, Mary Ann; Goldman, Charles; Kiliccote,Sila

    2008-01-01

    Both Federal and California state policymakers areincreasingly interested in developing more standardized and consistentapproaches to estimate and verify the load impacts of demand responseprograms and dynamic pricing tariffs. This study describes a statisticalanalysis of the performance of different models used to calculate thebaseline electric load for commercial buildings participating in ademand-response (DR) program, with emphasis onthe importance of weathereffects. During a DR event, a variety of adjustments may be made tobuilding operation, with the goal of reducing the building peak electricload. In order to determine the actual peak load reduction, an estimateof what the load would have been on the day of the event without any DRactions is needed. This baseline load profile (BLP) is key to accuratelyassessing the load impacts from event-based DR programs and may alsoimpact payment settlements for certain types of DR programs. We testedseven baseline models on a sample of 33 buildings located in California.These models can be loosely categorized into two groups: (1) averagingmethods, which use some linear combination of hourly load values fromprevious days to predict the load on the event, and (2) explicit weathermodels, which use a formula based on local hourly temperature to predictthe load. The models were tested both with and without morningadjustments, which use data from the day of the event to adjust theestimated BLP up or down.Key findings from this study are: - The accuracyof the BLP model currently used by California utilities to estimate loadreductions in several DR programs (i.e., hourly usage in highest 3 out of10 previous days) could be improved substantially if a morning adjustmentfactor were applied for weather-sensitive commercial and institutionalbuildings. - Applying a morning adjustment factor significantly reducesthe bias and improves the accuracy of all BLP models examined in oursample of buildings. - For buildings with low load

  1. Modeling spatial patterns in soil arsenic to estimate natural baseline concentrations

    SciTech Connect (OSTI)

    Venteris, Erik R.; Basta, Nicolas T.; Bigham, Jerry M.; Rea, Ron

    2014-05-09

    ABSTRACT Arsenic in soil is an important public health concern. Toxicity guidelines and models based on laboratory studies (i.e., U.S. EPA’s Integrated Risk Information System) should consider natural soil As concentrations to avoid unnecessary remediation burdens on society. We used soil and stream sediment samples from the USGS National Geochemical Survey database to assess the spatial distribution of natural As in a 1.16E+5 km2 area. Samples were collected at 348 soil and 144 stream locations, providing approximately one sample for every 290 km2. Sample sites were selected to minimize the potential influence of anthropogenic inputs. Samples were processed using acid digestion of whole samples (concentrated HCl and ascorbic acid) and concentrations were measured using hydride-generation atomic absorption spectrometry. Soil As ranged from 2.0 to 45.6 mg kg-1. Geostatistical techniques were used to model and map the spatial variability of As. The mean and variance at unsampled locations were estimated using sequential Gaussian simulation. Five areas of elevated concentration (> the median of 10 mg kg-1) were identified and the relationships to geologic parent materials, glacial sedimentation patterns, and soil conditions interpreted. Our results showed As concentrations >10 mg kg-1 were common, and >20 mg kg-1 were not unusual for the central and west central portions of Ohio (USA). In contrast, concentrations <4 mg kg-1 were rare. Measured concentrations typically exceeded the soil As human generic screening levels of 0.39 mg/kg (1); the calculated value that corresponds to a cancer risk level of 1 in 1,000,000 for soil ingestion. Because the As content of Ohio soils is similar to many world soils, the USEPA generic soil screening level of 0.39 mg/kg is of little utility. A more useful and practical approach would be the uses of natural background levels. Regional soil As patterns based on geology and biogeochemistry and not political boundaries should be used

  2. Baseline design/economics for advanced Fischer-Tropsch technology

    SciTech Connect (OSTI)

    Not Available

    1992-04-27

    The objectives of the study are to: Develop a baseline design for indirect liquefaction using advanced Fischer-Tropsch (F-T) technology. Prepare the capital and operating costs for the baseline design. Develop a process flowsheet simulation (PFS) model. The baseline design, the economic analysis, and the computer model will be the major research planning tools that Pittsburgh Energy Technology Center will use to plan, guide, and evaluate its ongoing and future research and commercialization programs relating to indirect coal liquefaction for the manufacture of synthetic liquid fuels from coal.

  3. CDF computing and event data models

    SciTech Connect (OSTI)

    Snider, F.D.; /Fermilab

    2005-12-01

    The authors discuss the computing systems, usage patterns and event data models used to analyze Run II data from the CDF-II experiment at the Tevatron collider. A critical analysis of the current implementation and design reveals some of the stronger and weaker elements of the system, which serve as lessons for future experiments. They highlight a need to maintain simplicity for users in the face of an increasingly complex computing environment.

  4. Hydropower Baseline Cost Modeling

    SciTech Connect (OSTI)

    O'Connor, Patrick W.; Zhang, Qin Fen; DeNeale, Scott T.; Chalise, Dol Raj; Centurion, Emma E.

    2015-01-01

    Recent resource assessments conducted by the United States Department of Energy have identified significant opportunities for expanding hydropower generation through the addition of power to non-powered dams and on undeveloped stream-reaches. Additional interest exists in the powering of existing water resource infrastructure such as conduits and canals, upgrading and expanding existing hydropower facilities, and the construction new pumped storage hydropower. Understanding the potential future role of these hydropower resources in the nation’s energy system requires an assessment of the environmental and techno-economic issues associated with expanding hydropower generation. To facilitate these assessments, this report seeks to fill the current gaps in publically available hydropower cost-estimating tools that can support the national-scale evaluation of hydropower resources.

  5. Computational Tools for Predictive Modeling of Properties in...

    Office of Scientific and Technical Information (OSTI)

    Book: Computational Tools for Predictive Modeling of Properties in Complex Actinide Systems Citation Details In-Document Search Title: Computational Tools for Predictive Modeling ...

  6. Computational Fluid Dynamics Modeling of Diesel Engine Combustion...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Computational Fluid Dynamics Modeling of Diesel Engine Combustion and Emissions Computational Fluid Dynamics Modeling of Diesel Engine Combustion and Emissions 2005 Diesel Engine ...

  7. MaRIE theory, modeling and computation roadmap executive summary...

    Office of Scientific and Technical Information (OSTI)

    Conference: MaRIE theory, modeling and computation roadmap executive summary Citation Details In-Document Search Title: MaRIE theory, modeling and computation roadmap executive ...

  8. Computer Modeling of Chemical and Geochemical Processes in High...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Computer modeling of chemical and geochemical processes in high ionic strength solutions ... in brine Computer modeling of chemical and geochemical processes in high ionic ...

  9. Towards a Computational Model of a Methane Producing Archaeum...

    Office of Scientific and Technical Information (OSTI)

    Towards a Computational Model of a Methane Producing Archaeum Citation Details In-Document Search Title: Towards a Computational Model of a Methane Producing Archaeum Authors: ...

  10. Computationally Efficient Modeling of High-Efficiency Clean Combustion...

    Broader source: Energy.gov (indexed) [DOE]

    More Documents & Publications Computationally Efficient Modeling of High-Efficiency Clean Combustion Engines Computationally Efficient Modeling of High-Efficiency Clean Combustion ...

  11. Computer modeling of the global warming effect

    SciTech Connect (OSTI)

    Washington, W.M.

    1993-12-31

    The state of knowledge of global warming will be presented and two aspects examined: observational evidence and a review of the state of computer modeling of climate change due to anthropogenic increases in greenhouse gases. Observational evidence, indeed, shows global warming, but it is difficult to prove that the changes are unequivocally due to the greenhouse-gas effect. Although observational measurements of global warming are subject to ``correction,`` researchers are showing consistent patterns in their interpretation of the data. Since the 1960s, climate scientists have been making their computer models of the climate system more realistic. Models started as atmospheric models and, through the addition of oceans, surface hydrology, and sea-ice components, they then became climate-system models. Because of computer limitations and the limited understanding of the degree of interaction of the various components, present models require substantial simplification. Nevertheless, in their present state of development climate models can reproduce most of the observed large-scale features of the real system, such as wind, temperature, precipitation, ocean current, and sea-ice distribution. The use of supercomputers to advance the spatial resolution and realism of earth-system models will also be discussed.

  12. Significant Enhancement of Computational Efficiency in Nonlinear Multiscale Battery Model for Computer Aided Engineering (Presentation)

    SciTech Connect (OSTI)

    Kim, G.; Pesaran, A.; Smith, K.; Graf, P.; Jun, M.; Yang, C.; Li, G.; Li, S.; Hochman, A.; Tselepidakis, D.; White, J.

    2014-06-01

    This presentation discusses the significant enhancement of computational efficiency in nonlinear multiscale battery model for computer aided engineering in current research at NREL.

  13. Wild Fire Computer Model Helps Firefighters

    ScienceCinema (OSTI)

    Canfield, Jesse

    2014-06-02

    A high-tech computer model called HIGRAD/FIRETEC, the cornerstone of a collaborative effort between U.S. Forest Service Rocky Mountain Research Station and Los Alamos National Laboratory, provides insights that are essential for front-line fire fighters. The science team is looking into levels of bark beetle-induced conditions that lead to drastic changes in fire behavior and how variable or erratic the behavior is likely to be.

  14. COMPUTATIONAL MODELING OF CIRCULATING FLUIDIZED BED REACTORS

    SciTech Connect (OSTI)

    Ibrahim, Essam A

    2013-01-09

    Details of numerical simulations of two-phase gas-solid turbulent flow in the riser section of Circulating Fluidized Bed Reactor (CFBR) using Computational Fluid Dynamics (CFD) technique are reported. Two CFBR riser configurations are considered and modeled. Each of these two riser models consist of inlet, exit, connecting elbows and a main pipe. Both riser configurations are cylindrical and have the same diameter but differ in their inlet lengths and main pipe height to enable investigation of riser geometrical scaling effects. In addition, two types of solid particles are exploited in the solid phase of the two-phase gas-solid riser flow simulations to study the influence of solid loading ratio on flow patterns. The gaseous phase in the two-phase flow is represented by standard atmospheric air. The CFD-based FLUENT software is employed to obtain steady state and transient solutions for flow modulations in the riser. The physical dimensions, types and numbers of computation meshes, and solution methodology utilized in the present work are stated. Flow parameters, such as static and dynamic pressure, species velocity, and volume fractions are monitored and analyzed. The differences in the computational results between the two models, under steady and transient conditions, are compared, contrasted, and discussed.

  15. BASELINE DESIGN/ECONOMICS FOR ADVANCED FISCHER-TROPSCH TECHNOLOGY

    SciTech Connect (OSTI)

    1998-04-01

    Bechtel, along with Amoco as the main subcontractor, developed a Baseline design, two alternative designs, and computer process simulation models for indirect coal liquefaction based on advanced Fischer-Tropsch (F-T) technology for the U. S. Department of Energy's (DOE's) Federal Energy Technology Center (FETC).

  16. Computational social dynamic modeling of group recruitment.

    SciTech Connect (OSTI)

    Berry, Nina M.; Lee, Marinna; Pickett, Marc; Turnley, Jessica Glicken; Smrcka, Julianne D.; Ko, Teresa H.; Moy, Timothy David; Wu, Benjamin C.

    2004-01-01

    The Seldon software toolkit combines concepts from agent-based modeling and social science to create a computationally social dynamic model for group recruitment. The underlying recruitment model is based on a unique three-level hybrid agent-based architecture that contains simple agents (level one), abstract agents (level two), and cognitive agents (level three). This uniqueness of this architecture begins with abstract agents that permit the model to include social concepts (gang) or institutional concepts (school) into a typical software simulation environment. The future addition of cognitive agents to the recruitment model will provide a unique entity that does not exist in any agent-based modeling toolkits to date. We use social networks to provide an integrated mesh within and between the different levels. This Java based toolkit is used to analyze different social concepts based on initialization input from the user. The input alters a set of parameters used to influence the values associated with the simple agents, abstract agents, and the interactions (simple agent-simple agent or simple agent-abstract agent) between these entities. The results of phase-1 Seldon toolkit provide insight into how certain social concepts apply to different scenario development for inner city gang recruitment.

  17. NASA technical baseline

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    ... Twitter Google + Vimeo GovDelivery SlideShare SunShot Grand Challenge: Regional Test Centers NASA technical baseline HomeTag:NASA technical baseline Curiosity's multi-mission ...

  18. Computational models of intergroup competition and warfare.

    SciTech Connect (OSTI)

    Letendre, Kenneth; Abbott, Robert G.

    2011-11-01

    This document reports on the research of Kenneth Letendre, the recipient of a Sandia Graduate Research Fellowship at the University of New Mexico. Warfare is an extreme form of intergroup competition in which individuals make extreme sacrifices for the benefit of their nation or other group to which they belong. Among animals, limited, non-lethal competition is the norm. It is not fully understood what factors lead to warfare. We studied the global variation in the frequency of civil conflict among countries of the world, and its positive association with variation in the intensity of infectious disease. We demonstrated that the burden of human infectious disease importantly predicts the frequency of civil conflict and tested a causal model for this association based on the parasite-stress theory of sociality. We also investigated the organization of social foraging by colonies of harvester ants in the genus Pogonomyrmex, using both field studies and computer models.

  19. ONSET OF CHAOS IN A MODEL OF QUANTUM COMPUTATION (Conference...

    Office of Scientific and Technical Information (OSTI)

    Clearly, if this happens in a quantum computer, it may lead to a destruction of the ... Numerical analysis 2 of a simplest model of quantum computer (2D model of 12-spins with ...

  20. Modeling-Computer Simulations At Northern Basin & Range Region...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Northern Basin & Range Region (Pritchett, 2004) Exploration Activity...

  1. Modeling-Computer Simulations At Central Nevada Seismic Zone...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Central Nevada Seismic Zone Region (Pritchett, 2004) Exploration...

  2. Modeling-Computer Simulations At Geysers Area (Goff & Decker...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Geysers Area (Goff & Decker, 1983) Exploration Activity Details...

  3. Modeling-Computer Simulations At Dixie Valley Geothermal Area...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Dixie Valley Geothermal Area (Wisian & Blackwell, 2004) Exploration...

  4. Modeling-Computer Simulations At Raft River Geothermal Area ...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Raft River Geothermal Area (1980) Exploration Activity Details...

  5. Modeling-Computer Simulations (Lewicki & Oldenburg, 2004) | Open...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations (Lewicki & Oldenburg, 2004) Exploration Activity Details Location...

  6. Modeling-Computer Simulations At Desert Peak Area (Wisian & Blackwell...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Desert Peak Area (Wisian & Blackwell, 2004) Exploration Activity...

  7. Modeling-Computer Simulations (Combs, Et Al., 1999) | Open Energy...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations (Combs, Et Al., 1999) Exploration Activity Details Location Unspecified...

  8. Modeling-Computer Simulations At Yellowstone Region (Laney, 2005...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Yellowstone Region (Laney, 2005) Exploration Activity Details Location...

  9. Modeling-Computer Simulations At Raft River Geothermal Area ...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Raft River Geothermal Area (1979) Exploration Activity Details...

  10. Modeling-Computer Simulations At Raft River Geothermal Area ...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Raft River Geothermal Area (1977) Exploration Activity Details...

  11. Modeling-Computer Simulations (Ozkocak, 1985) | Open Energy Informatio...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations (Ozkocak, 1985) Exploration Activity Details Location Unspecified...

  12. Modeling-Computer Simulations At White Mountains Area (Goff ...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At White Mountains Area (Goff & Decker, 1983) Exploration Activity...

  13. Modeling-Computer Simulations At Stillwater Area (Wisian & Blackwell...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Stillwater Area (Wisian & Blackwell, 2004) Exploration Activity...

  14. Modeling-Computer Simulations At Valles Caldera - Redondo Geothermal...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Valles Caldera - Redondo Geothermal Area (Wilt & Haar, 1986)...

  15. Modeling-Computer Simulations At Dixie Valley Geothermal Area...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Dixie Valley Geothermal Area (Kennedy & Soest, 2006) Exploration...

  16. Modeling-Computer Simulations (Ranalli & Rybach, 2005) | Open...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations (Ranalli & Rybach, 2005) Exploration Activity Details Location...

  17. Modeling-Computer Simulations At Raft River Geothermal Area ...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Raft River Geothermal Area (1983) Exploration Activity Details...

  18. Preliminary Phase Field Computational Model Development

    SciTech Connect (OSTI)

    Li, Yulan; Hu, Shenyang Y.; Xu, Ke; Suter, Jonathan D.; McCloy, John S.; Johnson, Bradley R.; Ramuhalli, Pradeep

    2014-12-15

    This interim report presents progress towards the development of meso-scale models of magnetic behavior that incorporate microstructural information. Modeling magnetic signatures in irradiated materials with complex microstructures (such as structural steels) is a significant challenge. The complexity is addressed incrementally, using the monocrystalline Fe (i.e., ferrite) film as model systems to develop and validate initial models, followed by polycrystalline Fe films, and by more complicated and representative alloys. In addition, the modeling incrementally addresses inclusion of other major phases (e.g., martensite, austenite), minor magnetic phases (e.g., carbides, FeCr precipitates), and minor nonmagnetic phases (e.g., Cu precipitates, voids). The focus of the magnetic modeling is on phase-field models. The models are based on the numerical solution to the Landau-Lifshitz-Gilbert equation. From the computational standpoint, phase-field modeling allows the simulation of large enough systems that relevant defect structures and their effects on functional properties like magnetism can be simulated. To date, two phase-field models have been generated in support of this work. First, a bulk iron model with periodic boundary conditions was generated as a proof-of-concept to investigate major loop effects of single versus polycrystalline bulk iron and effects of single non-magnetic defects. More recently, to support the experimental program herein using iron thin films, a new model was generated that uses finite boundary conditions representing surfaces and edges. This model has provided key insights into the domain structures observed in magnetic force microscopy (MFM) measurements. Simulation results for single crystal thin-film iron indicate the feasibility of the model for determining magnetic domain wall thickness and mobility in an externally applied field. Because the phase-field model dimensions are limited relative to the size of most specimens used in

  19. Hazard Baseline Documentation

    Broader source: Directives, Delegations, and Requirements [Office of Management (MA)]

    1995-12-04

    This standard establishes uniform Office of Environmental Management (EM) guidance on hazard baseline documents that identify and control radiological and non-radiological hazards for all EM facilities.

  20. Computable General Equilibrium Model Fiscal Year 2013 Capability Development Report - April 2014

    SciTech Connect (OSTI)

    Edwards, Brian Keith; Rivera, Michael K.; Boero, Riccardo

    2014-04-01

    This report documents progress made on continued developments of the National Infrastructure Simulation and Analysis Center (NISAC) Computable General Equilibrium Model (NCGEM), developed in fiscal year 2012. In fiscal year 2013, NISAC the treatment of the labor market and tests performed with the model to examine the properties of the solutions computed by the model. To examine these, developers conducted a series of 20 simulations for 20 U.S. States. Each of these simulations compared an economic baseline simulation with an alternative simulation that assumed a 20-percent reduction in overall factor productivity in the manufacturing industries of each State. Differences in the simulation results between the baseline and alternative simulations capture the economic impact of the reduction in factor productivity. While not every State is affected in precisely the same way, the reduction in manufacturing industry productivity negatively affects the manufacturing industries in each State to an extent proportional to the reduction in overall factor productivity. Moreover, overall economic activity decreases when manufacturing sector productivity is reduced. Developers ran two additional simulations: (1) a version of the model for the State of Michigan, with manufacturing divided into two sub-industries (automobile and other vehicle manufacturing as one sub-industry and the rest of manufacturing as the other subindustry); and (2) a version of the model for the United States, divided into 30 industries. NISAC conducted these simulations to illustrate the flexibility of industry definitions in NCGEM and to examine the simulation properties of in more detail.

  1. Predictive Capability Maturity Model for computational modeling and simulation.

    SciTech Connect (OSTI)

    Oberkampf, William Louis; Trucano, Timothy Guy; Pilch, Martin M.

    2007-10-01

    The Predictive Capability Maturity Model (PCMM) is a new model that can be used to assess the level of maturity of computational modeling and simulation (M&S) efforts. The development of the model is based on both the authors experience and their analysis of similar investigations in the past. The perspective taken in this report is one of judging the usefulness of a predictive capability that relies on the numerical solution to partial differential equations to better inform and improve decision making. The review of past investigations, such as the Software Engineering Institute's Capability Maturity Model Integration and the National Aeronautics and Space Administration and Department of Defense Technology Readiness Levels, indicates that a more restricted, more interpretable method is needed to assess the maturity of an M&S effort. The PCMM addresses six contributing elements to M&S: (1) representation and geometric fidelity, (2) physics and material model fidelity, (3) code verification, (4) solution verification, (5) model validation, and (6) uncertainty quantification and sensitivity analysis. For each of these elements, attributes are identified that characterize four increasing levels of maturity. Importantly, the PCMM is a structured method for assessing the maturity of an M&S effort that is directed toward an engineering application of interest. The PCMM does not assess whether the M&S effort, the accuracy of the predictions, or the performance of the engineering system satisfies or does not satisfy specified application requirements.

  2. Review of computational thermal-hydraulic modeling

    SciTech Connect (OSTI)

    Keefer, R.H.; Keeton, L.W.

    1995-12-31

    Corrosion of heat transfer tubing in nuclear steam generators has been a persistent problem in the power generation industry, assuming many different forms over the years depending on chemistry and operating conditions. Whatever the corrosion mechanism, a fundamental understanding of the process is essential to establish effective management strategies. To gain this fundamental understanding requires an integrated investigative approach that merges technology from many diverse scientific disciplines. An important aspect of an integrated approach is characterization of the corrosive environment at high temperature. This begins with a thorough understanding of local thermal-hydraulic conditions, since they affect deposit formation, chemical concentration, and ultimately corrosion. Computational Fluid Dynamics (CFD) can and should play an important role in characterizing the thermal-hydraulic environment and in predicting the consequences of that environment,. The evolution of CFD technology now allows accurate calculation of steam generator thermal-hydraulic conditions and the resulting sludge deposit profiles. Similar calculations are also possible for model boilers, so that tests can be designed to be prototypic of the heat exchanger environment they are supposed to simulate. This paper illustrates the utility of CFD technology by way of examples in each of these two areas. This technology can be further extended to produce more detailed local calculations of the chemical environment in support plate crevices, beneath thick deposits on tubes, and deep in tubesheet sludge piles. Knowledge of this local chemical environment will provide the foundation for development of mechanistic corrosion models, which can be used to optimize inspection and cleaning schedules and focus the search for a viable fix.

  3. Grocery 2009 TSD Miami Baseline | Open Energy Information

    Open Energy Info (EERE)

    Jump to: navigation, search Model Name Grocery 2009 TSD Miami Baseline Building Type Food Sales Model Type Baseline Model Target Type ASHRAE 90.1 2004 Model Year 2009 IDF file...

  4. Grocery 2009 TSD Chicago Baseline | Open Energy Information

    Open Energy Info (EERE)

    Jump to: navigation, search Model Name Grocery 2009 TSD Chicago Baseline Building Type Food Sales Model Type Baseline Model Target Type ASHRAE 90.1 2004 Model Year 2009 IDF file...

  5. Modeling of Geothermal Reservoirs: Fundamental Processes, Computer...

    Open Energy Info (EERE)

    of Geothermal Reservoirs: Fundamental Processes, Computer Simulation and Field Applications Jump to: navigation, search OpenEI Reference LibraryAdd to library Journal Article:...

  6. Modeling-Computer Simulations At Dixie Valley Geothermal Area...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Dixie Valley Geothermal Area (Wannamaker, Et Al., 2006) Exploration...

  7. Modeling-Computer Simulations At Obsidian Cliff Area (Hulen,...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Obsidian Cliff Area (Hulen, Et Al., 2003) Exploration Activity Details...

  8. Modeling-Computer Simulations At Walker-Lane Transitional Zone...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Walker-Lane Transitional Zone Region (Laney, 2005) Exploration...

  9. Modeling-Computer Simulations At Valles Caldera - Redondo Geothermal...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Valles Caldera - Redondo Geothermal Area (Roberts, Et Al., 1995)...

  10. Modeling-Computer Simulations At Long Valley Caldera Geothermal...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Long Valley Caldera Geothermal Area (Pribnow, Et Al., 2003)...

  11. Modeling-Computer Simulations At Hawthorne Area (Lazaro, Et Al...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Hawthorne Area (Lazaro, Et Al., 2010) Exploration Activity Details...

  12. Modeling-Computer Simulations At Walker-Lane Transitional Zone...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Walker-Lane Transitional Zone Region (Pritchett, 2004) Exploration...

  13. Modeling-Computer Simulations At Fenton Hill HDR Geothermal Area...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Fenton Hill HDR Geothermal Area (Brown & DuTeaux, 1997) Exploration...

  14. Modeling-Computer Simulations At Coso Geothermal Area (1980)...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Coso Geothermal Area (1980) Exploration Activity Details Location Coso...

  15. Modeling-Computer Simulations At Long Valley Caldera Geothermal...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Long Valley Caldera Geothermal Area (Newman, Et Al., 2006) Exploration...

  16. Scientists use world's fastest computer to model materials under...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Materials under extreme conditions Scientists use world's fastest computer to model materials under extreme conditions Materials scientists are for the first time attempting to...

  17. Modeling-Computer Simulations At The Needles Area (Bell & Ramelli...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At The Needles Area (Bell & Ramelli, 2009) Exploration Activity Details...

  18. Modeling-Computer Simulations At Fenton Hill HDR Geothermal Area...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Fenton Hill HDR Geothermal Area (Goff & Decker, 1983) Exploration...

  19. Modeling-Computer Simulations At Long Valley Caldera Geothermal...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Long Valley Caldera Geothermal Area (Farrar, Et Al., 2003) Exploration...

  20. Modeling-Computer Simulations At Central Nevada Seismic Zone...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Central Nevada Seismic Zone Region (Biasi, Et Al., 2009) Exploration...

  1. Modeling-Computer Simulations At Valles Caldera - Sulphur Springs...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Valles Caldera - Sulphur Springs Geothermal Area (Roberts, Et Al.,...

  2. Modeling-Computer Simulations At Nw Basin & Range Region (Pritchett...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Nw Basin & Range Region (Pritchett, 2004) Exploration Activity Details...

  3. LANL researchers use computer modeling to study HIV | National...

    National Nuclear Security Administration (NNSA)

    researchers use computer modeling to study HIV | National Nuclear Security Administration Facebook Twitter Youtube Flickr RSS People Mission Managing the Stockpile Preventing...

  4. Modeling-Computer Simulations At Long Valley Caldera Geothermal...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Long Valley Caldera Geothermal Area (Tempel, Et Al., 2011) Exploration...

  5. Modeling-Computer Simulations At Nw Basin & Range Region (Biasi...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Nw Basin & Range Region (Biasi, Et Al., 2009) Exploration Activity...

  6. Modeling-Computer Simulations At Coso Geothermal Area (2000)...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Coso Geothermal Area (2000) Exploration Activity Details Location Coso...

  7. Modeling-Computer Simulations At Northern Basin & Range Region...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Northern Basin & Range Region (Biasi, Et Al., 2009) Exploration...

  8. Modeling-Computer Simulations At Valles Caldera - Sulphur Springs...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Valles Caldera - Sulphur Springs Geothermal Area (Wilt & Haar, 1986)...

  9. Modeling-Computer Simulations At Akutan Fumaroles Area (Kolker...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Akutan Fumaroles Area (Kolker, Et Al., 2010) Exploration Activity...

  10. Modeling-Computer Simulations At Walker-Lane Transitional Zone...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Walker-Lane Transitional Zone Region (Biasi, Et Al., 2009) Exploration...

  11. Modeling-Computer Simulations At Coso Geothermal Area (1999)...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Coso Geothermal Area (1999) Exploration Activity Details Location Coso...

  12. Modeling-Computer Simulations At Fish Lake Valley Area (Deymonaz...

    Open Energy Info (EERE)

    Fish Lake Valley Area (Deymonaz, Et Al., 2008) Jump to: navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Fish Lake Valley...

  13. Modeling-Computer Simulations At Nevada Test And Training Range...

    Open Energy Info (EERE)

    Nevada Test And Training Range Area (Sabin, Et Al., 2004) Jump to: navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Nevada...

  14. Martin Karplus and Computer Modeling for Chemical Systems

    Office of Scientific and Technical Information (OSTI)

    Information Additional information about Martin Karplus, computer modeling, and chemical systems is available in electronic documents and on the Web. Documents: Comparison of 3D...

  15. New partnership uses advanced computer science modeling to address...

    National Nuclear Security Administration (NNSA)

    New partnership uses advanced computer science modeling to address climate change Friday, August 29, 2014 - 10:26am Several national laboratories and institutions have joined ...

  16. Baseline design/economics for advanced Fischer-Tropsch technology. Quarterly report, October--December 1991

    SciTech Connect (OSTI)

    Not Available

    1992-04-27

    The objectives of the study are to: Develop a baseline design for indirect liquefaction using advanced Fischer-Tropsch (F-T) technology. Prepare the capital and operating costs for the baseline design. Develop a process flowsheet simulation (PFS) model. The baseline design, the economic analysis, and the computer model will be the major research planning tools that Pittsburgh Energy Technology Center will use to plan, guide, and evaluate its ongoing and future research and commercialization programs relating to indirect coal liquefaction for the manufacture of synthetic liquid fuels from coal.

  17. Unsolicited Projects in 2012: Research in Computer Architecture, Modeling,

    Office of Science (SC) Website

    and Evolving MPI for Exascale | U.S. DOE Office of Science (SC) 2: Research in Computer Architecture, Modeling, and Evolving MPI for Exascale Advanced Scientific Computing Research (ASCR) ASCR Home About Research Applied Mathematics Computer Science Exascale Tools Workshop Programming Challenges Workshop Architectures I Workshop External link Architectures II Workshop External link Next Generation Networking Scientific Discovery through Advanced Computing (SciDAC) ASCR SBIR-STTR Facilities

  18. Ambient temperature modelling with soft computing techniques

    SciTech Connect (OSTI)

    Bertini, Ilaria; Ceravolo, Francesco; Citterio, Marco; Di Pietra, Biagio; Margiotta, Francesca; Pizzuti, Stefano; Puglisi, Giovanni; De Felice, Matteo

    2010-07-15

    This paper proposes a hybrid approach based on soft computing techniques in order to estimate monthly and daily ambient temperature. Indeed, we combine the back-propagation (BP) algorithm and the simple Genetic Algorithm (GA) in order to effectively train artificial neural networks (ANN) in such a way that the BP algorithm initialises a few individuals of the GA's population. Experiments concerned monthly temperature estimation of unknown places and daily temperature estimation for thermal load computation. Results have shown remarkable improvements in accuracy compared to traditional methods. (author)

  19. Develop baseline computational model for proactive welding stress management to suppress helium induced cracking during weld repair

    Broader source: Energy.gov [DOE]

    There are over 100 nuclear power plants operating in the U.S., which generate approximately 20% of the nation’s electricity. These plants range from 15 to 40 years old. Extending the service lives...

  20. Direct coal liquefaction baseline design and system analysis

    SciTech Connect (OSTI)

    Not Available

    1991-04-01

    The primary objective of the study is to develop a computer model for a base line direct coal liquefaction design based on two stage direct coupled catalytic reactors. This primary objective is to be accomplished by completing the following: a base line design based on previous DOE/PETC results from Wilsonville pilot plant and other engineering evaluations; a cost estimate and economic analysis; a computer model incorporating the above two steps over a wide range of capacities and selected process alternatives; a comprehensive training program for DOE/PETC Staff to understand and use the computer model; a thorough documentation of all underlying assumptions for baseline economics; and a user manual and training material which will facilitate updating of the model in the future.

  1. Direct coal liquefaction baseline design and system analysis

    SciTech Connect (OSTI)

    Not Available

    1991-07-01

    The primary objective of the study is to develop a computer model for a base line direct coal liquefaction design based on two stage direct coupled catalytic reactors. This primary objective is to be accomplished by completing the following: a base line design based on previous DOE/PETC results from Wilsonville pilot plant and other engineering evaluations; a cost estimate and economic analysis; a computer model incorporating the above two steps over a wide range of capacities and selected process alternatives; a comprehensive training program for DOE/PETC Staff to understand and use the computer model; a thorough documentation of all underlying assumptions for baseline economics; and a user manual and training material which will facilitate updating of the model in the future.

  2. Computational Fluid Dynamics Modeling of Diesel Engine Combustion and

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Emissions | Department of Energy Computational Fluid Dynamics Modeling of Diesel Engine Combustion and Emissions Computational Fluid Dynamics Modeling of Diesel Engine Combustion and Emissions 2005 Diesel Engine Emissions Reduction (DEER) Conference Presentations and Posters 2005_deer_reitz.pdf (682.47 KB) More Documents & Publications Experiments and Modeling of Two-Stage Combustion in Low-Emissions Diesel Engines Comparison of Conventional Diesel and Reactivity Controlled Compression

  3. Modeling-Computer Simulations | Open Energy Information

    Open Energy Info (EERE)

    the risk of inaccurate predictions.1 Potential Pitfalls Uncertainties in initial reservoir conditions and other model inputs can cause inaccuracies in simulations, which...

  4. Computational Model of Magnesium Deposition and Dissolution for Property

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Determination via Cyclic Voltammetry - Joint Center for Energy Storage Research June 23, 2016, Research Highlights Computational Model of Magnesium Deposition and Dissolution for Property Determination via Cyclic Voltammetry Top: Example distributions of the charge transfer coefficient and standard heterogeneous rate constant, obtained from fitting Bottom: Comparison between experimental and simulated voltammograms, demonstrating good agreement Scientific Achievement A computationally

  5. Computational Modeling for the American Chemical Society | GE...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Computational Modeling for the American Chemical Society Click to email this to a friend (Opens in new window) Share on Facebook (Opens in new window) Click to share (Opens in new...

  6. Scientists model brain structure to help computers recognize...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    The team tried developing a computer model based on human neural structure and function, ... Introspectively, we know that the human brain solves this problem very well. We only have ...

  7. Computational model of miniature pulsating heat pipes.

    SciTech Connect (OSTI)

    Martinez, Mario J.; Givler, Richard C.

    2013-01-01

    The modeling work described herein represents Sandia National Laboratories (SNL) portion of a collaborative three-year project with Northrop Grumman Electronic Systems (NGES) and the University of Missouri to develop an advanced, thermal ground-plane (TGP), which is a device, of planar configuration, that delivers heat from a source to an ambient environment with high efficiency. Work at all three institutions was funded by DARPA/MTO; Sandia was funded under DARPA/MTO project number 015070924. This is the final report on this project for SNL. This report presents a numerical model of a pulsating heat pipe, a device employing a two phase (liquid and its vapor) working fluid confined in a closed loop channel etched/milled into a serpentine configuration in a solid metal plate. The device delivers heat from an evaporator (hot zone) to a condenser (cold zone). This new model includes key physical processes important to the operation of flat plate pulsating heat pipes (e.g. dynamic bubble nucleation, evaporation and condensation), together with conjugate heat transfer with the solid portion of the device. The model qualitatively and quantitatively predicts performance characteristics and metrics, which was demonstrated by favorable comparisons with experimental results on similar configurations. Application of the model also corroborated many previous performance observations with respect to key parameters such as heat load, fill ratio and orientation.

  8. Bayesian approaches for combining computational model output and physical

    Office of Scientific and Technical Information (OSTI)

    observations (Conference) | SciTech Connect Bayesian approaches for combining computational model output and physical observations Citation Details In-Document Search Title: Bayesian approaches for combining computational model output and physical observations Authors: Higdon, David M [1] ; Lawrence, Earl [1] ; Heitmann, Katrin [2] ; Habib, Salman [2] + Show Author Affiliations Los Alamos National Laboratory ANL Publication Date: 2011-07-25 OSTI Identifier: 1084581 Report Number(s):

  9. HIV virus spread and evolution studied through computer modeling

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    HIV and evolution studied through computer modeling HIV virus spread and evolution studied through computer modeling This approach distinguishes between susceptible and infected individuals to capture the full infection history, including contact tracing data for infected individuals. November 19, 2013 Scanning electron micrograph of HIV-1 budding (in green) from cultured lymphocytes. The image has been colored to highlight important features. Scanning electron micrograph of HIV-1 budding (in

  10. Computer modeling reveals how surprisingly potent hepatitis C drug works

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Hepatitis C computer modeling Computer modeling reveals how surprisingly potent hepatitis C drug works A study reveals how daclatasvir targets one of its proteins and causes the fastest viral decline ever seen with anti-HCV drugs - within 12 hours of treatment. February 19, 2013 Los Alamos National Laboratory sits on top of a once-remote mesa in northern New Mexico with the Jemez mountains as a backdrop to research and innovation covering multi-disciplines from bioscience, sustainable energy

  11. Use Computational Model to Design and Optimize Welding Conditions to

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Suppress Helium Cracking during Welding | Department of Energy Use Computational Model to Design and Optimize Welding Conditions to Suppress Helium Cracking during Welding Use Computational Model to Design and Optimize Welding Conditions to Suppress Helium Cracking during Welding Today, welding is widely used for repair, maintenance and upgrade of nuclear reactor components. As a critical technology to extend the service life of nuclear power plants beyond 60 years, weld technology must be

  12. Computational social network modeling of terrorist recruitment.

    SciTech Connect (OSTI)

    Berry, Nina M.; Turnley, Jessica Glicken; Smrcka, Julianne D.; Ko, Teresa H.; Moy, Timothy David; Wu, Benjamin C.

    2004-10-01

    The Seldon terrorist model represents a multi-disciplinary approach to developing organization software for the study of terrorist recruitment and group formation. The need to incorporate aspects of social science added a significant contribution to the vision of the resulting Seldon toolkit. The unique addition of and abstract agent category provided a means for capturing social concepts like cliques, mosque, etc. in a manner that represents their social conceptualization and not simply as a physical or economical institution. This paper provides an overview of the Seldon terrorist model developed to study the formation of cliques, which are used as the major recruitment entity for terrorist organizations.

  13. Transportation Baseline Report

    SciTech Connect (OSTI)

    Fawcett, Ricky Lee; Kramer, George Leroy Jr.

    1999-12-01

    The National Transportation Program 1999 Transportation Baseline Report presents data that form a baseline to enable analysis and planning for future Department of Energy (DOE) Environmental Management (EM) waste and materials transportation. In addition, this Report provides a summary overview of DOEs projected quantities of waste and materials for transportation. Data presented in this report were gathered as a part of the IPABS Spring 1999 update of the EM Corporate Database and are current as of July 30, 1999. These data were input and compiled using the Analysis and Visualization System (AVS) which is used to update all stream-level components of the EM Corporate Database, as well as TSD System and programmatic risk (disposition barrier) information. Project (PBS) and site-level IPABS data are being collected through the Interim Data Management System (IDMS). The data are presented in appendices to this report.

  14. Baseline design/economics for advanced Fischer-Tropsch technology. Quarterly report, October--December 1992

    SciTech Connect (OSTI)

    Not Available

    1992-12-31

    Bechtel, with Amoco as the main subcontractor, initiated a study on September 26, 1991, for the US Department of Energy`s (DOE`s) Pittsburgh Energy Technology Center (PETC) to develop a computer model and baseline design for advanced Fischer-Tropsch (F-T) technology. This 24-month study, with an approved budget of $2.3 million, is being performed under DOE Contract Number AC22-91PC90027. (1) Develop a baseline design and two alternative designs for indirect liquefaction using advanced F-T technology. The baseline design uses Illinois No. 6 Eastern Coal and conventional refining. There is an alternative refining case using ZSM-5 treatment of the vapor stream from the slurry F-T reactor and an alternative coal case using Western coal from the Powder River Basin. (2) Prepare the capital and operating costs for the baseline design and the alternatives. Individual plant costs for the alternative cases will be prorated on capacity, wherever possible, from the baseline case. (3) Develop a process flowsheet simulation (PFS) model. The baseline design, the economic analysis and computer model will be major research planning tools that PETC will use to plan, guide and evaluate its ongoing and future research and commercialization programs relating to indirect coal liquefaction for the manufacture of synthetic liquid fuels from coal.

  15. New Computer Model Pinpoints Prime Materials for Carbon Capture

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Computer Model Pinpoints Prime Materials for Carbon Capture New Computer Model Pinpoints Prime Materials for Carbon Capture July 17, 2012 NERSC Contact: Linda Vu, lvu@lbl.gov, +1 510 495 2402 UC Berkeley Contact: Robert Sanders, rsanders@berkeley.edu zeolite350.jpg One of the 50 best zeolite structures for capturing carbon dioxide. Zeolite is a porous solid made of silicon dioxide, or quartz. In the model, the red balls are oxygen, the tan balls are silicon. The blue-green area is where carbon

  16. Integrated Multiscale Modeling of Molecular Computing Devices

    SciTech Connect (OSTI)

    Weinan E

    2012-03-29

    The main bottleneck in modeling transport in molecular devices is to develop the correct formulation of the problem and efficient algorithms for analyzing the electronic structure and dynamics using, for example, the time-dependent density functional theory. We have divided this task into several steps. The first step is to developing the right mathematical formulation and numerical algorithms for analyzing the electronic structure using density functional theory. The second step is to study time-dependent density functional theory, particularly the far-field boundary conditions. The third step is to study electronic transport in molecular devices. We are now at the end of the first step. Under DOE support, we have made subtantial progress in developing linear scaling and sub-linear scaling algorithms for electronic structure analysis. Although there has been a huge amount of effort in the past on developing linear scaling algorithms, most of the algorithms developed suffer from the lack of robustness and controllable accuracy. We have made the following progress: (1) We have analyzed thoroughly the localization properties of the wave-functions. We have developed a clear understanding of the physical as well as mathematical origin of the decay properties. One important conclusion is that even for metals, one can choose wavefunctions that decay faster than any algebraic power. (2) We have developed algorithms that make use of these localization properties. Our algorithms are based on non-orthogonal formulations of the density functional theory. Our key contribution is to add a localization step into the algorithm. The addition of this localization step makes the algorithm quite robust and much more accurate. Moreover, we can control the accuracy of these algorithms by changing the numerical parameters. (3) We have considerably improved the Fermi operator expansion (FOE) approach. Through pole expansion, we have developed the optimal scaling FOE algorithm.

  17. Annual Technology Baseline

    Broader source: Energy.gov [DOE]

    The National Renewable Energy Laboratory is conducting a study sponsored by the U.S. Department of Energy DOE, Office of Energy Efficiency and Renewable Energy (EERE), that aims to document and implement an annual process designed to identify a realistic and timely set of input assumptions (e.g., technology cost and performance, fuel costs), and a diverse set of potential futures (standard scenarios), initially for electric sector analysis. This primary product of the Annual Technology Baseline (ATB) project component includes detailed cost and performance data (both current and projected) for both renewable and conventional technologies. This data is presented in MS Excel.

  18. Methodology for characterizing modeling and discretization uncertainties in computational simulation

    SciTech Connect (OSTI)

    ALVIN,KENNETH F.; OBERKAMPF,WILLIAM L.; RUTHERFORD,BRIAN M.; DIEGERT,KATHLEEN V.

    2000-03-01

    This research effort focuses on methodology for quantifying the effects of model uncertainty and discretization error on computational modeling and simulation. The work is directed towards developing methodologies which treat model form assumptions within an overall framework for uncertainty quantification, for the purpose of developing estimates of total prediction uncertainty. The present effort consists of work in three areas: framework development for sources of uncertainty and error in the modeling and simulation process which impact model structure; model uncertainty assessment and propagation through Bayesian inference methods; and discretization error estimation within the context of non-deterministic analysis.

  19. Baseline design/economics for advanced Fischer-Tropsch technology. Quarterly report, April--June 1994

    SciTech Connect (OSTI)

    1994-01-01

    The objectives of this study are to: Develop a baseline design and two alternative designs for indirect liquefaction using advanced F-T technology. The baseline design uses Illinois No. 6 Eastern Coal and conventional refining. There is an alternative refining case using ZSM-5 treatment of the vapor steam from the flurry F-T reactor and an alternative coal case using Western coal from the Powder River Basin. Prepare the capital and operating costs for the baseline design and the alternatives. Individual plant costs for the alternative cases will be prorated on capacity, wherever possible, from the baseline case, develop a process flowsheet simulation (PFS) model. The baseline design, the economic analysis and computer model will be major research planning tools that Pittsburgh Energy Technology Center will use to plan, guide and evaluate its ongoing and future research and commercialization programs relating to indirect coal liquefaction for the manufacture of synthetic liquid fuels from coal. During the reporting period, work progressed on Tasks 1, 4, 5, 6 and 7. This report covers work done during the period and consists of six sections: introduction and summary; Task 1, baseline design and alternatives; Task 4, process flowsheet simulation (PFS) model; Task 5, perform sensitivity studies using the PFS model; Task 6, document the PFS model and develop a DOE training session on its use, and project management and staffing report.

  20. Practical Use of Computationally Frugal Model Analysis Methods

    SciTech Connect (OSTI)

    Hill, Mary C.; Kavetski, Dmitri; Clark, Martyn; Ye, Ming; Arabi, Mazdak; Lu, Dan; Foglia, Laura; Mehl, Steffen

    2015-03-21

    Computationally frugal methods of model analysis can provide substantial benefits when developing models of groundwater and other environmental systems. Model analysis includes ways to evaluate model adequacy and to perform sensitivity and uncertainty analysis. Frugal methods typically require 10s of parallelizable model runs; their convenience allows for other uses of the computational effort. We suggest that model analysis be posed as a set of questions used to organize methods that range from frugal to expensive (requiring 10,000 model runs or more). This encourages focus on method utility, even when methods have starkly different theoretical backgrounds. We note that many frugal methods are more useful when unrealistic process-model nonlinearities are reduced. Inexpensive diagnostics are identified for determining when frugal methods are advantageous. Examples from the literature are used to demonstrate local methods and the diagnostics. We suggest that the greater use of computationally frugal model analysis methods would allow questions such as those posed in this work to be addressed more routinely, allowing the environmental sciences community to obtain greater scientific insight from the many ongoing and future modeling efforts

  1. Practical Use of Computationally Frugal Model Analysis Methods

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Hill, Mary C.; Kavetski, Dmitri; Clark, Martyn; Ye, Ming; Arabi, Mazdak; Lu, Dan; Foglia, Laura; Mehl, Steffen

    2015-03-21

    Computationally frugal methods of model analysis can provide substantial benefits when developing models of groundwater and other environmental systems. Model analysis includes ways to evaluate model adequacy and to perform sensitivity and uncertainty analysis. Frugal methods typically require 10s of parallelizable model runs; their convenience allows for other uses of the computational effort. We suggest that model analysis be posed as a set of questions used to organize methods that range from frugal to expensive (requiring 10,000 model runs or more). This encourages focus on method utility, even when methods have starkly different theoretical backgrounds. We note that many frugalmore » methods are more useful when unrealistic process-model nonlinearities are reduced. Inexpensive diagnostics are identified for determining when frugal methods are advantageous. Examples from the literature are used to demonstrate local methods and the diagnostics. We suggest that the greater use of computationally frugal model analysis methods would allow questions such as those posed in this work to be addressed more routinely, allowing the environmental sciences community to obtain greater scientific insight from the many ongoing and future modeling efforts« less

  2. Computer Modeling VRF Heat Pumps in Commercial Buildings using EnergyPlus

    SciTech Connect (OSTI)

    Raustad, Richard

    2013-06-01

    Variable Refrigerant Flow (VRF) heat pumps are increasingly used in commercial buildings in the United States. Monitored energy use of field installations have shown, in some cases, savings exceeding 30% compared to conventional heating, ventilating, and air-conditioning (HVAC) systems. A simulation study was conducted to identify the installation or operational characteristics that lead to energy savings for VRF systems. The study used the Department of Energy EnergyPlus? building simulation software and four reference building models. Computer simulations were performed in eight U.S. climate zones. The baseline reference HVAC system incorporated packaged single-zone direct-expansion cooling with gas heating (PSZ-AC) or variable-air-volume systems (VAV with reheat). An alternate baseline HVAC system using a heat pump (PSZ-HP) was included for some buildings to directly compare gas and electric heating results. These baseline systems were compared to a VRF heat pump model to identify differences in energy use. VRF systems combine multiple indoor units with one or more outdoor unit(s). These systems move refrigerant between the outdoor and indoor units which eliminates the need for duct work in most cases. Since many applications install duct work in unconditioned spaces, this leads to installation differences between VRF systems and conventional HVAC systems. To characterize installation differences, a duct heat gain model was included to identify the energy impacts of installing ducts in unconditioned spaces. The configuration of variable refrigerant flow heat pumps will ultimately eliminate or significantly reduce energy use due to duct heat transfer. Fan energy is also studied to identify savings associated with non-ducted VRF terminal units. VRF systems incorporate a variable-speed compressor which may lead to operational differences compared to single-speed compression systems. To characterize operational differences, the computer model performance curves used

  3. TWRS baseline system description

    SciTech Connect (OSTI)

    Lee, A.K.

    1995-03-28

    This document provides a description of the baseline system conceptualized for remediating the tank waste stored within the Hanford Site. Remediation of the tank waste will be performed by the Tank Waste Remediation System (TWRS). This baseline system description (BSD) document has been prepared to describe the current planning basis for the TWRS for accomplishing the tank waste remediation functions. The BSD document is not intended to prescribe firm program management strategies for implementing the TWRS. The scope of the TWRS Program includes managing existing facilities, developing technology for new systems; building, testing and operating new facilities; and maintaining the system. The TWRS Program will manage the system used for receiving, safely storing, maintaining, treating, and disposing onsite, or packaging for offsite disposal, all tank waste. The scope of the TWRS Program encompasses existing facilities such as waste storage tanks, evaporators, pipelines, and low-level radioactive waste treatment and disposal facilities. It includes support facilities that comprise the total TWRS infrastructure, including upgrades to existing facilities or equipment and the addition of new facilities.

  4. New partnership uses advanced computer science modeling to address climate

    National Nuclear Security Administration (NNSA)

    change | National Nuclear Security Administration | (NNSA) partnership uses advanced computer science modeling to address climate change Friday, August 29, 2014 - 10:26am Several national laboratories and institutions have joined forces to develop and apply the most complete climate and Earth system model to address the most challenging and demanding climate change issues. Accelerated Climate Modeling for Energy, or ACME, is designed to accelerate the development and application of fully

  5. District-heating strategy model: computer programmer's manual

    SciTech Connect (OSTI)

    Kuzanek, J.F.

    1982-05-01

    The US Department of Housing and Urban Development (HUD) and the US Department of Energy (DOE) cosponsor a program aimed at increasing the number of district heating and cooling (DHC) systems. Such systems can reduce the amount and costs of fuels used to heat and cool buildings in a district. Twenty-eight communities have agreed to aid HUD in a national feasibility assessment of DHC systems. The HUD/DOE program entails technical assistance by Argonne National Laboratory and Oak Ridge National Laboratory. The assistance includes a computer program, called the district heating strategy model (DHSM), that performs preliminary calculations to analyze potential DHC systems. This report describes the general capabilities of the DHSM, provides historical background on its development, and explains the computer installation and operation of the model - including the data file structures and the options. Sample problems illustrate the structure of the various input data files, the interactive computer-output listings. The report is written primarily for computer programmers responsible for installing the model on their computer systems, entering data, running the model, and implementing local modifications to the code.

  6. Multi-epoch very long baseline interferometric observations of the nuclear starburst region of NGC 253: Improved modeling of the supernova and star formation rates

    SciTech Connect (OSTI)

    Rampadarath, H.; Morgan, J. S.; Tingay, S. J.; Lenc, E.

    2014-01-01

    The results of multi-epoch observations of the southern starburst galaxy, NGC 253, with the Australian Long Baseline Array at 2.3 GHz are presented. As with previous radio interferometric observations of this galaxy, no new sources were discovered. By combining the results of this survey with Very Large Array observations at higher frequencies from the literature, spectra were derived and a free-free absorption model was fitted of 20 known sources in NGC 253. The results were found to be consistent with previous studies. The supernova remnant, 5.48-43.3, was imaged with the highest sensitivity and resolution to date, revealing a two-lobed morphology. Comparisons with previous observations of similar resolution give an upper limit of 10{sup 4} km s{sup 1} for the expansion speed of this remnant. We derive a supernova rate of <0.2 yr{sup 1} for the inner 300 pc using a model that improves on previous methods by incorporating an improved radio supernova peak luminosity distribution and by making use of multi-wavelength radio data spanning 21 yr. A star formation rate of SFR(M ? 5 M {sub ?}) < 4.9 M {sub ?} yr{sup 1} was also estimated using the standard relation between supernova and star formation rates. Our improved estimates of supernova and star formation rates are consistent with studies at other wavelengths. The results of our study point to the possible existence of a small population of undetected supernova remnants, suggesting a low rate of radio supernova production in NGC 253.

  7. Baseline design/economics for advanced Fischer-Tropsch technology. Quarterly report, April--June 1993

    SciTech Connect (OSTI)

    1993-12-31

    The objectives of this study are to: (1) Develop a baseline design and two alternative designs for indirect liquefaction using advanced F-T technology. The baseline design uses Illinois No. 6 Eastern Coal and conventional refining. There is an alternative refining case using ZSM-5 treatment of the vapor stream from the slurry F-T reactor and an alternative coal case using Western coal from the Powder River Basin. (2) Prepare the capital and operating costs for the baseline design and the alternatives. Individual plant costs for the alternative cases will be prorated on capacity, wherever possible, from the baseline case. (3) Develop a process flowsheet simulation model. The baseline design, the economic analysis and computer model will be major research planning tools that PETC will use to plan, guide and evaluate its ongoing and future research and commercialization programs relating to indirect coal liquefaction for the manufacture of synthetic liquid fuels from coal. The study has been divided into seven major tasks: Task 1: Establish the baseline design and alternatives. Task 2: Evaluate baseline and alternative economics. Task 3: Develop engineering design criteria. Task 4: Develop a process flowsheet simulation (PFS) model. Task 5: Perform sensitivity studies using the PFS model. Task 6: Document the PFS model and develop a DOE training session on its use. Task 7: Perform project management, technical coordination and other miscellaneous support functions. During the reporting period, work progressed on Tasks 1, 4 and 7. This report covers work done during the period and consists of four sections: Introduction and Summary. Task 1--Baseline Design and Alternatives. Task 4--Process Flowsheet Simulation (PFS) Model, and Project Management and Staffing Report.

  8. Baseline design/economics for advanced Fischer-Tropsch technology. Quarterly report, January--March 1993

    SciTech Connect (OSTI)

    1993-12-31

    The objectives of this study are to: Develop a baseline design and two alternative designs for indirect liquefaction using advanced F-T technology. The baseline design uses Illinois No. 6 Eastern Coal and conventional refining. There is an alternative refining case using ZSM- 5 treatment of the vapor stream from the slurry F-T reactor and an alternative coal case using Western coal from the Powder River Basin. Prepare the capital and operating costs for the baseline design and the alternatives. Individual plant costs for the alternative cases will be prorated on capacity, wherever possible, from the baseline case, and develop a process flowsheet simulation model. The baseline design, the economic analysis and computer model will be major research planning tools that Pittsburgh Energy Technology Center will use to plan, guide and evaluate its ongoing and future research and commercialization programs relating to indirect coal liquefaction for the manufacture of synthetic liquid fuels from coal. The study has been divided into seven major tasks: Task 1, establish the baseline design and alternatives; Task 2, evaluate baseline and alternative economics; Task 3, develop engineering design criteria; Task 4, develop a process flowsheet simulation (PFS) model; Task 5, perform sensitivity studies using the PFS model; Task 6, document the PFS model and develop a DOE training session on its use; and Task 7, perform project management, technical coordination and other miscellaneous support functions. This report covers work done during the period and consists of four sections: Introduction and summary; Task 1, baseline design and alternatives; Task 4, process flowsheet simulation (PFS) model; and project management and staffing report.

  9. Baseline design/economics for advanced Fischer-Tropsch technology. Quarterly report, October--December 1994

    SciTech Connect (OSTI)

    1993-12-31

    The objectives of the study are to: Develop a baseline design and two alternative designs for indirect liquefaction using advanced F-T technology. The baseline design uses Illinois No. 6 Eastern Coal and conventional refining. There is an alternative refining case using ZSM-5 treatment of the vapor stream from the slurry F-T reactor and an alternative coal case using Western coal from the Powder River Basin. Prepare the capital and operating costs for the baseline design and the alternatives. Individual plant costs for the alternative cases will be prorated on capacity, wherever possible, from the baseline case. Develop a process flowsheet simulation model. The baseline design, the economic analysis and computer model will be major research planning tools that PETC will use to plan, guide and evaluate its ongoing and future research and commercialization programs relating to indirect coal liquefaction for the manufacture of synthetic liquid fuels from coal. The study has been divided into seven major tasks. Task 1: Establish the baseline design and alternatives. Task 2: Evaluate baseline and alternative economics. Task 3: Develop engineering design criteria. Task 4: Develop a process flowsheet simulation model. Task 5: Perform sensitivity studies using the PFS model. Task 6: Document the PFS model and develop a DOE training session on its use, and Task 7: Perform project management, technical coordination and other miscellaneous support functions. During the reporting period, work progressed on Tasks 1, 4 and 7. This report covers work done during the period and consists of four sections: Introduction and Summary. Task 1--Baseline Design and Alternatives. Task 4--Process Flowsheet Simulation Model. Project Management and Staffing Report.

  10. Hazard baseline documentation

    SciTech Connect (OSTI)

    Not Available

    1994-08-01

    This DOE limited technical standard establishes uniform Office of Environmental Management (EM) guidance on hazards baseline documents that identify and control radiological and nonradiological hazards for all EM facilities. It provides a road map to the safety and health hazard identification and control requirements contained in the Department`s orders and provides EM guidance on the applicability and integration of these requirements. This includes a definition of four classes of facilities (nuclear, non-nuclear, radiological, and other industrial); the thresholds for facility hazard classification; and applicable safety and health hazard identification, controls, and documentation. The standard applies to the classification, development, review, and approval of hazard identification and control documentation for EM facilities.

  11. Computer Modeling of Carbon Metabolism Enables Biofuel Engineering (Fact Sheet)

    SciTech Connect (OSTI)

    Not Available

    2011-09-01

    In an effort to reduce the cost of biofuels, the National Renewable Energy Laboratory (NREL) has merged biochemistry with modern computing and mathematics. The result is a model of carbon metabolism that will help researchers understand and engineer the process of photosynthesis for optimal biofuel production.

  12. Computer Modeling of Saltstone Landfills by Intera Environmental Consultants

    SciTech Connect (OSTI)

    Albenesius, E.L.

    2001-08-09

    This report summaries the computer modeling studies and how the results of these studies were used to estimate contaminant releases to the groundwater. These modeling studies were used to improve saltstone landfill designs and are the basis for the current reference design. With the reference landfill design, EPA Drinking Water Standards can be met for all chemicals and radionuclides contained in Savannah River Plant waste salts.

  13. Scientists use world's fastest computer to model materials under extreme

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    conditions Materials under extreme conditions Scientists use world's fastest computer to model materials under extreme conditions Materials scientists are for the first time attempting to create atomic-scale models that describe how voids are created, grow, and merge. October 30, 2009 Los Alamos National Laboratory sits on top of a once-remote mesa in northern New Mexico with the Jemez mountains as a backdrop to research and innovation covering multi-disciplines from bioscience, sustainable

  14. Parallel Computation of the Regional Ocean Modeling System (ROMS)

    SciTech Connect (OSTI)

    Wang, P; Song, Y T; Chao, Y; Zhang, H

    2005-04-05

    The Regional Ocean Modeling System (ROMS) is a regional ocean general circulation modeling system solving the free surface, hydrostatic, primitive equations over varying topography. It is free software distributed world-wide for studying both complex coastal ocean problems and the basin-to-global scale ocean circulation. The original ROMS code could only be run on shared-memory systems. With the increasing need to simulate larger model domains with finer resolutions and on a variety of computer platforms, there is a need in the ocean-modeling community to have a ROMS code that can be run on any parallel computer ranging from 10 to hundreds of processors. Recently, we have explored parallelization for ROMS using the MPI programming model. In this paper, an efficient parallelization strategy for such a large-scale scientific software package, based on an existing shared-memory computing model, is presented. In addition, scientific applications and data-performance issues on a couple of SGI systems, including Columbia, the world's third-fastest supercomputer, are discussed.

  15. Baseline design/economics for advanced Fischer-Tropsch technology. Auarterly report, July--September 1992

    SciTech Connect (OSTI)

    1992-12-31

    The objectives of this study are to: Develop a baseline design for indirect liquefaction using advanced F-T technology; prepare the capital and operating costs for the baseline design; and develop a process flowsheet simulation model. The baseline design, the economic analysis and computer model will be major research planning tools that Pittsburgh Energy Technology Center will use to plan, guide and evaluate its ongoing and future research and commercialisation programs relating to indirect coal liquefaction for the manufacture of synthetic liquid fuels from coal. The study has been divided into seven major tasks: Task 1, establish the baseline design and alternatives; Task 2, evaluate baseline economics; Task 3: Develop engineering design criteria; Task 4, develop a process flowsheet simulation (PFS) model; Task 5, perform sensitivity studies using the PFS model; Task 6, document the PFS model and develop a DOE training session on its use; Task 7, perform project management, technical coordination and other miscellaneous support functions. During the reporting period work progressed on Tasks 1, 4 and 7. This report covers work done during the period and consists of five sections: Introduction and summary; preliminary design for syngas production; Task 1, preliminary F-T reaction loop design; Task 1, development of a process simulation model; Task 4, key personnel staffing report, Task 7.

  16. Computational Science Research in Support of Petascale Electromagnetic Modeling

    SciTech Connect (OSTI)

    Lee, L.-Q.; Akcelik, V; Ge, L; Chen, S; Schussman, G; Candel, A; Li, Z; Xiao, L; Kabel, A; Uplenchwar, R; Ng, C; Ko, K; /SLAC

    2008-06-20

    Computational science research components were vital parts of the SciDAC-1 accelerator project and are continuing to play a critical role in newly-funded SciDAC-2 accelerator project, the Community Petascale Project for Accelerator Science and Simulation (ComPASS). Recent advances and achievements in the area of computational science research in support of petascale electromagnetic modeling for accelerator design analysis are presented, which include shape determination of superconducting RF cavities, mesh-based multilevel preconditioner in solving highly-indefinite linear systems, moving window using h- or p- refinement for time-domain short-range wakefield calculations, and improved scalable application I/O.

  17. Wind energy conversion system analysis model (WECSAM) computer program documentation

    SciTech Connect (OSTI)

    Downey, W T; Hendrick, P L

    1982-07-01

    Described is a computer-based wind energy conversion system analysis model (WECSAM) developed to predict the technical and economic performance of wind energy conversion systems (WECS). The model is written in CDC FORTRAN V. The version described accesses a data base containing wind resource data, application loads, WECS performance characteristics, utility rates, state taxes, and state subsidies for a six state region (Minnesota, Michigan, Wisconsin, Illinois, Ohio, and Indiana). The model is designed for analysis at the county level. The computer model includes a technical performance module and an economic evaluation module. The modules can be run separately or together. The model can be run for any single user-selected county within the region or looped automatically through all counties within the region. In addition, the model has a restart capability that allows the user to modify any data-base value written to a scratch file prior to the technical or economic evaluation. Thus, any user-supplied data for WECS performance, application load, utility rates, or wind resource may be entered into the scratch file to override the default data-base value. After the model and the inputs required from the user and derived from the data base are described, the model output and the various output options that can be exercised by the user are detailed. The general operation is set forth and suggestions are made for efficient modes of operation. Sample listings of various input, output, and data-base files are appended. (LEW)

  18. A New Perspective for the Calibration of Computational Predictor Models.

    SciTech Connect (OSTI)

    Crespo, Luis Guillermo

    2014-11-01

    This paper presents a framework for calibrating computational models using data from sev- eral and possibly dissimilar validation experiments. The offset between model predictions and observations, which might be caused by measurement noise, model-form uncertainty, and numerical error, drives the process by which uncertainty in the models parameters is characterized. The resulting description of uncertainty along with the computational model constitute a predictor model. Two types of predictor models are studied: Interval Predictor Models (IPMs) and Random Predictor Models (RPMs). IPMs use sets to characterize uncer- tainty, whereas RPMs use random vectors. The propagation of a set through a model makes the response an interval valued function of the state, whereas the propagation of a random vector yields a random process. Optimization-based strategies for calculating both types of predictor models are proposed. Whereas the formulations used to calculate IPMs target solutions leading to the interval value function of minimal spread containing all observations, those for RPMs seek to maximize the models' ability to reproduce the distribution of obser- vations. Regarding RPMs, we choose a structure for the random vector (i.e., the assignment of probability to points in the parameter space) solely dependent on the prediction error. As such, the probabilistic description of uncertainty is not a subjective assignment of belief, nor is it expected to asymptotically converge to a fixed value, but instead it is a description of the model's ability to reproduce the experimental data. This framework enables evaluating the spread and distribution of the predicted response of target applications depending on the same parameters beyond the validation domain (i.e., roll-up and extrapolation).

  19. Fast, narrow-band computer model for radiation calculations

    SciTech Connect (OSTI)

    Yan, Z.; Holmstedt, G.

    1997-01-01

    A fast, narrow-band computer model, FASTNB, which predicts the radiation intensity in a general nonisothermal and nonhomogeneous combustion environment, has been developed. The spectral absorption coefficients of the combustion products, including carbon dioxide, water vapor, and soot, are calculated based on the narrow-band model. FASTNB provides an accurate calculation at reasonably high speed. Compared with Grosshandler`s narrow-band model, RADCAL, which has been verified quite extensively against experimental measurements, FASTNB is more than 20 times faster and gives almost exactly the same results.

  20. The origins of computer weather prediction and climate modeling

    SciTech Connect (OSTI)

    Lynch, Peter [Meteorology and Climate Centre, School of Mathematical Sciences, University College Dublin, Belfield (Ireland)], E-mail: Peter.Lynch@ucd.ie

    2008-03-20

    Numerical simulation of an ever-increasing range of geophysical phenomena is adding enormously to our understanding of complex processes in the Earth system. The consequences for mankind of ongoing climate change will be far-reaching. Earth System Models are capable of replicating climate regimes of past millennia and are the best means we have of predicting the future of our climate. The basic ideas of numerical forecasting and climate modeling were developed about a century ago, long before the first electronic computer was constructed. There were several major practical obstacles to be overcome before numerical prediction could be put into practice. A fuller understanding of atmospheric dynamics allowed the development of simplified systems of equations; regular radiosonde observations of the free atmosphere and, later, satellite data, provided the initial conditions; stable finite difference schemes were developed; and powerful electronic computers provided a practical means of carrying out the prodigious calculations required to predict the changes in the weather. Progress in weather forecasting and in climate modeling over the past 50 years has been dramatic. In this presentation, we will trace the history of computer forecasting through the ENIAC integrations to the present day. The useful range of deterministic prediction is increasing by about one day each decade, and our understanding of climate change is growing rapidly as Earth System Models of ever-increasing sophistication are developed.

  1. The Impact of IBM Cell Technology on the Programming Paradigm in the Context of Computer Systems for Climate and Weather Models

    SciTech Connect (OSTI)

    Zhou, Shujia; Duffy, Daniel; Clune, Thomas; Suarez, Max; Williams, Samuel; Halem, Milton

    2009-01-10

    The call for ever-increasing model resolutions and physical processes in climate and weather models demands a continual increase in computing power. The IBM Cell processor's order-of-magnitude peak performance increase over conventional processors makes it very attractive to fulfill this requirement. However, the Cell's characteristics, 256KB local memory per SPE and the new low-level communication mechanism, make it very challenging to port an application. As a trial, we selected the solar radiation component of the NASA GEOS-5 climate model, which: (1) is representative of column physics components (half the total computational time), (2) has an extremely high computational intensity: the ratio of computational load to main memory transfers, and (3) exhibits embarrassingly parallel column computations. In this paper, we converted the baseline code (single-precision Fortran) to C and ported it to an IBM BladeCenter QS20. For performance, we manually SIMDize four independent columns and include several unrolling optimizations. Our results show that when compared with the baseline implementation running on one core of Intel's Xeon Woodcrest, Dempsey, and Itanium2, the Cell is approximately 8.8x, 11.6x, and 12.8x faster, respectively. Our preliminary analysis shows that the Cell can also accelerate the dynamics component (~;;25percent total computational time). We believe these dramatic performance improvements make the Cell processor very competitive as an accelerator.

  2. Final Report: Center for Programming Models for Scalable Parallel Computing

    SciTech Connect (OSTI)

    Mellor-Crummey, John

    2011-09-13

    As part of the Center for Programming Models for Scalable Parallel Computing, Rice University collaborated with project partners in the design, development and deployment of language, compiler, and runtime support for parallel programming models to support application development for the “leadership-class” computer systems at DOE national laboratories. Work over the course of this project has focused on the design, implementation, and evaluation of a second-generation version of Coarray Fortran. Research and development efforts of the project have focused on the CAF 2.0 language, compiler, runtime system, and supporting infrastructure. This has involved working with the teams that provide infrastructure for CAF that we rely on, implementing new language and runtime features, producing an open source compiler that enabled us to evaluate our ideas, and evaluating our design and implementation through the use of benchmarks. The report details the research, development, findings, and conclusions from this work.

  3. Multiscale Modeling of Malaria | Argonne Leadership Computing Facility

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Malaria Authors: Karniadakis, G.E., Parasitic infectious diseases like malaria and certain hereditary hematologic disorders are often associated with major changes in the shape and viscoelastic properties of red blood cells. Such changes can disrupt blood flow and, possibly, brain perfusion, as in the case of cerebral malaria. In recent work on stochastic multiscale models-in conjunction with large-scale parallel computing-we were able to quantify, for the first time, the main biophysical

  4. Advanced Reactor Thermal Hydraulic Modeling | Argonne Leadership Computing

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Facility Temperature distribution illustrating thermal striping in a T-junction. Computed on Intrepid with Nek5000 and visualized on Eureka with VisIt at the ALCF. Paul Fischer (ANL), Aleks Obabko (ANL), and Hank Childs (LBNL) Advanced Reactor Thermal Hydraulic Modeling PI Name: Paul Fischer PI Email: fischer@mcs.anl.gov Institution: Argonne National Laboratory Allocation Program: INCITE Allocation Hours at ALCF: 25 Million Year: 2012 Research Domain: Energy Technologies The DOE Nuclear

  5. Natural Abundance 17O Nuclear Magnetic Resonance and Computational Modeling

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Studies of Lithium Based Liquid Electrolytes - Joint Center for Energy Storage Research March 14, 2015, Research Highlights Natural Abundance 17O Nuclear Magnetic Resonance and Computational Modeling Studies of Lithium Based Liquid Electrolytes (Top) Example of natural abundance 17O NMR spectra of LiTFSI in mixture of EC, PC and EMC (4:1:5 by weight). (Bottom) The solvation structure of LiTFSI derived from the results obtained by both NMR and quantum chemistry calculations Scientific

  6. Martin Karplus and Computer Modeling for Chemical Systems

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Martin Karplus and Computer Modeling for Chemical Systems Resources with Additional Information * Karplus Equation Martin Karplus ©Portrait by N. Pitt, 9/10/03 Martin Karplus, the Theodore William Richards Professor of Chemistry Emeritus at Harvard, is one of three winners of the 2013 Nobel Prize in chemistry... The 83-year-old Vienna-born theoretical chemist, who is also affiliated with the Université de Strasbourg, Strasbourg, France, is a 1951 graduate of Harvard College and earned his

  7. Baseline Control Measures.pdf

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Individual Permit Baseline Control Measures at Los Alamos National Laboratory, Poster, Individual Permit for Storm Water, NPDES Permit No. NM0030759 Author(s): Veenis, Steven J....

  8. computers

    National Nuclear Security Administration (NNSA)

    Each successive generation of computing system has provided greater computing power and energy efficiency.

    CTS-1 clusters will support NNSA's Life Extension Program and...

  9. ONSET OF CHAOS IN A MODEL OF QUANTUM COMPUTATION G. BERMAN; ET...

    Office of Scientific and Technical Information (OSTI)

    OF CHAOS IN A MODEL OF QUANTUM COMPUTATION G. BERMAN; ET AL 71 CLASSICAL AND QUANTUM MECHANICS, GENERAL PHYSICS; 99 GENERAL AND MISCELLANEOUSMATHEMATICS, COMPUTING, AND...

  10. Computer Modeling of Violent Intent: A Content Analysis Approach

    SciTech Connect (OSTI)

    Sanfilippo, Antonio P.; Mcgrath, Liam R.; Bell, Eric B.

    2014-01-03

    We present a computational approach to modeling the intent of a communication source representing a group or an individual to engage in violent behavior. Our aim is to identify and rank aspects of radical rhetoric that are endogenously related to violent intent to predict the potential for violence as encoded in written or spoken language. We use correlations between contentious rhetoric and the propensity for violent behavior found in documents from radical terrorist and non-terrorist groups and individuals to train and evaluate models of violent intent. We then apply these models to unseen instances of linguistic behavior to detect signs of contention that have a positive correlation with violent intent factors. Of particular interest is the application of violent intent models to social media, such as Twitter, that have proved to serve as effective channels in furthering sociopolitical change.

  11. Hanford Site technical baseline database

    SciTech Connect (OSTI)

    Porter, P.E., Westinghouse Hanford

    1996-05-10

    This document includes a cassette tape that contains the Hanford specific files that make up the Hanford Site Technical Baseline Database as of May 10, 1996. The cassette tape also includes the delta files that delineate the differences between this revision and revision 3 (April 10, 1996) of the Hanford Site Technical Baseline Database.

  12. HEV America Baseline Test Sequence

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    BASELINE TEST SEQUENCE Revision 1 September 1, 2006 Prepared by Electric Transportation Applications Prepared by: _______________________________ Date: __________ Roberta Brayer Approved by: _________ _________________________________ Date: _______________ _____ Donald B. Karner ©2005 Electric Transportation Applications All Rights Reserved HEV America Baseline Test Sequence Page 1 HEV PERFORMANCE TEST PROCEDURE SEQUENCE The following test sequence shall be used for conduct of HEV America

  13. Computer-Aided Construction of Chemical Kinetic Models

    SciTech Connect (OSTI)

    Green, William H.

    2014-12-31

    The combustion chemistry of even simple fuels can be extremely complex, involving hundreds or thousands of kinetically significant species. The most reasonable way to deal with this complexity is to use a computer not only to numerically solve the kinetic model, but also to construct the kinetic model in the first place. Because these large models contain so many numerical parameters (e.g. rate coefficients, thermochemistry) one never has sufficient data to uniquely determine them all experimentally. Instead one must work in predictive mode, using theoretical rather than experimental values for many of the numbers in the model, and as appropriate refining the most sensitive numbers through experiments. Predictive chemical kinetics is exactly what is needed for computer-aided design of combustion systems based on proposed alternative fuels, particularly for early assessment of the value and viability of proposed new fuels before those fuels are commercially available. This project was aimed at making accurate predictive chemical kinetics practical; this is a challenging goal which requires a range of science advances. The project spanned a wide range from quantum chemical calculations on individual molecules and elementary-step reactions, through the development of improved rate/thermo calculation procedures, the creation of algorithms and software for constructing and solving kinetic simulations, the invention of methods for model-reduction while maintaining error control, and finally comparisons with experiment. Many of the parameters in the models were derived from quantum chemistry calculations, and the models were compared with experimental data measured in our lab or in collaboration with others.

  14. Computational modeling for hexcan failure under core distruptive accidental conditions

    SciTech Connect (OSTI)

    Sawada, T.; Ninokata, H.; Shimizu, A.

    1995-09-01

    This paper describes the development of computational modeling for hexcan wall failures under core disruptive accident conditions of fast breeder reactors. A series of out-of-pile experiments named SIMBATH has been analyzed by using the SIMMER-II code. The SIMBATH experiments were performed at KfK in Germany. The experiments used a thermite mixture to simulate fuel. The test geometry of SIMBATH ranged from single pin to 37-pin bundles. In this study, phenomena of hexcan wall failure found in a SIMBATH test were analyzed by SIMMER-II. Although the original model of SIMMER-II did not calculate any hexcan failure, several simple modifications made it possible to reproduce the hexcan wall melt-through observed in the experiment. In this paper the modifications and their significance are discussed for further modeling improvements.

  15. Baseline design/economics for advanced Fischer-Tropsch technology. Quarterly report, April--June 1992

    SciTech Connect (OSTI)

    Not Available

    1992-10-01

    Effective September 26, 1991, Bechtel, with Amoco as the main subcontractor, initiated a study to develop a computer model and baseline design for advanced Fischer-Tropsch (F-T) technology for the US Department of Energy`s Pittsburgh Energy Technology Center (PETC). The objectives of the study are to: Develop a baseline design for indirect liquefaction using advanced F-T technology; prepare the capital and operating costs for the baseline design; and develop a process flow sheet simulation (PI-S) model. The baseline design, the economic analysis, and the computer model win be the major research planning tools that PETC will use to plan, guide, and evaluate its ongoing and future research and commercialization programs relating to indirect coal liquefaction. for the manufacture of synthetic liquid fuels from coal. This report is Bechtel`s third quarterly technical progress report covering the period from March 16, 1992 through June 21, 1992. This report consists of seven sections: Section 1 - introduction; Section 2 - summary; Section 3 - carbon dioxide removal tradeoff study; Section 4 - preliminary plant designs for coal preparation; Section 5 - preliminary design for syngas production; Section 6 - Task 3 - engineering design criteria; and Section 7 - project management.

  16. Model of Procedure Usage – Results from a Qualitative Study to Inform Design of Computer-Based Procedures

    SciTech Connect (OSTI)

    Johanna H Oxstrand; Katya L Le Blanc

    2012-07-01

    The nuclear industry is constantly trying to find ways to decrease the human error rate, especially the human errors associated with procedure use. As a step toward the goal of improving procedure use performance, researchers, together with the nuclear industry, have been looking at replacing the current paper-based procedures with computer-based procedure systems. The concept of computer-based procedures is not new by any means; however most research has focused on procedures used in the main control room. Procedures reviewed in these efforts are mainly emergency operating procedures and normal operating procedures. Based on lessons learned for these previous efforts we are now exploring a more unknown application for computer based procedures - field procedures, i.e. procedures used by nuclear equipment operators and maintenance technicians. The Idaho National Laboratory, the Institute for Energy Technology, and participants from the U.S. commercial nuclear industry are collaborating in an applied research effort with the objective of developing requirements and specifications for a computer-based procedure system to be used by field operators. The goal is to identify the types of human errors that can be mitigated by using computer-based procedures and how to best design the computer-based procedures to do this. The underlying philosophy in the research effort is “Stop – Start – Continue”, i.e. what features from the use of paper-based procedures should we not incorporate (Stop), what should we keep (Continue), and what new features or work processes should be added (Start). One step in identifying the Stop – Start – Continue was to conduct a baseline study where affordances related to the current usage of paper-based procedures were identified. The purpose of the study was to develop a model of paper based procedure use which will help to identify desirable features for computer based procedure prototypes. Affordances such as note taking, markups

  17. Modeling the Fracture of Ice Sheets on Parallel Computers

    SciTech Connect (OSTI)

    Waisman, Haim; Tuminaro, Ray

    2013-10-10

    The objective of this project was to investigate the complex fracture of ice and understand its role within larger ice sheet simulations and global climate change. This objective was achieved by developing novel physics based models for ice, novel numerical tools to enable the modeling of the physics and by collaboration with the ice community experts. At the present time, ice fracture is not explicitly considered within ice sheet models due in part to large computational costs associated with the accurate modeling of this complex phenomena. However, fracture not only plays an extremely important role in regional behavior but also influences ice dynamics over much larger zones in ways that are currently not well understood. To this end, our research findings through this project offers significant advancement to the field and closes a large gap of knowledge in understanding and modeling the fracture of ice sheets in the polar regions. Thus, we believe that our objective has been achieved and our research accomplishments are significant. This is corroborated through a set of published papers, posters and presentations at technical conferences in the field. In particular significant progress has been made in the mechanics of ice, fracture of ice sheets and ice shelves in polar regions and sophisticated numerical methods that enable the solution of the physics in an efficient way.

  18. 324 Building Baseline Radiological Characterization

    SciTech Connect (OSTI)

    R.J. Reeder, J.C. Cooper

    2010-06-24

    This report documents the analysis of radiological data collected as part of the characterization study performed in 1998. The study was performed to create a baseline of the radiological conditions in the 324 Building.

  19. Computational fluid dynamic modeling of fluidized-bed polymerization reactors

    SciTech Connect (OSTI)

    Rokkam, Ram

    2012-01-01

    Polyethylene is one of the most widely used plastics, and over 60 million tons are produced worldwide every year. Polyethylene is obtained by the catalytic polymerization of ethylene in gas and liquid phase reactors. The gas phase processes are more advantageous, and use fluidized-bed reactors for production of polyethylene. Since they operate so close to the melting point of the polymer, agglomeration is an operational concern in all slurry and gas polymerization processes. Electrostatics and hot spot formation are the main factors that contribute to agglomeration in gas-phase processes. Electrostatic charges in gas phase polymerization fluidized bed reactors are known to influence the bed hydrodynamics, particle elutriation, bubble size, bubble shape etc. Accumulation of electrostatic charges in the fluidized-bed can lead to operational issues. In this work a first-principles electrostatic model is developed and coupled with a multi-fluid computational fluid dynamic (CFD) model to understand the effect of electrostatics on the dynamics of a fluidized-bed. The multi-fluid CFD model for gas-particle flow is based on the kinetic theory of granular flows closures. The electrostatic model is developed based on a fixed, size-dependent charge for each type of particle (catalyst, polymer, polymer fines) phase. The combined CFD model is first verified using simple test cases, validated with experiments and applied to a pilot-scale polymerization fluidized-bed reactor. The CFD model reproduced qualitative trends in particle segregation and entrainment due to electrostatic charges observed in experiments. For the scale up of fluidized bed reactor, filtered models are developed and implemented on pilot scale reactor.

  20. Computational fluid dynamics modeling of proton exchange membrane fuel cells

    SciTech Connect (OSTI)

    UM,SUKKEE; WANG,C.Y.; CHEN,KEN S.

    2000-02-11

    A transient, multi-dimensional model has been developed to simulate proton exchange membrane (PEM) fuel cells. The model accounts simultaneously for electrochemical kinetics, current distribution, hydrodynamics and multi-component transport. A single set of conservation equations valid for flow channels, gas-diffusion electrodes, catalyst layers and the membrane region are developed and numerically solved using a finite-volume-based computational fluid dynamics (CFD) technique. The numerical model is validated against published experimental data with good agreement. Subsequently, the model is applied to explore hydrogen dilution effects in the anode feed. The predicted polarization cubes under hydrogen dilution conditions are found to be in qualitative agreement with recent experiments reported in the literature. The detailed two-dimensional electrochemical and flow/transport simulations further reveal that in the presence of hydrogen dilution in the fuel stream, hydrogen is depleted at the reaction surface resulting in substantial kinetic polarization and hence a lower current density that is limited by hydrogen transport from the fuel stream to the reaction site.

  1. Computing

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Computing and Storage Requirements Computing and Storage Requirements for FES J. Candy General Atomics, San Diego, CA Presented at DOE Technical Program Review Hilton Washington DC/Rockville Rockville, MD 19-20 March 2013 2 Computing and Storage Requirements Drift waves and tokamak plasma turbulence Role in the context of fusion research * Plasma performance: In tokamak plasmas, performance is limited by turbulent radial transport of both energy and particles. * Gradient-driven: This turbulent

  2. Computational model for simulation small testing launcher, technical solution

    SciTech Connect (OSTI)

    Chelaru, Teodor-Viorel; Cristian, Barbu; Chelaru, Adrian

    2014-12-10

    The purpose of this paper is to present some aspects regarding the computational model and technical solutions for multistage suborbital launcher for testing (SLT) used to test spatial equipment and scientific measurements. The computational model consists in numerical simulation of SLT evolution for different start conditions. The launcher model presented will be with six degrees of freedom (6DOF) and variable mass. The results analysed will be the flight parameters and ballistic performances. The discussions area will focus around the technical possibility to realize a small multi-stage launcher, by recycling military rocket motors. From technical point of view, the paper is focused on national project 'Suborbital Launcher for Testing' (SLT), which is based on hybrid propulsion and control systems, obtained through an original design. Therefore, while classical suborbital sounding rockets are unguided and they use as propulsion solid fuel motor having an uncontrolled ballistic flight, SLT project is introducing a different approach, by proposing the creation of a guided suborbital launcher, which is basically a satellite launcher at a smaller scale, containing its main subsystems. This is why the project itself can be considered an intermediary step in the development of a wider range of launching systems based on hybrid propulsion technology, which may have a major impact in the future European launchers programs. SLT project, as it is shown in the title, has two major objectives: first, a short term objective, which consists in obtaining a suborbital launching system which will be able to go into service in a predictable period of time, and a long term objective that consists in the development and testing of some unconventional sub-systems which will be integrated later in the satellite launcher as a part of the European space program. This is why the technical content of the project must be carried out beyond the range of the existing suborbital vehicle

  3. computers

    National Nuclear Security Administration (NNSA)

    California.

    Retired computers used for cybersecurity research at Sandia National...

  4. Recent evolution of the offline computing model of the NOvA experiment

    SciTech Connect (OSTI)

    Habig, Alec; Norman, A.; Group, Craig

    2015-12-23

    The NOvA experiment at Fermilab is a long-baseline neutrino experiment designed to study νe appearance in a νμ beam. Over the last few years there has been intense work to streamline the computing infrastructure in preparation for data, which started to flow in from the far detector in Fall 2013. Major accomplishments for this effort include migration to the use of off-site resources through the use of the Open Science Grid and upgrading the file-handling framework from simple disk storage to a tiered system using a comprehensive data management and delivery system to find and access files on either disk or tape storage. NOvA has already produced more than 6.5 million files and more than 1 PB of raw data and Monte Carlo simulation files which are managed under this model. In addition, the current system has demonstrated sustained rates of up to 1 TB/hour of file transfer by the data handling system. NOvA pioneered the use of new tools and this paved the way for their use by other Intensity Frontier experiments at Fermilab. Most importantly, the new framework places the experiment's infrastructure on a firm foundation, and is ready to produce the files needed for first physics.

  5. Recent evolution of the offline computing model of the NOvA experiment

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Habig, Alec; Norman, A.; Group, Craig

    2015-12-23

    The NOvA experiment at Fermilab is a long-baseline neutrino experiment designed to study νe appearance in a νμ beam. Over the last few years there has been intense work to streamline the computing infrastructure in preparation for data, which started to flow in from the far detector in Fall 2013. Major accomplishments for this effort include migration to the use of off-site resources through the use of the Open Science Grid and upgrading the file-handling framework from simple disk storage to a tiered system using a comprehensive data management and delivery system to find and access files on either diskmore » or tape storage. NOvA has already produced more than 6.5 million files and more than 1 PB of raw data and Monte Carlo simulation files which are managed under this model. In addition, the current system has demonstrated sustained rates of up to 1 TB/hour of file transfer by the data handling system. NOvA pioneered the use of new tools and this paved the way for their use by other Intensity Frontier experiments at Fermilab. Most importantly, the new framework places the experiment's infrastructure on a firm foundation, and is ready to produce the files needed for first physics.« less

  6. Computer modeling of active experiments in space plasmas

    SciTech Connect (OSTI)

    Bollens, R.J.

    1993-01-01

    The understanding of space plasmas is expanding rapidly. This is, in large part, due to the ambitious efforts of scientists from around the world who are performing large scale active experiments in the space plasma surrounding the earth. One such effort was designated the Active Magnetospheric Particle Tracer Explorers (AMPTE) and consisted of a series of plasma releases that were completed during 1984 and 1985. What makes the AMPTE experiments particularly interesting was the occurrence of a dramatic anomaly that was completely unpredicted. During the AMPTE experiment, three satellites traced the solar-wind flow into the earth's magnetosphere. One satellite, built by West Germany, released a series of barium and lithium canisters that were detonated and subsequently photo-ionized via solar radiation, thereby creating an artificial comet. Another satellite, built by Great Britain and in the vicinity during detonation, carried, as did the first satellite, a comprehensive set of magnetic field, particle and wave instruments. Upon detonation, what was observed by the satellites, as well as by aircraft and ground-based observers, was quite unexpected. The initial deflection of the ion clouds was not in the ambient solar wind's flow direction ([rvec V]) but rather in the direction transverse to the solar wind and the background magnetic field ([rvec V] [times] [rvec B]). This result was not predicted by any existing theories or simulation models; it is the main subject discussed in this dissertation. A large three dimensional computer simulation was produced to demonstrate that this transverse motion can be explained in terms of a rocket effect. Due to the extreme computer resources utilized in producing this work, the computer methods used to complete the calculation and the visualization techniques used to view the results are also discussed.

  7. Computation Modeling and Assessment of Nanocoatings for Ultra Supercritical Boilers

    SciTech Connect (OSTI)

    J. Shingledecker; D. Gandy; N. Cheruvu; R. Wei; K. Chan

    2011-06-21

    Forced outages and boiler unavailability of coal-fired fossil plants is most often caused by fire-side corrosion of boiler waterwalls and tubing. Reliable coatings are required for Ultrasupercritical (USC) application to mitigate corrosion since these boilers will operate at a much higher temperatures and pressures than in supercritical (565 C {at} 24 MPa) boilers. Computational modeling efforts have been undertaken to design and assess potential Fe-Cr-Ni-Al systems to produce stable nanocrystalline coatings that form a protective, continuous scale of either Al{sub 2}O{sub 3} or Cr{sub 2}O{sub 3}. The computational modeling results identified a new series of Fe-25Cr-40Ni with or without 10 wt.% Al nanocrystalline coatings that maintain long-term stability by forming a diffusion barrier layer at the coating/substrate interface. The computational modeling predictions of microstructure, formation of continuous Al{sub 2}O{sub 3} scale, inward Al diffusion, grain growth, and sintering behavior were validated with experimental results. Advanced coatings, such as MCrAl (where M is Fe, Ni, or Co) nanocrystalline coatings, have been processed using different magnetron sputtering deposition techniques. Several coating trials were performed and among the processing methods evaluated, the DC pulsed magnetron sputtering technique produced the best quality coating with a minimum number of shallow defects and the results of multiple deposition trials showed that the process is repeatable. scale, inward Al diffusion, grain growth, and sintering behavior were validated with experimental results. The cyclic oxidation test results revealed that the nanocrystalline coatings offer better oxidation resistance, in terms of weight loss, localized oxidation, and formation of mixed oxides in the Al{sub 2}O{sub 3} scale, than widely used MCrAlY coatings. However, the ultra-fine grain structure in these coatings, consistent with the computational model predictions, resulted in accelerated Al

  8. Direct coal liquefaction baseline design and system analysis. Quarterly report, April--June 1991

    SciTech Connect (OSTI)

    Not Available

    1991-07-01

    The primary objective of the study is to develop a computer model for a base line direct coal liquefaction design based on two stage direct coupled catalytic reactors. This primary objective is to be accomplished by completing the following: a base line design based on previous DOE/PETC results from Wilsonville pilot plant and other engineering evaluations; a cost estimate and economic analysis; a computer model incorporating the above two steps over a wide range of capacities and selected process alternatives; a comprehensive training program for DOE/PETC Staff to understand and use the computer model; a thorough documentation of all underlying assumptions for baseline economics; and a user manual and training material which will facilitate updating of the model in the future.

  9. CAPE-OPEN compliant stochastic modeling and reduced-order model computation capability for APECS system

    SciTech Connect (OSTI)

    Diwekar, Urmila; Shastri, Yogendra (Vishwamitra Research Institute Clarendon Hills, IL); Subrmanyan, Karthik; Zitney, S.E.

    2007-11-04

    APECS (Advanced Process Engineering Co-Simulator) is an integrated software suite that combines the power of process simulation with high-fidelity, computational fluid dynamics (CFD) for improved design, analysis, and optimization of process engineering systems. The APECS system uses commercial process simulation (e.g., Aspen Plus) and CFD (e.g., FLUENT) software integrated with the process-industry standard CAPE-OPEN (CO) interfaces. This breakthrough capability allows engineers to better understand and optimize the fluid mechanics that drive overall power plant performance and efficiency. The focus of this paper is the CAPE-OPEN complaint stochastic modeling and reduced order model computational capability around the APECS system. The usefulness of capabilities is illustrated with coal fired, gasification based, FutureGen power plant simulation. These capabilities are used to generate efficient reduced order models and optimizing model complexities.

  10. Optimization and Performance Modeling of Stencil Computations on Modern Microprocessors

    SciTech Connect (OSTI)

    Datta, Kaushik; Kamil, Shoaib; Williams, Samuel; Oliker, Leonid; Shalf, John; Yelick, Katherine

    2007-06-01

    Stencil-based kernels constitute the core of many important scientific applications on blockstructured grids. Unfortunately, these codes achieve a low fraction of peak performance, due primarily to the disparity between processor and main memory speeds. In this paper, we explore the impact of trends in memory subsystems on a variety of stencil optimization techniques and develop performance models to analytically guide our optimizations. Our work targets cache reuse methodologies across single and multiple stencil sweeps, examining cache-aware algorithms as well as cache-oblivious techniques on the Intel Itanium2, AMD Opteron, and IBM Power5. Additionally, we consider stencil computations on the heterogeneous multicore design of the Cell processor, a machine with an explicitly managed memory hierarchy. Overall our work represents one of the most extensive analyses of stencil optimizations and performance modeling to date. Results demonstrate that recent trends in memory system organization have reduced the efficacy of traditional cache-blocking optimizations. We also show that a cache-aware implementation is significantly faster than a cache-oblivious approach, while the explicitly managed memory on Cell enables the highest overall efficiency: Cell attains 88% of algorithmic peak while the best competing cache-based processor achieves only 54% of algorithmic peak performance.

  11. Modeling and Analysis of a Lunar Space Reactor with the Computer...

    Office of Scientific and Technical Information (OSTI)

    Reactor with the Computer Code RELAP5-3DATHENA Citation Details In-Document Search Title: Modeling and Analysis of a Lunar Space Reactor with the Computer Code RELAP5-3D...

  12. Computational Modeling and Assessment Of Nanocoatings for Ultra Supercritical Boilers

    SciTech Connect (OSTI)

    David W. Gandy; John P. Shingledecker

    2011-04-11

    Forced outages and boiler unavailability in conventional coal-fired fossil power plants is most often caused by fireside corrosion of boiler waterwalls. Industry-wide, the rate of wall thickness corrosion wastage of fireside waterwalls in fossil-fired boilers has been of concern for many years. It is significant that the introduction of nitrogen oxide (NOx) emission controls with staged burners systems has increased reported waterwall wastage rates to as much as 120 mils (3 mm) per year. Moreover, the reducing environment produced by the low-NOx combustion process is the primary cause of accelerated corrosion rates of waterwall tubes made of carbon and low alloy steels. Improved coatings, such as the MCrAl nanocoatings evaluated here (where M is Fe, Ni, and Co), are needed to reduce/eliminate waterwall damage in subcritical, supercritical, and ultra-supercritical (USC) boilers. The first two tasks of this six-task project-jointly sponsored by EPRI and the U.S. Department of Energy (DE-FC26-07NT43096)-have focused on computational modeling of an advanced MCrAl nanocoating system and evaluation of two nanocrystalline (iron and nickel base) coatings, which will significantly improve the corrosion and erosion performance of tubing used in USC boilers. The computational model results showed that about 40 wt.% is required in Fe based nanocrystalline coatings for long-term durability, leading to a coating composition of Fe-25Cr-40Ni-10 wt.% Al. In addition, the long term thermal exposure test results further showed accelerated inward diffusion of Al from the nanocrystalline coatings into the substrate. In order to enhance the durability of these coatings, it is necessary to develop a diffusion barrier interlayer coating such TiN and/or AlN. The third task 'Process Advanced MCrAl Nanocoating Systems' of the six-task project jointly sponsored by the Electric Power Research Institute, EPRI and the U.S. Department of Energy (DE-FC26-07NT43096)- has focused on processing of

  13. Baseline Wind Energy Facility | Open Energy Information

    Open Energy Info (EERE)

    Wind Energy Facility Jump to: navigation, search Name Baseline Wind Energy Facility Facility Baseline Wind Energy Facility Sector Wind energy Facility Type Commercial Scale Wind...

  14. Baseline Control Measures.pdf

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Individual Permit Baseline Control Measures at Los Alamos National Laboratory, Poster, Individual Permit for Storm Water, NPDES Permit No. NM0030759 Author(s): Veenis, Steven J. Intended for: Public Purpose: This poster was prepared for the June 2013 Individual Permit for Storm Water (IP) public meeting. The purpose of the meeting was to update the public on implementation of the permit as required under Part 1.I (7) of the IP (National Pollutant Discharge Elimination System Permit No.

  15. Computations

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Infrastructure Energy Storage Nuclear Power & Engineering Grid Modernization Battery Testing ... Heavy Duty Fuels DISI Combustion HCCISCCI Fundamentals Spray Combustion Modeling ...

  16. Computations

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Computations - Sandia Energy Energy Search Icon Sandia Home Locations Contact Us Employee Locator Energy & Climate Secure & Sustainable Energy Future Stationary Power Energy Conversion Efficiency Solar Energy Wind Energy Water Power Supercritical CO2 Geothermal Natural Gas Safety, Security & Resilience of the Energy Infrastructure Energy Storage Nuclear Power & Engineering Grid Modernization Battery Testing Nuclear Energy Defense Waste Management Programs Advanced Nuclear Energy

  17. Computing

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Office of Advanced Scientific Computing Research in the Department of Energy Office of Science under contract number DE-AC02-05CH11231. ! Application and System Memory Use, Configuration, and Problems on Bassi Richard Gerber Lawrence Berkeley National Laboratory NERSC User Services ScicomP 13 Garching bei München, Germany, July 17, 2007 ScicomP 13, July 17, 2007, Garching Overview * About Bassi * Memory on Bassi * Large Page Memory (It's Great!) * System Configuration * Large Page

  18. Designing computing system architecture and models for the HL-LHC era

    SciTech Connect (OSTI)

    Bauerdick, L.; Bockelman, B.; Elmer, P.; Gowdy, S.; Tadel, M.; Wurthwein, F.

    2015-01-01

    This work describes a programme to study the computing model in CMS after the next long shutdown near the end of the decade.

  19. Modeling-Computer Simulations At U.S. West Region (Sabin, Et...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At U.S. West Region (Sabin, Et Al., 2004) Exploration Activity Details...

  20. Modeling-Computer Simulations At Cove Fort Area (Toksoz, Et Al...

    Open Energy Info (EERE)

    navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At Cove Fort Area (Toksoz, Et Al, 2010) Exploration Activity Details...

    1. Modeling-Computer Simulations At U.S. West Region (Williams ...

      Open Energy Info (EERE)

      navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer Simulations At U.S. West Region (Williams & Deangelo, 2008) Exploration Activity...

    2. final report for Center for Programming Models for Scalable Parallel Computing

      SciTech Connect (OSTI)

      Johnson, Ralph E

      2013-04-10

      This is the final report of the work on parallel programming patterns that was part of the Center for Programming Models for Scalable Parallel Computing

    3. Review of the synergies between computational modeling and experimenta...

      Office of Scientific and Technical Information (OSTI)

      length scales, new research directions are emerging in materials science and computational mechanics. ... Report Number(s): SAND--2015-10307J Journal ID: ISSN 0022-2461; PII: ...

    4. Automotive Underhood Thermal Management Analysis Using 3-D Coupled Thermal-Hydrodynamic Computer Models: Thermal Radiation Modeling

      SciTech Connect (OSTI)

      Pannala, S; D'Azevedo, E; Zacharia, T

      2002-02-26

      The goal of the radiation modeling effort was to develop and implement a radiation algorithm that is fast and accurate for the underhood environment. As part of this CRADA, a net-radiation model was chosen to simulate radiative heat transfer in an underhood of a car. The assumptions (diffuse-gray and uniform radiative properties in each element) reduce the problem tremendously and all the view factors for radiation thermal calculations can be calculated once and for all at the beginning of the simulation. The cost for online integration of heat exchanges due to radiation is found to be less than 15% of the baseline CHAD code and thus very manageable. The off-line view factor calculation is constructed to be very modular and has been completely integrated to read CHAD grid files and the output from this code can be read into the latest version of CHAD. Further integration has to be performed to accomplish the same with STAR-CD. The main outcome of this effort is to obtain a highly scalable and portable simulation capability to model view factors for underhood environment (for e.g. a view factor calculation which took 14 hours on a single processor only took 14 minutes on 64 processors). The code has also been validated using a simple test case where analytical solutions are available. This simulation capability gives underhood designers in the automotive companies the ability to account for thermal radiation - which usually is critical in the underhood environment and also turns out to be one of the most computationally expensive components of underhood simulations. This report starts off with the original work plan as elucidated in the proposal in section B. This is followed by Technical work plan to accomplish the goals of the project in section C. In section D, background to the current work is provided with references to the previous efforts this project leverages on. The results are discussed in section 1E. This report ends with conclusions and future scope of

    5. Baseline LAW Glass Formulation Testing

      SciTech Connect (OSTI)

      Kruger, Albert A.; Mooers, Cavin; Bazemore, Gina; Pegg, Ian L.; Hight, Kenneth; Lai, Shan Tao; Buechele, Andrew; Rielley, Elizabeth; Gan, Hao; Muller, Isabelle S.; Cecil, Richard

      2013-06-13

      The major objective of the baseline glass formulation work was to develop and select glass formulations that are compliant with contractual and processing requirements for each of the LAW waste streams. Other objectives of the work included preparation and characterization of glasses with respect to the properties of interest, optimization of sulfate loading in the glasses, evaluation of ability to achieve waste loading limits, testing to demonstrate compatibility of glass melts with melter materials of construction, development of glass formulations to support ILAW qualification activities, and identification of glass formulation issues with respect to contract specifications and processing requirements.

    6. Compensator models for fluence field modulated computed tomography

      SciTech Connect (OSTI)

      Bartolac, Steven; Jaffray, David; Radiation Medicine Program, Princess Margaret Hospital Department of Radiation Oncology, University of Toronto, Toronto, Ontario M5G 2M9

      2013-12-15

      Purpose: Fluence field modulated computed tomography (FFMCT) presents a novel approach for acquiring CT images, whereby a patient model guides dynamically changing fluence patterns in an attempt to achieve task-based, user-prescribed, regional variations in image quality, while also controlling dose to the patient. This work aims to compare the relative effectiveness of FFMCT applied to different thoracic imaging tasks (routine diagnostic CT, lung cancer screening, and cardiac CT) when the modulator is subject to limiting constraints, such as might be present in realistic implementations.Methods: An image quality plan was defined for a simulated anthropomorphic chest slice, including regions of high and low image quality, for each of the thoracic imaging tasks. Modulated fluence patterns were generated using a simulated annealing optimization script, which attempts to achieve the image quality plan under a global dosimetric constraint. Optimization was repeated under different types of modulation constraints (e.g., fixed or gantry angle dependent patterns, continuous or comprised of discrete apertures) with the most limiting case being a fixed conventional bowtie filter. For each thoracic imaging task, an image quality map (IQM{sub sd}) representing the regionally varying standard deviation is predicted for each modulation method and compared to the prescribed image quality plan as well as against results from uniform fluence fields. Relative integral dose measures were also compared.Results: Each IQM{sub sd} resulting from FFMCT showed improved agreement with planned objectives compared to those from uniform fluence fields for all cases. Dynamically changing modulation patterns yielded better uniformity, improved image quality, and lower dose compared to fixed filter patterns with optimized tube current. For the latter fixed filter cases, the optimal choice of tube current modulation was found to depend heavily on the task. Average integral dose reduction compared

    7. FED baseline engineering studies report

      SciTech Connect (OSTI)

      Sager, P.H.

      1983-04-01

      Studies were carried out on the FED Baseline to improve design definition, establish feasibility, and reduce cost. Emphasis was placed on cost reduction, but significant feasibility concerns existed in several areas, and better design definition was required to establish feasibility and provide a better basis for cost estimates. Design definition and feasibility studies included the development of a labyrinth shield ring concept to prevent radiation streaming between the torus spool and the TF coil cryostat. The labyrinth shield concept which was developed reduced radiation streaming sufficiently to permit contact maintenance of the inboard EF coils. Various concepts of preventing arcing between adjacent shield sectors were also explored. It was concluded that installation of copper straps with molybdenum thermal radiation shields would provide the most reliable means of preventing arcing. Other design studies included torus spool electrical/structural concepts, test module shielding, torus seismic response, poloidal conditions in the magnets, disruption characteristics, and eddy current effects. These additional studies had no significant impact on cost but did confirm the feasibility of the basic FED Baseline concept.

    8. Computer modeling of properties of complex molecular systems

      SciTech Connect (OSTI)

      Kulkova, E.Yu.; Khrenova, M.G.; Polyakov, I.V.

      2015-03-10

      Large molecular aggregates present important examples of strongly nonhomogeneous systems. We apply combined quantum mechanics / molecular mechanics approaches that assume treatment of a part of the system by quantum-based methods and the rest of the system with conventional force fields. Herein we illustrate these computational approaches by two different examples: (1) large-scale molecular systems mimicking natural photosynthetic centers, and (2) components of prospective solar cells containing titan dioxide and organic dye molecules. We demonstrate that modern computational tools are capable to predict structures and spectra of such complex molecular aggregates.

    9. Improvements in fast-response flood modeling: desktop parallel computing and domain tracking

      SciTech Connect (OSTI)

      Judi, David R; Mcpherson, Timothy N; Burian, Steven J

      2009-01-01

      It is becoming increasingly important to have the ability to accurately forecast flooding, as flooding accounts for the most losses due to natural disasters in the world and the United States. Flood inundation modeling has been dominated by one-dimensional approaches. These models are computationally efficient and are considered by many engineers to produce reasonably accurate water surface profiles. However, because the profiles estimated in these models must be superimposed on digital elevation data to create a two-dimensional map, the result may be sensitive to the ability of the elevation data to capture relevant features (e.g. dikes/levees, roads, walls, etc...). Moreover, one-dimensional models do not explicitly represent the complex flow processes present in floodplains and urban environments and because two-dimensional models based on the shallow water equations have significantly greater ability to determine flow velocity and direction, the National Research Council (NRC) has recommended that two-dimensional models be used over one-dimensional models for flood inundation studies. This paper has shown that two-dimensional flood modeling computational time can be greatly reduced through the use of Java multithreading on multi-core computers which effectively provides a means for parallel computing on a desktop computer. In addition, this paper has shown that when desktop parallel computing is coupled with a domain tracking algorithm, significant computation time can be eliminated when computations are completed only on inundated cells. The drastic reduction in computational time shown here enhances the ability of two-dimensional flood inundation models to be used as a near-real time flood forecasting tool, engineering, design tool, or planning tool. Perhaps even of greater significance, the reduction in computation time makes the incorporation of risk and uncertainty/ensemble forecasting more feasible for flood inundation modeling (NRC 2000; Sayers et al

    10. Baseline Test Specimen Machining Report

      SciTech Connect (OSTI)

      mark Carroll

      2009-08-01

      The Next Generation Nuclear Plant (NGNP) Project is tasked with selecting a high temperature gas reactor technology that will be capable of generating electricity and supplying large amounts of process heat. The NGNP is presently being designed as a helium-cooled high temperature gas reactor (HTGR) with a large graphite core. The graphite baseline characterization project is conducting the research and development (R&D) activities deemed necessary to fully qualify nuclear-grade graphite for use in the NGNP reactor. Establishing nonirradiated thermomechanical and thermophysical properties by characterizing lot-to-lot and billet-to-billet variations (for probabilistic baseline data needs) through extensive data collection and statistical analysis is one of the major fundamental objectives of the project. The reactor core will be made up of stacks of graphite moderator blocks. In order to gain a more comprehensive understanding of the varying characteristics in a wide range of suitable graphites, any of which can be classified as nuclear grade, an experimental program has been initiated to develop an extensive database of the baseline characteristics of numerous candidate graphites. Various factors known to affect the properties of graphite will be investigated, including specimen size, spatial location within a graphite billet, specimen orientation within a billet (either parallel to [P] or transverse to [T] the long axis of the as-produced billet), and billet-to-billet variations within a lot or across different production lots. Because each data point is based on a certain position within a given billet of graphite, particular attention must be paid to the traceability of each specimen and its spatial location and orientation within each billet. The evaluation of these properties is discussed in the Graphite Technology Development Plan (Windes et. al, 2007). One of the key components in the evaluation of these graphite types will be mechanical testing on

    11. COMPUTATIONAL FLUID DYNAMICS MODELING OF SCALED HANFORD DOUBLE SHELL TANK MIXING - CFD MODELING SENSITIVITY STUDY RESULTS

      SciTech Connect (OSTI)

      JACKSON VL

      2011-08-31

      The primary purpose of the tank mixing and sampling demonstration program is to mitigate the technical risks associated with the ability of the Hanford tank farm delivery and celtification systems to measure and deliver a uniformly mixed high-level waste (HLW) feed to the Waste Treatment and Immobilization Plant (WTP) Uniform feed to the WTP is a requirement of 24590-WTP-ICD-MG-01-019, ICD-19 - Interface Control Document for Waste Feed, although the exact definition of uniform is evolving in this context. Computational Fluid Dynamics (CFD) modeling has been used to assist in evaluating scaleup issues, study operational parameters, and predict mixing performance at full-scale.

    12. Pinellas Plant Environmental Baseline Report

      SciTech Connect (OSTI)

      Not Available

      1997-06-01

      The Pinellas Plant has been part of the Department of Energy`s (DOE) nuclear weapons complex since the plant opened in 1957. In March 1995, the DOE sold the Pinellas Plant to the Pinellas County Industry Council (PCIC). DOE has leased back a large portion of the plant site to facilitate transition to alternate use and safe shutdown. The current mission is to achieve a safe transition of the facility from defense production and prepare the site for alternative uses as a community resource for economic development. Toward that effort, the Pinellas Plant Environmental Baseline Report (EBR) discusses the current and past environmental conditions of the plant site. Information for the EBR is obtained from plant records. Historical process and chemical usage information for each area is reviewed during area characterizations.

    13. TWRS privatization process technical baseline

      SciTech Connect (OSTI)

      Orme, R.M.

      1996-09-13

      The U.S. Department of Energy (DOE) is planning a two-phased program for the remediation of Hanford tank waste. Phase 1 is a pilot program to demonstrate the procurement of treatment services. The volume of waste treated during the Phase 1 is a small percentage of the tank waste. During Phase 2, DOE intends to procure treatment services for the balance of the waste. The TWRS Privatization Process Technical Baseline (PPTB) provides a summary level flowsheet/mass balance of tank waste treatment operations which is consistent with the tank inventory information, waste feed staging studies, and privatization guidelines currently available. The PPTB will be revised periodically as privatized processing concepts are crystallized.

    14. Hydropower Baseline Cost Modeling, Version 2

      SciTech Connect (OSTI)

      O'Connor, Patrick W.

      2015-09-01

      Recent resource assessments conducted by the United States Department of Energy have identified significant opportunities for expanding hydropower generation through the addition of power to non-powered dams and on undeveloped stream-reaches. Additional interest exists in the powering of existing water resource infrastructure such as conduits and canals, upgrading and expanding existing hydropower facilities, and the construction new pumped storage hydropower. Understanding the potential future role of these hydropower resources in the nation’s energy system requires an assessment of the environmental and techno-economic issues associated with expanding hydropower generation. To facilitate these assessments, this report seeks to fill the current gaps in publically available hydropower cost estimating tools that can support the national-scale evaluation of hydropower resources.

    15. A system analysis computer model for the High Flux Isotope Reactor (HFIRSYS Version 1)

      SciTech Connect (OSTI)

      Sozer, M.C.

      1992-04-01

      A system transient analysis computer model (HFIRSYS) has been developed for analysis of small break loss of coolant accidents (LOCA) and operational transients. The computer model is based on the Advanced Continuous Simulation Language (ACSL) that produces the FORTRAN code automatically and that provides integration routines such as the Gear`s stiff algorithm as well as enabling users with numerous practical tools for generating Eigen values, and providing debug outputs and graphics capabilities, etc. The HFIRSYS computer code is structured in the form of the Modular Modeling System (MMS) code. Component modules from MMS and in-house developed modules were both used to configure HFIRSYS. A description of the High Flux Isotope Reactor, theoretical bases for the modeled components of the system, and the verification and validation efforts are reported. The computer model performs satisfactorily including cases in which effects of structural elasticity on the system pressure is significant; however, its capabilities are limited to single phase flow. Because of the modular structure, the new component models from the Modular Modeling System can easily be added to HFIRSYS for analyzing their effects on system`s behavior. The computer model is a versatile tool for studying various system transients. The intent of this report is not to be a users manual, but to provide theoretical bases and basic information about the computer model and the reactor.

    16. Final Report for Integrated Multiscale Modeling of Molecular Computing Devices

      SciTech Connect (OSTI)

      Glotzer, Sharon C.

      2013-08-28

      In collaboration with researchers at Vanderbilt University, North Carolina State University, Princeton and Oakridge National Laboratory we developed multiscale modeling and simulation methods capable of modeling the synthesis, assembly, and operation of molecular electronics devices. Our role in this project included the development of coarse-grained molecular and mesoscale models and simulation methods capable of simulating the assembly of millions of organic conducting molecules and other molecular components into nanowires, crossbars, and other organized patterns.

    17. Mathematical modeling and computer simulation of processes in energy systems

      SciTech Connect (OSTI)

      Hanjalic, K.C. )

      1990-01-01

      This book is divided into the following chapters. Modeling techniques and tools (fundamental concepts of modeling); 2. Fluid flow, heat and mass transfer, chemical reactions, and combustion; 3. Processes in energy equipment and plant components (boilers, steam and gas turbines, IC engines, heat exchangers, pumps and compressors, nuclear reactors, steam generators and separators, energy transport equipment, energy convertors, etc.); 4. New thermal energy conversion technologies (MHD, coal gasification and liquefaction fluidized-bed combustion, pulse-combustors, multistage combustion, etc.); 5. Combined cycles and plants, cogeneration; 6. Dynamics of energy systems and their components; 7. Integrated approach to energy systems modeling, and 8. Application of modeling in energy expert systems.

    18. Modeling-Computer Simulations At Kilauea East Rift Geothermal...

      Open Energy Info (EERE)

      importance of water convection for distributing heat in the East Rift Zone. References Albert J. Rudman, David Epp (1983) Conduction Models Of The Temperature Distribution In The...

    19. Modeling-Computer Simulations (Walker, Et Al., 2005) | Open Energy...

      Open Energy Info (EERE)

      occurrence model for geothermal systems based on fundamental geologic data. References J. D. Walker, A. E. Sabin, J. R. Unruh, J. Combs, F. C. Monastero (2005) Development Of...

    20. Baseline Graphite Characterization: First Billet

      SciTech Connect (OSTI)

      Mark C. Carroll; Joe Lords; David Rohrbaugh

      2010-09-01

      The Next Generation Nuclear Plant Project Graphite Research and Development program is currently establishing the safe operating envelope of graphite core components for a very high temperature reactor design. To meet this goal, the program is generating the extensive amount of quantitative data necessary for predicting the behavior and operating performance of the available nuclear graphite grades. In order determine the in-service behavior of the graphite for the latest proposed designs, two main programs are underway. The first, the Advanced Graphite Creep (AGC) program, is a set of experiments that are designed to evaluate the irradiated properties and behavior of nuclear grade graphite over a large spectrum of temperatures, neutron fluences, and compressive loads. Despite the aggressive experimental matrix that comprises the set of AGC test runs, a limited amount of data can be generated based upon the availability of space within the Advanced Test Reactor and the geometric constraints placed on the AGC specimens that will be inserted. In order to supplement the AGC data set, the Baseline Graphite Characterization program will endeavor to provide supplemental data that will characterize the inherent property variability in nuclear-grade graphite without the testing constraints of the AGC program. This variability in properties is a natural artifact of graphite due to the geologic raw materials that are utilized in its production. This variability will be quantified not only within a single billet of as-produced graphite, but also from billets within a single lot, billets from different lots of the same grade, and across different billets of the numerous grades of nuclear graphite that are presently available. The thorough understanding of this variability will provide added detail to the irradiated property data, and provide a more thorough understanding of the behavior of graphite that will be used in reactor design and licensing. This report covers the

    1. An Approach to Integrate a Space-Time GIS Data Model with High Performance Computers

      SciTech Connect (OSTI)

      Wang, Dali; Zhao, Ziliang; Shaw, Shih-Lung

      2011-01-01

      In this paper, we describe an approach to integrate a Space-Time GIS data model on a high performance computing platform. The Space-Time GIS data model has been developed on a desktop computing environment. We use the Space-Time GIS data model to generate GIS module, which organizes a series of remote sensing data. We are in the process of porting the GIS module into an HPC environment, in which the GIS modules handle large dataset directly via parallel file system. Although it is an ongoing project, authors hope this effort can inspire further discussions on the integration of GIS on high performance computing platforms.

    2. Modeling and Analysis of a Lunar Space Reactor with the Computer Code

      Office of Scientific and Technical Information (OSTI)

      RELAP5-3D/ATHENA (Conference) | SciTech Connect Conference: Modeling and Analysis of a Lunar Space Reactor with the Computer Code RELAP5-3D/ATHENA Citation Details In-Document Search Title: Modeling and Analysis of a Lunar Space Reactor with the Computer Code RELAP5-3D/ATHENA The transient analysis 3-dimensional (3-D) computer code RELAP5-3D/ATHENA has been employed to model and analyze a space reactor of 180 kW(thermal), 40 kW (net, electrical) with eight Stirling engines (SEs). Each SE

    3. Demonstrating the improvement of predictive maturity of a computational model

      SciTech Connect (OSTI)

      Hemez, Francois M; Unal, Cetin; Atamturktur, Huriye S

      2010-01-01

      We demonstrate an improvement of predictive capability brought to a non-linear material model using a combination of test data, sensitivity analysis, uncertainty quantification, and calibration. A model that captures increasingly complicated phenomena, such as plasticity, temperature and strain rate effects, is analyzed. Predictive maturity is defined, here, as the accuracy of the model to predict multiple Hopkinson bar experiments. A statistical discrepancy quantifies the systematic disagreement (bias) between measurements and predictions. Our hypothesis is that improving the predictive capability of a model should translate into better agreement between measurements and predictions. This agreement, in turn, should lead to a smaller discrepancy. We have recently proposed to use discrepancy and coverage, that is, the extent to which the physical experiments used for calibration populate the regime of applicability of the model, as basis to define a Predictive Maturity Index (PMI). It was shown that predictive maturity could be improved when additional physical tests are made available to increase coverage of the regime of applicability. This contribution illustrates how the PMI changes as 'better' physics are implemented in the model. The application is the non-linear Preston-Tonks-Wallace (PTW) strength model applied to Beryllium metal. We demonstrate that our framework tracks the evolution of maturity of the PTW model. Robustness of the PMI with respect to the selection of coefficients needed in its definition is also studied.

    4. GEO3D - Three-Dimensional Computer Model of a Ground Source Heat Pump System

      DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

      James Menart

      2013-06-07

      This file is the setup file for the computer program GEO3D. GEO3D is a computer program written by Jim Menart to simulate vertical wells in conjunction with a heat pump for ground source heat pump (GSHP) systems. This is a very detailed three-dimensional computer model. This program produces detailed heat transfer and temperature field information for a vertical GSHP system.

    5. Modeling-Computer Simulations (Gritto & Majer) | Open Energy...

      Open Energy Info (EERE)

      are shown in Figure 1. The parameters of the fault were modeled after Coates and Schoenberg (1995), where the orientation of the fault relative to the finite-difference grid...

    6. Computer support to run models of the atmosphere. Final report

      SciTech Connect (OSTI)

      Fung, I.

      1996-08-30

      This research is focused on a better quantification of the variations in CO{sub 2} exchanges between the atmosphere and biosphere and the factors responsible for these exchangers. The principal approach is to infer the variations in the exchanges from variations in the atmospheric CO{sub 2} distribution. The principal tool involves using a global three-dimensional tracer transport model to advect and convect CO{sub 2} in the atmosphere. The tracer model the authors used was developed at the Goddard institute for Space Studies (GISS) and is derived from the GISS atmospheric general circulation model. A special run of the GCM is made to save high-frequency winds and mixing statistics for the tracer model.

    7. Cielo Computational Environment Usage Model With Mappings to...

      Office of Scientific and Technical Information (OSTI)

      This document describes specific capabilities, tools, and procedures to support both local and remote users. The model is focused on the needs of the ASC user working in the secure ...

    8. A computationally efficient parallel Levenberg-Marquardt algorithm for highly parameterized inverse model analyses

      DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

      Lin, Youzuo; O'Malley, Daniel; Vesselinov, Velimir V.

      2016-08-19

      Inverse modeling seeks model parameters given a set of observations. However, for practical problems because the number of measurements is often large and the model parameters are also numerous, conventional methods for inverse modeling can be computationally expensive. We have developed a new, computationally-efficient parallel Levenberg-Marquardt method for solving inverse modeling problems with a highly parameterized model space. Levenberg-Marquardt methods require the solution of a linear system of equations which can be prohibitively expensive to compute for moderate to large-scale problems. Our novel method projects the original linear problem down to a Krylov subspace, such that the dimensionality of themore » problem can be significantly reduced. Furthermore, we store the Krylov subspace computed when using the first damping parameter and recycle the subspace for the subsequent damping parameters. The efficiency of our new inverse modeling algorithm is significantly improved using these computational techniques. We apply this new inverse modeling method to invert for random transmissivity fields in 2D and a random hydraulic conductivity field in 3D. Our algorithm is fast enough to solve for the distributed model parameters (transmissivity) in the model domain. The algorithm is coded in Julia and implemented in the MADS computational framework (http://mads.lanl.gov). By comparing with Levenberg-Marquardt methods using standard linear inversion techniques such as QR or SVD methods, our Levenberg-Marquardt method yields a speed-up ratio on the order of ~101 to ~102 in a multi-core computational environment. Furthermore, our new inverse modeling method is a powerful tool for characterizing subsurface heterogeneity for moderate- to large-scale problems.« less

    9. FINITE ELEMENT MODELS FOR COMPUTING SEISMIC INDUCED SOIL PRESSURES ON DEEPLY EMBEDDED NUCLEAR POWER PLANT STRUCTURES.

      SciTech Connect (OSTI)

      XU, J.; COSTANTINO, C.; HOFMAYER, C.

      2006-06-26

      PAPER DISCUSSES COMPUTATIONS OF SEISMIC INDUCED SOIL PRESSURES USING FINITE ELEMENT MODELS FOR DEEPLY EMBEDDED AND OR BURIED STIFF STRUCTURES SUCH AS THOSE APPEARING IN THE CONCEPTUAL DESIGNS OF STRUCTURES FOR ADVANCED REACTORS.

    10. Towards a Computational Model of a Methane Producing Archaeum

      DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

      Peterson, Joseph R.; Labhsetwar, Piyush; Ellermeier, Jeremy R.; Kohler, Petra R. A.; Jain, Ankur; Ha, Taekjip; Metcalf, William W.; Luthey-Schulten, Zaida

      2014-01-01

      Progress towards a complete model of the methanogenic archaeum Methanosarcina acetivorans is reported. We characterized size distribution of the cells using differential interference contrast microscopy, finding them to be ellipsoidal with mean length and width of 2.9  μ m and 2.3  μ m, respectively, when grown on methanol and 30% smaller when grown on acetate. We used the single molecule pull down (SiMPull) technique to measure average copy number of the Mcr complex and ribosomes. A kinetic model for the methanogenesis pathways based on biochemical studies and recent metabolic reconstructions for several related methanogens is presented. In this model,more » 26 reactions in the methanogenesis pathways are coupled to a cell mass production reaction that updates enzyme concentrations. RNA expression data (RNA-seq) measured for cell cultures grown on acetate and methanol is used to estimate relative protein production per mole of ATP consumed. The model captures the experimentally observed methane production rates for cells growing on methanol and is most sensitive to the number of methyl-coenzyme-M reductase (Mcr) and methyl-tetrahydromethanopterin:coenzyme-M methyltransferase (Mtr) proteins. A draft transcriptional regulation network based on known interactions is proposed which we intend to integrate with the kinetic model to allow dynamic regulation.« less

    11. Theoretical and computer models of detonation in solid explosives

      SciTech Connect (OSTI)

      Tarver, C.M.; Urtiew, P.A.

      1997-10-01

      Recent experimental and theoretical advances in understanding energy transfer and chemical kinetics have led to improved models of detonation waves in solid explosives. The Nonequilibrium Zeldovich - von Neumann - Doring (NEZND) model is supported by picosecond laser experiments and molecular dynamics simulations of the multiphonon up-pumping and internal vibrational energy redistribution (IVR) processes by which the unreacted explosive molecules are excited to the transition state(s) preceding reaction behind the leading shock front(s). High temperature, high density transition state theory calculates the induction times measured by laser interferometric techniques. Exothermic chain reactions form product gases in highly excited vibrational states, which have been demonstrated to rapidly equilibrate via supercollisions. Embedded gauge and Fabry-Perot techniques measure the rates of reaction product expansion as thermal and chemical equilibrium is approached. Detonation reaction zone lengths in carbon-rich condensed phase explosives depend on the relatively slow formation of solid graphite or diamond. The Ignition and Growth reactive flow model based on pressure dependent reaction rates and Jones-Wilkins-Lee (JWL) equations of state has reproduced this nanosecond time resolved experimental data and thus has yielded accurate average reaction zone descriptions in one-, two- and three- dimensional hydrodynamic code calculations. The next generation reactive flow model requires improved equations of state and temperature dependent chemical kinetics. Such a model is being developed for the ALE3D hydrodynamic code, in which heat transfer and Arrhenius kinetics are intimately linked to the hydrodynamics.

    12. Computer model for characterizing, screening, and optimizing electrolyte systems

      SciTech Connect (OSTI)

      Gering, Kevin L.

      2015-06-15

      Electrolyte systems in contemporary batteries are tasked with operating under increasing performance requirements. All battery operation is in some way tied to the electrolyte and how it interacts with various regions within the cell environment. Seeing the electrolyte plays a crucial role in battery performance and longevity, it is imperative that accurate, physics-based models be developed that will characterize key electrolyte properties while keeping pace with the increasing complexity of these liquid systems. Advanced models are needed since laboratory measurements require significant resources to carry out for even a modest experimental matrix. The Advanced Electrolyte Model (AEM) developed at the INL is a proven capability designed to explore molecular-to-macroscale level aspects of electrolyte behavior, and can be used to drastically reduce the time required to characterize and optimize electrolytes. Although it is applied most frequently to lithium-ion battery systems, it is general in its theory and can be used toward numerous other targets and intended applications. This capability is unique, powerful, relevant to present and future electrolyte development, and without peer. It redefines electrolyte modeling for highly-complex contemporary systems, wherein significant steps have been taken to capture the reality of electrolyte behavior in the electrochemical cell environment. This capability can have a very positive impact on accelerating domestic battery development to support aggressive vehicle and energy goals in the 21st century.

    13. ONSET OF CHAOS IN A MODEL OF QUANTUM COMPUTATION (Conference) | SciTech

      Office of Scientific and Technical Information (OSTI)

      Connect Conference: ONSET OF CHAOS IN A MODEL OF QUANTUM COMPUTATION Citation Details In-Document Search Title: ONSET OF CHAOS IN A MODEL OF QUANTUM COMPUTATION Recently, the question of a relevance of the so-called quantum chaos has been raised in applications to quantum computation [2,3]. Indeed, according to the general approach to closed systems of finite number of interacting Fermi-particles (see, e.g. [4,5]), with an increase of an interaction between qubits a kind of chaos is expected

    14. Computational modeling of drug-resistant bacteria. Final report

      SciTech Connect (OSTI)

      MacDougall, Preston

      2015-03-12

      Initial proposal summary: The evolution of antibiotic-resistant mutants among bacteria (superbugs) is a persistent and growing threat to public health. In many ways, we are engaged in a war with these microorganisms, where the corresponding arms race involves chemical weapons and biological targets. Just as advances in microelectronics, imaging technology and feature recognition software have turned conventional munitions into smart bombs, the long-term objectives of this proposal are to develop highly effective antibiotics using next-generation biomolecular modeling capabilities in tandem with novel subatomic feature detection software. Using model compounds and targets, our design methodology will be validated with correspondingly ultra-high resolution structure-determination methods at premier DOE facilities (single-crystal X-ray diffraction at Argonne National Laboratory, and neutron diffraction at Oak Ridge National Laboratory). The objectives and accomplishments are summarized.

    15. Computational models for the berry phase in semiconductor quantum dots

      SciTech Connect (OSTI)

      Prabhakar, S. Melnik, R. V. N.; Sebetci, A.

      2014-10-06

      By developing a new model and its finite element implementation, we analyze the Berry phase low-dimensional semiconductor nanostructures, focusing on quantum dots (QDs). In particular, we solve the Schrdinger equation and investigate the evolution of the spin dynamics during the adiabatic transport of the QDs in the 2D plane along circular trajectory. Based on this study, we reveal that the Berry phase is highly sensitive to the Rashba and Dresselhaus spin-orbit lengths.

    16. Accelerated Climate Modeling for Energy | Argonne Leadership Computing

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Facility a Category 5 hurricane simulated by the CESM at 13 km resolution An example of a Category 5 hurricane simulated by the CESM at 13 km resolution. Precipitable water (gray scale) shows the detailed dynamical structure in the flow. Strong precipitation is overlaid in red. High resolution is necessary to simulate reasonable numbers of tropical cyclones including Category 4 and 5 storms. Credit: Alan Scott and Mark Taylor, Sandia National Laboratories Accelerated Climate Modeling for

    17. Accelerated Climate Modeling for Energy | Argonne Leadership Computing

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Facility An example of a Category 5 hurricane simulated by the CESM at 13 km resolution An example of a Category 5 hurricane simulated by the CESM at 13 km resolution. Precipitable water (gray scale) shows the detailed dynamical structure in the flow. Strong precipitation is overlaid in red. High resolution is necessary to simulate reasonable numbers of tropical cyclones including Category 4 and 5 storms. Alan Scott and Mark Taylor, Sandia National Laboratories Accelerated Climate Modeling

    18. CASTING DEFECT MODELING IN AN INTEGRATED COMPUTATIONAL MATERIALS ENGINEERING APPROACH

      SciTech Connect (OSTI)

      Sabau, Adrian S

      2015-01-01

      To accelerate the introduction of new cast alloys, the simultaneous modeling and simulation of multiphysical phenomena needs to be considered in the design and optimization of mechanical properties of cast components. The required models related to casting defects, such as microporosity and hot tears, are reviewed. Three aluminum alloys are considered A356, 356 and 319. The data on calculated solidification shrinkage is presented and its effects on microporosity levels discussed. Examples are given for predicting microporosity defects and microstructure distribution for a plate casting. Models to predict fatigue life and yield stress are briefly highlighted here for the sake of completion and to illustrate how the length scales of the microstructure features as well as porosity defects are taken into account for modeling the mechanical properties. Thus, the data on casting defects, including microstructure features, is crucial for evaluating the final performance-related properties of the component. ACKNOWLEDGEMENTS This work was performed under a Cooperative Research and Development Agreement (CRADA) with the Nemak Inc., and Chrysler Co. for the project "High Performance Cast Aluminum Alloys for Next Generation Passenger Vehicle Engines. The author would also like to thank Amit Shyam for reviewing the paper and Andres Rodriguez of Nemak Inc. Research sponsored by the U. S. Department of Energy, Office of Energy Efficiency and Renewable Energy, Vehicle Technologies Office, as part of the Propulsion Materials Program under contract DE-AC05-00OR22725 with UT-Battelle, LLC. Part of this research was conducted through the Oak Ridge National Laboratory's High Temperature Materials Laboratory User Program, which is sponsored by the U. S. Department of Energy, Office of Energy Efficiency and Renewable Energy, Vehicle Technologies Program.

    19. Computer modeling of a CFB (circulating fluidized bed) gasifier

      SciTech Connect (OSTI)

      Gidaspow, D.; Ding, J.

      1990-06-01

      The overall objective of this investigation is to develop experimentally verified models for circulating fluidized bed (CFB) combustors. This report presents an extension of our cold flow modeling of a CFB given in our first quarterly report of this project and published in Numerical Methods for Multiphase Flows'' edited by I. Celik, D. Hughes, C. T. Crowe and D. Lankford, FED-Vol.91, American Society of Mechanical Engineering, pp47--56 (1990). The title of the paper is Multiphase Navier-Stokes Equation Solver'' by D. Gidaspow, J. Ding and U.K. Jayaswal. To the two dimensional code described in the above paper we added the energy equations and the conservation of species equations to describe a synthesis gas from char producer. Under the simulation conditions the injected oxygen reacted near the inlet. The solid-gas mixing was sufficiently rapid that no undesirable hot spots were produced. This simulation illustrates the code's capability to model CFB reactors. 15 refs., 20 figs.

    20. Computer model for characterizing, screening, and optimizing electrolyte systems

      Energy Science and Technology Software Center (OSTI)

      2015-06-15

      Electrolyte systems in contemporary batteries are tasked with operating under increasing performance requirements. All battery operation is in some way tied to the electrolyte and how it interacts with various regions within the cell environment. Seeing the electrolyte plays a crucial role in battery performance and longevity, it is imperative that accurate, physics-based models be developed that will characterize key electrolyte properties while keeping pace with the increasing complexity of these liquid systems. Advanced modelsmore » are needed since laboratory measurements require significant resources to carry out for even a modest experimental matrix. The Advanced Electrolyte Model (AEM) developed at the INL is a proven capability designed to explore molecular-to-macroscale level aspects of electrolyte behavior, and can be used to drastically reduce the time required to characterize and optimize electrolytes. Although it is applied most frequently to lithium-ion battery systems, it is general in its theory and can be used toward numerous other targets and intended applications. This capability is unique, powerful, relevant to present and future electrolyte development, and without peer. It redefines electrolyte modeling for highly-complex contemporary systems, wherein significant steps have been taken to capture the reality of electrolyte behavior in the electrochemical cell environment. This capability can have a very positive impact on accelerating domestic battery development to support aggressive vehicle and energy goals in the 21st century.« less

    1. Baselines for Greenhouse Gas Reductions: Problems, Precedents...

      Open Energy Info (EERE)

      Baseline projection, GHG inventory, Pathways analysis Resource Type: Publications, Lessons learnedbest practices Website: www.p2pays.orgref2221739.pdf References:...

    2. Tank waste remediation systems technical baseline database

      SciTech Connect (OSTI)

      Porter, P.E.

      1996-10-16

      This document includes a cassette tape that contains Hanford generated data for the Tank Waste Remediation Systems Technical Baseline Database as of October 09, 1996.

    3. Computer-Aided Construction of Combustion Chemistry Models

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Constructing Accurate Combustion Chemistry Models: Butanols William H. Green & Michael Harper MIT Dept. of Chem. Eng. CEFRC Annual Meeting, Sept. 2010 The people who did this work: Dr. C. Franklin Goldsmith Greg Magoon Shamel Merchant Dr. Sumathy Raman Dr. Sandeep Sharma Prof. Kevin Van Geem Steven Pyl We are also grateful to: Joshua Allen Prof. Paul Barton Dr. Stephen Klippenstein Prof. Guy Marin Jeffrey Mo Dr. S-A Seyed-Reihani Dr. Richard West & MANY CEFRC MEMBERS One of Our Project's

    4. Computational Human Performance Modeling For Alarm System Design

      SciTech Connect (OSTI)

      Jacques Hugo

      2012-07-01

      The introduction of new technologies like adaptive automation systems and advanced alarms processing and presentation techniques in nuclear power plants is already having an impact on the safety and effectiveness of plant operations and also the role of the control room operator. This impact is expected to escalate dramatically as more and more nuclear power utilities embark on upgrade projects in order to extend the lifetime of their plants. One of the most visible impacts in control rooms will be the need to replace aging alarm systems. Because most of these alarm systems use obsolete technologies, the methods, techniques and tools that were used to design the previous generation of alarm system designs are no longer effective and need to be updated. The same applies to the need to analyze and redefine operators alarm handling tasks. In the past, methods for analyzing human tasks and workload have relied on crude, paper-based methods that often lacked traceability. New approaches are needed to allow analysts to model and represent the new concepts of alarm operation and human-system interaction. State-of-the-art task simulation tools are now available that offer a cost-effective and efficient method for examining the effect of operator performance in different conditions and operational scenarios. A discrete event simulation system was used by human factors researchers at the Idaho National Laboratory to develop a generic alarm handling model to examine the effect of operator performance with simulated modern alarm system. It allowed analysts to evaluate alarm generation patterns as well as critical task times and human workload predicted by the system.

    5. ARM: Baseline Solar Radiation Network (BSRN): solar irradiances...

      Office of Scientific and Technical Information (OSTI)

      Baseline Solar Radiation Network (BSRN): solar irradiances Title: ARM: Baseline Solar Radiation Network (BSRN): solar irradiances Baseline Solar Radiation Network (BSRN): solar ...

    6. Long-Baseline Neutrino Facility / Deep Underground Neutrino Project...

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      Long-Baseline Neutrino Facility Deep Underground Neutrino Project (LBNF-DUNE) Long-Baseline Neutrino Facility Deep Underground Neutrino Project (LBNF-DUNE) Long-Baseline ...

    7. Systems, methods and computer-readable media for modeling cell performance fade of rechargeable electrochemical devices

      DOE Patents [OSTI]

      Gering, Kevin L

      2013-08-27

      A system includes an electrochemical cell, monitoring hardware, and a computing system. The monitoring hardware periodically samples performance characteristics of the electrochemical cell. The computing system determines cell information from the performance characteristics of the electrochemical cell. The computing system also develops a mechanistic level model of the electrochemical cell to determine performance fade characteristics of the electrochemical cell and analyzing the mechanistic level model to estimate performance fade characteristics over aging of a similar electrochemical cell. The mechanistic level model uses first constant-current pulses applied to the electrochemical cell at a first aging period and at three or more current values bracketing a first exchange current density. The mechanistic level model also is based on second constant-current pulses applied to the electrochemical cell at a second aging period and at three or more current values bracketing the second exchange current density.

    8. R&D for computational cognitive and social models : foundations for model evaluation through verification and validation (final LDRD report).

      SciTech Connect (OSTI)

      Slepoy, Alexander; Mitchell, Scott A.; Backus, George A.; McNamara, Laura A.; Trucano, Timothy Guy

      2008-09-01

      Sandia National Laboratories is investing in projects that aim to develop computational modeling and simulation applications that explore human cognitive and social phenomena. While some of these modeling and simulation projects are explicitly research oriented, others are intended to support or provide insight for people involved in high consequence decision-making. This raises the issue of how to evaluate computational modeling and simulation applications in both research and applied settings where human behavior is the focus of the model: when is a simulation 'good enough' for the goals its designers want to achieve? In this report, we discuss two years' worth of review and assessment of the ASC program's approach to computational model verification and validation, uncertainty quantification, and decision making. We present a framework that extends the principles of the ASC approach into the area of computational social and cognitive modeling and simulation. In doing so, we argue that the potential for evaluation is a function of how the modeling and simulation software will be used in a particular setting. In making this argument, we move from strict, engineering and physics oriented approaches to V&V to a broader project of model evaluation, which asserts that the systematic, rigorous, and transparent accumulation of evidence about a model's performance under conditions of uncertainty is a reasonable and necessary goal for model evaluation, regardless of discipline. How to achieve the accumulation of evidence in areas outside physics and engineering is a significant research challenge, but one that requires addressing as modeling and simulation tools move out of research laboratories and into the hands of decision makers. This report provides an assessment of our thinking on ASC Verification and Validation, and argues for further extending V&V research in the physical and engineering sciences toward a broader program of model evaluation in situations of high

    9. A Variable Refrigerant Flow Heat Pump Computer Model in EnergyPlus

      SciTech Connect (OSTI)

      Raustad, Richard A.

      2013-01-01

      This paper provides an overview of the variable refrigerant flow heat pump computer model included with the Department of Energy's EnergyPlusTM whole-building energy simulation software. The mathematical model for a variable refrigerant flow heat pump operating in cooling or heating mode, and a detailed model for the variable refrigerant flow direct-expansion (DX) cooling coil are described in detail.

    10. Techno-Economic Modeling - Building New Battery Systems on the Computer -

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Joint Center for Energy Storage Research October 22, 2015, Accomplishments Techno-Economic Modeling - Building New Battery Systems on the Computer JCESR is applying techno-economic models to project the performance and cost of a wide array of promising new battery systems before they are prototyped. The results from techno-economic modeling establish performance "floors" for discovery science teams looking for new anodes, cathodes, and electrolytes for a beyond lithium-ion battery,

    11. Verification of a VRF Heat Pump Computer Model in EnergyPlus

      SciTech Connect (OSTI)

      Nigusse, Bereket; Raustad, Richard

      2013-06-01

      This paper provides verification results of the EnergyPlus variable refrigerant flow (VRF) heat pump computer model using manufacturer's performance data. The paper provides an overview of the VRF model, presents the verification methodology, and discusses the results. The verification provides quantitative comparison of full and part-load performance to manufacturer's data in cooling-only and heating-only modes of operation. The VRF heat pump computer model uses dual range bi-quadratic performance curves to represent capacity and Energy Input Ratio (EIR) as a function of indoor and outdoor air temperatures, and dual range quadratic performance curves as a function of part-load-ratio for modeling part-load performance. These performance curves are generated directly from manufacturer's published performance data. The verification compared the simulation output directly to manufacturer's performance data, and found that the dual range equation fit VRF heat pump computer model predicts the manufacturer's performance data very well over a wide range of indoor and outdoor temperatures and part-load conditions. The predicted capacity and electric power deviations are comparbale to equation-fit HVAC computer models commonly used for packaged and split unitary HVAC equipment.

    12. A Hybrid MPI/OpenMP Approach for Parallel Groundwater Model Calibration on Multicore Computers

      SciTech Connect (OSTI)

      Tang, Guoping; D'Azevedo, Ed F; Zhang, Fan; Parker, Jack C.; Watson, David B; Jardine, Philip M

      2010-01-01

      Groundwater model calibration is becoming increasingly computationally time intensive. We describe a hybrid MPI/OpenMP approach to exploit two levels of parallelism in software and hardware to reduce calibration time on multicore computers with minimal parallelization effort. At first, HydroGeoChem 5.0 (HGC5) is parallelized using OpenMP for a uranium transport model with over a hundred species involving nearly a hundred reactions, and a field scale coupled flow and transport model. In the first application, a single parallelizable loop is identified to consume over 97% of the total computational time. With a few lines of OpenMP compiler directives inserted into the code, the computational time reduces about ten times on a compute node with 16 cores. The performance is further improved by selectively parallelizing a few more loops. For the field scale application, parallelizable loops in 15 of the 174 subroutines in HGC5 are identified to take more than 99% of the execution time. By adding the preconditioned conjugate gradient solver and BICGSTAB, and using a coloring scheme to separate the elements, nodes, and boundary sides, the subroutines for finite element assembly, soil property update, and boundary condition application are parallelized, resulting in a speedup of about 10 on a 16-core compute node. The Levenberg-Marquardt (LM) algorithm is added into HGC5 with the Jacobian calculation and lambda search parallelized using MPI. With this hybrid approach, compute nodes at the number of adjustable parameters (when the forward difference is used for Jacobian approximation), or twice that number (if the center difference is used), are used to reduce the calibration time from days and weeks to a few hours for the two applications. This approach can be extended to global optimization scheme and Monte Carol analysis where thousands of compute nodes can be efficiently utilized.

    13. TWRS technical baseline database manager definition document

      SciTech Connect (OSTI)

      Acree, C.D.

      1997-08-13

      This document serves as a guide for using the TWRS Technical Baseline Database Management Systems Engineering (SE) support tool in performing SE activities for the Tank Waste Remediation System (TWRS). This document will provide a consistent interpretation of the relationships between the TWRS Technical Baseline Database Management software and the present TWRS SE practices. The Database Manager currently utilized is the RDD-1000 System manufactured by the Ascent Logic Corporation. In other documents, the term RDD-1000 may be used interchangeably with TWRS Technical Baseline Database Manager.

    14. Computing Videos

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Computing Videos Computing

    15. The Nuclear Energy Advanced Modeling and Simulation Enabling Computational Technologies FY09 Report

      SciTech Connect (OSTI)

      Diachin, L F; Garaizar, F X; Henson, V E; Pope, G

      2009-10-12

      In this document we report on the status of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Enabling Computational Technologies (ECT) effort. In particular, we provide the context for ECT In the broader NEAMS program and describe the three pillars of the ECT effort, namely, (1) tools and libraries, (2) software quality assurance, and (3) computational facility (computers, storage, etc) needs. We report on our FY09 deliverables to determine the needs of the integrated performance and safety codes (IPSCs) in these three areas and lay out the general plan for software quality assurance to meet the requirements of DOE and the DOE Advanced Fuel Cycle Initiative (AFCI). We conclude with a brief description of our interactions with the Idaho National Laboratory computer center to determine what is needed to expand their role as a NEAMS user facility.

    16. Hanford Site technical baseline database. Revision 1

      SciTech Connect (OSTI)

      Porter, P.E.

      1995-01-27

      This report lists the Hanford specific files (Table 1) that make up the Hanford Site Technical Baseline Database. Table 2 includes the delta files that delineate the differences between this revision and revision 0 of the Hanford Site Technical Baseline Database. This information is being managed and maintained on the Hanford RDD-100 System, which uses the capabilities of RDD-100, a systems engineering software system of Ascent Logic Corporation (ALC). This revision of the Hanford Site Technical Baseline Database uses RDD-100 version 3.0.2.2 (see Table 3). Directories reflect those controlled by the Hanford RDD-100 System Administrator. Table 4 provides information regarding the platform. A cassette tape containing the Hanford Site Technical Baseline Database is available.

    17. Systems, methods and computer-readable media to model kinetic performance of rechargeable electrochemical devices

      DOE Patents [OSTI]

      Gering, Kevin L.

      2013-01-01

      A system includes an electrochemical cell, monitoring hardware, and a computing system. The monitoring hardware samples performance characteristics of the electrochemical cell. The computing system determines cell information from the performance characteristics. The computing system also analyzes the cell information of the electrochemical cell with a Butler-Volmer (BV) expression modified to determine exchange current density of the electrochemical cell by including kinetic performance information related to pulse-time dependence, electrode surface availability, or a combination thereof. A set of sigmoid-based expressions may be included with the modified-BV expression to determine kinetic performance as a function of pulse time. The determined exchange current density may be used with the modified-BV expression, with or without the sigmoid expressions, to analyze other characteristics of the electrochemical cell. Model parameters can be defined in terms of cell aging, making the overall kinetics model amenable to predictive estimates of cell kinetic performance along the aging timeline.

    18. Modeling of BWR core meltdown accidents - for application in the MELRPI. MOD2 computer code

      SciTech Connect (OSTI)

      Koh, B R; Kim, S H; Taleyarkhan, R P; Podowski, M Z; Lahey, Jr, R T

      1985-04-01

      This report summarizes improvements and modifications made in the MELRPI computer code. A major difference between this new, updated version of the code, called MELRPI.MOD2, and the one reported previously, concerns the inclusion of a model for the BWR emergency core cooling systems (ECCS). This model and its computer implementation, the ECCRPI subroutine, account for various emergency injection modes, for both intact and rubblized geometries. Other changes to MELRPI deal with an improved model for canister wall oxidation, rubble bed modeling, and numerical integration of system equations. A complete documentation of the entire MELRPI.MOD2 code is also given, including an input guide, list of subroutines, sample input/output and program listing.

    19. Once-through CANDU reactor models for the ORIGEN2 computer code

      SciTech Connect (OSTI)

      Croff, A.G.; Bjerke, M.A.

      1980-11-01

      Reactor physics calculations have led to the development of two CANDU reactor models for the ORIGEN2 computer code. The model CANDUs are based on (1) the existing once-through fuel cycle with feed comprised of natural uranium and (2) a projected slightly enriched (1.2 wt % /sup 235/U) fuel cycle. The reactor models are based on cross sections taken directly from the reactor physics codes. Descriptions of the reactor models, as well as values for the ORIGEN2 flux parameters THERM, RES, and FAST, are given.

    20. DOE Issues Funding Opportunity for Advanced Computational and Modeling Research for the Electric Power System

      Broader source: Energy.gov [DOE]

      The objective of this Funding Opportunity Announcement (FOA) is to leverage scientific advancements in mathematics and computation for application to power system models and software tools, with the long-term goal of enabling real-time protection and control based on wide-area sensor measurements.

    1. Application of a computational glass model to the shock response of soda-lime glass

      DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

      Gorfain, Joshua E.; Key, Christopher T.; Alexander, C. Scott

      2016-04-20

      This article details the implementation and application of the glass-specific computational constitutive model by Holmquist and Johnson [1] to simulate the dynamic response of soda-lime glass under high rate and high pressure shock conditions. The predictive capabilities of this model are assessed through comparison of experimental data with numerical results from computations using the CTH shock physics code. The formulation of this glass model is reviewed in the context of its implementation within CTH. Using a variety of experimental data compiled from the open literature, a complete parameterization of the model describing the observed behavior of soda-lime glass is developed.more » Simulation results using the calibrated soda-lime glass model are compared to flyer plate and Taylor rod impact experimental data covering a range of impact and failure conditions spanning an order of magnitude in velocity and pressure. In conclusion, the complex behavior observed in the experimental testing is captured well in the computations, demonstrating the capability of the glass model within CTH.« less

    2. Technical Review of the CENWP Computational Fluid Dynamics Model of the John Day Dam Forebay

      SciTech Connect (OSTI)

      Rakowski, Cynthia L.; Serkowski, John A.; Richmond, Marshall C.

      2010-12-01

      The US Army Corps of Engineers Portland District (CENWP) has developed a computational fluid dynamics (CFD) model of the John Day forebay on the Columbia River to aid in the development and design of alternatives to improve juvenile salmon passage at the John Day Project. At the request of CENWP, Pacific Northwest National Laboratory (PNNL) Hydrology Group has conducted a technical review of CENWP's CFD model run in CFD solver software, STAR-CD. PNNL has extensive experience developing and applying 3D CFD models run in STAR-CD for Columbia River hydroelectric projects. The John Day forebay model developed by CENWP is adequately configured and validated. The model is ready for use simulating forebay hydraulics for structural and operational alternatives. The approach and method are sound, however CENWP has identified some improvements that need to be made for future models and for modifications to this existing model.

    3. Computational method and system for modeling, analyzing, and optimizing DNA amplification and synthesis

      DOE Patents [OSTI]

      Vandersall, Jennifer A.; Gardner, Shea N.; Clague, David S.

      2010-05-04

      A computational method and computer-based system of modeling DNA synthesis for the design and interpretation of PCR amplification, parallel DNA synthesis, and microarray chip analysis. The method and system include modules that address the bioinformatics, kinetics, and thermodynamics of DNA amplification and synthesis. Specifically, the steps of DNA selection, as well as the kinetics and thermodynamics of DNA hybridization and extensions, are addressed, which enable the optimization of the processing and the prediction of the products as a function of DNA sequence, mixing protocol, time, temperature and concentration of species.

    4. Vehicle Technologies Office Merit Review 2014: Significant Enhancement of Computational Efficiency in Nonlinear Multiscale Battery Model for Computer Aided Engineering

      Broader source: Energy.gov [DOE]

      Presentation given by NREL at 2014 DOE Hydrogen and Fuel Cells Program and Vehicle Technologies Office Annual Merit Review and Peer Evaluation Meeting about significant enhancement of computational...

    5. Computer modeling of electromagnetic edge containment in twin-roll casting

      SciTech Connect (OSTI)

      Chang, F.C.; Turner, L.R.; Hull, J.R.; Wang, Y.H.; Blazek, K.E.

      1998-07-01

      This paper presents modeling studies of magnetohydrodynamics (MHD) analysis in twin-roll casting. Argonne National Laboratory (ANL) and Inland Steel Company have worked together to develop a 3-D computer model that can predict eddy currents, fluid flows, and liquid metal containment for an electromagnetic (EM) edge containment device. This mathematical model can greatly shorten casting research on the use of EM fields for liquid metal containment and control. It can also optimize the existing casting processes and minimize expensive, time-consuming full-scale testing. The model was verified by comparing predictions with experimental results of liquid-metal containment and fluid flow in EM edge dams designed at Inland Steel for twin-roll casting. Numerical simulation was performed by coupling a three-dimensional (3-D) finite-element EM code (ELEKTRA) and a 3-D finite-difference fluids code (CaPS-EM) to solve Maxwell`s equations, Ohm`s law, Navier-Stokes equations, and transport equations of turbulence flow in a casting process that uses EM fields. ELEKTRA is able to predict the eddy-current distribution and electromagnetic forces in complex geometry. CaPS-EM is capable of modeling fluid flows with free-surfaces and dynamic rollers. The computed 3-D magnetic fields and induced eddy currents in ELEKTRA are used as input to flow-field computations in CaPS-EM. Results of the numerical simulation compared well with measurements obtained from both static and dynamic tests.

    6. Models the Electromagnetic Response of a 3D Distribution using MP COMPUTERS

      Energy Science and Technology Software Center (OSTI)

      1999-05-01

      EM3D models the electromagnetic response of a 3D distribution of conductivity, dielectric permittivity and magnetic permeability within the earth for geophysical applications using massively parallel computers. The simulations are carried out in the frequency domain for either electric or magnetic sources for either scattered or total filed formulations of Maxwell''s equations. The solution is based on the method of finite differences and includes absorbing boundary conditions so that responses can be modeled up into themore » radar range where wave propagation is dominant. Recent upgrades in the software include the incorporation of finite size sources, that in addition to dipolar source fields, and a low induction number preconditioner that can significantly reduce computational run times. A graphical user interface (GUI) is bundled with the software so that complicated 3D models can be easily constructed and simulated with the software. The GUI also allows for plotting of the output.« less

    7. High-Performance Computing Modeling Advances Accelerator Science for High-Energy Physics

      DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

      Amundson, James; Macridin, Alexandru; Spentzouris, Panagiotis

      2014-11-01

      The development and optimization of particle accelerators are essential for advancing our understanding of the properties of matter, energy, space and time. Particle accelerators are complex devices whose behavior involves many physical effects on multiple scales. Therefore, advanced computational tools utilizing high-performance computing (HPC) are essential for accurately modeling them. In the past decade, the DOE SciDAC program has produced such accelerator-modeling tools, which have beem employed to tackle some of the most difficult accelerator science problems. In this article we discuss the Synergia beam-dynamics framework and its applications to high-intensity particle accelerator physics. Synergia is an accelerator simulation packagemorecapable of handling the entire spectrum of beam dynamics simulations. We present the design principles, key physical and numerical models in Synergia and its performance on HPC platforms. Finally, we present the results of Synergia applications for the Fermilab proton source upgrade, known as the Proton Improvement Plan (PIP).less

    8. High-Performance Computing Modeling Advances Accelerator Science for High-Energy Physics

      SciTech Connect (OSTI)

      Amundson, James; Macridin, Alexandru; Spentzouris, Panagiotis

      2014-11-01

      The development and optimization of particle accelerators are essential for advancing our understanding of the properties of matter, energy, space and time. Particle accelerators are complex devices whose behavior involves many physical effects on multiple scales. Therefore, advanced computational tools utilizing high-performance computing (HPC) are essential for accurately modeling them. In the past decade, the DOE SciDAC program has produced such accelerator-modeling tools, which have beem employed to tackle some of the most difficult accelerator science problems. In this article we discuss the Synergia beam-dynamics framework and its applications to high-intensity particle accelerator physics. Synergia is an accelerator simulation package capable of handling the entire spectrum of beam dynamics simulations. We present the design principles, key physical and numerical models in Synergia and its performance on HPC platforms. Finally, we present the results of Synergia applications for the Fermilab proton source upgrade, known as the Proton Improvement Plan (PIP).

    9. Sodium fast reactor gaps analysis of computer codes and models for accident analysis and reactor safety.

      SciTech Connect (OSTI)

      Carbajo, Juan; Jeong, Hae-Yong; Wigeland, Roald; Corradini, Michael; Schmidt, Rodney Cannon; Thomas, Justin; Wei, Tom; Sofu, Tanju; Ludewig, Hans; Tobita, Yoshiharu; Ohshima, Hiroyuki; Serre, Frederic

      2011-06-01

      This report summarizes the results of an expert-opinion elicitation activity designed to qualitatively assess the status and capabilities of currently available computer codes and models for accident analysis and reactor safety calculations of advanced sodium fast reactors, and identify important gaps. The twelve-member panel consisted of representatives from five U.S. National Laboratories (SNL, ANL, INL, ORNL, and BNL), the University of Wisconsin, the KAERI, the JAEA, and the CEA. The major portion of this elicitation activity occurred during a two-day meeting held on Aug. 10-11, 2010 at Argonne National Laboratory. There were two primary objectives of this work: (1) Identify computer codes currently available for SFR accident analysis and reactor safety calculations; and (2) Assess the status and capability of current US computer codes to adequately model the required accident scenarios and associated phenomena, and identify important gaps. During the review, panel members identified over 60 computer codes that are currently available in the international community to perform different aspects of SFR safety analysis for various event scenarios and accident categories. A brief description of each of these codes together with references (when available) is provided. An adaptation of the Predictive Capability Maturity Model (PCMM) for computational modeling and simulation is described for use in this work. The panel's assessment of the available US codes is presented in the form of nine tables, organized into groups of three for each of three risk categories considered: anticipated operational occurrences (AOOs), design basis accidents (DBA), and beyond design basis accidents (BDBA). A set of summary conclusions are drawn from the results obtained. At the highest level, the panel judged that current US code capabilities are adequate for licensing given reasonable margins, but expressed concern that US code development activities had stagnated and that the

    10. Computing Sciences

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Division The Computational Research Division conducts research and development in mathematical modeling and simulation, algorithm design, data storage, management and...

    11. DualTrust: A Trust Management Model for Swarm-Based Autonomic Computing Systems

      SciTech Connect (OSTI)

      Maiden, Wendy M.

      2010-05-01

      Trust management techniques must be adapted to the unique needs of the application architectures and problem domains to which they are applied. For autonomic computing systems that utilize mobile agents and ant colony algorithms for their sensor layer, certain characteristics of the mobile agent ant swarm -- their lightweight, ephemeral nature and indirect communication -- make this adaptation especially challenging. This thesis looks at the trust issues and opportunities in swarm-based autonomic computing systems and finds that by monitoring the trustworthiness of the autonomic managers rather than the swarming sensors, the trust management problem becomes much more scalable and still serves to protect the swarm. After analyzing the applicability of trust management research as it has been applied to architectures with similar characteristics, this thesis specifies the required characteristics for trust management mechanisms used to monitor the trustworthiness of entities in a swarm-based autonomic computing system and describes a trust model that meets these requirements.

    12. On the Impact of Execution Models: A Case Study in Computational Chemistry

      SciTech Connect (OSTI)

      Chavarría-Miranda, Daniel; Halappanavar, Mahantesh; Krishnamoorthy, Sriram; Manzano Franco, Joseph B.; Vishnu, Abhinav; Hoisie, Adolfy

      2015-05-25

      Efficient utilization of high-performance computing (HPC) platforms is an important and complex problem. Execution models, abstract descriptions of the dynamic runtime behavior of the execution stack, have significant impact on the utilization of HPC systems. Using a computational chemistry kernel as a case study and a wide variety of execution models combined with load balancing techniques, we explore the impact of execution models on the utilization of an HPC system. We demonstrate a 50 percent improvement in performance by using work stealing relative to a more traditional static scheduling approach. We also use a novel semi-matching technique for load balancing that has comparable performance to a traditional hypergraph-based partitioning implementation, which is computationally expensive. Using this study, we found that execution model design choices and assumptions can limit critical optimizations such as global, dynamic load balancing and finding the correct balance between available work units and different system and runtime overheads. With the emergence of multi- and many-core architectures and the consequent growth in the complexity of HPC platforms, we believe that these lessons will be beneficial to researchers tuning diverse applications on modern HPC platforms, especially on emerging dynamic platforms with energy-induced performance variability.

    13. Superior model for fault tolerance computation in designing nano-sized circuit systems

      SciTech Connect (OSTI)

      Singh, N. S. S. Muthuvalu, M. S.; Asirvadam, V. S.

      2014-10-24

      As CMOS technology scales nano-metrically, reliability turns out to be a decisive subject in the design methodology of nano-sized circuit systems. As a result, several computational approaches have been developed to compute and evaluate reliability of desired nano-electronic circuits. The process of computing reliability becomes very troublesome and time consuming as the computational complexity build ups with the desired circuit size. Therefore, being able to measure reliability instantly and superiorly is fast becoming necessary in designing modern logic integrated circuits. For this purpose, the paper firstly looks into the development of an automated reliability evaluation tool based on the generalization of Probabilistic Gate Model (PGM) and Boolean Difference-based Error Calculator (BDEC) models. The Matlab-based tool allows users to significantly speed-up the task of reliability analysis for very large number of nano-electronic circuits. Secondly, by using the developed automated tool, the paper explores into a comparative study involving reliability computation and evaluation by PGM and, BDEC models for different implementations of same functionality circuits. Based on the reliability analysis, BDEC gives exact and transparent reliability measures, but as the complexity of the same functionality circuits with respect to gate error increases, reliability measure by BDEC tends to be lower than the reliability measure by PGM. The lesser reliability measure by BDEC is well explained in this paper using distribution of different signal input patterns overtime for same functionality circuits. Simulation results conclude that the reliability measure by BDEC depends not only on faulty gates but it also depends on circuit topology, probability of input signals being one or zero and also probability of error on signal lines.

    14. Compare Energy Use in Variable Refrigerant Flow Heat Pumps Field Demonstration and Computer Model

      SciTech Connect (OSTI)

      Sharma, Chandan; Raustad, Richard

      2013-06-01

      Variable Refrigerant Flow (VRF) heat pumps are often regarded as energy efficient air-conditioning systems which offer electricity savings as well as reduction in peak electric demand while providing improved individual zone setpoint control. One of the key advantages of VRF systems is minimal duct losses which provide significant reduction in energy use and duct space. However, there is limited data available to show their actual performance in the field. Since VRF systems are increasingly gaining market share in the US, it is highly desirable to have more actual field performance data of these systems. An effort was made in this direction to monitor VRF system performance over an extended period of time in a US national lab test facility. Due to increasing demand by the energy modeling community, an empirical model to simulate VRF systems was implemented in the building simulation program EnergyPlus. This paper presents the comparison of energy consumption as measured in the national lab and as predicted by the program. For increased accuracy in the comparison, a customized weather file was created by using measured outdoor temperature and relative humidity at the test facility. Other inputs to the model included building construction, VRF system model based on lab measured performance, occupancy of the building, lighting/plug loads, and thermostat set-points etc. Infiltration model inputs were adjusted in the beginning to tune the computer model and then subsequent field measurements were compared to the simulation results. Differences between the computer model results and actual field measurements are discussed. The computer generated VRF performance closely resembled the field measurements.

    15. Hybrid Electric Vehicle Fleet and Baseline Performance Testing

      SciTech Connect (OSTI)

      J. Francfort; D. Karner

      2006-04-01

      The U.S. Department of Energy’s Advanced Vehicle Testing Activity (AVTA) conducts baseline performance and fleet testing of hybrid electric vehicles (HEV). To date, the AVTA has completed baseline performance testing on seven HEV models and accumulated 1.4 million fleet testing miles on 26 HEVs. The HEV models tested or in testing include: Toyota Gen I and Gen II Prius, and Highlander; Honda Insight, Civic and Accord; Chevrolet Silverado; Ford Escape; and Lexus RX 400h. The baseline performance testing includes dynamometer and closed track testing to document the HEV’s fuel economy (SAE J1634) and performance in a controlled environment. During fleet testing, two of each HEV model are driven to 160,000 miles per vehicle within 36 months, during which maintenance and repair events, and fuel use is recorded and used to compile life-cycle costs. At the conclusion of the 160,000 miles of fleet testing, the SAE J1634 tests are rerun and each HEV battery pack is tested. These AVTA testing activities are conducted by the Idaho National Laboratory, Electric Transportation Applications, and Exponent Failure Analysis Associates. This paper discusses the testing methods and results.

    16. GASFLOW: A Computational Fluid Dynamics Code for Gases, Aerosols, and Combustion, Volume 1: Theory and Computational Model

      SciTech Connect (OSTI)

      Nichols, B.D.; Mueller, C.; Necker, G.A.; Travis, J.R.; Spore, J.W.; Lam, K.L.; Royl, P.; Redlinger, R.; Wilson, T.L.

      1998-10-01

      Los Alamos National Laboratory (LANL) and Forschungszentrum Karlsruhe (FzK) are developing GASFLOW, a three-dimensional (3D) fluid dynamics field code as a best-estimate tool to characterize local phenomena within a flow field. Examples of 3D phenomena include circulation patterns; flow stratification; hydrogen distribution mixing and stratification; combustion and flame propagation; effects of noncondensable gas distribution on local condensation and evaporation; and aerosol entrainment, transport, and deposition. An analysis with GASFLOW will result in a prediction of the gas composition and discrete particle distribution in space and time throughout the facility and the resulting pressure and temperature loadings on the walls and internal structures with or without combustion. A major application of GASFLOW is for predicting the transport, mixing, and combustion of hydrogen and other gases in nuclear reactor containments and other facilities. It has been applied to situations involving transporting and distributing combustible gas mixtures. It has been used to study gas dynamic behavior (1) in low-speed, buoyancy-driven flows, as well as sonic flows or diffusion dominated flows; and (2) during chemically reacting flows, including deflagrations. The effects of controlling such mixtures by safety systems can be analyzed. The code version described in this manual is designated GASFLOW 2.1, which combines previous versions of the United States Nuclear Regulatory Commission code HMS (for Hydrogen Mixing Studies) and the Department of Energy and FzK versions of GASFLOW. The code was written in standard Fortran 90. This manual comprises three volumes. Volume I describes the governing physical equations and computational model. Volume II describes how to use the code to set up a model geometry, specify gas species and material properties, define initial and boundary conditions, and specify different outputs, especially graphical displays. Sample problems are included

    17. High Performance Computing Modeling Advances Accelerator Science for High-Energy Physics

      SciTech Connect (OSTI)

      Amundson, James; Macridin, Alexandru; Spentzouris, Panagiotis

      2014-07-28

      The development and optimization of particle accelerators are essential for advancing our understanding of the properties of matter, energy, space, and time. Particle accelerators are complex devices whose behavior involves many physical effects on multiple scales. Therefore, advanced computational tools utilizing high-performance computing are essential for accurately modeling them. In the past decade, the US Department of Energy's SciDAC program has produced accelerator-modeling tools that have been employed to tackle some of the most difficult accelerator science problems. The authors discuss the Synergia framework and its applications to high-intensity particle accelerator physics. Synergia is an accelerator simulation package capable of handling the entire spectrum of beam dynamics simulations. Our authors present Synergia's design principles and its performance on HPC platforms.

    18. High Performance Computing Modeling Advances Accelerator Science for High-Energy Physics

      DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

      Amundson, James; Macridin, Alexandru; Spentzouris, Panagiotis

      2014-07-28

      The development and optimization of particle accelerators are essential for advancing our understanding of the properties of matter, energy, space, and time. Particle accelerators are complex devices whose behavior involves many physical effects on multiple scales. Therefore, advanced computational tools utilizing high-performance computing are essential for accurately modeling them. In the past decade, the US Department of Energy's SciDAC program has produced accelerator-modeling tools that have been employed to tackle some of the most difficult accelerator science problems. The authors discuss the Synergia framework and its applications to high-intensity particle accelerator physics. Synergia is an accelerator simulation package capable ofmore » handling the entire spectrum of beam dynamics simulations. Our authors present Synergia's design principles and its performance on HPC platforms.« less

    19. NREL Computer Models Integrate Wind Turbines with Floating Platforms (Fact Sheet)

      SciTech Connect (OSTI)

      Not Available

      2011-07-01

      Far off the shores of energy-hungry coastal cities, powerful winds blow over the open ocean, where the water is too deep for today's seabed-mounted offshore wind turbines. For the United States to tap into these vast offshore wind energy resources, wind turbines must be mounted on floating platforms to be cost effective. Researchers at the National Renewable Energy Laboratory (NREL) are supporting that development with computer models that allow detailed analyses of such floating wind turbines.

    20. Spectroscopy, modeling and computation of metal chelate solubility in supercritical CO{sub 2}

      SciTech Connect (OSTI)

      J. F. Brennecke; M. A. Stadtherr

      1999-12-10

      The overall objectives of this project were to gain a fundamental understanding of the solubility and phase behavior of metal chelates in supercritical CO{sub 2}. Extraction with CO{sub 2} is an excellent way to remove organic compounds from soils, sludges and aqueous solutions, and recent research has demonstrated that, together with chelating agents, it is a viable way to remove metals, as well. In this project the authors sought to gain fundamental knowledge that is vital to computing phase behavior, and modeling and designing processes using CO{sub 2} to separate organics and metal compounds from DOE mixed wastes. The overall program was a comprehensive one to measure, model and compute the solubility of metal chelate complexes in supercritical CO{sub 2} and CO{sub 2}/cosolvent mixtures. Through a combination of phase behavior measurements, spectroscopy and the development of a new computational technique, the authors have achieved a completely reliable way to model metal chelate solubility in supercritical CO{sub 2} and CO{sub 2}/co-contaminant mixtures. Thus, they can now design and optimize processes to extract metals from solid matrices using supercritical CO{sub 2}, as an alternative to hazardous organic solvents that create their own environmental problems, even while helping in metals decontamination.

    1. Review of the synergies between computational modeling and experimental characterization of materials across length scales

      SciTech Connect (OSTI)

      Dingreville, Rémi; Karnesky, Richard A.; Puel, Guillaume; Schmitt, Jean -Hubert

      2015-11-16

      With the increasing interplay between experimental and computational approaches at multiple length scales, new research directions are emerging in materials science and computational mechanics. Such cooperative interactions find many applications in the development, characterization and design of complex material systems. This manuscript provides a broad and comprehensive overview of recent trends in which predictive modeling capabilities are developed in conjunction with experiments and advanced characterization to gain a greater insight into structure–property relationships and study various physical phenomena and mechanisms. The focus of this review is on the intersections of multiscale materials experiments and modeling relevant to the materials mechanics community. After a general discussion on the perspective from various communities, the article focuses on the latest experimental and theoretical opportunities. Emphasis is given to the role of experiments in multiscale models, including insights into how computations can be used as discovery tools for materials engineering, rather than to “simply” support experimental work. This is illustrated by examples from several application areas on structural materials. In conclusion this manuscript ends with a discussion on some problems and open scientific questions that are being explored in order to advance this relatively new field of research.

    2. Review of the synergies between computational modeling and experimental characterization of materials across length scales

      DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

      Dingreville, Rémi; Karnesky, Richard A.; Puel, Guillaume; Schmitt, Jean -Hubert

      2015-11-16

      With the increasing interplay between experimental and computational approaches at multiple length scales, new research directions are emerging in materials science and computational mechanics. Such cooperative interactions find many applications in the development, characterization and design of complex material systems. This manuscript provides a broad and comprehensive overview of recent trends in which predictive modeling capabilities are developed in conjunction with experiments and advanced characterization to gain a greater insight into structure–property relationships and study various physical phenomena and mechanisms. The focus of this review is on the intersections of multiscale materials experiments and modeling relevant to the materials mechanicsmore » community. After a general discussion on the perspective from various communities, the article focuses on the latest experimental and theoretical opportunities. Emphasis is given to the role of experiments in multiscale models, including insights into how computations can be used as discovery tools for materials engineering, rather than to “simply” support experimental work. This is illustrated by examples from several application areas on structural materials. In conclusion this manuscript ends with a discussion on some problems and open scientific questions that are being explored in order to advance this relatively new field of research.« less

    3. Solid Waste Program technical baseline description

      SciTech Connect (OSTI)

      Carlson, A.B.

      1994-07-01

      The system engineering approach has been taken to describe the technical baseline under which the Solid Waste Program is currently operating. The document contains a mission analysis, function analysis, system definition, documentation requirements, facility and project bases, and uncertainties facing the program.

    4. Waste management project technical baseline description

      SciTech Connect (OSTI)

      Sederburg, J.P.

      1997-08-13

      A systems engineering approach has been taken to describe the technical baseline under which the Waste Management Project is currently operating. The document contains a mission analysis, function analysis, requirement analysis, interface definitions, alternative analysis, system definition, documentation requirements, implementation definitions, and discussion of uncertainties facing the Project.

    5. CURRENT - A Computer Code for Modeling Two-Dimensional, Chemically Reaccting, Low Mach Number Flows

      SciTech Connect (OSTI)

      Winters, W.S.; Evans, G.H.; Moen, C.D.

      1996-10-01

      This report documents CURRENT, a computer code for modeling two- dimensional, chemically reacting, low Mach number flows including the effects of surface chemistry. CURRENT is a finite volume code based on the SIMPLER algorithm. Additional convergence acceleration for low Peclet number flows is provided using improved boundary condition coupling and preconditioned gradient methods. Gas-phase and surface chemistry is modeled using the CHEMKIN software libraries. The CURRENT user-interface has been designed to be compatible with the Sandia-developed mesh generator and post processor ANTIPASTO and the post processor TECPLOT. This report describes the theory behind the code and also serves as a user`s manual.

    6. Efficient Computation of Info-Gap Robustness for Finite Element Models

      SciTech Connect (OSTI)

      Stull, Christopher J.; Hemez, Francois M.; Williams, Brian J.

      2012-07-05

      A recent research effort at LANL proposed info-gap decision theory as a framework by which to measure the predictive maturity of numerical models. Info-gap theory explores the trade-offs between accuracy, that is, the extent to which predictions reproduce the physical measurements, and robustness, that is, the extent to which predictions are insensitive to modeling assumptions. Both accuracy and robustness are necessary to demonstrate predictive maturity. However, conducting an info-gap analysis can present a formidable challenge, from the standpoint of the required computational resources. This is because a robustness function requires the resolution of multiple optimization problems. This report offers an alternative, adjoint methodology to assess the info-gap robustness of Ax = b-like numerical models solved for a solution x. Two situations that can arise in structural analysis and design are briefly described and contextualized within the info-gap decision theory framework. The treatments of the info-gap problems, using the adjoint methodology are outlined in detail, and the latter problem is solved for four separate finite element models. As compared to statistical sampling, the proposed methodology offers highly accurate approximations of info-gap robustness functions for the finite element models considered in the report, at a small fraction of the computational cost. It is noted that this report considers only linear systems; a natural follow-on study would extend the methodologies described herein to include nonlinear systems.

    7. India-Danish Government Baseline Workstream | Open Energy Information

      Open Energy Info (EERE)

      Danish Government Baseline Workstream Jump to: navigation, search Name India-Danish Government Baseline Workstream AgencyCompany Organization Danish Government Partner Danish...

    8. Baseline and Target Values for PV Forecasts: Toward Improved...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Baseline and Target Values for PV Forecasts: Toward Improved Solar Power Forecasting ... Baseline and Target Values for PV Forecasts: Toward Improved Solar Power Forecasting Jie ...

    9. EA-1943: Construction and Operation of the Long Baseline Neutrino...

      Energy Savers [EERE]

      43: Construction and Operation of the Long Baseline Neutrino Facility and Deep Underground ... EA-1943: Construction and Operation of the Long Baseline Neutrino Facility and Deep ...

    10. South Africa-Danish Government Baseline Workstream | Open Energy...

      Open Energy Info (EERE)

      Baseline Workstream Jump to: navigation, search Name South Africa-Danish Government Baseline Workstream AgencyCompany Organization Danish Government Partner Danish Ministry for...

    11. Brazil-Danish Government Baseline Workstream | Open Energy Information

      Open Energy Info (EERE)

      Danish Government Baseline Workstream Jump to: navigation, search Name Brazil-Danish Government Baseline Workstream AgencyCompany Organization Danish Government Partner Danish...

    12. China-Danish Government Baseline Workstream | Open Energy Information

      Open Energy Info (EERE)

      Danish Government Baseline Workstream Jump to: navigation, search Name China-Danish Government Baseline Workstream AgencyCompany Organization Danish Government Partner Danish...

    13. NREL: Energy Analysis - Annual Technology Baseline and Standard...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Annual Technology Baseline and Standard Scenarios - Legacy Versions This section contains earlier versions of NREL's Annual Technology Baseline and Standard Scenarios products. ...

    14. UNFCCC-Consolidated baseline and monitoring methodology for landfill...

      Open Energy Info (EERE)

      Consolidated baseline and monitoring methodology for landfill gas project activities Jump to: navigation, search Tool Summary LAUNCH TOOL Name: UNFCCC-Consolidated baseline and...

    15. Indonesia-Danish Government Baseline Workstream | Open Energy...

      Open Energy Info (EERE)

      Indonesia-Danish Government Baseline Workstream Jump to: navigation, search Name Indonesia-Danish Government Baseline Workstream AgencyCompany Organization Danish Government...

    16. Mexico-Danish Government Baseline Workstream | Open Energy Information

      Open Energy Info (EERE)

      Danish Government Baseline Workstream Jump to: navigation, search Name Mexico-Danish Government Baseline Workstream AgencyCompany Organization Danish Government Partner Danish...

    17. Computer modeling of electromagnetic fields and fluid flows for edge containment in continuous casting

      SciTech Connect (OSTI)

      Chang, F.C.; Hull, J.R.; Wang, Y.H.; Blazek, K.E.

      1996-02-01

      A computer model was developed to predict eddy currents and fluid flows in molten steel. The model was verified by comparing predictions with experimental results of liquid-metal containment and fluid flow in electromagnetic (EM) edge dams (EMDs) designed at Inland Steel for twin-roll casting. The model can optimize the EMD design so it is suitable for application, and minimize expensive, time-consuming full-scale testing. Numerical simulation was performed by coupling a three-dimensional (3-D) finite-element EM code (ELEKTRA) and a 3-D finite-difference fluids code (CaPS-EM) to solve heat transfer, fluid flow, and turbulence transport in a casting process that involves EM fields. ELEKTRA is able to predict the eddy- current distribution and the electromagnetic forces in complex geometries. CaPS-EM is capable of modeling fluid flows with free surfaces. Results of the numerical simulation compared well with measurements obtained from a static test.

    18. An integrated computer modeling environment for regional land use, air quality, and transportation planning

      SciTech Connect (OSTI)

      Hanley, C.J.; Marshall, N.L.

      1997-04-01

      The Land Use, Air Quality, and Transportation Integrated Modeling Environment (LATIME) represents an integrated approach to computer modeling and simulation of land use allocation, travel demand, and mobile source emissions for the Albuquerque, New Mexico, area. This environment provides predictive capability combined with a graphical and geographical interface. The graphical interface shows the causal relationships between data and policy scenarios and supports alternative model formulations. Scenarios are launched from within a Geographic Information System (GIS), and data produced by each model component at each time step within a simulation is stored in the GIS. A menu-driven query system is utilized to review link-based results and regional and area-wide results. These results can also be compared across time or between alternative land use scenarios. Using this environment, policies can be developed and implemented based on comparative analysis, rather than on single-step future projections. 16 refs., 3 figs., 2 tabs.

    19. Computational fluid dynamics modeling of coal gasification in a pressurized spout-fluid bed

      SciTech Connect (OSTI)

      Zhongyi Deng; Rui Xiao; Baosheng Jin; He Huang; Laihong Shen; Qilei Song; Qianjun Li

      2008-05-15

      Computational fluid dynamics (CFD) modeling, which has recently proven to be an effective means of analysis and optimization of energy-conversion processes, has been extended to coal gasification in this paper. A 3D mathematical model has been developed to simulate the coal gasification process in a pressurized spout-fluid bed. This CFD model is composed of gas-solid hydrodynamics, coal pyrolysis, char gasification, and gas phase reaction submodels. The rates of heterogeneous reactions are determined by combining Arrhenius rate and diffusion rate. The homogeneous reactions of gas phase can be treated as secondary reactions. A comparison of the calculated and experimental data shows that most gasification performance parameters can be predicted accurately. This good agreement indicates that CFD modeling can be used for complex fluidized beds coal gasification processes. 37 refs., 7 figs., 5 tabs.

    20. Complex functionality with minimal computation. Promise and pitfalls of reduced-tracer ocean biogeochemistry models

      DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

      Galbraith, Eric D.; Dunne, John P.; Gnanadesikan, Anand; Slater, Richard D.; Sarmiento, Jorge L.; Dufour, Carolina O.; de Souza, Gregory F.; Bianchi, Daniele; Claret, Mariona; Rodgers, Keith B.; et al

      2015-12-21

      Earth System Models increasingly include ocean biogeochemistry models in order to predict changes in ocean carbon storage, hypoxia, and biological productivity under climate change. However, state-of-the-art ocean biogeochemical models include many advected tracers, that significantly increase the computational resources required, forcing a trade-off with spatial resolution. Here, we compare a state-of the art model with 30 prognostic tracers (TOPAZ) with two reduced-tracer models, one with 6 tracers (BLING), and the other with 3 tracers (miniBLING). The reduced-tracer models employ parameterized, implicit biological functions, which nonetheless capture many of the most important processes resolved by TOPAZ. All three are embedded inmore » the same coupled climate model. Despite the large difference in tracer number, the absence of tracers for living organic matter is shown to have a minimal impact on the transport of nutrient elements, and the three models produce similar mean annual preindustrial distributions of macronutrients, oxygen, and carbon. Significant differences do exist among the models, in particular the seasonal cycle of biomass and export production, but it does not appear that these are necessary consequences of the reduced tracer number. With increasing CO2, changes in dissolved oxygen and anthropogenic carbon uptake are very similar across the different models. Thus, while the reduced-tracer models do not explicitly resolve the diversity and internal dynamics of marine ecosystems, we demonstrate that such models are applicable to a broad suite of major biogeochemical concerns, including anthropogenic change. Lastly, these results are very promising for the further development and application of reduced-tracer biogeochemical models that incorporate ‘‘sub-ecosystem-scale’’ parameterizations.« less

    1. Complex functionality with minimal computation. Promise and pitfalls of reduced-tracer ocean biogeochemistry models

      SciTech Connect (OSTI)

      Galbraith, Eric D.; Dunne, John P.; Gnanadesikan, Anand; Slater, Richard D.; Sarmiento, Jorge L.; Dufour, Carolina O.; de Souza, Gregory F.; Bianchi, Daniele; Claret, Mariona; Rodgers, Keith B.; Marvasti, Seyedehsafoura Sedigh

      2015-12-21

      Earth System Models increasingly include ocean biogeochemistry models in order to predict changes in ocean carbon storage, hypoxia, and biological productivity under climate change. However, state-of-the-art ocean biogeochemical models include many advected tracers, that significantly increase the computational resources required, forcing a trade-off with spatial resolution. Here, we compare a state-of the art model with 30 prognostic tracers (TOPAZ) with two reduced-tracer models, one with 6 tracers (BLING), and the other with 3 tracers (miniBLING). The reduced-tracer models employ parameterized, implicit biological functions, which nonetheless capture many of the most important processes resolved by TOPAZ. All three are embedded in the same coupled climate model. Despite the large difference in tracer number, the absence of tracers for living organic matter is shown to have a minimal impact on the transport of nutrient elements, and the three models produce similar mean annual preindustrial distributions of macronutrients, oxygen, and carbon. Significant differences do exist among the models, in particular the seasonal cycle of biomass and export production, but it does not appear that these are necessary consequences of the reduced tracer number. With increasing CO2, changes in dissolved oxygen and anthropogenic carbon uptake are very similar across the different models. Thus, while the reduced-tracer models do not explicitly resolve the diversity and internal dynamics of marine ecosystems, we demonstrate that such models are applicable to a broad suite of major biogeochemical concerns, including anthropogenic change. Lastly, these results are very promising for the further development and application of reduced-tracer biogeochemical models that incorporate ‘‘sub-ecosystem-scale’’ parameterizations.

    2. A Computational Model for the Identification of Biochemical Pathways in the Krebs Cycle

      SciTech Connect (OSTI)

      Oliveira, Joseph S.; Bailey, Colin G.; Jones-Oliveira, Janet B.; Dixon, David A.; Gull, Dean W.; Chandler, Mary L.

      2003-03-01

      We have applied an algorithmic methodology which provably decomposes any complex network into a complete family of principal subcircuits to study the minimal circuits that describe the Krebs cycle. Every operational behavior that the network is capable of exhibiting can be represented by some combination of these principal subcircuits and this computational decomposition is linearly efficient. We have developed a computational model that can be applied to biochemical reaction systems which accurately renders pathways of such reactions via directed hypergraphs (Petri nets). We have applied the model to the citric acid cycle (Krebs cycle). The Krebs cycle, which oxidizes the acetyl group of acetyl CoA to CO2 and reduces NAD and FAD to NADH and FADH2 is a complex interacting set of nine subreaction networks. The Krebs cycle was selected because of its familiarity to the biological community and because it exhibits enough complexity to be interesting in order to introduce this novel analytic approach. This study validates the algorithmic methodology for the identification of significant biochemical signaling subcircuits, based solely upon the mathematical model and not upon prior biological knowledge. The utility of the algebraic-combinatorial model for identifying the complete set of biochemical subcircuits as a data set is demonstrated for this important metabolic process.

    3. Modelling of pathologies of the nervous system by the example of computational and electronic models of elementary nervous systems

      SciTech Connect (OSTI)

      Shumilov, V. N. Syryamkin, V. I. Syryamkin, M. V.

      2015-11-17

      The paper puts forward principles of action of devices operating similarly to the nervous system and the brain of biological systems. We propose an alternative method of studying diseases of the nervous system, which may significantly influence prevention, medical treatment, or at least retardation of development of these diseases. This alternative is to use computational and electronic models of the nervous system. Within this approach, we represent the brain in the form of a huge electrical circuit composed of active units, namely, neuron-like units and connections between them. As a result, we created computational and electronic models of elementary nervous systems, which are based on the principles of functioning of biological nervous systems that we have put forward. Our models demonstrate reactions to external stimuli and their change similarly to the behavior of simplest biological organisms. The models possess the ability of self-training and retraining in real time without human intervention and switching operation/training modes. In our models, training and memorization take place constantly under the influence of stimuli on the organism. Training is without any interruption and switching operation modes. Training and formation of new reflexes occur by means of formation of new connections between excited neurons, between which formation of connections is physically possible. Connections are formed without external influence. They are formed under the influence of local causes. Connections are formed between outputs and inputs of two neurons, when the difference between output and input potentials of excited neurons exceeds a value sufficient to form a new connection. On these grounds, we suggest that the proposed principles truly reflect mechanisms of functioning of biological nervous systems and the brain. In order to confirm the correspondence of the proposed principles to biological nature, we carry out experiments for the study of processes of

    4. Towards an Abstraction-Friendly Programming Model for High Productivity and High Performance Computing

      SciTech Connect (OSTI)

      Liao, C; Quinlan, D; Panas, T

      2009-10-06

      General purpose languages, such as C++, permit the construction of various high level abstractions to hide redundant, low level details and accelerate programming productivity. Example abstractions include functions, data structures, classes, templates and so on. However, the use of abstractions significantly impedes static code analyses and optimizations, including parallelization, applied to the abstractions complex implementations. As a result, there is a common perception that performance is inversely proportional to the level of abstraction. On the other hand, programming large scale, possibly heterogeneous high-performance computing systems is notoriously difficult and programmers are less likely to abandon the help from high level abstractions when solving real-world, complex problems. Therefore, the need for programming models balancing both programming productivity and execution performance has reached a new level of criticality. We are exploring a novel abstraction-friendly programming model in order to support high productivity and high performance computing. We believe that standard or domain-specific semantics associated with high level abstractions can be exploited to aid compiler analyses and optimizations, thus helping achieving high performance without losing high productivity. We encode representative abstractions and their useful semantics into an abstraction specification file. In the meantime, an accessible, source-to-source compiler infrastructure (the ROSE compiler) is used to facilitate recognizing high level abstractions and utilizing their semantics for more optimization opportunities. Our initial work has shown that recognizing abstractions and knowing their semantics within a compiler can dramatically extend the applicability of existing optimizations, including automatic parallelization. Moreover, a new set of optimizations have become possible within an abstraction-friendly and semantics-aware programming model. In the future, we will

    5. Rapidly re-computable EEG (electroencephalography) forward models for realistic head shapes

      SciTech Connect (OSTI)

      Ermer, J. J.; Mosher, J. C.; Baillet, S.; Leahy, R. M.

      2001-01-01

      Solution of the EEG source localization (inverse) problem utilizing model-based methods typically requires a significant number of forward model evaluations. For subspace based inverse methods like MUSIC [6], the total number of forward model evaluations can often approach an order of 10{sup 3} or 10{sup 4}. Techniques based on least-squares minimization may require significantly more evaluations. The observed set of measurements over an M-sensor array is often expressed as a linear forward spatio-temporal model of the form: F = GQ + N (1) where the observed forward field F (M-sensors x N-time samples) can be expressed in terms of the forward model G, a set of dipole moment(s) Q (3xP-dipoles x N-time samples) and additive noise N. Because of their simplicity, ease of computation, and relatively good accuracy, multi-layer spherical models [7] (or fast approximations described in [1], [7]) have traditionally been the 'forward model of choice' for approximating the human head. However, approximation of the human head via a spherical model does have several key drawbacks. By its very shape, the use of a spherical model distorts the true distribution of passive currents in the skull cavity. Spherical models also require that the sensor positions be projected onto the fitted sphere (Fig. 1), resulting in a distortion of the true sensor-dipole spatial geometry (and ultimately the computed surface potential). The use of a single 'best-fitted' sphere has the added drawback of incomplete coverage of the inner skull region, often ignoring areas such as the frontal cortex. In practice, this problem is typically countered by fitting additional sphere(s) to those region(s) not covered by the primary sphere. The use of these additional spheres results in added complication to the forward model. Using high-resolution spatial information obtained via X-ray CT or MR imaging, a realistic head model can be formed by tessellating the head into a set of contiguous regions (typically the

    6. Engineering task plan TWRS technical baseline completion

      SciTech Connect (OSTI)

      Moore, T.L

      1996-03-08

      The Tank Waste Remediation System (TWRS) includes many activities required to remediate the radioactive waste stored in underground waste storage tanks. These activities include routine monitoring of the waste, facilities maintenance, upgrades to existing equipment, and installation of new equipment necessary to manage, retrieve, process, and dispose of the waste. In order to ensure that these multiple activities are integrated, cost effective, and necessary, a sound technical baseline is required from which all activities can be traced and measured. The process by which this technical baseline is developed will consist of the identification of functions, requirements, architecture, and test (FRAT) methodology. This process must be completed for TWRS to a level that provides the technical basis for all facility/system/component maintenance, upgrades, or new equipment installation.

    7. Baseline Microstructural Characterization of Outer 3013 Containers

      SciTech Connect (OSTI)

      Zapp, Phillip E.; Dunn, Kerry A

      2005-07-31

      Three DOE Standard 3013 outer storage containers were examined to characterize the microstructure of the type 316L stainless steel material of construction. Two of the containers were closure-welded yielding production-quality outer 3013 containers; the third examined container was not closed. Optical metallography and Knoop microhardness measurements were performed to establish a baseline characterization that will support future destructive examinations of 3013 outer containers in the storage inventory. Metallography revealed the microstructural features typical of this austenitic stainless steel as it is formed and welded. The grains were equiaxed with evident annealing twins. Flow lines were prominent in the forming directions of the cylindrical body and flat lids and bottom caps. No adverse indications were seen. Microhardness values, although widely varying, were consistent with annealed austenitic stainless steel. The data gathered as part of this characterization will be used as a baseline for the destructive examination of 3013 containers removed from the storage inventory.

    8. State-of-the-art review of computational fluid dynamics modeling for fluid-solids systems

      SciTech Connect (OSTI)

      Lyczkowski, R.W.; Bouillard, J.X.; Ding, J.; Chang, S.L.; Burge, S.W.

      1994-05-12

      As the result of 15 years of research (50 staff years of effort) Argonne National Laboratory (ANL), through its involvement in fluidized-bed combustion, magnetohydrodynamics, and a variety of environmental programs, has produced extensive computational fluid dynamics (CFD) software and models to predict the multiphase hydrodynamic and reactive behavior of fluid-solids motions and interactions in complex fluidized-bed reactors (FBRS) and slurry systems. This has resulted in the FLUFIX, IRF, and SLUFIX computer programs. These programs are based on fluid-solids hydrodynamic models and can predict information important to the designer of atmospheric or pressurized bubbling and circulating FBR, fluid catalytic cracking (FCC) and slurry units to guarantee optimum efficiency with minimum release of pollutants into the environment. This latter issue will become of paramount importance with the enactment of the Clean Air Act Amendment (CAAA) of 1995. Solids motion is also the key to understanding erosion processes. Erosion rates in FBRs and pneumatic and slurry components are computed by ANL`s EROSION code to predict the potential metal wastage of FBR walls, intervals, feed distributors, and cyclones. Only the FLUFIX and IRF codes will be reviewed in the paper together with highlights of the validations because of length limitations. It is envisioned that one day, these codes with user-friendly pre and post-processor software and tailored for massively parallel multiprocessor shared memory computational platforms will be used by industry and researchers to assist in reducing and/or eliminating the environmental and economic barriers which limit full consideration of coal, shale and biomass as energy sources, to retain energy security, and to remediate waste and ecological problems.

    9. The Fermilab long-baseline neutrino program

      SciTech Connect (OSTI)

      Goodman, M.; MINOS Collaboration

      1997-10-01

      Fermilab is embarking upon a neutrino oscillation program which includes a long-baseline neutrino experiment MINOS. MINOS will be a 10 kiloton detector located 730 km Northwest of Fermilab in the Soudan underground laboratory. It will be sensitive to neutrino oscillations with parameters above {Delta}m{sup 2} {approximately} 3 {times} 10{sup {minus}3} eV{sup 2} and sin{sup 2}(2{theta}) {approximately} 0.02.

    10. Systematic errors in long baseline oscillation experiments

      SciTech Connect (OSTI)

      Harris, Deborah A.; /Fermilab

      2006-02-01

      This article gives a brief overview of long baseline neutrino experiments and their goals, and then describes the different kinds of systematic errors that are encountered in these experiments. Particular attention is paid to the uncertainties that come about because of imperfect knowledge of neutrino cross sections and more generally how neutrinos interact in nuclei. Near detectors are planned for most of these experiments, and the extent to which certain uncertainties can be reduced by the presence of near detectors is also discussed.

    11. Computational Fluid Dynamics (CFD) Modeling for High Rate Pulverized Coal Injection (PCI) into the Blast Furnace

      SciTech Connect (OSTI)

      Dr. Chenn Zhou

      2008-10-15

      Pulverized coal injection (PCI) into the blast furnace (BF) has been recognized as an effective way to decrease the coke and total energy consumption along with minimization of environmental impacts. However, increasing the amount of coal injected into the BF is currently limited by the lack of knowledge of some issues related to the process. It is therefore important to understand the complex physical and chemical phenomena in the PCI process. Due to the difficulty in attaining trus BF measurements, Computational fluid dynamics (CFD) modeling has been identified as a useful technology to provide such knowledge. CFD simulation is powerful for providing detailed information on flow properties and performing parametric studies for process design and optimization. In this project, comprehensive 3-D CFD models have been developed to simulate the PCI process under actual furnace conditions. These models provide raceway size and flow property distributions. The results have provided guidance for optimizing the PCI process.

    12. Module 7 - Integrated Baseline Review and Change Control | Department of

      Office of Environmental Management (EM)

      Energy 7 - Integrated Baseline Review and Change Control Module 7 - Integrated Baseline Review and Change Control This module focuses on integrated baseline reviews (IBR) and change control. This module outlines the objective and responsibility of an integrated baseline review. Additionally, this module will discuss the change control process required for implementing earned value. Begin Module >> (418.59

    13. Enabling a Highly-Scalable Global Address Space Model for Petascale Computing

      SciTech Connect (OSTI)

      Apra, Edoardo; Vetter, Jeffrey S; Yu, Weikuan

      2010-01-01

      Over the past decade, the trajectory to the petascale has been built on increased complexity and scale of the underlying parallel architectures. Meanwhile, software de- velopers have struggled to provide tools that maintain the productivity of computational science teams using these new systems. In this regard, Global Address Space (GAS) programming models provide a straightforward and easy to use addressing model, which can lead to improved produc- tivity. However, the scalability of GAS depends directly on the design and implementation of the runtime system on the target petascale distributed-memory architecture. In this paper, we describe the design, implementation, and optimization of the Aggregate Remote Memory Copy Interface (ARMCI) runtime library on the Cray XT5 2.3 PetaFLOPs computer at Oak Ridge National Laboratory. We optimized our implementation with the flow intimation technique that we have introduced in this paper. Our optimized ARMCI implementation improves scalability of both the Global Arrays (GA) programming model and a real-world chemistry application NWChem from small jobs up through 180,000 cores.

    14. Computational Model of Population Dynamics Based on the Cell Cycle and Local Interactions

      SciTech Connect (OSTI)

      Oprisan, Sorinel Adrian; Oprisan, Ana

      2005-03-31

      Our study bridges cellular (mesoscopic) level interactions and global population (macroscopic) dynamics of carcinoma. The morphological differences and transitions between well and smooth defined benign tumors and tentacular malignat tumors suggest a theoretical analysis of tumor invasion based on the development of mathematical models exhibiting bifurcations of spatial patterns in the density of tumor cells. Our computational model views the most representative and clinically relevant features of oncogenesis as a fight between two distinct sub-systems: the immune system of the host and the neoplastic system. We implemented the neoplastic sub-system using a three-stage cell cycle: active, dormant, and necrosis. The second considered sub-system consists of cytotoxic active (effector) cells -- EC, with a very broad phenotype ranging from NK cells to CTL cells, macrophages, etc. Based on extensive numerical simulations, we correlated the fractal dimensions for carcinoma, which could be obtained from tumor imaging, with the malignat stage. Our computational model was able to also simulate the effects of surgical, chemotherapeutical, and radiotherapeutical treatments.

    15. High-Performance Computer Modeling of the Cosmos-Iridium Collision

      SciTech Connect (OSTI)

      Olivier, S; Cook, K; Fasenfest, B; Jefferson, D; Jiang, M; Leek, J; Levatin, J; Nikolaev, S; Pertica, A; Phillion, D; Springer, K; De Vries, W

      2009-08-28

      This paper describes the application of a new, integrated modeling and simulation framework, encompassing the space situational awareness (SSA) enterprise, to the recent Cosmos-Iridium collision. This framework is based on a flexible, scalable architecture to enable efficient simulation of the current SSA enterprise, and to accommodate future advancements in SSA systems. In particular, the code is designed to take advantage of massively parallel, high-performance computer systems available, for example, at Lawrence Livermore National Laboratory. We will describe the application of this framework to the recent collision of the Cosmos and Iridium satellites, including (1) detailed hydrodynamic modeling of the satellite collision and resulting debris generation, (2) orbital propagation of the simulated debris and analysis of the increased risk to other satellites (3) calculation of the radar and optical signatures of the simulated debris and modeling of debris detection with space surveillance radar and optical systems (4) determination of simulated debris orbits from modeled space surveillance observations and analysis of the resulting orbital accuracy, (5) comparison of these modeling and simulation results with Space Surveillance Network observations. We will also discuss the use of this integrated modeling and simulation framework to analyze the risks and consequences of future satellite collisions and to assess strategies for mitigating or avoiding future incidents, including the addition of new sensor systems, used in conjunction with the Space Surveillance Network, for improving space situational awareness.

    16. Swelling in light water reactor internal components: Insights from computational modeling

      SciTech Connect (OSTI)

      Stoller, Roger E.; Barashev, Alexander V.; Golubov, Stanislav I.

      2015-08-01

      A modern cluster dynamics model has been used to investigate the materials and irradiation parameters that control microstructural evolution under the relatively low-temperature exposure conditions that are representative of the operating environment for in-core light water reactor components. The focus is on components fabricated from austenitic stainless steel. The model accounts for the synergistic interaction between radiation-produced vacancies and the helium that is produced by nuclear transmutation reactions. Cavity nucleation rates are shown to be relatively high in this temperature regime (275 to 325°C), but are sensitive to assumptions about the fine scale microstructure produced under low-temperature irradiation. The cavity nucleation rates observed run counter to the expectation that void swelling would not occur under these conditions. This expectation was based on previous research on void swelling in austenitic steels in fast reactors. This misleading impression arose primarily from an absence of relevant data. The results of the computational modeling are generally consistent with recent data obtained by examining ex-service components. However, it has been shown that the sensitivity of the model s predictions of low-temperature swelling behavior to assumptions about the primary damage source term and specification of the mean-field sink strengths is somewhat greater that that observed at higher temperatures. Further assessment of the mathematical model is underway to meet the long-term objective of this research, which is to provide a predictive model of void swelling at relevant lifetime exposures to support extended reactor operations.

    17. Physical and Computational Modeling for Chemical and Biological Weapons Airflow Applications

      SciTech Connect (OSTI)

      McEligot, Donald Marinus; Mc Creery, Glenn Ernest; Pink, Robert John; Barringer, C.; Knight, K. J.

      2002-11-01

      There is a need for information on dispersion and infiltration of chemical and biological agents in complex building environments. A recent collaborative study conducted at the Idaho National Engineering and Environmental Laboratory (INEEL) and Bechtel Corporation Research and Development had the objective of assessing computational fluid dynamics (CFD) models for simulation of flow around complicated buildings through a comparison of experimental and numerical results. The test facility used in the experiments was INEEL’s unique large Matched-Index-of-Refraction (MIR) flow system. The CFD code used for modeling was Fluent, a widely available commercial flow simulation package. For the experiment, a building plan was selected to approximately represent an existing facility. It was found that predicted velocity profiles from above the building and in front of the building were in good agreement with the measurements.

    18. Predicting adenocarcinoma recurrence using computational texture models of nodule components in lung CT

      SciTech Connect (OSTI)

      Depeursinge, Adrien; Yanagawa, Masahiro; Leung, Ann N.; Rubin, Daniel L.

      2015-04-15

      Purpose: To investigate the importance of presurgical computed tomography (CT) intensity and texture information from ground-glass opacities (GGO) and solid nodule components for the prediction of adenocarcinoma recurrence. Methods: For this study, 101 patients with surgically resected stage I adenocarcinoma were selected. During the follow-up period, 17 patients had disease recurrence with six associated cancer-related deaths. GGO and solid tumor components were delineated on presurgical CT scans by a radiologist. Computational texture models of GGO and solid regions were built using linear combinations of steerable Riesz wavelets learned with linear support vector machines (SVMs). Unlike other traditional texture attributes, the proposed texture models are designed to encode local image scales and directions that are specific to GGO and solid tissue. The responses of the locally steered models were used as texture attributes and compared to the responses of unaligned Riesz wavelets. The texture attributes were combined with CT intensities to predict tumor recurrence and patient hazard according to disease-free survival (DFS) time. Two families of predictive models were compared: LASSO and SVMs, and their survival counterparts: Cox-LASSO and survival SVMs. Results: The best-performing predictive model of patient hazard was associated with a concordance index (C-index) of 0.81 ± 0.02 and was based on the combination of the steered models and CT intensities with survival SVMs. The same feature group and the LASSO model yielded the highest area under the receiver operating characteristic curve (AUC) of 0.8 ± 0.01 for predicting tumor recurrence, although no statistically significant difference was found when compared to using intensity features solely. For all models, the performance was found to be significantly higher when image attributes were based on the solid components solely versus using the entire tumors (p < 3.08 × 10{sup −5}). Conclusions: This study

    19. Validation of the thermospheric vector spherical harmonic (VSH) computer model. Master's thesis

      SciTech Connect (OSTI)

      Davis, J.L.

      1991-01-01

      A semi-empirical computer model of the lower thermosphere has been developed that provides a description of the composition and dynamics of the thermosphere (Killeen et al., 1992). Input variables needed to run the VSH model include time, space and geophysical conditions. One of the output variables the model provides, neutral density, is of particular interest to the U.S. Air Force. Neutral densities vary both as a result of change in solar flux (eg. the solar cycle) and as a result of changes in the magnetosphere (eg. large changes occur in neutral density during geomagnetic storms). Satellites in earth orbit experience aerodynamic drag due to the atmospheric density of the thermosphere. Variability in the neutral density described above affects the drag a satellite experiences and as a result can change the orbital characteristics of the satellite. These changes make it difficult to track the satellite's position. Therefore, it is particularly important to insure that the accuracy of the model's neutral density is optimized for all input parameters. To accomplish this, a validation program was developed to evaluate the strengths and weaknesses of the model's density output by comparing it to SETA-2 (satellite electrostatic accelerometer) total mass density measurements.

    20. In-Service Design & Performance Prediction of Advanced Fusion Material Systems by Computational Modeling and Simulation

      SciTech Connect (OSTI)

      G. R. Odette; G. E. Lucas

      2005-11-15

      This final report on "In-Service Design & Performance Prediction of Advanced Fusion Material Systems by Computational Modeling and Simulation" (DE-FG03-01ER54632) consists of a series of summaries of work that has been published, or presented at meetings, or both. It briefly describes results on the following topics: 1) A Transport and Fate Model for Helium and Helium Management; 2) Atomistic Studies of Point Defect Energetics, Dynamics and Interactions; 3) Multiscale Modeling of Fracture consisting of: 3a) A Micromechanical Model of the Master Curve (MC) Universal Fracture Toughness-Temperature Curve Relation, KJc(T - To), 3b) An Embrittlement DTo Prediction Model for the Irradiation Hardening Dominated Regime, 3c) Non-hardening Irradiation Assisted Thermal and Helium Embrittlement of 8Cr Tempered Martensitic Steels: Compilation and Analysis of Existing Data, 3d) A Model for the KJc(T) of a High Strength NFA MA957, 3e) Cracked Body Size and Geometry Effects of Measured and Effective Fracture Toughness-Model Based MC and To Evaluations of F82H and Eurofer 97, 3-f) Size and Geometry Effects on the Effective Toughness of Cracked Fusion Structures; 4) Modeling the Multiscale Mechanics of Flow Localization-Ductility Loss in Irradiation Damaged BCC Alloys; and 5) A Universal Relation Between Indentation Hardness and True Stress-Strain Constitutive Behavior. Further details can be found in the cited references or presentations that generally can be accessed on the internet, or provided upon request to the authors. Finally, it is noted that this effort was integrated with our base program in fusion materials, also funded by the DOE OFES.

    1. Computational model of collisional-radiative nonequilibrium plasma in an air-driven type laser propulsion

      SciTech Connect (OSTI)

      Ogino, Yousuke; Ohnishi, Naofumi

      2010-05-06

      A thrust power of a gas-driven laser-propulsion system is obtained through interaction with a propellant gas heated by a laser energy. Therefore, understanding the nonequilibrium nature of laser-produced plasma is essential for increasing available thrust force and for improving energy conversion efficiency from a laser to a propellant gas. In this work, a time-dependent collisional-radiative model for air plasma has been developed to study the effects of nonequilibrium atomic and molecular processes on population densities for an air-driven type laser propulsion. Many elementary processes are considered in the number density range of 10{sup 12}/cm{sup 3}<=N<=10{sup 19}/cm{sup 3} and the temperature range of 300 K<=T<=40,000 K. We then compute the unsteady nature of pulsively heated air plasma. When the ionization relaxation time is the same order as the time scale of a heating pulse, the effects of unsteady ionization are important for estimating air plasma states. From parametric computations, we determine the appropriate conditions for the collisional-radiative steady state, local thermodynamic equilibrium, and corona equilibrium models in that density and temperature range.

    2. Verification of warhead dismantelment and the importance of baseline validation

      SciTech Connect (OSTI)

      Buonpane, L.M.; Strait, R.S. )

      1991-01-01

      This paper presents an approach for evaluating verification regimes for nuclear warhead dismantlement. The approach is an adaptation of the traditional nuclear materials management model. As such the approach integrates the difficulties of verifying both stockpile estimates and numbers of warheads dismantled. Both random uncertainties and systematic uncertainties are considered in this approach. By making some basic assumptions about the relative uncertainties surrounding the stockpile estimates and the numbers of warheads dismantled, the authors illustrate their relative impacts on overall verification ability. The results highlight the need for increased attention on the problem of validating baseline declarations of stockpile size.

    3. Baseline design/economics for advanced Fischer-Tropsch technology. Quarterly report, July--September 1993

      SciTech Connect (OSTI)

      1993-12-31

      The objectives of this study are to: Develop a baseline design and two alternative designs for indirect liquefaction using advanced F-T technology. The baseline design uses Illinois No. 6 Eastern Coal and conventional refining. There is an alternative refining case using ZSM-5 treatment of the vapor stream from the slurry F-T reactor and an alternative coal case using Western coal from the Powder River Basin. Prepare the capital and operating costs for the baseline design and the alternatives. Individual plant costs for the alternative cases will be prorated on capacity, wherever possible, from the baseline case. Develop a process flowsheet simulation (PFS) model. During the period of this report, a Topical Report summarizing the Baseline Case design was drafted and issued to DOE/PETC for review and release approval. Major effort was spent on the Alternate Upgrading and Refining Case. Its design specifications were finalized, and material and utility balances completed. Initial capital cost estimates were developed. A Topical Report, summarizing the Alternative (ZSM-5) Upgrading and Refining Case design, is being drafted. Under Task 4, some of the individual plant models were expanded and enhanced. An overall ASPEN/SP process simulation model was developed for the Baseline Design Case by combining the individual models of Areas 100, 200 and 300. In addition, a separate model for the simplified product refining area, Area 300, of the Alternate Upgrading and Refining case was developed. Under Task 7, cost and schedule control was the primary activity. A technical paper entitled ``Baseline Design/Economics for Advanced Fischer-Tropsch Technology`` was presented in the DOE/PETC`s Annual Contractors Review Conference, held at Pittsburgh, Pennsylvania, on September 27-29, 1993. A contract amendment was submitted to include the Kerr McGee ROSE unit in the Baseline design case and to convert the PFS models from the ASPEN/SP to ASPEN/Plus software code.

    4. Why applicants should use computer simulation models to comply with the FERC`s new merger policy

      SciTech Connect (OSTI)

      Frankena, M.W.; Morris, J.R.

      1997-02-01

      Computer models for electric utility use in complying with the US Federal Energy Regulatory Commission policy on mergers are described. Four types of simulation models that are widely used in the electric power industry are considered as tools for analyzing market power issues: dispatch/transportation models, dispatch/unit-commitment models, load-flow models, and load-flow/dispatch models. Basic model capabilities and limitations are described. Uses of the models for other purposes are also noted, including regulatory filings, antitrust litigation, and evaluation of pricing strategies.

    5. Subsurface Multiphase Flow and Multicomponent Reactive Transport Modeling using High-Performance Computing

      SciTech Connect (OSTI)

      Hammond, Glenn E.; Lichtner, Peter C.; Lu, Chuan

      2007-07-16

      Numerical modeling has become a critical tool to the U.S. Department of Energy for evaluating the environmental impact of alternative energy sources and remediation strategies for legacy waste sites. Unfortunately, the physical and chemical complexity of many sites overwhelms the capabilities of even most state of the art groundwater models. Of particular concern are the representation of highly-heterogeneous stratified rock/soil layers in the subsurface and the biological and geochemical interactions of chemical species within multiple fluid phases. Clearly, there is a need for higher-resolution modeling (i.e. more spatial, temporal, and chemical degrees of freedom) and increasingly mechanistic descriptions of subsurface physicochemical processes. We present SciDAC-funded research being performed in the development of PFLOTRAN, a parallel multiphase flow and multicomponent reactive transport model. Written in Fortran90, PFLOTRAN is founded upon PETSc data structures and solvers. We are employing PFLOTRAN in the simulation of uranium transport at the Hanford 300 Area, a contaminated site of major concern to the Department of Energy, the State of Washington, and other government agencies. By leveraging the billions of degrees of freedom available through high-performance computation using tens of thousands of processors, we can better characterize the release of uranium into groundwater and its subsequent transport to the Columbia River, and thereby better understand and evaluate the effectiveness of various proposed remediation strategies.

    6. Emergency Response Equipment and Related Training: Airborne Radiological Computer System (Model II)

      SciTech Connect (OSTI)

      David P. Colton

      2007-02-28

      The materials included in the Airborne Radiological Computer System, Model-II (ARCS-II) were assembled with several considerations in mind. First, the system was designed to measure and record the airborne gamma radiation levels and the corresponding latitude and longitude coordinates, and to provide a first overview look of the extent and severity of an accident's impact. Second, the portable system had to be light enough and durable enough that it could be mounted in an aircraft, ground vehicle, or watercraft. Third, the system must control the collection and storage of the data, as well as provide a real-time display of the data collection results to the operator. The notebook computer and color graphics printer components of the system would only be used for analyzing and plotting the data. In essence, the provided equipment is composed of an acquisition system and an analysis system. The data can be transferred from the acquisition system to the analysis system at the end of the data collection or at some other agreeable time.

    7. COMPUTATIONAL THERMODYNAMIC MODELING OF HOT CORROSION OF ALLLOYS HAYNES 242 AND HASTELLOYTMN FOR MOLTEN SALT SERVICE

      SciTech Connect (OSTI)

      Michael V. Glazoff; Piyush Sabharwall; Akira Tokuhiro

      2014-09-01

      An evaluation of thermodynamic aspects of hot corrosion of the superalloys Haynes 242 and HastelloyTM N in the eutectic mixtures of KF and ZrF4 is carried out for development of Advanced High Temperature Reactor (AHTR). This work models the behavior of several superalloys, potential candidates for the AHTR, using computational thermodynamics tool (ThermoCalc), leading to the development of thermodynamic description of the molten salt eutectic mixtures, and on that basis, mechanistic prediction of hot corrosion. The results from these studies indicated that the principal mechanism of hot corrosion was associated with chromium leaching for all of the superalloys described above. However, HastelloyTM N displayed the best hot corrosion performance. This was not surprising given it was developed originally to withstand the harsh conditions of molten salt environment. However, the results obtained in this study provided confidence in the employed methods of computational thermodynamics and could be further used for future alloy design efforts. Finally, several potential solutions to mitigate hot corrosion were proposed for further exploration, including coating development and controlled scaling of intermediate compounds in the KF-ZrF4 system.

    8. Computing Resources

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Cluster-Image TRACC RESEARCH Computational Fluid Dynamics Computational Structural Mechanics Transportation Systems Modeling Computing Resources The TRACC Computational Clusters With the addition of a new cluster called Zephyr that was made operational in September of this year (2012), TRACC now offers two clusters to choose from: Zephyr and our original cluster that has now been named Phoenix. Zephyr was acquired from Atipa technologies, and it is a 92-node system with each node having two AMD

    9. Introduction to Focus Issue: Rhythms and Dynamic Transitions in Neurological Disease: Modeling, Computation, and Experiment

      SciTech Connect (OSTI)

      Kaper, Tasso J. Kramer, Mark A.; Rotstein, Horacio G.

      2013-12-15

      Rhythmic neuronal oscillations across a broad range of frequencies, as well as spatiotemporal phenomena, such as waves and bumps, have been observed in various areas of the brain and proposed as critical to brain function. While there is a long and distinguished history of studying rhythms in nerve cells and neuronal networks in healthy organisms, the association and analysis of rhythms to diseases are more recent developments. Indeed, it is now thought that certain aspects of diseases of the nervous system, such as epilepsy, schizophrenia, Parkinson's, and sleep disorders, are associated with transitions or disruptions of neurological rhythms. This focus issue brings together articles presenting modeling, computational, analytical, and experimental perspectives about rhythms and dynamic transitions between them that are associated to various diseases.

    10. Extraction of actinides by multi-dentate diamides and their evaluation with computational molecular modeling

      SciTech Connect (OSTI)

      Sasaki, Y.; Kitatsuji, Y.; Hirata, M.; Kimura, T.; Yoshizuka, K.

      2008-07-01

      Multi-dentate diamides have been synthesized and examined for actinide (An) extractions. Bi- and tridentate extractants are the focus in this work. The extraction of actinides was performed from 0.1-6 M HNO{sub 3} to organic solvents. It was obvious that N,N,N',N'-tetra-alkyl-diglycolamide (DGA) derivatives, 2,2'-(methylimino)bis(N,N-dioctyl-acetamide) (MIDOA), and N,N'-dimethyl-N,N'-dioctyl-2-(3-oxa-pentadecane)-malonamide (DMDOOPDMA) have relatively high D values (D(Pu) > 70). The following notable results using DGA extractants were obtained: (1) DGAs with short alkyl chains give higher D values than those with long alkyl chain, (2) DGAs with long alkyl chain have high solubility in n-dodecane. Computational molecular modeling was also used to elucidate the effects of structural and electronic properties of the reagents on their different extractabilities. (authors)

    11. DualTrust: A Distributed Trust Model for Swarm-Based Autonomic Computing Systems

      SciTech Connect (OSTI)

      Maiden, Wendy M.; Dionysiou, Ioanna; Frincke, Deborah A.; Fink, Glenn A.; Bakken, David E.

      2011-02-01

      For autonomic computing systems that utilize mobile agents and ant colony algorithms for their sensor layer, trust management is important for the acceptance of the mobile agent sensors and to protect the system from malicious behavior by insiders and entities that have penetrated network defenses. This paper examines the trust relationships, evidence, and decisions in a representative system and finds that by monitoring the trustworthiness of the autonomic managers rather than the swarming sensors, the trust management problem becomes much more scalable and still serves to protect the swarm. We then propose the DualTrust conceptual trust model. By addressing the autonomic manager’s bi-directional primary relationships in the ACS architecture, DualTrust is able to monitor the trustworthiness of the autonomic managers, protect the sensor swarm in a scalable manner, and provide global trust awareness for the orchestrating autonomic manager.

    12. A computational model for thermal fluid design analysis of nuclear thermal rockets

      SciTech Connect (OSTI)

      Given, J.A.; Anghaie, S.

      1997-01-01

      A computational model for simulation and design analysis of nuclear thermal propulsion systems has been developed. The model simulates a full-topping expander cycle engine system and the thermofluid dynamics of the core coolant flow, accounting for the real gas properties of the hydrogen propellant/coolant throughout the system. Core thermofluid studies reveal that near-wall heat transfer models currently available may not be applicable to conditions encountered within some nuclear rocket cores. Additionally, the possibility of a core thermal fluid instability at low mass fluxes and the effects of the core power distribution are investigated. Results indicate that for tubular core coolant channels, thermal fluid instability is not an issue within the possible range of operating conditions in these systems. Findings also show the advantages of having a nonflat centrally peaking axial core power profile from a fluid dynamic standpoint. The effects of rocket operating conditions on system performance are also investigated. Results show that high temperature and low pressure operation is limited by core structural considerations, while low temperature and high pressure operation is limited by system performance constraints. The utility of these programs for finding these operational limits, optimum operating conditions, and thermal fluid effects is demonstrated.

    13. Subsurface Multiphase Flow and Multicomponent Reactive Transport Modeling using High-Performance Computing

      SciTech Connect (OSTI)

      Hammond, Glenn E.; Lichtner, Peter C.; Lu, Chuan

      2007-08-01

      Numerical modeling has become a critical tool to the Department of Energy for evaluating the environmental impact of alternative energy sources and remediation strategies for legacy waste sites. Unfortunately, the physical and chemical complexity of many sites overwhelms the capabilities of even most state of the art groundwater models. Of particular concern are the representation of highly-heterogeneous stratified rock/soil layers in the subsurface and the biological and geochemical interactions of chemical species within multiple fluid phases. Clearly, there is a need for higher-resolution modeling (i.e. more spatial, temporal, and chemical degrees of freedom) and increasingly mechanistic descriptions of subsurface physicochemical processes. We present research being performed in the development of PFLOTRAN, a parallel multiphase flow and multicomponent reactive transport model. Written in Fortran90, PFLOTRAN is founded upon PETSc data structures and solvers and has exhibited impressive strong scalability on up to 4000 processors on the ORNL Cray XT3. We are employing PFLOTRAN in the simulation of uranium transport at the Hanford 300 Area, a contaminated site of major concern to the Department of Energy, the State of Washington, and other government agencies where overly-simplistic historical modeling erroneously predicted decade removal times for uranium by ambient groundwater flow. By leveraging the billions of degrees of freedom available through high-performance computation using tens of thousands of processors, we can better characterize the release of uranium into groundwater and its subsequent transport to the Columbia River, and thereby better understand and evaluate the effectiveness of various proposed remediation strategies.

    14. Global Nuclear Energy Partnership Waste Treatment Baseline

      SciTech Connect (OSTI)

      Dirk Gombert; William Ebert; James Marra; Robert Jubin; John Vienna

      2008-05-01

      The Global Nuclear Energy Partnership program (GNEP) is designed to demonstrate a proliferation-resistant and sustainable integrated nuclear fuel cycle that can be commercialized and used internationally. Alternative stabilization concepts for byproducts and waste streams generated by fuel recycling processes were evaluated and a baseline of waste forms was recommended for the safe disposition of waste streams. Waste forms are recommended based on the demonstrated or expected commercial practicability and technical maturity of the processes needed to make the waste forms, and performance of the waste form materials when disposed. Significant issues remain in developing technologies to process some of the wastes into the recommended waste forms, and a detailed analysis of technology readiness and availability may lead to the choice of a different waste form than what is recommended herein. Evolving regulations could also affect the selection of waste forms.

    15. GEO2D - Two-Dimensional Computer Model of a Ground Source Heat Pump System

      DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

      James Menart

      2013-06-07

      This file contains a zipped file that contains many files required to run GEO2D. GEO2D is a computer code for simulating ground source heat pump (GSHP) systems in two-dimensions. GEO2D performs a detailed finite difference simulation of the heat transfer occurring within the working fluid, the tube wall, the grout, and the ground. Both horizontal and vertical wells can be simulated with this program, but it should be noted that the vertical wall is modeled as a single tube. This program also models the heat pump in conjunction with the heat transfer occurring. GEO2D simulates the heat pump and ground loop as a system. Many results are produced by GEO2D as a function of time and position, such as heat transfer rates, temperatures and heat pump performance. On top of this information from an economic comparison between the geothermal system simulated and a comparable air heat pump systems or a comparable gas, oil or propane heating systems with a vapor compression air conditioner. The version of GEO2D in the attached file has been coupled to the DOE heating and cooling load software called ENERGYPLUS. This is a great convenience for the user because heating and cooling loads are an input to GEO2D. GEO2D is a user friendly program that uses a graphical user interface for inputs and outputs. These make entering data simple and they produce many plotted results that are easy to understand. In order to run GEO2D access to MATLAB is required. If this program is not available on your computer you can download the program MCRInstaller.exe, the 64 bit version, from the MATLAB website or from this geothermal depository. This is a free download which will enable you to run GEO2D..

    16. Baseline design/economics for advanced Fischer-Tropsch technology. Quarterly report, January--March 1994

      SciTech Connect (OSTI)

      1994-12-31

      The objectives of the study are to: Develop a baseline design and two alternative designs for indirect liquefaction using advanced F-T technology. The baseline design uses Illinois No. 6 Eastern Coal and conventional refining. There is an alternative refining case using ZSM-5 treatment of the vapor stream from the slurry F-T reactor and an alternative coal case using Western, coal from the Powder River Basin. Prepare the capital and operating costs for the baseline design and the alternatives. Individual plant costs for the alternative cases will be prorated on capacity, wherever possible, from the baseline case. Develop a process flowsheet simulation (PFS) model. During the reporting period, work progressed on Tasks 1, 2, 4, 6 and 7. This report covers work done during the period and consists of four sections: Introduction and Summary. Task 1: Baseline Design and Alternatives. Task 2: Evaluate baseline and alternative economics. Task 4: Process Flowsheet Simulation (PFS) model. Task 6: Document the PFS model and develop a DOE training session on its use and Project Management and Staffing Report.

    17. Energy Intensity Baselining and Tracking Guidance | Department of Energy

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      Technical Assistance » Better Plants » Energy Intensity Baselining and Tracking Guidance Energy Intensity Baselining and Tracking Guidance The Energy Intensity Baselining and Tracking Guidance for the Better Buildings, Better Plants Program helps companies meet the program's reporting requirements by describing the steps necessary to develop an energy consumption and energy intensity baseline and calculating consumption and intensity changes over time. Most of the calculation steps described

    18. Proof-of-Concept Demonstrations for Computation-Based Human Reliability Analysis. Modeling Operator Performance During Flooding Scenarios

      SciTech Connect (OSTI)

      Joe, Jeffrey Clark; Boring, Ronald Laurids; Herberger, Sarah Elizabeth Marie; Mandelli, Diego; Smith, Curtis Lee

      2015-09-01

      The United States (U.S.) Department of Energy (DOE) Light Water Reactor Sustainability (LWRS) program has the overall objective to help sustain the existing commercial nuclear power plants (NPPs). To accomplish this program objective, there are multiple LWRS “pathways,” or research and development (R&D) focus areas. One LWRS focus area is called the Risk-Informed Safety Margin and Characterization (RISMC) pathway. Initial efforts under this pathway to combine probabilistic and plant multi-physics models to quantify safety margins and support business decisions also included HRA, but in a somewhat simplified manner. HRA experts at Idaho National Laboratory (INL) have been collaborating with other experts to develop a computational HRA approach, called the Human Unimodel for Nuclear Technology to Enhance Reliability (HUNTER), for inclusion into the RISMC framework. The basic premise of this research is to leverage applicable computational techniques, namely simulation and modeling, to develop and then, using RAVEN as a controller, seamlessly integrate virtual operator models (HUNTER) with 1) the dynamic computational MOOSE runtime environment that includes a full-scope plant model, and 2) the RISMC framework PRA models already in use. The HUNTER computational HRA approach is a hybrid approach that leverages past work from cognitive psychology, human performance modeling, and HRA, but it is also a significant departure from existing static and even dynamic HRA methods. This report is divided into five chapters that cover the development of an external flooding event test case and associated statistical modeling considerations.

    19. Computational Nanophotonics: Model Optical Interactions and Transport in Tailored Nanosystem Architectures

      SciTech Connect (OSTI)

      Stockman, Mark; Gray, Steven

      2014-02-21

      The program is directed toward development of new computational approaches to photoprocesses in nanostructures whose geometry and composition are tailored to obtain desirable optical responses. The emphasis of this specific program is on the development of computational methods and prediction and computational theory of new phenomena of optical energy transfer and transformation on the extreme nanoscale (down to a few nanometers).

    20. Computational Tools for Predictive Modeling of Properties in Complex Actinide Systems

      SciTech Connect (OSTI)

      Autschbach, Jochen; Govind, Niranjan; Atta Fynn, Raymond; Bylaska, Eric J.; Weare, John H.; de Jong, Wibe A.

      2015-03-30

      In this chapter we focus on methodological and computational aspects that are key to accurately modeling the spectroscopic and thermodynamic properties of molecular systems containing actinides within the density functional theory (DFT) framework. Our focus is on properties that require either an accurate relativistic all-electron description or an accurate description of the dynamical behavior of actinide species in an environment at finite temperature, or both. The implementation of the methods and the calculations discussed in this chapter were done with the NWChem software suite (Valiev et al. 2010). In the first two sections we discuss two methods that account for relativistic effects, the ZORA and the X2C Hamiltonian. Section 1.2.1 discusses the implementation of the approximate relativistic ZORA Hamiltonian and its extension to magnetic properties. Section 1.3 focuses on the exact X2C Hamiltonian and the application of this methodology to obtain accurate molecular properties. In Section 1.4 we examine the role of a dynamical environment at finite temperature as well as the presence of other ions on the thermodynamics of hydrolysis and exchange reaction mechanisms. Finally, Section 1.5 discusses the modeling of XAS (EXAFS, XANES) properties in realistic environments accounting for both the dynamics of the system and (for XANES) the relativistic effects.

    1. Wind Turbine Modeling for Computational Fluid Dynamics: December 2010 - December 2012

      SciTech Connect (OSTI)

      Tossas, L. A. M.; Leonardi, S.

      2013-07-01

      With the shortage of fossil fuel and the increasing environmental awareness, wind energy is becoming more and more important. As the market for wind energy grows, wind turbines and wind farms are becoming larger. Current utility-scale turbines extend a significant distance into the atmospheric boundary layer. Therefore, the interaction between the atmospheric boundary layer and the turbines and their wakes needs to be better understood. The turbulent wakes of upstream turbines affect the flow field of the turbines behind them, decreasing power production and increasing mechanical loading. With a better understanding of this type of flow, wind farm developers could plan better-performing, less maintenance-intensive wind farms. Simulating this flow using computational fluid dynamics is one important way to gain a better understanding of wind farm flows. In this study, we compare the performance of actuator disc and actuator line models in producing wind turbine wakes and the wake-turbine interaction between multiple turbines. We also examine parameters that affect the performance of these models, such as grid resolution, the use of a tip-loss correction, and the way in which the turbine force is projected onto the flow field.

    2. Development of Computational Tools for Metabolic Model Curation, Flux Elucidation and Strain Design

      SciTech Connect (OSTI)

      Maranas, Costas D

      2012-05-21

      An overarching goal of the Department of Energy™ mission is the efficient deployment and engineering of microbial and plant systems to enable biomass conversion in pursuit of high energy density liquid biofuels. This has spurred the pace at which new organisms are sequenced and annotated. This torrent of genomic information has opened the door to understanding metabolism in not just skeletal pathways and a handful of microorganisms but for truly genome-scale reconstructions derived for hundreds of microbes and plants. Understanding and redirecting metabolism is crucial because metabolic fluxes are unique descriptors of cellular physiology that directly assess the current cellular state and quantify the effect of genetic engineering interventions. At the same time, however, trying to keep pace with the rate of genomic data generation has ushered in a number of modeling and computational challenges related to (i) the automated assembly, testing and correction of genome-scale metabolic models, (ii) metabolic flux elucidation using labeled isotopes, and (iii) comprehensive identification of engineering interventions leading to the desired metabolism redirection.

    3. Recommendations for computer modeling codes to support the UMTRA groundwater restoration project

      SciTech Connect (OSTI)

      Tucker, M.D.; Khan, M.A.

      1996-04-01

      The Uranium Mill Tailings Remediation Action (UMTRA) Project is responsible for the assessment and remedial action at the 24 former uranium mill tailings sites located in the US. The surface restoration phase, which includes containment and stabilization of the abandoned uranium mill tailings piles, has a specific termination date and is nearing completion. Therefore, attention has now turned to the groundwater restoration phase, which began in 1991. Regulated constituents in groundwater whose concentrations or activities exceed maximum contaminant levels (MCLs) or background levels at one or more sites include, but are not limited to, uranium, selenium, arsenic, molybdenum, nitrate, gross alpha, radium-226 and radium-228. The purpose of this report is to recommend computer codes that can be used to assist the UMTRA groundwater restoration effort. The report includes a survey of applicable codes in each of the following areas: (1) groundwater flow and contaminant transport modeling codes, (2) hydrogeochemical modeling codes, (3) pump and treat optimization codes, and (4) decision support tools. Following the survey of the applicable codes, specific codes that can best meet the needs of the UMTRA groundwater restoration program in each of the four areas are recommended.

    4. Computer modelling of the reduction of rare earth dopants in barium aluminate

      SciTech Connect (OSTI)

      Rezende, Marcos V. dos S; Valerio, Mario E.G.; Jackson, Robert A.

      2011-08-15

      Long lasting phosphorescence in barium aluminates can be achieved by doping with rare earth ions in divalent charge states. The rare earth ions are initially in a trivalent charge state, but are reduced to a divalent charge state before being doped into the material. In this paper, the reduction of trivalent rare earth ions in the BaAl{sub 2}O{sub 4} lattice is studied by computer simulation, with the energetics of the whole reduction and doping process being modelled by two methods, one based on single ion doping and one which allows dopant concentrations to be taken into account. A range of different reduction schemes are considered and the most energetically favourable schemes identified. - Graphical abstract: The doping and subsequent reduction of a rare earth ion into the barium aluminate lattice. Highlights: > The doping of barium aluminate with rare earth ions reduced in a range of atmospheres has been modelled. > The overall solution energy for the doping process for each ion in each reducing atmosphere is calculated using two methods. > The lowest energy reduction process is predicted and compared with experimental results.

    5. A sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory.

      SciTech Connect (OSTI)

      Johnson, J. D.; Oberkampf, William Louis; Helton, Jon Craig (Arizona State University, Tempe, AZ); Storlie, Curtis B. (North Carolina State University, Raleigh, NC)

      2006-10-01

      Evidence theory provides an alternative to probability theory for the representation of epistemic uncertainty in model predictions that derives from epistemic uncertainty in model inputs, where the descriptor epistemic is used to indicate uncertainty that derives from a lack of knowledge with respect to the appropriate values to use for various inputs to the model. The potential benefit, and hence appeal, of evidence theory is that it allows a less restrictive specification of uncertainty than is possible within the axiomatic structure on which probability theory is based. Unfortunately, the propagation of an evidence theory representation for uncertainty through a model is more computationally demanding than the propagation of a probabilistic representation for uncertainty, with this difficulty constituting a serious obstacle to the use of evidence theory in the representation of uncertainty in predictions obtained from computationally intensive models. This presentation describes and illustrates a sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory. Preliminary trials indicate that the presented strategy can be used to propagate uncertainty representations based on evidence theory in analysis situations where naive sampling-based (i.e., unsophisticated Monte Carlo) procedures are impracticable due to computational cost.

    6. Noise analysis of genome-scale protein synthesis using a discrete computational model of translation

      SciTech Connect (OSTI)

      Racle, Julien; Hatzimanikatis, Vassily; Stefaniuk, Adam Jan

      2015-07-28

      Noise in genetic networks has been the subject of extensive experimental and computational studies. However, very few of these studies have considered noise properties using mechanistic models that account for the discrete movement of ribosomes and RNA polymerases along their corresponding templates (messenger RNA (mRNA) and DNA). The large size of these systems, which scales with the number of genes, mRNA copies, codons per mRNA, and ribosomes, is responsible for some of the challenges. Additionally, one should be able to describe the dynamics of ribosome exchange between the free ribosome pool and those bound to mRNAs, as well as how mRNA species compete for ribosomes. We developed an efficient algorithm for stochastic simulations that addresses these issues and used it to study the contribution and trade-offs of noise to translation properties (rates, time delays, and rate-limiting steps). The algorithm scales linearly with the number of mRNA copies, which allowed us to study the importance of genome-scale competition between mRNAs for the same ribosomes. We determined that noise is minimized under conditions maximizing the specific synthesis rate. Moreover, sensitivity analysis of the stochastic system revealed the importance of the elongation rate in the resultant noise, whereas the translation initiation rate constant was more closely related to the average protein synthesis rate. We observed significant differences between our results and the noise properties of the most commonly used translation models. Overall, our studies demonstrate that the use of full mechanistic models is essential for the study of noise in translation and transcription.

    7. Improving computer simulations of heat transfer for projecting fenestration products: Using radiation view-factor models

      SciTech Connect (OSTI)

      Griffith, B.; Tuerler, D.; Arasteh, D.K.; Curcija, D.

      1998-10-01

      The window well formed by the concave surface on the warm side of skylights and garden windows can cause surface heat-flow rates to be different for these projecting types of fenestration products than for normal planar windows. Current methods of simulating fenestration thermal conductance (U-factor) use constant boundary condition values for overall surface heat transfer. Simulations that account for local variations in surface heat transfer rates (radiation and convection) may be more accurate for rating and labeling window products whose surfaces project outside a building envelope. This paper, which presents simulation and experimental results for one projecting geometry, is the first step in documenting the importance of these local effects. A generic specimen, called the foam garden window, was used in simulations and experiments to investigate heat transfer of projecting surfaces. Experiments focused on a vertical cross section (measurement plane) located at the middle of the window well on the warm side of the specimen. The specimen was placed between laboratory thermal chambers that were operated at American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE) winter heating design conditions. Infrared thermography was used to map surface temperatures. Air temperature and velocity were mapped throughout the measurement plane using a mechanical traversing system. Finite-element computer simulations that directly modeled element-to-element radiation were better able to match experimental data than simulations that used fixed coefficients for total surface heat transfer. Air conditions observed in the window well suggest that localized convective effects were the reason for the difference between actual and modeled surface temperatures. U-value simulation results were 5% to 10% lower when radiation was modeled directly.

    8. Toward accurate tooth segmentation from computed tomography images using a hybrid level set model

      SciTech Connect (OSTI)

      Gan, Yangzhou; Zhao, Qunfei; Xia, Zeyang E-mail: jing.xiong@siat.ac.cn; Hu, Ying; Xiong, Jing E-mail: jing.xiong@siat.ac.cn; Zhang, Jianwei

      2015-01-15

      Purpose: A three-dimensional (3D) model of the teeth provides important information for orthodontic diagnosis and treatment planning. Tooth segmentation is an essential step in generating the 3D digital model from computed tomography (CT) images. The aim of this study is to develop an accurate and efficient tooth segmentation method from CT images. Methods: The 3D dental CT volumetric images are segmented slice by slice in a two-dimensional (2D) transverse plane. The 2D segmentation is composed of a manual initialization step and an automatic slice by slice segmentation step. In the manual initialization step, the user manually picks a starting slice and selects a seed point for each tooth in this slice. In the automatic slice segmentation step, a developed hybrid level set model is applied to segment tooth contours from each slice. Tooth contour propagation strategy is employed to initialize the level set function automatically. Cone beam CT (CBCT) images of two subjects were used to tune the parameters. Images of 16 additional subjects were used to validate the performance of the method. Volume overlap metrics and surface distance metrics were adopted to assess the segmentation accuracy quantitatively. The volume overlap metrics were volume difference (VD, mm{sup 3}) and Dice similarity coefficient (DSC, %). The surface distance metrics were average symmetric surface distance (ASSD, mm), RMS (root mean square) symmetric surface distance (RMSSSD, mm), and maximum symmetric surface distance (MSSD, mm). Computation time was recorded to assess the efficiency. The performance of the proposed method has been compared with two state-of-the-art methods. Results: For the tested CBCT images, the VD, DSC, ASSD, RMSSSD, and MSSD for the incisor were 38.16 ± 12.94 mm{sup 3}, 88.82 ± 2.14%, 0.29 ± 0.03 mm, 0.32 ± 0.08 mm, and 1.25 ± 0.58 mm, respectively; the VD, DSC, ASSD, RMSSSD, and MSSD for the canine were 49.12 ± 9.33 mm{sup 3}, 91.57 ± 0.82%, 0.27 ± 0.02 mm, 0

    9. LTC vacuum blasting machine (concrete): Baseline report

      SciTech Connect (OSTI)

      1997-07-31

      The LTC shot blast technology was tested and is being evaluated at Florida International University (FIU) as a baseline technology. In conjunction with FIU`s evaluation of efficiency and cost, this report covers the evaluation conducted for safety and health issues. It is a commercially available technology and has been used for various projects at locations throughout the country. The LTC 1073 Vacuum Blasting Machine uses a high-capacity, direct-pressure blasting system which incorporates a continuous feed for the blast media. The blast media cleans the surface within the contained brush area of the blast. It incorporates a vacuum system which removes dust and debris from the surface as it is blasted. The safety and health evaluation during the testing demonstration focused on two main areas of exposure: dust and noise. Dust exposure during maintenance activities was minimal, but due to mechanical difficulties dust monitoring could not be conducted during operation. Noise exposure was significant. Further testing for each of these exposures is recommended because of the outdoor environment where the testing demonstration took place. This may cause the results to be inaccurate. It is feasible that the dust and noise levels will be higher in an enclosed environment. In addition, other safety and health issues found were ergonomics, heat stress, tripping hazards, electrical hazards, lockout/tagout, and arm-hand vibration.

    10. LTC vacuum blasting machine (metal): Baseline report

      SciTech Connect (OSTI)

      1997-07-31

      The LTC coating removal technology was tested and is being evaluated at Florida International University (FIU) as a baseline technology. In conjunction with evaluation of efficiency and cost, this report covers the evaluation conducted for safety and health issues. It is a commercially available technology and has been used for various projects at locations throughout the country. The LTC coating removal system consisted of several hand tools, a Roto Peen scaler, and a needlegun. They are designed to remove coatings from steel, concrete, brick, and wood. These hand tools are used with the LTC PTC-6 vacuum system to capture dust and debris as removal of the coating takes place. The safety and health evaluation during the testing demonstration focused on two main areas of exposure: dust and noise. The dust exposure was minimal but noise exposure was significant. Further testing for each exposure is recommended because of the environment where the testing demonstration took place. It is feasible that the dust and noise levels will be higher in an enclosed operating environment of different construction. In addition, other areas of concern found were arm-hand vibration, whole-body vibration, ergonomics, heat stress, tripping hazards, electrical hazards, machine guarding, and lockout/tagout.

    11. Pentek metal coating removal system: Baseline report

      SciTech Connect (OSTI)

      1997-07-31

      The Pentek coating removal technology was tested and is being evaluated at Florida International University (FIU) as a baseline technology. In conjunction with FIU`s evaluation of efficiency and cost, this report covers evaluation conducted for safety and health issues. It is a commercially available technology and has been used for various projects at locations throughout the country. The Pentek coating removal system consisted of the ROTO-PEEN Scaler, CORNER-CUTTER{reg_sign}, and VAC-PAC{reg_sign}. They are designed to remove coatings from steel, concrete, brick, and wood. The Scaler uses 3M Roto Peen tungsten carbide cutters while the CORNER-CUTTER{reg_sign} uses solid needles for descaling activities. These hand tools are used with the VAC-PAC{reg_sign} vacuum system to capture dust and debris as removal of the coating takes place. The safety and health evaluation during the testing demonstration focused on two main areas of exposure: dust and noise. Dust exposure minimal, but noise exposure was significant. Further testing for each exposure is recommended because of the environment where the testing demonstration took place. It is feasible that the dust and noise levels will be higher in an enclosed operating environment of different construction. In addition, other areas of concern found were arm-hand vibration, whole-body, ergonomics, heat stress, tripping hazards, electrical hazards, machine guarding, and lockout/tagout.

    12. Gated integrator with signal baseline subtraction

      DOE Patents [OSTI]

      Wang, X.

      1996-12-17

      An ultrafast, high precision gated integrator includes an opamp having differential inputs. A signal to be integrated is applied to one of the differential inputs through a first input network, and a signal indicative of the DC offset component of the signal to be integrated is applied to the other of the differential inputs through a second input network. A pair of electronic switches in the first and second input networks define an integrating period when they are closed. The first and second input networks are substantially symmetrically constructed of matched components so that error components introduced by the electronic switches appear symmetrically in both input circuits and, hence, are nullified by the common mode rejection of the integrating opamp. The signal indicative of the DC offset component is provided by a sample and hold circuit actuated as the integrating period begins. The symmetrical configuration of the integrating circuit improves accuracy and speed by balancing out common mode errors, by permitting the use of high speed switching elements and high speed opamps and by permitting the use of a small integrating time constant. The sample and hold circuit substantially eliminates the error caused by the input signal baseline offset during a single integrating window. 5 figs.

    13. Gated integrator with signal baseline subtraction

      DOE Patents [OSTI]

      Wang, Xucheng

      1996-01-01

      An ultrafast, high precision gated integrator includes an opamp having differential inputs. A signal to be integrated is applied to one of the differential inputs through a first input network, and a signal indicative of the DC offset component of the signal to be integrated is applied to the other of the differential inputs through a second input network. A pair of electronic switches in the first and second input networks define an integrating period when they are closed. The first and second input networks are substantially symmetrically constructed of matched components so that error components introduced by the electronic switches appear symmetrically in both input circuits and, hence, are nullified by the common mode rejection of the integrating opamp. The signal indicative of the DC offset component is provided by a sample and hold circuit actuated as the integrating period begins. The symmetrical configuration of the integrating circuit improves accuracy and speed by balancing out common mode errors, by permitting the use of high speed switching elements and high speed opamps and by permitting the use of a small integrating time constant. The sample and hold circuit substantially eliminates the error caused by the input signal baseline offset during a single integrating window.

    14. Arc melter demonstration baseline test results

      SciTech Connect (OSTI)

      Soelberg, N.R.; Chambers, A.G.; Anderson, G.L.; Oden, L.L.; O`Connor, W.K.; Turner, P.C.

      1994-07-01

      This report describes the test results and evaluation for the Phase 1 (baseline) arc melter vitrification test series conducted for the Buried Waste Integrated Demonstration program (BWID). Phase 1 tests were conducted on surrogate mixtures of as-incinerated wastes and soil. Some buried wastes, soils, and stored wastes at the INEL and other DOE sites, are contaminated with transuranic (TRU) radionuclides and hazardous organics and metals. The high temperature environment in an electric arc furnace may be used to process these wastes to produce materials suitable for final disposal. An electric arc furnace system can treat heterogeneous wastes and contaminated soils by (a) dissolving and retaining TRU elements and selected toxic metals as oxides in the slag phase, (b) destroying organic materials by dissociation, pyrolyzation, and combustion, and (c) capturing separated volatilized metals in the offgas system for further treatment. Structural metals in the waste may be melted and tapped separately for recycle or disposal, or these metals may be oxidized and dissolved into the slag. The molten slag, after cooling, will provide a glass/ceramic final waste form that is homogeneous, highly nonleachable, and extremely durable. These features make this waste form suitable for immobilization of TRU radionuclides and toxic metals for geologic timeframes. Further, the volume of contaminated wastes and soils will be substantially reduced in the process.

    15. Baseline air quality study at Fermilab

      SciTech Connect (OSTI)

      Dave, M.J.; Charboneau, R.

      1980-10-01

      Air quality and meteorological data collected at Fermi National Accelerator Laboratory are presented. The data represent baseline values for the pre-construction phase of a proposed coal-gasification test facility. Air quality data were characterized through continuous monitoring of gaseous pollutants, collection of meteorological data, data acquisition and reduction, and collection and analysis of discrete atmospheric samples. Seven air quality parameters were monitored and recorded on a continuous real-time basis: sulfur dioxide, ozone, total hydrocarbons, nonreactive hydrocarbons, nitric oxide, nitrogen oxides, and carbon monoxide. A 20.9-m tower was erected near Argonne's mobile air monitoring laboratory, which was located immediately downwind of the proposed facility. The tower was instrumented at three levels to collect continuous meteorological data. Wind speed was monitored at three levels; wind direction, horizontal and vertical, at the top level; ambient temperature at the top level; and differential temperature between all three levels. All continuously-monitored parameters were digitized and recorded on magnetic tape. Appropriate software was prepared to reduce the data. Statistical summaries, grphical displays, and correlation studies also are presented.

    16. Marine and Hydrokinetic Technology (MHK) Instrumentation, Measurement, and Computer Modeling Workshop

      SciTech Connect (OSTI)

      Musial, W.; Lawson, M.; Rooney, S.

      2013-02-01

      The Marine and Hydrokinetic Technology (MHK) Instrumentation, Measurement, and Computer Modeling Workshop was hosted by the National Renewable Energy Laboratory (NREL) in Broomfield, Colorado, July 9–10, 2012. The workshop brought together over 60 experts in marine energy technologies to disseminate technical information to the marine energy community, and to collect information to help identify ways in which the development of a commercially viable marine energy industry can be accelerated. The workshop was comprised of plenary sessions that reviewed the state of the marine energy industry and technical sessions that covered specific topics of relevance. Each session consisted of presentations, followed by facilitated discussions. During the facilitated discussions, the session chairs posed several prepared questions to the presenters and audience to encourage communication and the exchange of ideas between technical experts. Following the workshop, attendees were asked to provide written feedback on their takeaways from the workshop and their best ideas on how to accelerate the pace of marine energy technology development. The first four sections of this document give a general overview of the workshop format, provide presentation abstracts, supply discussion session notes, and list responses to the post-workshop questions. The final section presents key findings and conclusions from the workshop that suggest what the most pressing MHK technology needs are and how the U.S. Department of Energy (DOE) and national laboratory resources can be utilized to assist the marine energy industry in the most effective manner.

    17. Marine and Hydrokinetic Technology (MHK) Instrumentation, Measurement, and Computer Modeling Workshop

      SciTech Connect (OSTI)

      Musial, W.; Lawson, M.; Rooney, S.

      2013-02-01

      The Marine and Hydrokinetic Technology (MHK) Instrumentation, Measurement, and Computer Modeling Workshop was hosted by the National Renewable Energy Laboratory (NREL) in Broomfield, Colorado, July 9-10, 2012. The workshop brought together over 60 experts in marine energy technologies to disseminate technical information to the marine energy community and collect information to help identify ways in which the development of a commercially viable marine energy industry can be accelerated. The workshop was comprised of plenary sessions that reviewed the state of the marine energy industry and technical sessions that covered specific topics of relevance. Each session consisted of presentations, followed by facilitated discussions. During the facilitated discussions, the session chairs posed several prepared questions to the presenters and audience to encourage communication and the exchange of ideas between technical experts. Following the workshop, attendees were asked to provide written feedback on their takeaways and their best ideas on how to accelerate the pace of marine energy technology development. The first four sections of this document give a general overview of the workshop format, provide presentation abstracts and discussion session notes, and list responses to the post-workshop questions. The final section presents key findings and conclusions from the workshop that suggest how the U.S. Department of Energy and national laboratory resources can be utilized to most effectively assist the marine energy industry.

    18. Natural Abundance 17O Nuclear Magnetic Resonance and Computational Modeling Studies of Lithium Based Liquid Electrolytes

      SciTech Connect (OSTI)

      Deng, Xuchu; Hu, Mary Y.; Wei, Xiaoliang; Wang, Wei; Chen, Zhong; Liu, Jun; Hu, Jian Z.

      2015-07-01

      Natural abundance 17O NMR measurements were conducted on electrolyte solutions consisting of Li[CF3SO2NSO2CF3] (LiTFSI) dissolved in the solvents of ethylene carbonate (EC), propylene carbonate (PC), ethyl methyl carbonate (EMC), and their mixtures at various concentrations. It was observed that 17O chemical shifts of solvent molecules change with the concentration of LiTFSI. The chemical shift displacements of carbonyl oxygen are evidently greater than those of ethereal oxygen, strongly indicating that Li+ ion is coordinated with carbonyl oxygen rather than ethereal oxygen. To understand the detailed molecular interaction, computational modeling of 17O chemical shifts was carried out on proposed solvation structures. By comparing the predicted chemical shifts with the experimental values, it is found that a Li+ ion is coordinated with four double bond oxygen atoms from EC, PC, EMC and TFSI- anion. In the case of excessive amount of solvents of EC, PC and EMC the Li+ coordinated solvent molecules are undergoing quick exchange with bulk solvent molecules, resulting in average 17O chemical shifts. Several kinds of solvation structures are identified, where the proportion of each structure in the liquid electrolytes investigated depends on the concentration of LiTFSI.

    19. Transfer matrix computation of critical polynomials for two-dimensional Potts models

      DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

      Jacobsen, Jesper Lykke; Scullard, Christian R.

      2013-02-04

      We showed, In our previous work, that critical manifolds of the q-state Potts model can be studied by means of a graph polynomial PB(q, v), henceforth referred to as the critical polynomial. This polynomial may be defined on any periodic two-dimensional lattice. It depends on a finite subgraph B, called the basis, and the manner in which B is tiled to construct the lattice. The real roots v = eK — 1 of PB(q, v) either give the exact critical points for the lattice, or provide approximations that, in principle, can be made arbitrarily accurate by increasing the size ofmore » B in an appropriate way. In earlier work, PB(q, v) was defined by a contraction-deletion identity, similar to that satisfied by the Tutte polynomial. Here, we give a probabilistic definition of PB(q, v), which facilitates its computation, using the transfer matrix, on much larger B than was previously possible.We present results for the critical polynomial on the (4, 82), kagome, and (3, 122) lattices for bases of up to respectively 96, 162, and 243 edges, compared to the limit of 36 edges with contraction-deletion. We discuss in detail the role of the symmetries and the embedding of B. The critical temperatures vc obtained for ferromagnetic (v > 0) Potts models are at least as precise as the best available results from Monte Carlo simulations or series expansions. For instance, with q = 3 we obtain vc(4, 82) = 3.742 489 (4), vc(kagome) = 1.876 459 7 (2), and vc(3, 122) = 5.033 078 49 (4), the precision being comparable or superior to the best simulation results. More generally, we trace the critical manifolds in the real (q, v) plane and discuss the intricate structure of the phase diagram in the antiferromagnetic (v < 0) region.« less

    20. Computer modeling of electrical and thermal performance during bipolar pulsed radiofrequency for pain relief

      SciTech Connect (OSTI)

      Pérez, Juan J.; Pérez-Cajaraville, Juan J.; Muñoz, Víctor; Berjano, Enrique

      2014-07-15

      Purpose: Pulsed RF (PRF) is a nonablative technique for treating neuropathic pain. Bipolar PRF application is currently aimed at creating a “strip lesion” to connect the electrode tips; however, the electrical and thermal performance during bipolar PRF is currently unknown. The objective of this paper was to study the temperature and electric field distributions during bipolar PRF. Methods: The authors developed computer models to study temperature and electric field distributions during bipolar PRF and to assess the possible ablative thermal effect caused by the accumulated temperature spikes, along with any possible electroporation effects caused by the electrical field. The authors also modeled the bipolar ablative mode, known as bipolar Continuous Radiofrequency (CRF), in order to compare both techniques. Results: There were important differences between CRF and PRF in terms of electrical and thermal performance. In bipolar CRF: (1) the initial temperature of the tissue impacts on temperature progress and hence on the thermal lesion dimension; and (2) at 37 °C, 6-min of bipolar CRF creates a strip thermal lesion between the electrodes when these are separated by a distance of up to 20 mm. In bipolar PRF: (1) an interelectrode distance shorter than 5 mm produces thermal damage (i.e., ablative effect) in the intervening tissue after 6 min of bipolar RF; and (2) the possible electroporation effect (electric fields higher than 150 kV m{sup −1}) would be exclusively circumscribed to a very small zone of tissue around the electrode tip. Conclusions: The results suggest that (1) the clinical parameters considered to be suitable for bipolar CRF should not necessarily be considered valid for bipolar PRF, and vice versa; and (2) the ablative effect of the CRF mode is mainly due to its much greater level of delivered energy than is the case in PRF, and therefore at same applied energy levels, CRF, and PRF are expected to result in same outcomes in terms of

    1. Baseline design/economics for advanced Fischer-Tropsch technology. Quarterly report, July--September 1994

      SciTech Connect (OSTI)

      1994-12-31

      This report is Bechtel`s twelfth quarterly technical progress report and covers the period of July through September, 1994. All major tasks associated with the contract study have essentially been completed. Effort is under way in preparing various topical reports for publication. The objectives of this study are to: Develop a baseline design and two alternative designs for indirect liquefaction using advanced F-T technology. The baseline design uses Illinois No. 6 Eastern Coal and conventional refining. There is an alternative refining case using ZSM-5 treatment of the vapor stream from the slurry F-T reactor and an alternative coal case using Western coal from the Powder River Basin. Prepare the capital and operating costs for the baseline design and the alternatives. Individual plant costs for the alternative cases win be prorated on capacity, wherever possible, from the baseline case. Develop a process flowsheet simulation (PFS) model; establish the baseline design and alternatives; evaluate baseline and alternative economics; develop engineering design criteria; develop a process flowsheet simulation (PFS) model; perform sensitivity studies using the PFS model; document the PFS model and develop a DOE training session on its use; and perform project management, technical coordination and other miscellaneous support functions. Tasks 1, 2, 3 and 5 have essentially been completed. Effort is under way in preparing topical reports for publication. During the current reporting period, work progressed on Tasks 4, 6 and 7. This report covers work done during this period and consists of four sections: Introduction and Summary; Task 4 - Process Flowsheet Simulation (PFS) Model and Conversion to ASPEN PLUS; Task 6 - Document the PFS model and develop a DOE training session on its use; and Project Management and Staffing Report.

    2. Exascale Computing

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      DesignForward FastForward CAL Partnerships Shifter: User Defined Images Archive APEX Home » R & D » Exascale Computing Exascale Computing Moving forward into the exascale era, NERSC users place will place increased demands on NERSC computational facilities. Users will be facing increased complexity in the memory subsystem and node architecture. System designs and programming models will have to evolve to face these new challenges. NERSC staff are active in current initiatives addressing

    3. Modeling Pancreatic Tumor Motion Using 4-Dimensional Computed Tomography and Surrogate Markers

      SciTech Connect (OSTI)

      Huguet, Florence; Yorke, Ellen D.; Davidson, Margaret; Zhang, Zhigang; Jackson, Andrew; Mageras, Gig S.; Wu, Abraham J.; Goodman, Karyn A.

      2015-03-01

      Purpose: To assess intrafractional positional variations of pancreatic tumors using 4-dimensional computed tomography (4D-CT), their impact on gross tumor volume (GTV) coverage, the reliability of biliary stent, fiducial seeds, and the real-time position management (RPM) external marker as tumor surrogates for setup of respiratory gated treatment, and to build a correlative model of tumor motion. Methods and Materials: We analyzed the respiration-correlated 4D-CT images acquired during simulation of 36 patients with either a biliary stent (n=16) or implanted fiducials (n=20) who were treated with RPM respiratory gated intensity modulated radiation therapy for locally advanced pancreatic cancer. Respiratory displacement relative to end-exhalation was measured for the GTV, the biliary stent, or fiducial seeds, and the RPM marker. The results were compared between the full respiratory cycle and the gating interval. Linear mixed model was used to assess the correlation of GTV motion with the potential surrogate markers. Results: The average ± SD GTV excursions were 0.3 ± 0.2 cm in the left-right direction, 0.6 ± 0.3 cm in the anterior-posterior direction, and 1.3 ± 0.7 cm in the superior-inferior direction. Gating around end-exhalation reduced GTV motion by 46% to 60%. D95% was at least the prescribed 56 Gy in 76% of patients. GTV displacement was associated with the RPM marker, the biliary stent, and the fiducial seeds. The correlation was better with fiducial seeds and with biliary stent. Conclusions: Respiratory gating reduced the margin necessary for radiation therapy for pancreatic tumors. GTV motion was well correlated with biliary stent or fiducial seed displacements, validating their use as surrogates for daily assessment of GTV position during treatment. A patient-specific internal target volume based on 4D-CT is recommended both for gated and not-gated treatment; otherwise, our model can be used to predict the degree of GTV motion.

    4. Introducing Enabling Computational Tools to the Climate Sciences: Multi-Resolution Climate Modeling with Adaptive Cubed-Sphere Grids

      SciTech Connect (OSTI)

      Jablonowski, Christiane

      2015-07-14

      The research investigates and advances strategies how to bridge the scale discrepancies between local, regional and global phenomena in climate models without the prohibitive computational costs of global cloud-resolving simulations. In particular, the research explores new frontiers in computational geoscience by introducing high-order Adaptive Mesh Refinement (AMR) techniques into climate research. AMR and statically-adapted variable-resolution approaches represent an emerging trend for atmospheric models and are likely to become the new norm in future-generation weather and climate models. The research advances the understanding of multi-scale interactions in the climate system and showcases a pathway how to model these interactions effectively with advanced computational tools, like the Chombo AMR library developed at the Lawrence Berkeley National Laboratory. The research is interdisciplinary and combines applied mathematics, scientific computing and the atmospheric sciences. In this research project, a hierarchy of high-order atmospheric models on cubed-sphere computational grids have been developed that serve as an algorithmic prototype for the finite-volume solution-adaptive Chombo-AMR approach. The foci of the investigations have lied on the characteristics of both static mesh adaptations and dynamically-adaptive grids that can capture flow fields of interest like tropical cyclones. Six research themes have been chosen. These are (1) the introduction of adaptive mesh refinement techniques into the climate sciences, (2) advanced algorithms for nonhydrostatic atmospheric dynamical cores, (3) an assessment of the interplay between resolved-scale dynamical motions and subgrid-scale physical parameterizations, (4) evaluation techniques for atmospheric model hierarchies, (5) the comparison of AMR refinement strategies and (6) tropical cyclone studies with a focus on multi-scale interactions and variable-resolution modeling. The results of this research project

    5. NREL: Climate Neutral Research Campuses - Determine Baseline Energy

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Consumption Determine Baseline Energy Consumption To create a climate action plan for your research campus, begin by determining current energy consumption and the resulting greenhouse gas emissions. You can then break down emissions by sector. It important to understand the following at the beginning: The Importance of a Baseline "The baseline inventory also provides a common data set for establishing benchmarks and priorities during the strategic planning stage and a means for

    6. Mid-Atlantic Baseline Studies Project | Department of Energy

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      Mid-Atlantic Baseline Studies Project Mid-Atlantic Baseline Studies Project Funded by the Department of Energy, along with a number of partners, the collaborative Mid-Atlantic Baseline Studies Project, led by the Biodiversity Research Institute (BRI), helps improve understanding of species composition and use of the Mid-Atlantic marine environment in order to promote more sustainable offshore wind development. This first-of-its-kind study along the Eastern Seaboard of the United States delivers

    7. Long-Baseline Neutrino Facility / Deep Underground Neutrino Project

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      (LBNF-DUNE) | Department of Energy Long-Baseline Neutrino Facility / Deep Underground Neutrino Project (LBNF-DUNE) Long-Baseline Neutrino Facility / Deep Underground Neutrino Project (LBNF-DUNE) Long-Baseline Neutrino Facility / Deep Underground Neutrino Project (LBNF-DUNE) Chris Mossey, Deputy Lab Director (Fermi) and Project Director for LBNF-DUNE March 23, 2016 Presentation (5.94 MB) Key Resources PMCDP EVMS PARS IIe FPD Resource Center PM Newsletter Forms and Templates More Documents

    8. 2016 Annual Technology Baseline (ATB) (Conference) | SciTech Connect

      Office of Scientific and Technical Information (OSTI)

      Conference: 2016 Annual Technology Baseline (ATB) Citation Details In-Document Search Title: 2016 Annual Technology Baseline (ATB) Consistent cost and performance data for various electricity generation technologies can be difficult to find and may change frequently for certain technologies. With the Annual Technology Baseline (ATB), National Renewable Energy Laboratory provides an organized and centralized dataset that was reviewed by internal and external experts. It uses the best information

    9. Spent Nuclear Fuel Project technical baseline document. Fiscal year 1995: Volume 1, Baseline description

      SciTech Connect (OSTI)

      Womack, J.C.; Cramond, R.; Paedon, R.J.

      1995-03-13

      This document is a revision to WHC-SD-SNF-SD-002, and is issued to support the individual projects that make up the Spent Nuclear Fuel Project in the lower-tier functions, requirements, interfaces, and technical baseline items. It presents results of engineering analyses since Sept. 1994. The mission of the SNFP on the Hanford site is to provide safety, economic, environmentally sound management of Hanford SNF in a manner that stages it to final disposition. This particularly involves K Basin fuel, although other SNF is involved also.

    10. ENERGY STAR PortfolioManager Baseline Year Instructions

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      Baseline Year" Time frame Select "Multiple Properties" Using filters, choose properties to include in report Check box to Select all filtered properties Select these reporting ...

    11. CD-2, Approve Performance Baseline | Department of Energy

      Office of Environmental Management (EM)

      ... External Independent Review (EIR), Independent Project Review (IPR), Independent Cost Estimate (ICE) Perform a Performance Baseline External Independent Review (EIR) or an ...

    12. EA-1943: Long Baseline Neutrino Facility/Deep Underground Neutrino...

      Broader source: Energy.gov (indexed) [DOE]

      May 27, 2015 EA-1943: Draft Environmental Assessment Long Baseline Neutrino FacilityDeep Underground Neutrino Experiment (LBNFDUNE) at Fermilab, Batavia, Illinois and the...

    13. Cost and Performance Comparison Baseline for Fossil Energy Power...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      blocks together into a new, revolutionary concept for future coal-based power and energy production. Objective To establish baseline performance and cost estimates for today's...

    14. Updates to the International Linear Collider Damping Rings Baseline...

      Office of Scientific and Technical Information (OSTI)

      Updates to the International Linear Collider Damping Rings Baseline Design Citation Details In-Document Search Title: Updates to the International Linear Collider Damping Rings...

    15. Sandia Energy - Scaled Wind Farm Technology Facility Baselining...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Project Accelerates Work Home Renewable Energy Energy SWIFT Facilities Partnership News Wind Energy News & Events Systems Analysis Scaled Wind Farm Technology Facility Baselining...

    16. South Africa - Greenhouse Gas Emission Baselines and Reduction...

      Open Energy Info (EERE)

      from Buildings AgencyCompany Organization United Nations Environment Programme Sector Energy Focus Area Buildings Topics Baseline projection, GHG inventory, Pathways analysis,...

    17. Mexico - Greenhouse Gas Emissions Baselines and Reduction Potentials...

      Open Energy Info (EERE)

      from Buildings AgencyCompany Organization United Nations Environment Programme Sector Energy Focus Area Buildings Topics Baseline projection, GHG inventory, Pathways analysis,...

    18. U.S. Department of Energy Performance Baseline Guide

      Broader source: Directives, Delegations, and Requirements [Office of Management (MA)]

      2008-09-12

      The guide supports DOE O 413.3A and identifies key performance baseline development processes and practices. Does not cancel other directives.

    19. Chile-Danish Government Baseline Workstream | Open Energy Information

      Open Energy Info (EERE)

      Kenya, Mexico, South Africa, Thailand and Vietnam), to share practices on setting national greenhouse gas emissions baseline scenarios. The aim of the workstream is to...

    20. Ethiopia-National Greenhouse Gas Emissions Baseline Scenarios...

      Open Energy Info (EERE)

      National Greenhouse Gas Emissions Baseline Scenarios: Learning from Experiences in Developing Countries Jump to: navigation, search Name Ethiopia-National Greenhouse Gas Emissions...

    1. NREL: Energy Analysis - Annual Technology Baseline and Standard...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Annual Technology Baseline and Standard Scenarios Discussion Draft of NREL 2016 Annual ... and a diverse set of potential futures (standard scenarios) to inform electric sector ...

    2. NETL - Bituminous Baseline Performance and Cost Interactive Tool...

      Open Energy Info (EERE)

      from the Cost and Performance Baseline for Fossil Energy Plants - Bituminous Coal and Natural Gas to Electricity report. The tool provides an interactive summary of the full...

    3. River Corridor Baseline Risk Assessment (RCBRA) Human Health...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      12, 2011 River Corridor Baseline Risk Assessment (RCBRA) Human Health Risk Assessment (Volume 2) * RCBRA Human Health Risk Assessment is final - Response provided to HAB ...

    4. Baselines For Land-Use Change In The Tropics: Application ToAvoided Deforestation Projects

      SciTech Connect (OSTI)

      Brown, Sandra; Hall, Myrna; Andrasko, Ken; Ruiz, Fernando; Marzoli, Walter; Guerrero, Gabriela; Masera, Omar; Dushku, Aaron; Dejong,Ben; Cornell, Joseph

      2007-06-01

      Although forest conservation activities particularly in thetropics offer significant potential for mitigating carbon emissions,these types of activities have faced obstacles in the policy arena causedby the difficulty in determining key elements of the project cycle,particularly the baseline. A baseline for forest conservation has twomain components: the projected land-use change and the correspondingcarbon stocks in the applicable pools such as vegetation, detritus,products and soil, with land-use change being the most difficult toaddress analytically. In this paper we focus on developing and comparingthree models, ranging from relatively simple extrapolations of pasttrends in land use based on simple drivers such as population growth tomore complex extrapolations of past trends using spatially explicitmodels of land-use change driven by biophysical and socioeconomicfactors. The three models of the latter category used in the analysis atregional scale are The Forest Area Change (FAC) model, the Land Use andCarbon Sequestration (LUCS) model, and the Geographical Modeling (GEOMOD)model. The models were used to project deforestation in six tropicalregions that featured different ecological and socioeconomic conditions,population dynamics, and uses of the land: (1) northern Belize; (2) SantaCruz State, Bolivia; (3) Parana State in Brazil; (4) Campeche, Mexico;(5) Chiapas, Mexico; and (6) Michoacan, Mexico. A comparison of all modeloutputs across all six regions shows that each model produced quitedifferent deforestation baseline. In general, the simplest FAC model,applied at the national administrative-unit scale, projected the highestamount of forest loss (four out of six) and the LUCS model the leastamount of loss (four out of five). Based on simulations of GEOMOD, wefound that readily observable physical and biological factors as well asdistance to areas of past disturbance were each about twice as importantas either sociological/demographic or economic

    5. LIAR -- A computer program for the modeling and simulation of high performance linacs

      SciTech Connect (OSTI)

      Assmann, R.; Adolphsen, C.; Bane, K.; Emma, P.; Raubenheimer, T.; Siemann, R.; Thompson, K.; Zimmermann, F.

      1997-04-01

      The computer program LIAR (LInear Accelerator Research Code) is a numerical modeling and simulation tool for high performance linacs. Amongst others, it addresses the needs of state-of-the-art linear colliders where low emittance, high-intensity beams must be accelerated to energies in the 0.05-1 TeV range. LIAR is designed to be used for a variety of different projects. LIAR allows the study of single- and multi-particle beam dynamics in linear accelerators. It calculates emittance dilutions due to wakefield deflections, linear and non-linear dispersion and chromatic effects in the presence of multiple accelerator imperfections. Both single-bunch and multi-bunch beams can be simulated. Several basic and advanced optimization schemes are implemented. Present limitations arise from the incomplete treatment of bending magnets and sextupoles. A major objective of the LIAR project is to provide an open programming platform for the accelerator physics community. Due to its design, LIAR allows straight-forward access to its internal FORTRAN data structures. The program can easily be extended and its interactive command language ensures maximum ease of use. Presently, versions of LIAR are compiled for UNIX and MS Windows operating systems. An interface for the graphical visualization of results is provided. Scientific graphs can be saved in the PS and EPS file formats. In addition a Mathematica interface has been developed. LIAR now contains more than 40,000 lines of source code in more than 130 subroutines. This report describes the theoretical basis of the program, provides a reference for existing features and explains how to add further commands. The LIAR home page and the ONLINE version of this manual can be accessed under: http://www.slac.stanford.edu/grp/arb/rwa/liar.htm.

    6. MODELING STRATEGIES TO COMPUTE NATURAL CIRCULATION USING CFD IN A VHTR AFTER A LOFA

      SciTech Connect (OSTI)

      Yu-Hsin Tung; Richard W. Johnson; Ching-Chang Chieng; Yuh-Ming Ferng

      2012-11-01

      A prismatic gas-cooled very high temperature reactor (VHTR) is being developed under the next generation nuclear plant program (NGNP) of the U.S. Department of Energy, Office of Nuclear Energy. In the design of the prismatic VHTR, hexagonal shaped graphite blocks are drilled to allow insertion of fuel pins, made of compacted TRISO fuel particles, and coolant channels for the helium coolant. One of the concerns for the reactor design is the effects of a loss of flow accident (LOFA) where the coolant circulators are lost for some reason, causing a loss of forced coolant flow through the core. In such an event, it is desired to know what happens to the (reduced) heat still being generated in the core and if it represents a problem for the fuel compacts, the graphite core or the reactor vessel (RV) walls. One of the mechanisms for the transport of heat out of the core is by the natural circulation of the coolant, which is still present. That is, how much heat may be transported by natural circulation through the core and upwards to the top of the upper plenum? It is beyond current capability for a computational fluid dynamic (CFD) analysis to perform a calculation on the whole RV with a sufficiently refined mesh to examine the full potential of natural circulation in the vessel. The present paper reports the investigation of several strategies to model the flow and heat transfer in the RV. It is found that it is necessary to employ representative geometries of the core to estimate the heat transfer. However, by taking advantage of global and local symmetries, a detailed estimate of the strength of the resulting natural circulation and the level of heat transfer to the top of the upper plenum is obtained.

    7. BEAM: A computational workflow system for managing and modeling material characterization data in HPC environments

      SciTech Connect (OSTI)

      Lingerfelt, Eric J; Endeve, Eirik; Ovchinnikov, Oleg S; Borreguero Calvo, Jose M; Park, Byung H; Archibald, Richard K; Symons, Christopher T; Kalinin, Sergei V; Messer, Bronson; Shankar, Mallikarjun; Jesse, Stephen

      2016-01-01

      Improvements in scientific instrumentation allow imaging at mesoscopic to atomic length scales, many spectroscopic modes, and now with the rise of multimodal acquisition systems and the associated processing capability the era of multidimensional, informationally dense data sets has arrived. Technical issues in these combinatorial scientific fields are exacerbated by computational challenges best summarized as a necessity for drastic improvement in the capability to transfer, store, and analyze large volumes of data. The Bellerophon Environment for Analysis of Materials (BEAM) platform provides material scientists the capability to directly leverage the integrated computational and analytical power of High Performance Computing (HPC) to perform scalable data analysis and simulation via an intuitive, cross-platform client user interface. This framework delivers authenticated, push-button execution of complex user workflows that deploy data analysis algorithms and computational simulations utilizing the converged compute-and-data infrastructure at Oak Ridge National Laboratory s (ORNL) Compute and Data Environment for Science (CADES) and HPC environments like Titan at the Oak Ridge Leadership Computing Facility (OLCF). In this work we address the underlying HPC needs for characterization in the material science community, elaborate how BEAM s design and infrastructure tackle those needs, and present a small sub-set of user cases where scientists utilized BEAM across a broad range of analytical techniques and analysis modes.

    8. International Nuclear Energy Research Initiative Development of Computational Models for Pyrochemical Electrorefiners of Nuclear Waste Transmutation Systems

      SciTech Connect (OSTI)

      M.F. Simpson; K.-R. Kim

      2010-12-01

      In support of closing the nuclear fuel cycle using non-aqueous separations technology, this project aims to develop computational models of electrorefiners based on fundamental chemical and physical processes. Spent driver fuel from Experimental Breeder Reactor-II (EBR-II) is currently being electrorefined in the Fuel Conditioning Facility (FCF) at Idaho National Laboratory (INL). And Korea Atomic Energy Research Institute (KAERI) is developing electrorefining technology for future application to spent fuel treatment and management in the Republic of Korea (ROK). Electrorefining is a critical component of pyroprocessing, a non-aqueous chemical process which separates spent fuel into four streams: (1) uranium metal, (2) U/TRU metal, (3) metallic high-level waste containing cladding hulls and noble metal fission products, and (4) ceramic high-level waste containing sodium and active metal fission products. Having rigorous yet flexible electrorefiner models will facilitate process optimization and assist in trouble-shooting as necessary. To attain such models, INL/UI has focused on approaches to develop a computationally-light and portable two-dimensional (2D) model, while KAERI/SNU has investigated approaches to develop a computationally intensive three-dimensional (3D) model for detailed and fine-tuned simulation.

    9. Model Baseline Fire Department/Fire Protection Engineering Assessment

      Broader source: Energy.gov [DOE]

      The purpose of the document is to comprehensively delineate and rationalize the roles and responsibilities of the Fire Department and Fire Protection (Engineering).

    10. User's guide for SAMMY: a computer model for multilevel r-matrix fits to neutron data using Bayes' equations

      SciTech Connect (OSTI)

      Larson, N. M.; Perey, F. G.

      1980-11-01

      A method is described for determining the parameters of a model from experimental data based upon the utilization of Bayes' theorem. This method has several advantages over the least-squares method as it is commonly used; one important advantage is that the assumptions under which the parameter values have been determined are more clearly evident than in many results based upon least squares. Bayes' method has been used to develop a computer code which can be utilized to analyze neutron cross-section data by means of the R-matrix theory. The required formulae from the R-matrix theory are presented, and the computer implementation of both Bayes' equations and R-matrix theory is described. Details about the computer code and compelte input/output information are given.

    11. Tank Waste Remediation System (TWRS) Technical Baseline Summary Description

      SciTech Connect (OSTI)

      TEDESCHI, A.R.

      2000-04-21

      This revision notes the supersedure of the subject document by concurrent issuance of HNF-1901 ''Technical Baseline Summary Description for the Tank Farm Contractor'', Revision 2. Safe storage mission technical baseline information was absorbed by the new revision of HNF-1901.

    12. Tank waste remediation system technical baseline summary description

      SciTech Connect (OSTI)

      Raymond, R.E.

      1998-01-08

      This document is one of the tools used to develop and control the mission work as depicted in the included figure. This Technical Baseline Summary Description document is the top-level tool for management of the Technical Baseline for waste storage operations.

    13. Overview of Computer-Aided Engineering of Batteries and Introduction to Multi-Scale, Multi-Dimensional Modeling of Li-Ion Batteries (Presentation)

      SciTech Connect (OSTI)

      Pesaran, A.; Kim, G. H.; Smith, K.; Santhanagopalan, S.; Lee, K. J.

      2012-05-01

      This 2012 Annual Merit Review presentation gives an overview of the Computer-Aided Engineering of Batteries (CAEBAT) project and introduces the Multi-Scale, Multi-Dimensional model for modeling lithium-ion batteries for electric vehicles.

    14. Baseline and Target Values for PV Forecasts: Toward Improved Solar Power Forecasting: Preprint

      SciTech Connect (OSTI)

      Zhang, Jie; Hodge, Bri-Mathias; Lu, Siyuan; Hamann, Hendrik F.; Lehman, Brad; Simmons, Joseph; Campos, Edwin; Banunarayanan, Venkat

      2015-08-05

      Accurate solar power forecasting allows utilities to get the most out of the solar resources on their systems. To truly measure the improvements that any new solar forecasting methods can provide, it is important to first develop (or determine) baseline and target solar forecasting at different spatial and temporal scales. This paper aims to develop baseline and target values for solar forecasting metrics. These were informed by close collaboration with utility and independent system operator partners. The baseline values are established based on state-of-the-art numerical weather prediction models and persistence models. The target values are determined based on the reduction in the amount of reserves that must be held to accommodate the uncertainty of solar power output. forecasting metrics. These were informed by close collaboration with utility and independent system operator partners. The baseline values are established based on state-of-the-art numerical weather prediction models and persistence models. The target values are determined based on the reduction in the amount of reserves that must be held to accommodate the uncertainty of solar power output.

    15. Fort Irwin Integrated Resource Assessment. Volume 2, Baseline detail

      SciTech Connect (OSTI)

      Richman, E.E.; Keller, J.M.; Dittmer, A.L.; Hadley, D.L.

      1994-01-01

      This report documents the assessment of baseline energy use at Fort Irwin, a US Army Forces Command facility near Barstow, California. It is a companion report to Volume 1, Executive Summary, and Volume 3, Integrated Resource Assessment. The US Army Forces Command (FORSCOM) has tasked the US Department of Energy (DOE) Federal Energy Management Program (FEMP), supported by the Pacific Northwest Laboratory (PNL), to identify, evaluate, and assist in acquiring all cost-effective energy projects at Fort Irwin. This is part of a model program that PNL has designed to support energy-use decisions in the federal sector. This program (1) identifies and evaluates all cost-effective energy projects; (2) develops a schedule at each installation for project acquisition considering project type, size, timing, and capital requirements, as well as energy and dollar savings; and (3) targets 100% of the financing required to implement energy efficiency projects. PNL applied this model program to Fort Irwin. This analysis examines the characteristics of electric, propane gas, and vehicle fuel use for a typical operating year. It records energy-use intensities for the facilities at Fort Irwin by building type and energy end use. It also breaks down building energy consumption by fuel type, energy end use, and building type. A complete energy consumption reconciliation is presented that accounts for all energy use among buildings, utilities, and applicable losses.

    16. A Survey of Techniques for Modeling and Improving Reliability of Computing Systems

      DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

      Mittal, Sparsh; Vetter, Jeffrey S.

      2015-04-24

      Recent trends of aggressive technology scaling have greatly exacerbated the occurrences and impact of faults in computing systems. This has made `reliability' a first-order design constraint. To address the challenges of reliability, several techniques have been proposed. In this study, we provide a survey of architectural techniques for improving resilience of computing systems. We especially focus on techniques proposed for microarchitectural components, such as processor registers, functional units, cache and main memory etc. In addition, we discuss techniques proposed for non-volatile memory, GPUs and 3D-stacked processors. To underscore the similarities and differences of the techniques, we classify them based onmore » their key characteristics. We also review the metrics proposed to quantify vulnerability of processor structures. Finally, we believe that this survey will help researchers, system-architects and processor designers in gaining insights into the techniques for improving reliability of computing systems.« less

    17. A Survey of Techniques for Modeling and Improving Reliability of Computing Systems

      SciTech Connect (OSTI)

      Mittal, Sparsh; Vetter, Jeffrey S.

      2015-04-24

      Recent trends of aggressive technology scaling have greatly exacerbated the occurrences and impact of faults in computing systems. This has made `reliability' a first-order design constraint. To address the challenges of reliability, several techniques have been proposed. In this study, we provide a survey of architectural techniques for improving resilience of computing systems. We especially focus on techniques proposed for microarchitectural components, such as processor registers, functional units, cache and main memory etc. In addition, we discuss techniques proposed for non-volatile memory, GPUs and 3D-stacked processors. To underscore the similarities and differences of the techniques, we classify them based on their key characteristics. We also review the metrics proposed to quantify vulnerability of processor structures. Finally, we believe that this survey will help researchers, system-architects and processor designers in gaining insights into the techniques for improving reliability of computing systems.

    18. Psychosocial and Cultural Modeling in Human Computation Systems: A Gamification Approach

      SciTech Connect (OSTI)

      Sanfilippo, Antonio P.; Riensche, Roderick M.; Haack, Jereme N.; Butner, R. Scott

      2013-11-20

      “Gamification”, the application of gameplay to real-world problems, enables the development of human computation systems that support decision-making through the integration of social and machine intelligence. One of gamification’s major benefits includes the creation of a problem solving environment where the influence of cognitive and cultural biases on human judgment can be curtailed through collaborative and competitive reasoning. By reducing biases on human judgment, gamification allows human computation systems to exploit human creativity relatively unhindered by human error. Operationally, gamification uses simulation to harvest human behavioral data that provide valuable insights for the solution of real-world problems.

    19. Baseline Evaluations to Support Control Room Modernization at Nuclear Power Plants

      SciTech Connect (OSTI)

      Boring, Ronald L.; Joe, Jeffrey C.

      2015-02-01

      For any major control room modernization activity at a commercial nuclear power plant (NPP) in the U.S., a utility should carefully follow the four phases prescribed by the U.S. Nuclear Regulatory Commission in NUREG-0711, Human Factors Engineering Program Review Model. These four phases include Planning and Analysis, Design, Verification and Validation, and Implementation and Operation. While NUREG-0711 is a useful guideline, it is written primarily from the perspective of regulatory review, and it therefore does not provide a nuanced account of many of the steps the utility might undertake as part of control room modernization. The guideline is largely summative—intended to catalog final products—rather than formative—intended to guide the overall modernization process. In this paper, we highlight two crucial formative sub-elements of the Planning and Analysis phase specific to control room modernization that are not covered in NUREG-0711. These two sub-elements are the usability and ergonomics baseline evaluations. A baseline evaluation entails evaluating the system as-built and currently in use. The usability baseline evaluation provides key insights into operator performance using the control system currently in place. The ergonomics baseline evaluation identifies possible deficiencies in the physical configuration of the control system. Both baseline evaluations feed into the design of the replacement system and subsequent summative benchmarking activities that help ensure that control room modernization represents a successful evolution of the control system.

    20. Computation of Domain-Averaged Irradiance with a Simple Two-Stream Radiative Transfer Model Including Vertical Cloud Property Correlations

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Computation of Domain-Averaged Irradiance with a Simple Two-Stream Radiative Transfer Model Including Vertical Cloud Property Correlations S. Kato Center for Atmospheric Sciences Hampton University Hampton, Virginia Introduction Recent development of remote sensing instruments by Atmospheric Radiation Measurement (ARM?) Program provides information of spatial and temporal variability of cloud structures. However it is not clear what cloud properties are required to express complicated cloud

    1. Coupling of Mechanical Behavior of Cell Components to Electrochemical-Thermal Models for Computer- Aided Engineering of Batteries under Abuse

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Coupling of Mechanical Behavior of Cell Components to Electrochemical-Thermal Models for Computer- Aided Engineering of Batteries under Abuse P.I.: Ahmad Pesaran Team: Tomasz Wierzbicki and Elham Sahraei (MIT) Genong Li and Lewis Collins (ANSYS) M. Sprague, G.H. Kim and S. Santhangopalan (NREL) June 17, 2014 This presentation does not contain any proprietary, confidential, or otherwise restricted information. Project ID: ES199 NREL/PR-5400-61885 2 Overview * Project Start: October 2013 * Project

    2. Reducing computation in an i-vector speaker recognition system using a tree-structured universal background model

      DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

      McClanahan, Richard; De Leon, Phillip L.

      2014-08-20

      The majority of state-of-the-art speaker recognition systems (SR) utilize speaker models that are derived from an adapted universal background model (UBM) in the form of a Gaussian mixture model (GMM). This is true for GMM supervector systems, joint factor analysis systems, and most recently i-vector systems. In all of the identified systems, the posterior probabilities and sufficient statistics calculations represent a computational bottleneck in both enrollment and testing. We propose a multi-layered hash system, employing a tree-structured GMM–UBM which uses Runnalls’ Gaussian mixture reduction technique, in order to reduce the number of these calculations. Moreover, with this tree-structured hash, wemore » can trade-off reduction in computation with a corresponding degradation of equal error rate (EER). As an example, we also reduce this computation by a factor of 15× while incurring less than 10% relative degradation of EER (or 0.3% absolute EER) when evaluated with NIST 2010 speaker recognition evaluation (SRE) telephone data.« less

    3. CORCON-MOD3: An integrated computer model for analysis of molten core-concrete interactions. User`s manual

      SciTech Connect (OSTI)

      Bradley, D.R.; Gardner, D.R.; Brockmann, J.E.; Griffith, R.O.

      1993-10-01

      The CORCON-Mod3 computer code was developed to mechanistically model the important core-concrete interaction phenomena, including those phenomena relevant to the assessment of containment failure and radionuclide release. The code can be applied to a wide range of severe accident scenarios and reactor plants. The code represents the current state of the art for simulating core debris interactions with concrete. This document comprises the user`s manual and gives a brief description of the models and the assumptions and limitations in the code. Also discussed are the input parameters and the code output. Two sample problems are also given.

    4. Computational Combustion

      SciTech Connect (OSTI)

      Westbrook, C K; Mizobuchi, Y; Poinsot, T J; Smith, P J; Warnatz, J

      2004-08-26

      Progress in the field of computational combustion over the past 50 years is reviewed. Particular attention is given to those classes of models that are common to most system modeling efforts, including fluid dynamics, chemical kinetics, liquid sprays, and turbulent flame models. The developments in combustion modeling are placed into the time-dependent context of the accompanying exponential growth in computer capabilities and Moore's Law. Superimposed on this steady growth, the occasional sudden advances in modeling capabilities are identified and their impacts are discussed. Integration of submodels into system models for spark ignition, diesel and homogeneous charge, compression ignition engines, surface and catalytic combustion, pulse combustion, and detonations are described. Finally, the current state of combustion modeling is illustrated by descriptions of a very large jet lifted 3D turbulent hydrogen flame with direct numerical simulation and 3D large eddy simulations of practical gas burner combustion devices.

    5. Baseline and target values for regional and point PV power forecasts: Toward improved solar forecasting

      SciTech Connect (OSTI)

      Zhang, Jie; Hodge, Bri -Mathias; Lu, Siyuan; Hamann, Hendrik F.; Lehman, Brad; Simmons, Joseph; Campos, Edwin; Banunarayanan, Venkat; Black, Jon; Tedesco, John

      2015-11-10

      Accurate solar photovoltaic (PV) power forecasting allows utilities to reliably utilize solar resources on their systems. However, to truly measure the improvements that any new solar forecasting methods provide, it is important to develop a methodology for determining baseline and target values for the accuracy of solar forecasting at different spatial and temporal scales. This paper aims at developing a framework to derive baseline and target values for a suite of generally applicable, value-based, and custom-designed solar forecasting metrics. The work was informed by close collaboration with utility and independent system operator partners. The baseline values are established based on state-of-the-art numerical weather prediction models and persistence models in combination with a radiative transfer model. The target values are determined based on the reduction in the amount of reserves that must be held to accommodate the uncertainty of PV power output. The proposed reserve-based methodology is a reasonable and practical approach that can be used to assess the economic benefits gained from improvements in accuracy of solar forecasting. Lastly, the financial baseline and targets can be translated back to forecasting accuracy metrics and requirements, which will guide research on solar forecasting improvements toward the areas that are most beneficial to power systems operations.

    6. Baseline and target values for regional and point PV power forecasts: Toward improved solar forecasting

      DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

      Zhang, Jie; Hodge, Bri -Mathias; Lu, Siyuan; Hamann, Hendrik F.; Lehman, Brad; Simmons, Joseph; Campos, Edwin; Banunarayanan, Venkat; Black, Jon; Tedesco, John

      2015-11-10

      Accurate solar photovoltaic (PV) power forecasting allows utilities to reliably utilize solar resources on their systems. However, to truly measure the improvements that any new solar forecasting methods provide, it is important to develop a methodology for determining baseline and target values for the accuracy of solar forecasting at different spatial and temporal scales. This paper aims at developing a framework to derive baseline and target values for a suite of generally applicable, value-based, and custom-designed solar forecasting metrics. The work was informed by close collaboration with utility and independent system operator partners. The baseline values are established based onmore » state-of-the-art numerical weather prediction models and persistence models in combination with a radiative transfer model. The target values are determined based on the reduction in the amount of reserves that must be held to accommodate the uncertainty of PV power output. The proposed reserve-based methodology is a reasonable and practical approach that can be used to assess the economic benefits gained from improvements in accuracy of solar forecasting. Lastly, the financial baseline and targets can be translated back to forecasting accuracy metrics and requirements, which will guide research on solar forecasting improvements toward the areas that are most beneficial to power systems operations.« less

    7. A non-CFD modeling system for computing 3D wind and concentration fields in urban environments

      SciTech Connect (OSTI)

      Nelson, Matthew A; Brown, Michael J; Williams, Michael D; Gowardhan, Akshay; Pardyjak, Eric R

      2010-01-01

      The Quick Urban & Industrial Complex (QUIC) Dispersion Modeling System has been developed to rapidly compute the transport and dispersion of toxic agent releases in the vicinity of buildings. It is composed of an empirical-diagnostic wind solver, an 'urbanized' Lagrangian random-walk model, and a graphical user interface. The code has been used for homeland security and environmental air pollution applications. In this paper, we discuss the wind solver methodology and improvements made to the original Roeckle schemes in order to better capture flow fields in dense built-up areas. The mode1-computed wind and concentration fields are then compared to measurements from several field experiments. Improvements to the QUIC Dispersion Modeling System have been made to account for the inhomogeneous and complex building layouts found in large cities. The logic that has been introduced into the code is described and comparisons of model output to full-scale outdoor urban measurements in Oklahoma City and New York City are given. Although far from perfect, the model agreed fairly well with measurements and in many cases performed equally to CFD codes.

    8. A new surrogate modeling technique combining Kriging and polynomial chaos expansions – Application to uncertainty analysis in computational dosimetry

      SciTech Connect (OSTI)

      Kersaudy, Pierric; Sudret, Bruno; Varsier, Nadège; Picon, Odile; Wiart, Joe

      2015-04-01

      In numerical dosimetry, the recent advances in high performance computing led to a strong reduction of the required computational time to assess the specific absorption rate (SAR) characterizing the human exposure to electromagnetic waves. However, this procedure remains time-consuming and a single simulation can request several hours. As a consequence, the influence of uncertain input parameters on the SAR cannot be analyzed using crude Monte Carlo simulation. The solution presented here to perform such an analysis is surrogate modeling. This paper proposes a novel approach to build such a surrogate model from a design of experiments. Considering a sparse representation of the polynomial chaos expansions using least-angle regression as a selection algorithm to retain the most influential polynomials, this paper proposes to use the selected polynomials as regression functions for the universal Kriging model. The leave-one-out cross validation is used to select the optimal number of polynomials in the deterministic part of the Kriging model. The proposed approach, called LARS-Kriging-PC modeling, is applied to three benchmark examples and then to a full-scale metamodeling problem involving the exposure of a numerical fetus model to a femtocell device. The performances of the LARS-Kriging-PC are compared to an ordinary Kriging model and to a classical sparse polynomial chaos expansion. The LARS-Kriging-PC appears to have better performances than the two other approaches. A significant accuracy improvement is observed compared to the ordinary Kriging or to the sparse polynomial chaos depending on the studied case. This approach seems to be an optimal solution between the two other classical approaches. A global sensitivity analysis is finally performed on the LARS-Kriging-PC model of the fetus exposure problem.

    9. Flow field computation of the NREL S809 airfoil using various turbulence models

      SciTech Connect (OSTI)

      Chang, Y.L.; Yang, S.L.; Arici, O. [Michigan Technological Univ., Houghton, MI (United States). Mechanical Engineering-Engineering Mechanics Dept.

      1996-10-01

      Performance comparison of three popular turbulence models, namely Baldwin-Lomas algebraic model, Chien`s Low-Reynolds-Number {kappa}-{epsilon} model, and Wilcox`s Low-Reynolds-Number {kappa}-{omega} model, is given. These models were applied to calculate the flow field around the National Renewable Energy Laboratory S809 airfoil using Total Variational Diminishing scheme. Numerical results of C{sub P}, C{sub L}, and C{sub D} are presented along with the Delft experimental data. It is shown that all three models perform well for attached flow, i.e., no flow separation at low angles of attack. However, at high angles of attack with flow separation, convergence characteristics show Wilcox`s model outperforms the other models. Results of this study will be used to guide the authors in their dynamic stall research.

    10. Application of high performance computing to automotive design and manufacturing: Composite materials modeling task technical manual for constitutive models for glass fiber-polymer matrix composites

      SciTech Connect (OSTI)

      Simunovic, S; Zacharia, T

      1997-11-01

      This report provides a theoretical background for three constitutive models for a continuous strand mat (CSM) glass fiber-thermoset polymer matrix composite. The models were developed during fiscal years 1994 through 1997 as a part of the Cooperative Research and Development Agreement, "Application of High-Performance Computing to Automotive Design and Manufacturing." The full derivation of constitutive relations in the framework of the continuum program DYNA3D and have been used for the simulation and impact analysis of CSM composite tubes. The analysis of simulation and experimental results show that the model based on strain tensor split yields the most accurate results of the three implemented models. The parameters used in the models and their derivation from the physical tests are documented.

    11. Neutrino Oscillation Parameter Sensitivity in Future Long-Baseline Experiments

      SciTech Connect (OSTI)

      Bass, Matthew

      2014-01-01

      The study of neutrino interactions and propagation has produced evidence for physics beyond the standard model and promises to continue to shed light on rare phenomena. Since the discovery of neutrino oscillations in the late 1990s there have been rapid advances in establishing the three flavor paradigm of neutrino oscillations. The 2012 discovery of a large value for the last unmeasured missing angle has opened the way for future experiments to search for charge-parity symmetry violation in the lepton sector. This thesis presents an analysis of the future sensitivity to neutrino oscillations in the three flavor paradigm for the T2K, NO A, LBNE, and T2HK experiments. The theory of the three flavor paradigm is explained and the methods to use these theoretical predictions to design long baseline neutrino experiments are described. The sensitivity to the oscillation parameters for each experiment is presented with a particular focus on the search for CP violation and the measurement of the neutrino mass hierarchy. The variations of these sensitivities with statistical considerations and experimental design optimizations taken into account are explored. The effects of systematic uncertainties in the neutrino flux, interaction, and detection predictions are also considered by incorporating more advanced simulations inputs from the LBNE experiment.

    12. Future Long-Baseline Neutrino Oscillations: View from North America

      SciTech Connect (OSTI)

      Wilson, R. J.

      2015-06-01

      In late 2012 the US Department of Energy gave approval for the first phase of the Long-Baseline Neutrino Experiment (LBNE), that will conduct a broad scientific program including neutrino oscillations, neutrino scattering physics, search for baryon violation, supernova burst neutrinos and other related astrophysical phenomena. The project is now being reformulated as an international facility hosted by the United States. The facility will consist of an intense neutrino beam produced at Fermi National Accelerator Laboratory (Fermilab), a highly capable set of neutrino detectors on the Fermilab campus, and a large underground liquid argon time projection chamber at Sanford Underground Research Facility (SURF) in South Dakota 1300 km from Fermilab. With an intense beam and massive far detector, the experimental program at the facility will make detailed studies of neutrino oscillations, including measurements of the neutrino mass hierarchy and Charge-Parity symmetry violation, by measuring neutrino and anti-neutrino mixing separately. At the near site, the high-statistics neutrino scattering data will allow for many cross section measurements and precision tests of the Standard Model. This presentation will describe the configuration developed by the LBNE collaboration, the broad physics program, and the status of the formation of the international facility.

    13. Development of a three-phase reacting flow computer model for analysis of petroleum cracking

      SciTech Connect (OSTI)

      Chang, S.L.; Lottes, S.A.; Petrick, M.

      1995-07-01

      A general computational fluid dynamics computer code (ICRKFLO) has been developed for the simulation of the multi-phase reacting flow in a petroleum fluid catalytic cracker riser. ICRKFLO has several unique features. A new integral reaction submodel couples calculations of hydrodynamics and cracking kinetics by making the calculations more efficient in achieving stable convergence while still preserving the major physical effects of reaction processes. A new coke transport submodel handles the process of coke formation in gas phase reactions and the subsequent deposition on the surface of adjacent particles. The code was validated by comparing with experimental results of a pilot scale fluid cracker unit. The code can predict the flow characteristics of gas, liquid, and particulate solid phases, vaporization of the oil droplets, and subsequent cracking of the oil in a riser reactor, which may lead to a better understanding of the internal processes of the riser and the impact of riser geometry and operating parameters on the riser performance.

    14. CFD [computational fluid dynamics] And Safety Factors. Computer modeling of complex processes needs old-fashioned experiments to stay in touch with reality.

      SciTech Connect (OSTI)

      Leishear, Robert A.; Lee, Si Y.; Poirier, Michael R.; Steeper, Timothy J.; Ervin, Robert C.; Giddings, Billy J.; Stefanko, David B.; Harp, Keith D.; Fowley, Mark D.; Van Pelt, William B.

      2012-10-07

      Computational fluid dynamics (CFD) is recognized as a powerful engineering tool. That is, CFD has advanced over the years to the point where it can now give us deep insight into the analysis of very complex processes. There is a danger, though, that an engineer can place too much confidence in a simulation. If a user is not careful, it is easy to believe that if you plug in the numbers, the answer comes out, and you are done. This assumption can lead to significant errors. As we discovered in the course of a study on behalf of the Department of Energy's Savannah River Site in South Carolina, CFD models fail to capture some of the large variations inherent in complex processes. These variations, or scatter, in experimental data emerge from physical tests and are inadequately captured or expressed by calculated mean values for a process. This anomaly between experiment and theory can lead to serious errors in engineering analysis and design unless a correction factor, or safety factor, is experimentally validated. For this study, blending times for the mixing of salt solutions in large storage tanks were the process of concern under investigation. This study focused on the blending processes needed to mix salt solutions to ensure homogeneity within waste tanks, where homogeneity is required to control radioactivity levels during subsequent processing. Two of the requirements for this task were to determine the minimum number of submerged, centrifugal pumps required to blend the salt mixtures in a full-scale tank in half a day or less, and to recommend reasonable blending times to achieve nearly homogeneous salt mixtures. A full-scale, low-flow pump with a total discharge flow rate of 500 to 800 gpm was recommended with two opposing 2.27-inch diameter nozzles. To make this recommendation, both experimental and CFD modeling were performed. Lab researchers found that, although CFD provided good estimates of an average blending time, experimental blending times varied

    15. OSTIblog Articles in the Long Baseline Neutrino Experiment Topic...

      Office of Scientific and Technical Information (OSTI)

      Long Baseline Neutrino Experiment Topic Mining for Gold, Neutrinos and the Neutrinoless ... The site of the former Homestake Mine was once one of the largest and deepest gold mines ...

    16. Cost and Performance Baseline for Fossil Energy Plants Volume...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      www.netl.doe.gov This page intentionally left blank Cost and Performance Baseline for Coal-to-SNG and Ammonia (Volume 2) i Table of Contents LIST OF EXHIBITS......

    17. Biodiversity Research Institute Mid-Atlantic Baseline Study Webinar

      Broader source: Energy.gov [DOE]

      Carried out by the Biodiversity Research Institute (BRI) and funded by the U.S. Department of Energy Wind and Water Power Technology Office and other partners, the goal of the Mid-Atlantic Baseline...

    18. U.S Department of Energy Performance Baseline Guide

      Broader source: Directives, Delegations, and Requirements [Office of Management (MA)]

      2011-09-23

      This guide identifies key Performance Baseline (PB) elements, development processes, and practices; describes the context in which DOE PB development occurs; and suggests ways of addressing the critical elements in PB development.

    19. In pursuit of an accurate spatial and temporal model of biomolecules at the atomistic level: a perspective on computer simulation

      SciTech Connect (OSTI)

      Gray, Alan; Harlen, Oliver G.; Harris, Sarah A.; Khalid, Syma; Leung, Yuk Ming; Lonsdale, Richard; Mulholland, Adrian J.; Pearson, Arwen R.; Read, Daniel J.; Richardson, Robin A.

      2015-01-01

      The current computational techniques available for biomolecular simulation are described, and the successes and limitations of each with reference to the experimental biophysical methods that they complement are presented. Despite huge advances in the computational techniques available for simulating biomolecules at the quantum-mechanical, atomistic and coarse-grained levels, there is still a widespread perception amongst the experimental community that these calculations are highly specialist and are not generally applicable by researchers outside the theoretical community. In this article, the successes and limitations of biomolecular simulation and the further developments that are likely in the near future are discussed. A brief overview is also provided of the experimental biophysical methods that are commonly used to probe biomolecular structure and dynamics, and the accuracy of the information that can be obtained from each is compared with that from modelling. It is concluded that progress towards an accurate spatial and temporal model of biomacromolecules requires a combination of all of these biophysical techniques, both experimental and computational.

    20. COMPARATIVE COMPUTATIONAL MODELING OF AIRFLOWS AND VAPOR DOSIMETY IN THE RESPIRATORY TRACTS OF RAT, MONKEY, AND HUMAN

      SciTech Connect (OSTI)

      Corley, Richard A.; Kabilan, Senthil; Kuprat, Andrew P.; Carson, James P.; Minard, Kevin R.; Jacob, Rick E.; Timchalk, Charles; Glenny, Robb W.; Pipavath, Sudhaker; Cox, Timothy C.; Wallis, Chris; Larson, Richard; Fanucchi, M.; Postlewait, Ed; Einstein, Daniel R.

      2012-07-01

      Coupling computational fluid dynamics (CFD) with physiologically based pharmacokinetic (PBPK) models is useful for predicting site-specific dosimetry of airborne materials in the respiratory tract and elucidating the importance of species differences in anatomy, physiology, and breathing patterns. Historically, these models were limited to discrete regions of the respiratory system. CFD/PBPK models have now been developed for the rat, monkey, and human that encompass airways from the nose or mouth to the lung. A PBPK model previously developed to describe acrolein uptake in nasal tissues was adapted to the extended airway models as an example application. Model parameters for each anatomic region were obtained from the literature, measured directly, or estimated from published data. Airflow and site-specific acrolein uptake patterns were determined under steadystate inhalation conditions to provide direct comparisons with prior data and nasalonly simulations. Results confirmed that regional uptake was dependent upon airflow rates and acrolein concentrations with nasal extraction efficiencies predicted to be greatest in the rat, followed by the monkey, then the human. For human oral-breathing simulations, acrolein uptake rates in oropharyngeal and laryngeal tissues were comparable to nasal tissues following nasal breathing under the same exposure conditions. For both breathing modes, higher uptake rates were predicted for lower tracheo-bronchial tissues of humans than either the rat or monkey. These extended airway models provide a unique foundation for comparing dosimetry across a significantly more extensive range of conducting airways in the rat, monkey, and human than prior CFD models.

    1. Cielo Computational Environment Usage Model With Mappings to ACE Requirements for the General Availability User Environment Capabilities Release Version 1.1

      SciTech Connect (OSTI)

      Vigil,Benny Manuel; Ballance, Robert; Haskell, Karen

      2012-08-09

      Cielo is a massively parallel supercomputer funded by the DOE/NNSA Advanced Simulation and Computing (ASC) program, and operated by the Alliance for Computing at Extreme Scale (ACES), a partnership between Los Alamos National Laboratory (LANL) and Sandia National Laboratories (SNL). The primary Cielo compute platform is physically located at Los Alamos National Laboratory. This Cielo Computational Environment Usage Model documents the capabilities and the environment to be provided for the Q1 FY12 Level 2 Cielo Capability Computing (CCC) Platform Production Readiness Milestone. This document describes specific capabilities, tools, and procedures to support both local and remote users. The model is focused on the needs of the ASC user working in the secure computing environments at Lawrence Livermore National Laboratory (LLNL), Los Alamos National Laboratory, or Sandia National Laboratories, but also addresses the needs of users working in the unclassified environment. The Cielo Computational Environment Usage Model maps the provided capabilities to the tri-Lab ASC Computing Environment (ACE) Version 8.0 requirements. The ACE requirements reflect the high performance computing requirements for the Production Readiness Milestone user environment capabilities of the ASC community. A description of ACE requirements met, and those requirements that are not met, are included in each section of this document. The Cielo Computing Environment, along with the ACE mappings, has been issued and reviewed throughout the tri-Lab community.

    2. Modeling

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      ... Sandia Will Host PV Bankability Workshop at Solar Power International (SPI) 2013 Computational Modeling & Simulation, Distribution Grid Integration, Energy, Facilities, Grid ...

    3. Modeling

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Science and Actuarial Practice" Read More Permalink New Project Is the ACME of Computer Science to Address Climate Change Analysis, Climate, Global Climate & Energy, Modeling, ...

    4. Computing Frontier: Distributed Computing

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Computing Frontier: Distributed Computing and Facility Infrastructures Conveners: Kenneth Bloom 1 , Richard Gerber 2 1 Department of Physics and Astronomy, University of Nebraska-Lincoln 2 National Energy Research Scientific Computing Center (NERSC), Lawrence Berkeley National Laboratory 1.1 Introduction The field of particle physics has become increasingly reliant on large-scale computing resources to address the challenges of analyzing large datasets, completing specialized computations and

    5. developing-compute-efficient

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Developing Compute-efficient, Quality Models with LS-PrePost 3 on the TRACC Cluster Oct. ... with an emphasis on applying these capabilities to build computationally efficient models. ...

    6. Computational modeling of electrostatic charge and fields produced by hypervelocity impact

      SciTech Connect (OSTI)

      Crawford, David A.

      2015-05-19

      Following prior experimental evidence of electrostatic charge separation, electric and magnetic fields produced by hypervelocity impact, we have developed a model of electrostatic charge separation based on plasma sheath theory and implemented it into the CTH shock physics code. Preliminary assessment of the model shows good qualitative and quantitative agreement between the model and prior experiments at least in the hypervelocity regime for the porous carbonate material tested. The model agrees with the scaling analysis of experimental data performed in the prior work, suggesting that electric charge separation and the resulting electric and magnetic fields can be a substantial effect at larger scales, higher impact velocities, or both.

    7. Protein superfamily members as targets for computer modeling: The carbohydrate recognition domain of a macrophage lectin

      SciTech Connect (OSTI)

      Stenkamp, R.E.; Aruffo, A.; Bajorath, J.

      1996-12-31

      Members of protein superfamilies display similar folds, but share only limited sequence identity, often 25% or less. Thus, it is not straightforward to apply standard homology modeling methods to construct reliable three-dimensional models of such proteins. A three-dimensional model of the carbohydrate recognition domain of the rat macrophage lectin, a member of the calcium-dependent (C-type) lectin superfamily, has been generated to illustrate how information provided by comparison of X-ray structures and sequence-structure alignments can aid in comparative modeling when primary sequence similarities are low. 20 refs., 4 figs.

    8. Computational model, method, and system for kinetically-tailoring multi-drug chemotherapy for individuals

      DOE Patents [OSTI]

      Gardner, Shea Nicole

      2007-10-23

      A method and system for tailoring treatment regimens to individual patients with diseased cells exhibiting evolution of resistance to such treatments. A mathematical model is provided which models rates of population change of proliferating and quiescent diseased cells using cell kinetics and evolution of resistance of the diseased cells, and pharmacokinetic and pharmacodynamic models. Cell kinetic parameters are obtained from an individual patient and applied to the mathematical model to solve for a plurality of treatment regimens, each having a quantitative efficacy value associated therewith. A treatment regimen may then be selected from the plurlaity of treatment options based on the efficacy value.

    9. Multigroup computation of the temperature-dependent Resonance Scattering Model (RSM) and its implementation

      SciTech Connect (OSTI)

      Ghrayeb, S. Z.; Ouisloumen, M.; Ougouag, A. M.; Ivanov, K. N.

      2012-07-01

      A multi-group formulation for the exact neutron elastic scattering kernel is developed. This formulation is intended for implementation into a lattice physics code. The correct accounting for the crystal lattice effects influences the estimated values for the probability of neutron absorption and scattering, which in turn affect the estimation of core reactivity and burnup characteristics. A computer program has been written to test the formulation for various nuclides. Results of the multi-group code have been verified against the correct analytic scattering kernel. In both cases neutrons were started at various energies and temperatures and the corresponding scattering kernels were tallied. (authors)

    10. Nuclear Engineering Computer Models for In-Core Fuel Management Analysis.

      Energy Science and Technology Software Center (OSTI)

      1992-06-12

      Version 00 VPI-NECM is a nuclear engineering computer system of modules for in-core fuel management analysis. The system consists of 6 independent programs designed to calculate: (1) FARCON - neutron slowing down and epithermal group constants, (2) SLOCON - thermal neutron spectrum and group constants, (3) DISFAC - slow neutron disadvantage factors, (4) ODOG - solution of a one group neutron diffusion equation, (5) ODMUG - three group criticality problem, (6) FUELBURN - fuel burnupmore » in slow neutron fission reactors.« less

    11. TIS: an Intelligent Gateway Computer for information and modeling networks. Overview

      SciTech Connect (OSTI)

      Hampel, V.E.; Bailey, C.; Kawin, R.A.; Lann, N.A.; McGrogan, S.K.; Scott, W.S.; Stammers, S.M.; Thomas, J.L.

      1983-08-01

      The Technology Information System (TIS) is being used to develop software for Intelligent Gateway Computers (IGC) suitable for the prototyping of advanced, integrated information networks. Dedicated to information management, TIS leads the user to available information resources, on TIS or elsewhere, by means of a master directory and automated access procedures. Other geographically distributed information centers accessible through TIS include federal and commercial systems like DOE/RECON, NASA/RECON, DOD/DROLS, DOT/TIC, CIS, and DIALOG in the United States, the chemical information systems DARC in France, and DECHEMA in West Germany. New centers are added as required.

    12. Predicting oropharyngeal tumor volume throughout the course of radiation therapy from pretreatment computed tomography data using general linear models

      SciTech Connect (OSTI)

      Yock, Adam D. Kudchadker, Rajat J.; Rao, Arvind; Dong, Lei; Beadle, Beth M.; Garden, Adam S.; Court, Laurence E.

      2014-05-15

      Purpose: The purpose of this work was to develop and evaluate the accuracy of several predictive models of variation in tumor volume throughout the course of radiation therapy. Methods: Nineteen patients with oropharyngeal cancers were imaged daily with CT-on-rails for image-guided alignment per an institutional protocol. The daily volumes of 35 tumors in these 19 patients were determined and used to generate (1) a linear model in which tumor volume changed at a constant rate, (2) a general linear model that utilized the power fit relationship between the daily and initial tumor volumes, and (3) a functional general linear model that identified and exploited the primary modes of variation between time series describing the changing tumor volumes. Primary and nodal tumor volumes were examined separately. The accuracy of these models in predicting daily tumor volumes were compared with those of static and linear reference models using leave-one-out cross-validation. Results: In predicting the daily volume of primary tumors, the general linear model and the functional general linear model were more accurate than the static reference model by 9.9% (range: −11.6%–23.8%) and 14.6% (range: −7.3%–27.5%), respectively, and were more accurate than the linear reference model by 14.2% (range: −6.8%–40.3%) and 13.1% (range: −1.5%–52.5%), respectively. In predicting the daily volume of nodal tumors, only the 14.4% (range: −11.1%–20.5%) improvement in accuracy of the functional general linear model compared to the static reference model was statistically significant. Conclusions: A general linear model and a functional general linear model trained on data from a small population of patients can predict the primary tumor volume throughout the course of radiation therapy with greater accuracy than standard reference models. These more accurate models may increase the prognostic value of information about the tumor garnered from pretreatment computed tomography

    13. A computational model for three-dimensional jointed media with a single joint set; Yucca Mountain Site Characterization Project

      SciTech Connect (OSTI)

      Koteras, J.R.

      1994-02-01

      This report describes a three-dimensional model for jointed rock or other media with a single set of joints. The joint set consists of evenly spaced joint planes. The normal joint response is nonlinear elastic and is based on a rational polynomial. Joint shear stress is treated as being linear elastic in the shear stress versus slip displacement before attaining a critical stress level governed by a Mohr-Coulomb faction criterion. The three-dimensional model represents an extension of a two-dimensional, multi-joint model that has been in use for several years. Although most of the concepts in the two-dimensional model translate in a straightforward manner to three dimensions, the concept of slip on the joint planes becomes more complex in three dimensions. While slip in two dimensions can be treated as a scalar quantity, it must be treated as a vector in the joint plane in three dimensions. For the three-dimensional model proposed here, the slip direction is assumed to be the direction of maximum principal strain in the joint plane. Five test problems are presented to verify the correctness of the computational implementation of the model.

    14. Climate Models: Rob Jacob | Argonne National Laboratory

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      --Tribology -Mathematics, computing, & computer science --Cloud computing --Modeling, simulation, & visualization --Petascale & exascale computing --Supercomputing &...

    15. Computational modeling of electrostatic charge and fields produced by hypervelocity impact

      DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

      Crawford, David A.

      2015-05-19

      Following prior experimental evidence of electrostatic charge separation, electric and magnetic fields produced by hypervelocity impact, we have developed a model of electrostatic charge separation based on plasma sheath theory and implemented it into the CTH shock physics code. Preliminary assessment of the model shows good qualitative and quantitative agreement between the model and prior experiments at least in the hypervelocity regime for the porous carbonate material tested. The model agrees with the scaling analysis of experimental data performed in the prior work, suggesting that electric charge separation and the resulting electric and magnetic fields can be a substantial effectmore » at larger scales, higher impact velocities, or both.« less

    16. Use of model calibration to achieve high accuracy in analysis of computer networks

      DOE Patents [OSTI]

      Frogner, Bjorn; Guarro, Sergio; Scharf, Guy

      2004-05-11

      A system and method are provided for creating a network performance prediction model, and calibrating the prediction model, through application of network load statistical analyses. The method includes characterizing the measured load on the network, which may include background load data obtained over time, and may further include directed load data representative of a transaction-level event. Probabilistic representations of load data are derived to characterize the statistical persistence of the network performance variability and to determine delays throughout the network. The probabilistic representations are applied to the network performance prediction model to adapt the model for accurate prediction of network performance. Certain embodiments of the method and system may be used for analysis of the performance of a distributed application characterized as data packet streams.

    17. Computational Fluid Dynamics

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      scour-tracc-cfd TRACC RESEARCH Computational Fluid Dynamics Computational Structural Mechanics Transportation Systems Modeling Computational Fluid Dynamics Overview of CFD: Video Clip with Audio Computational fluid dynamics (CFD) research uses mathematical and computational models of flowing fluids to describe and predict fluid response in problems of interest, such as the flow of air around a moving vehicle or the flow of water and sediment in a river. Coupled with appropriate and prototypical

    18. Multiproject baselines for evaluation of electric power projects

      SciTech Connect (OSTI)

      Sathaye, Jayant; Murtishaw, Scott; Price, Lynn; Lefranc, Maurice; Roy, Joyashree; Winkler, Harald; Spalding-Fecher, Randall

      2003-03-12

      Calculating greenhouse gas emissions reductions from climate change mitigation projects requires construction of a baseline that sets emissions levels that would have occurred without the project. This paper describes a standardized multiproject methodology for setting baselines, represented by the emissions rate (kg C/kWh), for electric power projects. A standardized methodology would reduce the transaction costs of projects. The most challenging aspect of setting multiproject emissions rates is determining the vintage and types of plants to include in the baseline and the stringency of the emissions rates to be considered, in order to balance the desire to encourage no- or low-carbon projects while maintaining environmental integrity. The criteria for selecting power plants to include in the baseline depend on characteristics of both the project and the electricity grid it serves. Two case studies illustrate the application of these concepts to the electric power grids in eastern India and South Africa. We use hypothetical, but realistic, climate change projects in each country to illustrate the use of the multiproject methodology, and note the further research required to fully understand the implications of the various choices in constructing and using these baselines.

    19. Baseline design/economics for advanced Fischer-Tropsch technology. Quarterly report, January--March 1992

      SciTech Connect (OSTI)

      Not Available

      1992-09-01

      The objectives of the study are to: Develop a baseline design for indirect liquefaction using advanced Fischer-Tropsch (F-T) technology. Prepare the capital and operating costs for the baseline design. Develop a process flow sheet simulation (PFS) model. This report summarizes the activities completed during the period December 23, 1992 through March 15, 1992. In Task 1, Baseline Design and Alternates, the following activities related to the tradeoff studies were completed: approach and basis; oxygen purity; F-T reactor pressure; wax yield; autothermal reformer; hydrocarbons (C{sub 3}/C{sub 4}s) recovery; and hydrogenrecovery. In Task 3, Engineering Design Criteria, activities were initiated to support the process tradeoff studies in Task I and to develop the environmental strategy for the Illinois site. The work completed to date consists of the development of the F-T reactor yield correlation from the Mobil dam and a brief review of the environmental strategy prepared for the same site in the direct liquefaction baseline study.Some work has also been done in establishing site-related criteria, in establishing the maximum vessel diameter for train sizing and in coping with the low H{sub 2}/CO ratio from the Shell gasifier. In Task 7, Project Management and Administration, the following activities were completed: the subcontract agreement between Amoco and Bechtel was negotiated; a first technical progress meeting was held at the Bechtel office in February; and the final Project Management Plan was approved by PETC and issued in March 1992.

    20. Fort Drum integrated resource assessment. Volume 2, Baseline detail

      SciTech Connect (OSTI)

      Dixon, D.R.; Armstrong, P.R.; Brodrick, J.R.; Daellenbach, K.K.; Di Massa, F.V.; Keller, J.M.; Richman, E.E.; Sullivan, G.P.; Wahlstrom, R.R.

      1992-12-01

      The US Army Forces Command (FORSCOM) has tasked the Pacific Northwest Laboratory (PNL) as the lead laboratory supporting the US Department of Energy (DOE) Federal Energy Management Program`s mission to identify, evaluate, and assist in acquiring all cost-effective energy projects at Fort Drum. This is a model program PNL is designing for federal customers served by the Niagara Mohawk Power Company. It will identify and evaluate all electric and fossil fuel cost-effective energy projects; develop a schedule at each installation for project acquisition considering project type, size, timing, and capital requirements, as well as energy and dollar savings; and secure 100% of the financing required to implement electric energy efficiency projects from Niagara Mohawk and have Niagara Mohawk procure the necessary contractors to perform detailed audits and install the technologies. This report documents the assessment of baseline energy use at one of Niagara Mohawk`s primary federal facilities, the FORSCOM Fort Drum facility located near Watertown, New York. It is a companion report to Volume 1, the Executive Summary, and Volume 3, the Resource Assessment. This analysis examines the characteristics of electric, gas, oil, propane, coal, and purchased thermal capacity use for fiscal year (FY) 1990. It records energy-use intensities for the facilities at Fort Drum by building type and energy end use. It also breaks down building energy consumption by fuel type, energy end use, and building type. A complete energy consumption reconciliation is presented that includes the accounting of all energy use among buildings, utilities, central systems, and applicable losses.

    1. Griffiss AFB integrated resource assessment. Volume 2, Electric baseline detail

      SciTech Connect (OSTI)

      Dixon, D.R.; Armstrong, P.R.; Keller, J.M.

      1993-02-01

      The US Air Force Air Combat Command has tasked the Pacific Northwest Laboratory (PNL) as the lead laboratory supporting the US Department of Energy (DOE) Federal Energy Management Program`s (FEMP) mission to identify, evaluate, and assist in acquiring all cost-effective energy projects at Griffiss Air Force Base (AFB). This is a model program PNL is designing for federal customers served by the Niagara Mohawk Power Company (Niagara Mohawk). It will (1) identify and evaluate all electric cost-effective energy projects; (2) develop a schedule at each installation for project acquisition considering project type, size, timing, and capital requirements, as well as energy and dollar savings; and (3) secure 100% of the financing required to implement electric energy efficiency projects from Niagara Mohawk and have Niagara Mohawk procure the necessary contractors to perform detailed audits and install the technologies. This report documents the assessment of baseline energy use at one of Niagara Mohawk`s primary federal facilities, Griffiss AFB, an Air Combat Command facility located near Rome, New York. It is a companion report to Volume 1, the Executive Summary, and Volume 3, the Electric Resource Assessment. The analysis examines the characteristics of electric, gas, oil, propane, coal, and purchased thermal capacity use for fiscal year (FY) 1990. The results include energy-use intensities for the facilities at Griffiss AFB by building type and electric energy end use. A complete electric energy consumption reconciliation is presented that accounts for the distribution of all major electric energy uses and losses among buildings, utilities, and central systems.

    2. Computational modeling of Krypton gas puffs with tailored mass density profiles on Z

      SciTech Connect (OSTI)

      Jennings, Christopher A.; Ampleford, David J.; Lamppa, Derek C.; Hansen, Stephanie B.; Jones, Brent Manley; Harvey-Thompson, Adam James; Jobe, Marc Ronald Lee; Reneker, Joseph; Rochau, Gregory A.; Cuneo, Michael Edward; Strizic, T.

      2015-05-18

      Large diameter multi-shell gas puffs rapidly imploded by high current (~20 MA, ~100 ns) on the Z generator of Sandia National Laboratories are able to produce high-intensity Krypton K-shell emission at ~13 keV. Efficiently radiating at these high photon energies is a significant challenge which requires the careful design and optimization of the gas distribution. To facilitate this, we hydrodynamically model the gas flow out of the nozzle and then model its implosion using a 3-dimensional resistive, radiative MHD code (GORGON). This approach enables us to iterate between modeling the implosion and gas flow from the nozzle to optimize radiative output from this combined system. Furthermore, guided by our implosion calculations, we have designed gas profiles that help mitigate disruption from Magneto-Rayleigh–Taylor implosion instabilities, while preserving sufficient kinetic energy to thermalize to the high temperatures required for K-shell emission.

    3. Computational modeling of Krypton gas puffs with tailored mass density profiles on Z.

      SciTech Connect (OSTI)

      Jennings, Christopher A.; Ampleford, David J.; Lamppa, Derek C.; Hansen, Stephanie B.; Jones, Brent Manley; Harvey-Thompson, Adam James; Jobe, Marc Ronald Lee; Reneker, Joseph; Rochau, Gregory A.; Cuneo, Michael Edward; Strizic, T.

      2015-05-18

      Large diameter multi-shell gas puffs rapidly imploded by high current (~20 MA, ~100 ns) on the Z generator of Sandia National Laboratories are able to produce high-intensity Krypton K-shell emission at ~13 keV. Efficiently radiating at these high photon energies is a significant challenge which requires the careful design and optimization of the gas distribution. To facilitate this, we hydrodynamically model the gas flow out of the nozzle and then model its implosion using a 3-dimensional resistive, radiative MHD code (GORGON). This approach enables us to iterate between modeling the implosion and gas flow from the nozzle to optimize radiative output from this combined system. Furthermore, guided by our implosion calculations, we have designed gas profiles that help mitigate disruption from Magneto-Rayleigh–Taylor implosion instabilities, while preserving sufficient kinetic energy to thermalize to the high temperatures required for K-shell emission.

    4. Computational modeling of structure of metal matrix composite in centrifugal casting process

      SciTech Connect (OSTI)

      Zagorski, Roman [Department of Electrotechnology, Faculty of Materials Science and Metallurgy, Silesian University of Technology, ul. Krasinskiego 8, 40-019, Katowice (Poland)

      2007-04-07

      The structure of alumina matrix composite reinforced with crystalline particles obtained during centrifugal casting process are studied. Several parameters of cast process like pouring temperature, temperature, rotating speed and size of casting mould which influent on structure of composite are examined. Segregation of crystalline particles depended on other factors such as: the gradient of density of the liquid matrix and reinforcement, thermal processes connected with solidifying of the cast, processes leading to changes in physical and structural properties of liquid composite are also investigated. All simulation are carried out by CFD program Fluent. Numerical simulations are performed using the FLUENT two-phase free surface (air and matrix) unsteady flow model (volume of fluid model - VOF) and discrete phase model (DPM)

    5. Computational modeling of Krypton gas puffs with tailored mass density profiles on Z

      DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

      Jennings, Christopher A.; Ampleford, David J.; Lamppa, Derek C.; Hansen, Stephanie B.; Jones, Brent Manley; Harvey-Thompson, Adam James; Jobe, Marc Ronald Lee; Reneker, Joseph; Rochau, Gregory A.; Cuneo, Michael Edward; et al

      2015-05-18

      Large diameter multi-shell gas puffs rapidly imploded by high current (~20 MA, ~100 ns) on the Z generator of Sandia National Laboratories are able to produce high-intensity Krypton K-shell emission at ~13 keV. Efficiently radiating at these high photon energies is a significant challenge which requires the careful design and optimization of the gas distribution. To facilitate this, we hydrodynamically model the gas flow out of the nozzle and then model its implosion using a 3-dimensional resistive, radiative MHD code (GORGON). This approach enables us to iterate between modeling the implosion and gas flow from the nozzle to optimize radiativemore » output from this combined system. Furthermore, guided by our implosion calculations, we have designed gas profiles that help mitigate disruption from Magneto-Rayleigh–Taylor implosion instabilities, while preserving sufficient kinetic energy to thermalize to the high temperatures required for K-shell emission.« less

    6. Computational modeling of Krypton gas puffs with tailored mass density profiles on Z

      SciTech Connect (OSTI)

      Jennings, C. A.; Ampleford, D. J.; Lamppa, D. C.; Hansen, S. B.; Jones, B.; Harvey-Thompson, A. J.; Jobe, M.; Strizic, T.; Reneker, J.; Rochau, G. A.; Cuneo, M. E.

      2015-05-15

      Large diameter multi-shell gas puffs rapidly imploded by high current (?20 MA, ?100?ns) on the Z generator of Sandia National Laboratories are able to produce high-intensity Krypton K-shell emission at ?13?keV. Efficiently radiating at these high photon energies is a significant challenge which requires the careful design and optimization of the gas distribution. To facilitate this, we hydrodynamically model the gas flow out of the nozzle and then model its implosion using a 3-dimensional resistive, radiative MHD code (GORGON). This approach enables us to iterate between modeling the implosion and gas flow from the nozzle to optimize radiative output from this combined system. Guided by our implosion calculations, we have designed gas profiles that help mitigate disruption from Magneto-RayleighTaylor implosion instabilities, while preserving sufficient kinetic energy to thermalize to the high temperatures required for K-shell emission.

    7. Orbital-selective Mott phases of a one-dimensional three-orbital Hubbard model studied using computational techniques

      DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

      Liu, Guangkun; Kaushal, Nitin; Liu, Shaozhi; Bishop, Christopher B.; Wang, Yan; Johnston, Steve; Alvarez, Gonzalo; Moreo, Adriana; Dagotto, Elbio R.

      2016-06-24

      A recently introduced one-dimensional three-orbital Hubbard model displays orbital-selective Mott phases with exotic spin arrangements such as spin block states [J. Rincón et al., Phys. Rev. Lett. 112, 106405 (2014)]. In this paper we show that the constrained-path quantum Monte Carlo (CPQMC) technique can accurately reproduce the phase diagram of this multiorbital one-dimensional model, paving the way to future CPQMC studies in systems with more challenging geometries, such as ladders and planes. The success of this approach relies on using the Hartree-Fock technique to prepare the trial states needed in CPQMC. In addition, we study a simplified version of themore » model where the pair-hopping term is neglected and the Hund coupling is restricted to its Ising component. The corresponding phase diagrams are shown to be only mildly affected by the absence of these technically difficult-to-implement terms. This is confirmed by additional density matrix renormalization group and determinant quantum Monte Carlo calculations carried out for the same simplified model, with the latter displaying only mild fermion sign problems. Lastly, we conclude that these methods are able to capture quantitatively the rich physics of the several orbital-selective Mott phases (OSMP) displayed by this model, thus enabling computational studies of the OSMP regime in higher dimensions, beyond static or dynamic mean-field approximations.« less

    8. Computational Modeling of Fluid Flow through a Fracture in Permeable Rock

      SciTech Connect (OSTI)

      Crandall, Dustin; Ahmadi, Goodarz; Smith, Duane H

      2010-01-01

      Laminar, single-phase, finite-volume solutions to the Navier–Stokes equations of fluid flow through a fracture within permeable media have been obtained. The fracture geometry was acquired from computed tomography scans of a fracture in Berea sandstone, capturing the small-scale roughness of these natural fluid conduits. First, the roughness of the two-dimensional fracture profiles was analyzed and shown to be similar to Brownian fractal structures. The permeability and tortuosity of each fracture profile was determined from simulations of fluid flow through these geometries with impermeable fracture walls. A surrounding permeable medium, assumed to obey Darcy’s Law with permeabilities from 0.2 to 2,000 millidarcies, was then included in the analysis. A series of simulations for flows in fractured permeable rocks was performed, and the results were used to develop a relationship between the flow rate and pressure loss for fractures in porous rocks. The resulting frictionfactor, which accounts for the fracture geometric properties, is similar to the cubic law; it has the potential to be of use in discrete fracture reservoir-scale simulations of fluid flow through highly fractured geologic formations with appreciable matrix permeability. The observed fluid flow from the surrounding permeable medium to the fracture was significant when the resistance within the fracture and the medium were of the same order. An increase in the volumetric flow rate within the fracture profile increased by more than 5% was observed for flows within high permeability-fractured porous media.

    9. Computational Nanophotonics: modeling optical interactions and transport in tailored nanosystem architectures

      SciTech Connect (OSTI)

      Schatz, George; Ratner, Mark

      2014-02-27

      This report describes research by George Schatz and Mark Ratner that was done over the period 10/03-5/09 at Northwestern University. This research project was part of a larger research project with the same title led by Stephen Gray at Argonne. A significant amount of our work involved collaborations with Gray, and there were many joint publications as summarized later. In addition, a lot of this work involved collaborations with experimental groups at Northwestern, Argonne, and elsewhere. The research was primarily concerned with developing theory and computational methods that can be used to describe the interaction of light with noble metal nanoparticles (especially silver) that are capable of plasmon excitation. Classical electrodynamics provides a powerful approach for performing these studies, so much of this research project involved the development of methods for solving Maxwell’s equations, including both linear and nonlinear effects, and examining a wide range of nanostructures, including particles, particle arrays, metal films, films with holes, and combinations of metal nanostructures with polymers and other dielectrics. In addition, our work broke new ground in the development of quantum mechanical methods to describe plasmonic effects based on the use of time dependent density functional theory, and we developed new theory concerned with the coupling of plasmons to electrical transport in molecular wire structures. Applications of our technology were aimed at the development of plasmonic devices as components of optoelectronic circuits, plasmons for spectroscopy applications, and plasmons for energy-related applications.

    10. Unveiling Stability Criteria of DNA-Carbon Nanotubes Constructs by Scanning Tunneling Microscopy and Computational Modeling

      DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

      Kilina, Svetlana; Yarotski, Dzmitry A.; Talin, A. Alec; Tretiak, Sergei; Taylor, Antoinette J.; Balatsky, Alexander V.

      2011-01-01

      We present a combined approach that relies on computational simulations and scanning tunneling microscopy (STM) measurements to reveal morphological properties and stability criteria of carbon nanotube-DNA (CNT-DNA) constructs. Application of STM allows direct observation of very stable CNT-DNA hybrid structures with the well-defined DNA wrapping angle of 63.4 ° and a coiling period of 3.3 nm. Using force field simulations, we determine how the DNA-CNT binding energy depends on the sequence and binding geometry of a single strand DNA. This dependence allows us to quantitatively characterize the stability of a hybrid structure with an optimal π-stacking between DNA nucleotides andmore » the tube surface and better interpret STM data. Our simulations clearly demonstrate the existence of a very stable DNA binding geometry for (6,5) CNT as evidenced by the presence of a well-defined minimum in the binding energy as a function of an angle between DNA strand and the nanotube chiral vector. This novel approach demonstrates the feasibility of CNT-DNA geometry studies with subnanometer resolution and paves the way towards complete characterization of the structural and electronic properties of drug-delivering systems based on DNA-CNT hybrids as a function of DNA sequence and a nanotube chirality.« less

    11. DFT modeling of adsorption onto uranium metal using large-scale parallel computing

      SciTech Connect (OSTI)

      Davis, N.; Rizwan, U.

      2013-07-01

      There is a dearth of atomistic simulations involving the surface chemistry of 7-uranium which is of interest as the key fuel component of a breeder-burner stage in future fuel cycles. Recent availability of high-performance computing hardware and software has rendered extended quantum chemical surface simulations involving actinides feasible. With that motivation, data for bulk and surface 7-phase uranium metal are calculated in the plane-wave pseudopotential density functional theory method. Chemisorption of atomic hydrogen and oxygen on several un-relaxed low-index faces of 7-uranium is considered. The optimal adsorption sites (calculated cohesive energies) on the (100), (110), and (111) faces are found to be the one-coordinated top site (8.8 eV), four-coordinated center site (9.9 eV), and one-coordinated top 1 site (7.9 eV) respectively, for oxygen; and the four-coordinated center site (2.7 eV), four-coordinated center site (3.1 eV), and three-coordinated top2 site (3.2 eV) for hydrogen. (authors)

    12. COMPUTATIONAL AND EXPERIMENTAL MODELING OF THREE-PHASE SLURRY-BUBBLE COLUMN REACTOR

      SciTech Connect (OSTI)

      Isaac K. Gamwo; Dimitri Gidaspow

      1999-09-01

      Considerable progress has been achieved in understanding three-phase reactors from the point of view of kinetic theory. In a paper in press for publication in Chemical Engineering Science (Wu and Gidaspow, 1999) we have obtained a complete numerical solution of bubble column reactors. In view of the complexity of the simulation a better understanding of the processes using simplified analytical solutions is required. Such analytical solutions are presented in the attached paper, Large Scale Oscillations or Gravity Waves in Risers and Bubbling Beds. This paper presents analytical solutions for bubbling frequencies and standing wave flow patterns. The flow patterns in operating slurry bubble column reactors are not optimum. They involve upflow in the center and downflow at the walls. It may be possible to control flow patterns by proper redistribution of heat exchangers in slurry bubble column reactors. We also believe that the catalyst size in operating slurry bubble column reactors is not optimum. To obtain an optimum size we are following up on the observation of George Cody of Exxon who reported a maximum granular temperature (random particle kinetic energy) for a particle size of 90 microns. The attached paper, Turbulence of Particles in a CFB and Slurry Bubble Columns Using Kinetic Theory, supports George Cody's observations. However, our explanation for the existence of the maximum in granular temperature differs from that proposed by George Cody. Further computer simulations and experiments involving measurements of granular temperature are needed to obtain a sound theoretical explanation for the possible existence of an optimum catalyst size.

    13. Precision Measurements of Long-Baseline Neutrino Oscillation at LBNF

      SciTech Connect (OSTI)

      Worcester, Elizabeth

      2015-08-06

      In a long-baseline neutrino oscillation experiment, the primary physics objectives are to determine the neutrino mass hierarchy, to determine the octant of the neutrino mixing angle θ23, to search for CP violation in neutrino oscillation, and to precisely measure the size of any CP-violating effect that is discovered. This presentation provides a brief introduction to these measurements and reports on efforts to optimize the design of a long-baseline neutrino oscillation experiment, the status of LBNE, and the transition to an international collaboration at LBNF.

    14. Precision Measurements of Long-Baseline Neutrino Oscillation at LBNF

      DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

      Worcester, Elizabeth

      2015-08-06

      In a long-baseline neutrino oscillation experiment, the primary physics objectives are to determine the neutrino mass hierarchy, to determine the octant of the neutrino mixing angle θ23, to search for CP violation in neutrino oscillation, and to precisely measure the size of any CP-violating effect that is discovered. This presentation provides a brief introduction to these measurements and reports on efforts to optimize the design of a long-baseline neutrino oscillation experiment, the status of LBNE, and the transition to an international collaboration at LBNF.

    15. Computational Science and Engineering

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Computational Science and Engineering NETL's Computational Science and Engineering competency consists of conducting applied scientific research and developing physics-based simulation models, methods, and tools to support the development and deployment of novel process and equipment designs. Research includes advanced computations to generate information beyond the reach of experiments alone by integrating experimental and computational sciences across different length and time scales. Specific

    16. Computational mechanics

      SciTech Connect (OSTI)

      Goudreau, G.L.

      1993-03-01

      The Computational Mechanics thrust area sponsors research into the underlying solid, structural and fluid mechanics and heat transfer necessary for the development of state-of-the-art general purpose computational software. The scale of computational capability spans office workstations, departmental computer servers, and Cray-class supercomputers. The DYNA, NIKE, and TOPAZ codes have achieved world fame through our broad collaborators program, in addition to their strong support of on-going Lawrence Livermore National Laboratory (LLNL) programs. Several technology transfer initiatives have been based on these established codes, teaming LLNL analysts and researchers with counterparts in industry, extending code capability to specific industrial interests of casting, metalforming, and automobile crash dynamics. The next-generation solid/structural mechanics code, ParaDyn, is targeted toward massively parallel computers, which will extend performance from gigaflop to teraflop power. Our work for FY-92 is described in the following eight articles: (1) Solution Strategies: New Approaches for Strongly Nonlinear Quasistatic Problems Using DYNA3D; (2) Enhanced Enforcement of Mechanical Contact: The Method of Augmented Lagrangians; (3) ParaDyn: New Generation Solid/Structural Mechanics Codes for Massively Parallel Processors; (4) Composite Damage Modeling; (5) HYDRA: A Parallel/Vector Flow Solver for Three-Dimensional, Transient, Incompressible Viscous How; (6) Development and Testing of the TRIM3D Radiation Heat Transfer Code; (7) A Methodology for Calculating the Seismic Response of Critical Structures; and (8) Reinforced Concrete Damage Modeling.

    17. The Use Of Computational Human Performance Modeling As Task Analysis Tool

      SciTech Connect (OSTI)

      Jacuqes Hugo; David Gertman

      2012-07-01

      During a review of the Advanced Test Reactor safety basis at the Idaho National Laboratory, human factors engineers identified ergonomic and human reliability risks involving the inadvertent exposure of a fuel element to the air during manual fuel movement and inspection in the canal. There were clear indications that these risks increased the probability of human error and possible severe physical outcomes to the operator. In response to this concern, a detailed study was conducted to determine the probability of the inadvertent exposure of a fuel element. Due to practical and safety constraints, the task network analysis technique was employed to study the work procedures at the canal. Discrete-event simulation software was used to model the entire procedure as well as the salient physical attributes of the task environment, such as distances walked, the effect of dropped tools, the effect of hazardous body postures, and physical exertion due to strenuous tool handling. The model also allowed analysis of the effect of cognitive processes such as visual perception demands, auditory information and verbal communication. The model made it possible to obtain reliable predictions of operator performance and workload estimates. It was also found that operator workload as well as the probability of human error in the fuel inspection and transfer task were influenced by the concurrent nature of certain phases of the task and the associated demand on cognitive and physical resources. More importantly, it was possible to determine with reasonable accuracy the stages as well as physical locations in the fuel handling task where operators would be most at risk of losing their balance and falling into the canal. The model also provided sufficient information for a human reliability analysis that indicated that the postulated fuel exposure accident was less than credible.

    18. Mathematical and computational modeling of the diffraction problems by discrete singularities method

      SciTech Connect (OSTI)

      Nesvit, K. V.

      2014-11-12

      The main objective of this study is reduced the boundary-value problems of scattering and diffraction waves on plane-parallel structures to the singular or hypersingular integral equations. For these cases we use a method of the parametric representations of the integral and pseudo-differential operators. Numerical results of the model scattering problems on periodic and boundary gratings and also on the gratings above a flat screen reflector are presented in this paper.

    19. Computational Intelligence Based Data Fusion Algorithm for Dynamic sEMG and Skeletal Muscle Force Modelling

      SciTech Connect (OSTI)

      Chandrasekhar Potluri,; Madhavi Anugolu; Marco P. Schoen; D. Subbaram Naidu

      2013-08-01

      In this work, an array of three surface Electrography (sEMG) sensors are used to acquired muscle extension and contraction signals for 18 healthy test subjects. The skeletal muscle force is estimated using the acquired sEMG signals and a Non-linear Wiener Hammerstein model, relating the two signals in a dynamic fashion. The model is obtained from using System Identification (SI) algorithm. The obtained force models for each sensor are fused using a proposed fuzzy logic concept with the intent to improve the force estimation accuracy and resilience to sensor failure or misalignment. For the fuzzy logic inference system, the sEMG entropy, the relative error, and the correlation of the force signals are considered for defining the membership functions. The proposed fusion algorithm yields an average of 92.49% correlation between the actual force and the overall estimated force output. In addition, the proposed fusionbased approach is implemented on a test platform. Experiments indicate an improvement in finger/hand force estimation.

    20. Computer, Computational, and Statistical Sciences

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      CCS Computer, Computational, and Statistical Sciences Computational physics, computer science, applied mathematics, statistics and the integration of large data streams are central ...

    1. Mathematical and Computational Epidemiology

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Mathematical and Computational Epidemiology Search Site submit Contacts | Sponsors Mathematical and Computational Epidemiology Los Alamos National Laboratory change this image and alt text Menu About Contact Sponsors Research Agent-based Modeling Mixing Patterns, Social Networks Mathematical Epidemiology Social Internet Research Uncertainty Quantification Publications People Mathematical and Computational Epidemiology (MCEpi) Quantifying model uncertainty in agent-based simulations for

    2. Technical Baseline Summary Description for the Tank Farm Contractor

      SciTech Connect (OSTI)

      TEDESCHI, A.R.

      2000-04-21

      This document is a revision of the document titled above, summarizing the technical baseline of the Tank Farm Contractor. It is one of several documents prepared by CH2M HILL Hanford Group, Inc. to support the U.S. Department of Energy Office of River Protection Tank Waste Retrieval and Disposal Mission at Hanford.

    3. 241-AZ Farm Annulus Extent of Condition Baseline Inspection

      SciTech Connect (OSTI)

      Engeman, Jason K.; Girardot, Crystal L.; Vazquez, Brandon J.

      2013-05-15

      This report provides the results of the comprehensive annulus visual inspection for tanks 241- AZ-101 and 241-AZ-102 performed in fiscal year 2013. The inspection established a baseline covering about 95 percent of the annulus floor for comparison with future inspections. Any changes in the condition are also included in this document.

    4. Revised SRC-I project baseline. Volume 2

      SciTech Connect (OSTI)

      Not Available

      1984-01-01

      The SRC Process Area Design Baseline consists of six volumes. The first four were submitted to DOE on 9 September 1981. The fifth volume, summarizing the Category A Engineering Change Proposals (ECPs), was not submitted. The sixth volume, containing proprietary information on Kerr-McGee's Critical Solvent Deashing System, was forwarded to BRHG Synthetic Fuels, Inc. for custody, according to past instructions from DOE, and is available for perusal by authorized DOE representatives. DOE formally accepted the Design Baseline under ICRC Release ECP 4-1001, at the Project Configuration Control Board meeting in Oak Ridge, Tennessee on 5 November 1981. The documentation was then revised by Catalytic, Inc. to incorporate the Category B and C and Post-Baseline Engineering Change Proposals. Volumes I through V of the Revised Design Baseline, dated 22 October 1982, are nonproprietary and they were issued to the DOE via Engineering Change Notice (ECN) 4-1 on 23 February 1983. Volume VI again contains proprieary information on Kerr-McGee Critical Solvent Deashing System; it was issued to Burns and Roe Synthetic Fuels, Inc. Subsequently, updated process descriptions, utility summaries, and errata sheets were issued to the DOE and Burns and Roe Synthetic Fuels, Inc. on nonproprietary Engineering Change Notices 4-2 and 4-3 on 24 May 1983.

    5. THE FIRST VERY LONG BASELINE INTERFEROMETRIC SETI EXPERIMENT

      SciTech Connect (OSTI)

      Rampadarath, H.; Morgan, J. S.; Tingay, S. J.; Trott, C. M.

      2012-08-15

      The first Search for Extra-Terrestrial Intelligence (SETI) conducted with very long baseline interferometry (VLBI) is presented. By consideration of the basic principles of interferometry, we show that VLBI is efficient at discriminating between SETI signals and human generated radio frequency interference (RFI). The target for this study was the star Gliese 581, thought to have two planets within its habitable zone. On 2007 June 19, Gliese 581 was observed for 8 hr at 1230-1544 MHz with the Australian Long Baseline Array. The data set was searched for signals appearing on all interferometer baselines above five times the noise limit. A total of 222 potential SETI signals were detected and by using automated data analysis techniques were ruled out as originating from the Gliese 581 system. From our results we place an upper limit of 7 MW Hz{sup -1} on the power output of any isotropic emitter located in the Gliese 581 system within this frequency range. This study shows that VLBI is ideal for targeted SETI including follow-up observations. The techniques presented are equally applicable to next-generation interferometers, such as the long baselines of the Square Kilometre Array.

    6. A Computational Model of the Mark-IV Electrorefiner: Phase I -- Fuel Basket/Salt Interface

      SciTech Connect (OSTI)

      Robert Hoover; Supathorn Phongikaroon; Shelly Li; Michael Simpson; Tae-Sic Yoo

      2009-09-01

      Spent driver fuel from the Experimental Breeder Reactor-II (EBR-II) is currently being treated in the Mk-IV electrorefiner (ER) in the Fuel Conditioning Facility (FCF) at Idaho National Laboratory. The modeling approach to be presented here has been developed to help understand the effect of different parameters on the dynamics of this system. The first phase of this new modeling approach focuses on the fuel basket/salt interface involving the transport of various species found in the driver fuels (e.g. uranium and zirconium). This approach minimizes the guessed parameters to only one, the exchange current density (i0). U3+ and Zr4+ were the only species used for the current study. The result reveals that most of the total cell current is used for the oxidation of uranium, with little being used by zirconium. The dimensionless approach shows that the total potential is a strong function of i0 and a weak function of wt% of uranium in the salt system for initiation processes.

    7. Revised SRC-I project baseline. Volume 1

      SciTech Connect (OSTI)

      Not Available

      1984-01-01

      International Coal Refining Company (ICRC), in cooperation with the Commonwealth of Kentucky has contracted with the United States Department of Energy (DOE) to design, build and operate a first-of-its-kind plant demonstrating the economic, environmental, socioeconomic and technical feasibility of the direct coal liquefaction process known as SRC-I. ICRC has made a massive commitment of time and expertise to design processes, plan and formulate policy, schedules, costs and technical drawings for all plant systems. These fully integrated plans comprise the Project Baseline and are the basis for all future detailed engineering, plant construction, operation, and other work set forth in the contract between ICRC and the DOE. Volumes I and II of the accompanying documents constitute the updated Project Baseline for the SRC-I two-stage liquefaction plant. International Coal Refining Company believes this versatile plant design incorporates the most advanced coal liquefaction system available in the synthetic fuels field. SRC-I two-stage liquefaction, as developed by ICRC, is the way of the future in coal liquefaction because of its product slate flexibility, high process thermal efficiency, and low consumption of hydrogen. The SRC-I Project Baseline design also has made important state-of-the-art advances in areas such as environmental control systems. Because of a lack of funding, the DOE has curtailed the total project effort without specifying a definite renewal date. This precludes the development of revised accurate and meaningful schedules and, hence, escalated project costs. ICRC has revised and updated the original Design Baseline to include in the technical documentation all of the approved but previously non-incorporated Category B and C and new Post-Baseline Engineering Change Proposals.

    8. Computer modeling of Y-Ba-Cu-O thin film deposition and growth

      SciTech Connect (OSTI)

      Burmester, C.; Gronsky, R. ); Wille, L. . Dept. of Physics)

      1991-07-01

      The deposition and growth of epitaxial thin films of YBa{sub 2}Cu{sub 3}O{sub 7} are modeled by means of Monte Carlo simulations of the deposition and diffusion of Y, Ba, and Cu oxide particles. This complements existing experimental characterization techniques to allow the study of kinetic phenomena expected to play a dominant role in the inherently non-equilibrium thin film deposition process. Surface morphologies and defect structures obtained in the simulated films are found to closely resemble those observed experimentally. A systematic study of the effects of deposition rate and substrate temperature during in-situ film fabrication reveals that the kinetics of film growth can readily dominate the structural formation of the thin film. 16 refs., 4 figs.

    9. EVMS Training Snippet: 4.2 Integrated Baseline Review (IBR) Process...

      Energy Savers [EERE]

      2 Integrated Baseline Review (IBR) Process EVMS Training Snippet: 4.2 Integrated Baseline Review (IBR) Process This EVMS Training Snippet sponsored by the Office of Project ...

    10. Microsoft PowerPoint - Snippet 4.6 Baseline Control Methods 20140723...

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      to an approved performance baseline, including impacts on the project scope, schedule, design, methods, and cost baselines. The BCP represents a change to one or more of the...

    11. Inference of tumor evolution during chemotherapy by computational modeling and in situ analysis of genetic and phenotypic cellular diversity

      DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

      Almendro, Vanessa; Cheng, Yu -Kang; Randles, Amanda; Itzkovitz, Shalev; Marusyk, Andriy; Ametller, Elisabet; Gonzalez-Farre, Xavier; Muñoz, Montse; Russnes, Hege  G.; Helland, Åslaug; et al

      2014-02-01

      Cancer therapy exerts a strong selection pressure that shapes tumor evolution, yet our knowledge of how tumors change during treatment is limited. Here, we report the analysis of cellular heterogeneity for genetic and phenotypic features and their spatial distribution in breast tumors pre- and post-neoadjuvant chemotherapy. We found that intratumor genetic diversity was tumor-subtype specific, and it did not change during treatment in tumors with partial or no response. However, lower pretreatment genetic diversity was significantly associated with pathologic complete response. In contrast, phenotypic diversity was different between pre- and post-treatment samples. We also observed significant changes in the spatialmore » distribution of cells with distinct genetic and phenotypic features. We used these experimental data to develop a stochastic computational model to infer tumor growth patterns and evolutionary dynamics. Our results highlight the importance of integrated analysis of genotypes and phenotypes of single cells in intact tissues to predict tumor evolution.« less

    12. Inference of tumor evolution during chemotherapy by computational modeling and in situ analysis of genetic and phenotypic cellular diversity

      SciTech Connect (OSTI)

      Almendro, Vanessa; Cheng, Yu -Kang; Randles, Amanda; Itzkovitz, Shalev; Marusyk, Andriy; Ametller, Elisabet; Gonzalez-Farre, Xavier; Muñoz, Montse; Russnes, Hege  G.; Helland, Åslaug; Rye, Inga  H.; Borresen-Dale, Anne -Lise; Maruyama, Reo; van Oudenaarden, Alexander; Dowsett, Mitchell; Jones, Robin  L.; Reis-Filho, Jorge; Gascon, Pere; Gönen, Mithat; Michor, Franziska; Polyak, Kornelia

      2014-02-01

      Cancer therapy exerts a strong selection pressure that shapes tumor evolution, yet our knowledge of how tumors change during treatment is limited. Here, we report the analysis of cellular heterogeneity for genetic and phenotypic features and their spatial distribution in breast tumors pre- and post-neoadjuvant chemotherapy. We found that intratumor genetic diversity was tumor-subtype specific, and it did not change during treatment in tumors with partial or no response. However, lower pretreatment genetic diversity was significantly associated with pathologic complete response. In contrast, phenotypic diversity was different between pre- and post-treatment samples. We also observed significant changes in the spatial distribution of cells with distinct genetic and phenotypic features. We used these experimental data to develop a stochastic computational model to infer tumor growth patterns and evolutionary dynamics. Our results highlight the importance of integrated analysis of genotypes and phenotypes of single cells in intact tissues to predict tumor evolution.

    13. Computational Study of Bond Dissociation Enthalpies for Substituted $\\beta$-O-4 Lignin Model Compounds

      SciTech Connect (OSTI)

      Younker, Jarod M; Beste, Ariana; Buchanan III, A C

      2011-01-01

      The biopolymer lignin is a potential source of valuable chemicals. Phenethyl phenyl ether (PPE) is representative of the dominant $\\beta$-O-4 ether linkage. Density functional theory (DFT) is used to calculate the Boltzmann-weighted carbon-oxygen and carbon-carbon bond dissociation enthalpies (BDEs) of substituted PPE. These values are important in order to understand lignin decomposition. Exclusion of all conformers that have distributions of less than 5\\% at 298 K impacts the BDE by less than 1 kcal mol$^{-1}$. We find that aliphatic hydroxyl/methylhydroxyl substituents introduce only small changes to the BDEs (0-3 kcal mol$^{-1}$). Substitution on the phenyl ring at the $ortho$ position substantially lowers the C-O BDE, except in combination with the hydroxyl/methylhydroxyl substituents, where the effect of methoxy substitution is reduced by hydrogen bonding. Hydrogen bonding between the aliphatic substituents and the ether oxygen in the PPE derivatives has a significant influence on the BDE. CCSD(T)-calculated BDEs and hydrogen bond strengths of $ortho$-substituted anisoles when compared with M06-2X values confirm that the latter method is sufficient to describe the molecules studied and provide an important benchmark for lignin model compounds.

    14. Risk and Vulnerability Assessment Using Cybernomic Computational Models: Tailored for Industrial Control Systems

      SciTech Connect (OSTI)

      Abercrombie, Robert K; Sheldon, Federick T.; Schlicher, Bob G

      2015-01-01

      There are many influencing economic factors to weigh from the defender-practitioner stakeholder point-of-view that involve cost combined with development/deployment models. Some examples include the cost of countermeasures themselves, the cost of training and the cost of maintenance. Meanwhile, we must better anticipate the total cost from a compromise. The return on investment in countermeasures is essentially impact costs (i.e., the costs from violating availability, integrity and confidentiality / privacy requirements). The natural question arises about choosing the main risks that must be mitigated/controlled and monitored in deciding where to focus security investments. To answer this question, we have investigated the cost/benefits to the attacker/defender to better estimate risk exposure. In doing so, it s important to develop a sound basis for estimating the factors that derive risk exposure, such as likelihood that a threat will emerge and whether it will be thwarted. This impact assessment framework can provide key information for ranking cybersecurity threats and managing risk.

    15. Elucidating reactivity regimes in cyclopentane oxidation: Jet stirred reactor experiments, computational chemistry, and kinetic modeling

      DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

      Al Rashidi, Mariam J.; Thion, Sebastien; Togbe, Casimir; Dayma, Guillaume; Mehl, Marco; Dagaut, Philippe; Pitz, William J.; Zador, Judit; Sarathy, S. Mani

      2016-06-22

      This study is concerned with the identification and quantification of species generated during the combustion of cyclopentane in a jet stirred reactor (JSR). Experiments were carried out for temperatures between 740 and 1250 K, equivalence ratios from 0.5 to 3.0, and at an operating pressure of 10 atm. The fuel concentration was kept at 0.1% and the residence time of the fuel/O2/N2 mixture was maintained at 0.7 s. The reactant, product, and intermediate species concentration profiles were measured using gas chromatography and Fourier transform infrared spectroscopy. The concentration profiles of cyclopentane indicate inhibition of reactivity between 850-1000 K for φ=2.0more » and φ=3.0. This behavior is interesting, as it has not been observed previously for other fuel molecules, cyclic or non-cyclic. A kinetic model including both low- and high-temperature reaction pathways was developed and used to simulate the JSR experiments. The pressure-dependent rate coefficients of all relevant reactions lying on the PES of cyclopentyl + O2, as well as the C-C and C-H scission reactions of the cyclopentyl radical were calculated at the UCCSD(T)-F12b/cc-pVTZ-F12//M06-2X/6-311++G(d,p) level of theory. The simulations reproduced the unique reactivity trend of cyclopentane and the measured concentration profiles of intermediate and product species. Furthermore, sensitivity and reaction path analyses indicate that this reactivity trend may be attributed to differences in the reactivity of allyl radical at different conditions, and it is highly sensitive to the C-C/C-H scission branching ratio of the cyclopentyl radical decomposition.« less

    16. Computational Structural Mechanics

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      load-2 TRACC RESEARCH Computational Fluid Dynamics Computational Structural Mechanics Transportation Systems Modeling Computational Structural Mechanics Overview of CSM Computational structural mechanics is a well-established methodology for the design and analysis of many components and structures found in the transportation field. Modern finite-element models (FEMs) play a major role in these evaluations, and sophisticated software, such as the commercially available LS-DYNA® code, is

    17. Michael Levitt and Computational Biology

      Office of Scientific and Technical Information (OSTI)

      ... Additional Web Pages: 3 Scientists Win Chemistry Nobel for Complex Computer Modeling, npr Stanford's Nobel Chemistry Prize Honors Computer Science, San Jose Mercury News Without ...

    18. Estimating baseline risks from biouptake and food ingestion at a contaminated site

      SciTech Connect (OSTI)

      MacDonell, M.; Woytowich, K.; Blunt, D.; Picel, M.

      1993-11-01

      Biouptake of contaminants and subsequent human exposure via food ingestion represents a public concern at many contaminated sites. Site-specific measurements from plant and animal studies are usually quite limited, so this exposure pathway is often modeled to assess the potential for adverse health effects. A modeling tool was applied to evaluate baseline risks at a contaminated site in Missouri, and the results were used to confirm that ingestion of fish and game animals from the site area do not pose a human health threat. Results were also used to support the development of cleanup criteria for site soil.

    19. TriBITS lifecycle model. Version 1.0, a lean/agile software lifecycle model for research-based computational science and engineering and applied mathematical software.

      SciTech Connect (OSTI)

      Willenbring, James M.; Bartlett, Roscoe Ainsworth; Heroux, Michael Allen

      2012-01-01

      Software lifecycles are becoming an increasingly important issue for computational science and engineering (CSE) software. The process by which a piece of CSE software begins life as a set of research requirements and then matures into a trusted high-quality capability is both commonplace and extremely challenging. Although an implicit lifecycle is obviously being used in any effort, the challenges of this process - respecting the competing needs of research vs. production - cannot be overstated. Here we describe a proposal for a well-defined software lifecycle process based on modern Lean/Agile software engineering principles. What we propose is appropriate for many CSE software projects that are initially heavily focused on research but also are expected to eventually produce usable high-quality capabilities. The model is related to TriBITS, a build, integration and testing system, which serves as a strong foundation for this lifecycle model, and aspects of this lifecycle model are ingrained in the TriBITS system. Here, we advocate three to four phases or maturity levels that address the appropriate handling of many issues associated with the transition from research to production software. The goals of this lifecycle model are to better communicate maturity levels with customers and to help to identify and promote Software Engineering (SE) practices that will help to improve productivity and produce better software. An important collection of software in this domain is Trilinos, which is used as the motivation and the initial target for this lifecycle model. However, many other related and similar CSE (and non-CSE) software projects can also make good use of this lifecycle model, especially those that use the TriBITS system. Indeed this lifecycle process, if followed, will enable large-scale sustainable integration of many complex CSE software efforts across several institutions.

    20. Baseline ecological risk assessment Salmon Site, Lamar County, Mississippi

      SciTech Connect (OSTI)

      1995-04-01

      The Salmon Site (SS), formerly the Tatum Dome Test Site, located in Mississippi was the site of two nuclear and two gas explosion tests conducted between 1964 and 1970. A consequence of these testing activities is that radionuclides were released into the salt dome, where they are presently contained. During reentry drilling and other site activities, incidental liquid and solid wastes that contained radioactivity were generated, resulting in some soil, ground water and equipment contamination. As part of the remedial investigation effort, a Baseline Ecological Risk Assessment was conducted at the SS. The purpose is to gauge ecological and other environmental impacts attributable to past activities at the former test facility. The results of this facility-specific baseline risk assessment are presented in this document.

    1. Sandia National Laboratories: Advanced Simulation and Computing:

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Computational Systems & Software Environment Computational Systems & Software Environment Advanced Simulation and Computing Computational Systems & Software Environment Integrated Codes Physics & Engineering Models Verification & Validation Facilities Operation & User Support Research & Collaboration Contact ASC Advanced Simulation and Computing Computational Systems & Software Environment Crack Modeling The Computational Systems & Software Environment

    2. Computational fluid dynamics modeling of two-phase flow in a BWR fuel assembly. Final CRADA Report.

      SciTech Connect (OSTI)

      Tentner, A.; Nuclear Engineering Division

      2009-10-13

      A direct numerical simulation capability for two-phase flows with heat transfer in complex geometries can considerably reduce the hardware development cycle, facilitate the optimization and reduce the costs of testing of various industrial facilities, such as nuclear power plants, steam generators, steam condensers, liquid cooling systems, heat exchangers, distillers, and boilers. Specifically, the phenomena occurring in a two-phase coolant flow in a BWR (Boiling Water Reactor) fuel assembly include coolant phase changes and multiple flow regimes which directly influence the coolant interaction with fuel assembly and, ultimately, the reactor performance. Traditionally, the best analysis tools for this purpose of two-phase flow phenomena inside the BWR fuel assembly have been the sub-channel codes. However, the resolution of these codes is too coarse for analyzing the detailed intra-assembly flow patterns, such as flow around a spacer element. Advanced CFD (Computational Fluid Dynamics) codes provide a potential for detailed 3D simulations of coolant flow inside a fuel assembly, including flow around a spacer element using more fundamental physical models of flow regimes and phase interactions than sub-channel codes. Such models can extend the code applicability to a wider range of situations, which is highly important for increasing the efficiency and to prevent accidents.

    3. Computational fluid dynamics modeling of chemical looping combustion process with calcium sulphate oxygen carrier - article no. A19

      SciTech Connect (OSTI)

      Baosheng Jin; Rui Xiao; Zhongyi Deng; Qilei Song

      2009-07-01

      To concentrate CO{sub 2} in combustion processes by efficient and energy-saving ways is a first and very important step for its sequestration. Chemical looping combustion (CLC) could easily achieve this goal. A chemical-looping combustion system consists of a fuel reactor and an air reactor. Two reactors in the form of interconnected fluidized beds are used in the process: (1) a fuel reactor where the oxygen carrier is reduced by reaction with the fuel, and (2) an air reactor where the reduced oxygen carrier from the fuel reactor is oxidized with air. The outlet gas from the fuel reactor consists of CO{sub 2} and H{sub 2}O, while the outlet gas stream from the air reactor contains only N{sub 2} and some unused O{sub 2}. The water in combustion products can be easily removed by condensation and pure carbon dioxide is obtained without any loss of energy for separation. Until now, there is little literature about mathematical modeling of chemical-looping combustion using the computational fluid dynamics (CFD) approach. In this work, the reaction kinetic model of the fuel reactor (CaSO{sub 4}+ H{sub 2}) is developed by means of the commercial code FLUENT and the effects of partial pressure of H{sub 2} (concentration of H{sub 2}) on chemical looping combustion performance are also studied. The results show that the concentration of H{sub 2} could enhance the CLC performance.

    4. Center for Programming Models for Scalable Parallel Computing - Towards Enhancing OpenMP for Manycore and Heterogeneous Nodes

      SciTech Connect (OSTI)

      Barbara Chapman

      2012-02-01

      OpenMP was not well recognized at the beginning of the project, around year 2003, because of its limited use in DoE production applications and the inmature hardware support for an efficient implementation. Yet in the recent years, it has been graduately adopted both in HPC applications, mostly in the form of MPI+OpenMP hybrid code, and in mid-scale desktop applications for scientific and experimental studies. We have observed this trend and worked deligiently to improve our OpenMP compiler and runtimes, as well as to work with the OpenMP standard organization to make sure OpenMP are evolved in the direction close to DoE missions. In the Center for Programming Models for Scalable Parallel Computing project, the HPCTools team at the University of Houston (UH), directed by Dr. Barbara Chapman, has been working with project partners, external collaborators and hardware vendors to increase the scalability and applicability of OpenMP for multi-core (and future manycore) platforms and for distributed memory systems by exploring different programming models, language extensions, compiler optimizations, as well as runtime library support.

    5. Baseline measurements of terrestrial gamma radioactivity at the CEBAF site

      SciTech Connect (OSTI)

      Wollenberg, H.A.; Smith, A.R.

      1991-10-01

      A survey of the gamma radiation background from terrestrial sources was conducted at the CEBAF site, Newport News, Virginia, on November 12--16, 1990, to provide a gamma radiation baseline for the site prior to the startup of the accelerator. The concentrations and distributions of the natural radioelements in exposed soil were measured, and the results of the measurements were converted into gamma-ray exposure rates. Concurrently, samples were collected for laboratory gamma spectral analyses.

    6. Computational mechanics

      SciTech Connect (OSTI)

      Raboin, P J

      1998-01-01

      The Computational Mechanics thrust area is a vital and growing facet of the Mechanical Engineering Department at Lawrence Livermore National Laboratory (LLNL). This work supports the development of computational analysis tools in the areas of structural mechanics and heat transfer. Over 75 analysts depend on thrust area-supported software running on a variety of computing platforms to meet the demands of LLNL programs. Interactions with the Department of Defense (DOD) High Performance Computing and Modernization Program and the Defense Special Weapons Agency are of special importance as they support our ParaDyn project in its development of new parallel capabilities for DYNA3D. Working with DOD customers has been invaluable to driving this technology in directions mutually beneficial to the Department of Energy. Other projects associated with the Computational Mechanics thrust area include work with the Partnership for a New Generation Vehicle (PNGV) for ''Springback Predictability'' and with the Federal Aviation Administration (FAA) for the ''Development of Methodologies for Evaluating Containment and Mitigation of Uncontained Engine Debris.'' In this report for FY-97, there are five articles detailing three code development activities and two projects that synthesized new code capabilities with new analytic research in damage/failure and biomechanics. The article this year are: (1) Energy- and Momentum-Conserving Rigid-Body Contact for NIKE3D and DYNA3D; (2) Computational Modeling of Prosthetics: A New Approach to Implant Design; (3) Characterization of Laser-Induced Mechanical Failure Damage of Optical Components; (4) Parallel Algorithm Research for Solid Mechanics Applications Using Finite Element Analysis; and (5) An Accurate One-Step Elasto-Plasticity Algorithm for Shell Elements in DYNA3D.

    7. A full-spectral Bayesian reconstruction approach based on the material decomposition model applied in dual-energy computed tomography

      SciTech Connect (OSTI)

      Cai, C.; Rodet, T.; Mohammad-Djafari, A.; Legoupil, S.

      2013-11-15

      Purpose: Dual-energy computed tomography (DECT) makes it possible to get two fractions of basis materials without segmentation. One is the soft-tissue equivalent water fraction and the other is the hard-matter equivalent bone fraction. Practical DECT measurements are usually obtained with polychromatic x-ray beams. Existing reconstruction approaches based on linear forward models without counting the beam polychromaticity fail to estimate the correct decomposition fractions and result in beam-hardening artifacts (BHA). The existing BHA correction approaches either need to refer to calibration measurements or suffer from the noise amplification caused by the negative-log preprocessing and the ill-conditioned water and bone separation problem. To overcome these problems, statistical DECT reconstruction approaches based on nonlinear forward models counting the beam polychromaticity show great potential for giving accurate fraction images.Methods: This work proposes a full-spectral Bayesian reconstruction approach which allows the reconstruction of high quality fraction images from ordinary polychromatic measurements. This approach is based on a Gaussian noise model with unknown variance assigned directly to the projections without taking negative-log. Referring to Bayesian inferences, the decomposition fractions and observation variance are estimated by using the joint maximum a posteriori (MAP) estimation method. Subject to an adaptive prior model assigned to the variance, the joint estimation problem is then simplified into a single estimation problem. It transforms the joint MAP estimation problem into a minimization problem with a nonquadratic cost function. To solve it, the use of a monotone conjugate gradient algorithm with suboptimal descent steps is proposed.Results: The performance of the proposed approach is analyzed with both simulated and experimental data. The results show that the proposed Bayesian approach is robust to noise and materials. It is also

    8. Parallel computing works

      SciTech Connect (OSTI)

      Not Available

      1991-10-23

      An account of the Caltech Concurrent Computation Program (C{sup 3}P), a five year project that focused on answering the question: Can parallel computers be used to do large-scale scientific computations '' As the title indicates, the question is answered in the affirmative, by implementing numerous scientific applications on real parallel computers and doing computations that produced new scientific results. In the process of doing so, C{sup 3}P helped design and build several new computers, designed and implemented basic system software, developed algorithms for frequently used mathematical computations on massively parallel machines, devised performance models and measured the performance of many computers, and created a high performance computing facility based exclusively on parallel computers. While the initial focus of C{sup 3}P was the hypercube architecture developed by C. Seitz, many of the methods developed and lessons learned have been applied successfully on other massively parallel architectures.

    9. Electromagnetic analysis of forces and torques on the baseline and enhanced ITER shield modules due to plasma disruption.

      SciTech Connect (OSTI)

      Kotulski, Joseph Daniel; Coats, Rebecca Sue; Pasik, Michael Francis; Ulrickson, Michael Andrew

      2009-08-01

      An electromagnetic analysis is performed on the ITER shield modules under different plasma-disruption scenarios using the OPERA-3d software. The models considered include the baseline design as provided by the International Organization and an enhanced design that includes the more realistic geometrical features of a shield module. The modeling procedure is explained, electromagnetic torques are presented, and results of the modeling are discussed.

    10. DEVELOPMENT OF A COMPUTATIONAL MULTIPHASE FLOW MODEL FOR FISCHER TROPSCH SYNTHESIS IN A SLURRY BUBBLE COLUMN REACTOR

      SciTech Connect (OSTI)

      Donna Post Guillen; Tami Grimmett; Anastasia M. Gribik; Steven P. Antal

      2010-09-01

      The Hybrid Energy Systems Testing (HYTEST) Laboratory is being established at the Idaho National Laboratory to develop and test hybrid energy systems with the principal objective to safeguard U.S. Energy Security by reducing dependence on foreign petroleum. A central component of the HYTEST is the slurry bubble column reactor (SBCR) in which the gas-to-liquid reactions will be performed to synthesize transportation fuels using the Fischer Tropsch (FT) process. SBCRs are cylindrical vessels in which gaseous reactants (for example, synthesis gas or syngas) is sparged into a slurry of liquid reaction products and finely dispersed catalyst particles. The catalyst particles are suspended in the slurry by the rising gas bubbles and serve to promote the chemical reaction that converts syngas to a spectrum of longer chain hydrocarbon products, which can be upgraded to gasoline, diesel or jet fuel. These SBCRs operate in the churn-turbulent flow regime which is characterized by complex hydrodynamics, coupled with reacting flow chemistry and heat transfer, that effect reactor performance. The purpose of this work is to develop a computational multiphase fluid dynamic (CMFD) model to aid in understanding the physico-chemical processes occurring in the SBCR. Our team is developing a robust methodology to couple reaction kinetics and mass transfer into a four-field model (consisting of the bulk liquid, small bubbles, large bubbles and solid catalyst particles) that includes twelve species: (1) CO reactant, (2) H2 reactant, (3) hydrocarbon product, and (4) H2O product in small bubbles, large bubbles, and the bulk fluid. Properties of the hydrocarbon product were specified by vapor liquid equilibrium calculations. The absorption and kinetic models, specifically changes in species concentrations, have been incorporated into the mass continuity equation. The reaction rate is determined based on the macrokinetic model for a cobalt catalyst developed by Yates and Satterfield [1]. The

    11. Computational Modeling & Simulation

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      ... across energy technologies to effectively address ... participating in the Wind Turbine Radar Interference ... Association AWEA WindPower 2015 event in Orlando, Florida. ...

    12. Computational Modeling & Simulation

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      2 - Sandia Energy Energy Search Icon Sandia Home Locations Contact Us Employee Locator Energy & Climate Secure & Sustainable Energy Future Stationary Power Energy Conversion Efficiency Solar Energy Wind Energy Water Power Supercritical CO2 Geothermal Natural Gas Safety, Security & Resilience of the Energy Infrastructure Energy Storage Nuclear Power & Engineering Grid Modernization Battery Testing Nuclear Energy Defense Waste Management Programs Advanced Nuclear Energy Nuclear

    13. Computational Modeling & Simulation

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      3 - Sandia Energy Energy Search Icon Sandia Home Locations Contact Us Employee Locator Energy & Climate Secure & Sustainable Energy Future Stationary Power Energy Conversion Efficiency Solar Energy Wind Energy Water Power Supercritical CO2 Geothermal Natural Gas Safety, Security & Resilience of the Energy Infrastructure Energy Storage Nuclear Power & Engineering Grid Modernization Battery Testing Nuclear Energy Defense Waste Management Programs Advanced Nuclear Energy Nuclear

    14. KINETIC MODELING OF A FISCHER-TROPSCH REACTION OVER A COBALT CATALYST IN A SLURRY BUBBLE COLUMN REACTOR FOR INCORPORATION INTO A COMPUTATIONAL MULTIPHASE FLUID DYNAMICS MODEL

      SciTech Connect (OSTI)

      Anastasia Gribik; Doona Guillen, PhD; Daniel Ginosar, PhD

      2008-09-01

      Currently multi-tubular fixed bed reactors, fluidized bed reactors, and slurry bubble column reactors (SBCRs) are used in commercial Fischer Tropsch (FT) synthesis. There are a number of advantages of the SBCR compared to fixed and fluidized bed reactors. The main advantage of the SBCR is that temperature control and heat recovery are more easily achieved. The SBCR is a multiphase chemical reactor where a synthesis gas, comprised mainly of H2 and CO, is bubbled through a liquid hydrocarbon wax containing solid catalyst particles to produce specialty chemicals, lubricants, or fuels. The FT synthesis reaction is the polymerization of methylene groups [-(CH2)-] forming mainly linear alkanes and alkenes, ranging from methane to high molecular weight waxes. The Idaho National Laboratory is developing a computational multiphase fluid dynamics (CMFD) model of the FT process in a SBCR. This paper discusses the incorporation of absorption and reaction kinetics into the current hydrodynamic model. A phased approach for incorporation of the reaction kinetics into a CMFD model is presented here. Initially, a simple kinetic model is coupled to the hydrodynamic model, with increasing levels of complexity added in stages. The first phase of the model includes incorporation of the absorption of gas species from both large and small bubbles into the bulk liquid phase. The driving force for the gas across the gas liquid interface into the bulk liquid is dependent upon the interfacial gas concentration in both small and large bubbles. However, because it is difficult to measure the concentration at the gas-liquid interface, coefficients for convective mass transfer have been developed for the overall driving force between the bulk concentrations in the gas and liquid phases. It is assumed that there are no temperature effects from mass transfer of the gas phases to the bulk liquid phase, since there are only small amounts of dissolved gas in the liquid phase. The product from the

    15. NREL: MIDC/SRRL Baseline Measurement System (39.74 N, 105.18 W, 1829 m,

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      GMT-7) Solar Radiation Research Laboratory Baseline Measurement System

    16. Modeling and Analysis of a Lunar Space Reactor with the Computer Code RELAP5-3D/ATHENA

      SciTech Connect (OSTI)

      Carbajo, Juan J; Qualls, A L

      2008-01-01

      The transient analysis 3-dimensional (3-D) computer code RELAP5-3D/ATHENA has been employed to model and analyze a space reactor of 180 kW(thermal), 40 kW (net, electrical) with eight Stirling engines (SEs). Each SE will generate over 6 kWe; the excess power will be needed for the pumps and other power management devices. The reactor will be cooled by NaK (a eutectic mixture of sodium and potassium which is liquid at ambient temperature). This space reactor is intended to be deployed over the surface of the Moon or Mars. The reactor operating life will be 8 to 10 years. The RELAP5-3D/ATHENA code is being developed and maintained by Idaho National Laboratory. The code can employ a variety of coolants in addition to water, the original coolant employed with early versions of the code. The code can also use 3-D volumes and 3-D junctions, thus allowing for more realistic representation of complex geometries. A combination of 3-D and 1-D volumes is employed in this study. The space reactor model consists of a primary loop and two secondary loops connected by two heat exchangers (HXs). Each secondary loop provides heat to four SEs. The primary loop includes the nuclear reactor with the lower and upper plena, the core with 85 fuel pins, and two vertical heat exchangers (HX). The maximum coolant temperature of the primary loop is 900 K. The secondary loops also employ NaK as a coolant at a maximum temperature of 877 K. The SEs heads are at a temperature of 800 K and the cold sinks are at a temperature of ~400 K. Two radiators will be employed to remove heat from the SEs. The SE HXs surrounding the SE heads are of annular design and have been modeled using 3-D volumes. These 3-D models have been used to improve the HX design by optimizing the flows of coolant and maximizing the heat transferred to the SE heads. The transients analyzed include failure of one or more Stirling engines, trip of the reactor pump, and trips of the secondary loop pumps feeding the HXs of the

    17. Computational Analysis of the Pyrolysis of ..beta..-O4 Lignin Model Compounds: Concerted vs. Homolytic Fragmentation

      SciTech Connect (OSTI)

      Clark, J. M.; Robichaud, D. J.; Nimlos, M. R.

      2012-01-01

      The thermochemical conversion of biomass to liquid transportation fuels is a very attractive technology for expanding the utilization of carbon neutral processes and reducing dependency on fossil fuel resources. As with all such emerging technologies, biomass conversion through gasification or pyrolysis has a number of obstacles that need to be overcome to make these processes cost competitive with the refining of fossil fuels. Our current efforts have focused on the investigation of the thermochemistry of the linkages between lignin units using ab initio calculations on dimeric lignin model compounds. All calculations were carried out using M062X density functional theory at the 6-311++G(d,p) basis set. The M062X method has been shown to be consistent with the CBS-QB3 method while being significantly less computationally expensive. To date we have only completed the study on the b-O4 compounds. The theoretical calculations performed in the study indicate that concerted elimination pathways dominate over bond homolysis reactions under typical pyrolysis conditions. However, this does not mean that concerted elimination will be the dominant loss process for lignin. Bimolecular radical chemistry could very well dwarf the unimolecular pathways investigated in this study. These concerted pathways tend to form stable, reasonably non-reactive products that would be more suited producing a fungible bio-oil for the production of liquid transportation fuels.

    18. Kinetic analysis of the phenyl-shift reaction in $\\beta$-O-4 lignin model compounds: A computational study.

      SciTech Connect (OSTI)

      Beste, Ariana; Buchanan III, A C

      2011-01-01

      The phenyl-shift reaction in $\\beta$-phenethyl phenyl ether ($\\beta - \\rm PhCH_2CH_2OPh$, $\\beta$-PPE) is an integral step in the pyrolysis of PPE, which is a model compound for the $\\beta$-O-4 linkage in lignin. We investigated the influence of natural occurring substituents (hydroxy, methoxy) on the reaction rate by calculating relative rate constant using density functional theory in combination with transition state theory, including anharmonic correction for low-frequency modes. The phenyl-shift reaction proceeds through an intermediate and the overall rate constants were computed invoking the steady-state approximation (its validity was confirmed). Substituents on the phenethyl group have only little influence on the rate constants. If a methoxy substituent is located in para position of the phenyl ring adjacent to the ether oxygen, the energies of the intermediate and second transition state are lowered, but the overall rate constant is not significantly altered. This is a consequence of the dominating first transition from pre-complex to intermediate in the overall rate constant. {\\it O}- and di-{\\it o}-methoxy substituents accelerate the phenyl-migration rate compared to $\\beta$-PPE.

    19. Computational modeling predicts simultaneous targeting of fibroblasts and epithelial cells is necessary for treatment of pulmonary fibrosis

      DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

      Warsinske, Hayley C.; Wheaton, Amanda K.; Kim, Kevin K.; Linderman, Jennifer J.; Moore, Bethany B.; Kirschner, Denise E.

      2016-06-23

      Pulmonary fibrosis is pathologic remodeling of lung tissue that can result in difficulty breathing, reduced quality of life, and a poor prognosis for patients. Fibrosis occurs as a result of insult to lung tissue, though mechanisms of this response are not well-characterized. The disease is driven in part by dysregulation of fibroblast proliferation and differentiation into myofibroblast cells, as well as pro-fibrotic mediator-driven epithelial cell apoptosis. The most well-characterized pro-fibrotic mediator associated with pulmonary fibrosis is TGF-β1. Excessive synthesis of, and sensitivity to, pro-fibrotic mediators as well as insufficient production of and sensitivity to anti-fibrotic mediators has been credited withmore » enabling fibroblast accumulation. Available treatments neither halt nor reverse lung damage. In this study we have two aims: to identify molecular and cellular scale mechanisms driving fibroblast proliferation and differentiation as well as epithelial cell survival in the context of fibrosis, and to predict therapeutic targets and strategies. We combine in vitro studies with a multi-scale hybrid agent-based computational model that describes fibroblasts and epithelial cells in co-culture. Within this model TGF-β1 represents a pro-fibrotic mediator and we include detailed dynamics of TGFβ1 receptor ligand signaling in fibroblasts. PGE2 represents an anti-fibrotic mediator. Using uncertainty and sensitivity analysis we identify TGF-β1 synthesis, TGF-β1 activation, and PGE2 synthesis among the key mechanisms contributing to fibrotic outcomes. We further demonstrate that intervention strategies combining potential therapeutics targeting both fibroblast regulation and epithelial cell survival can promote healthy tissue repair better than individual strategies. Combinations of existing drugs and compounds may provide significant improvements to the current standard of care for pulmonary fibrosis. In conclusion, a two-hit therapeutic

    20. Computer-Based Procedures for Field Workers in Nuclear Power Plants: Development of a Model of Procedure Usage and Identification of Requirements

      SciTech Connect (OSTI)

      Katya Le Blanc; Johanna Oxstrand

      2012-04-01

      The nuclear industry is constantly trying to find ways to decrease the human error rate, especially the human errors associated with procedure use. As a step toward the goal of improving procedure use performance, researchers, together with the nuclear industry, have been looking at replacing the current paper-based procedures with computer-based procedure systems. The concept of computer-based procedures is not new by any means; however most research has focused on procedures used in the main control room. Procedures reviewed in these efforts are mainly emergency operating procedures and normal operating procedures. Based on lessons learned for these previous efforts we are now exploring a more unknown application for computer based procedures - field procedures, i.e. procedures used by nuclear equipment operators and maintenance technicians. The Idaho National Laboratory and participants from the U.S. commercial nuclear industry are collaborating in an applied research effort with the objective of developing requirements and specifications for a computer-based procedure system to be used by field workers. The goal is to identify the types of human errors that can be mitigated by using computer-based procedures and how to best design the computer-based procedures to do so. This paper describes the development of a Model of Procedure Use and the qualitative study on which the model is based. The study was conducted in collaboration with four nuclear utilities and five research institutes. During the qualitative study and the model development requirements and for computer-based procedures were identified.