National Library of Energy BETA

Sample records for baseline computational model

  1. Integrated Baseline System (IBS) Version 2.0: Models guide

    SciTech Connect (OSTI)

    Not Available

    1994-03-01

    The Integrated Baseline System (IBS) is an emergency management planning and analysis tool being developed under the direction of the US Army Nuclear and Chemical Agency. This Models Guide summarizes the IBS use of several computer models for predicting the results of emergency situations. These include models for predicting dispersion/doses of airborne contaminants, traffic evacuation, explosion effects, heat radiation from a fire, and siren sound transmission. The guide references additional technical documentation on the models when such documentation is available from other sources. The audience for this manual is chiefly emergency management planners and analysts, but also data managers and system managers.

  2. Integrated Baseline Bystem (IBS) Version 1.03: Models guide

    SciTech Connect (OSTI)

    Not Available

    1993-01-01

    The Integrated Baseline System)(IBS), operated by the Federal Emergency Management Agency (FEMA), is a system of computerized tools for emergency planning and analysis. This document is the models guide for the IBS and explains how to use the emergency related computer models. This document provides information for the experienced system user, and is the primary reference for the computer modeling software supplied with the system. It is designed for emergency managers and planners, and others familiar with the concepts of computer modeling. Although the IBS manual set covers basic and advanced operations, it is not a complete reference document set. Emergency situation modeling software in the IBS is supported by additional technical documents. Some of the other IBS software is commercial software for which more complete documentation is available. The IBS manuals reference such documentation where necessary.

  3. Examining Uncertainty in Demand Response Baseline Models and

    E-Print Network [OSTI]

    LBNL-5096E Examining Uncertainty in Demand Response Baseline Models and Variability in Automated of California. #12;Examining Uncertainty in Demand Response Baseline Models and Variability in Automated.e. dynamic prices). Using a regression-based baseline model, we define several Demand Response (DR

  4. Functional Testing Protocols for Commercial Building Efficiency Baseline Modeling Software

    SciTech Connect (OSTI)

    Jump, David; Price, Phillip N.; Granderson, Jessica; Sohn, Michael

    2013-09-06

    This document describes procedures for testing and validating proprietary baseline energy modeling software accuracy in predicting energy use over the period of interest, such as a month or a year. The procedures are designed according to the methodology used for public domain baselining software in another LBNL report that was (like the present report) prepared for Pacific Gas and Electric Company: ?Commercial Building Energy Baseline Modeling Software: Performance Metrics and Method Testing with Open Source Models and Implications for Proprietary Software Testing Protocols? (referred to here as the ?Model Analysis Report?). The test procedure focuses on the quality of the software?s predictions rather than on the specific algorithms used to predict energy use. In this way the software vendor is not required to divulge or share proprietary information about how their software works, while enabling stakeholders to assess its performance.

  5. Examining Uncertainty in Demand Response Baseline Models and Variability in Automated Response to Dynamic Pricing

    SciTech Connect (OSTI)

    Mathieu, Johanna L.; Callaway, Duncan S.; Kiliccote, Sila

    2011-08-15

    Controlling electric loads to deliver power system services presents a number of interesting challenges. For example, changes in electricity consumption of Commercial and Industrial (C&I) facilities are usually estimated using counterfactual baseline models, and model uncertainty makes it difficult to precisely quantify control responsiveness. Moreover, C&I facilities exhibit variability in their response. This paper seeks to understand baseline model error and demand-side variability in responses to open-loop control signals (i.e. dynamic prices). Using a regression-based baseline model, we define several Demand Response (DR) parameters, which characterize changes in electricity use on DR days, and then present a method for computing the error associated with DR parameter estimates. In addition to analyzing the magnitude of DR parameter error, we develop a metric to determine how much observed DR parameter variability is attributable to real event-to-event variability versus simply baseline model error. Using data from 38 C&I facilities that participated in an automated DR program in California, we find that DR parameter errors are large. For most facilities, observed DR parameter variability is likely explained by baseline model error, not real DR parameter variability; however, a number of facilities exhibit real DR parameter variability. In some cases, the aggregate population of C&I facilities exhibits real DR parameter variability, resulting in implications for the system operator with respect to both resource planning and system stability.

  6. Baseline requirements of the proposed action for the Transportation Management Division routing models

    SciTech Connect (OSTI)

    Johnson, P.E.; Joy, D.S.

    1995-02-01

    The potential impacts associated with the transportation of hazardous materials are important to shippers, carriers, and the general public. This is particularly true for shipments of radioactive material. The shippers are primarily concerned with safety, security, efficiency, and equipment requirements. The carriers are concerned with the potential impact that radioactive shipments may have on their operations--particularly if such materials are involved in an accident. The general public has also expressed concerns regarding the safety of transporting radioactive and other hazardous materials through their communities. Because transportation routes are a central concern in hazardous material transport, the prediction of likely routes is the first step toward resolution of these issues. In response to these routing needs, several models have been developed over the past fifteen years at Oak Ridge National Laboratory (ORNL). The HIGHWAY routing model is used to predict routes for truck transportation, the INTERLINE routing model is used to predict both rail and barge routes, and the AIRPORT locator model is used to determine airports with specified criteria near a specific location. As part of the ongoing improvement of the US Department of Energy`s (DOE) Environmental Management Transportation Management Division`s (EM-261) computer systems and development efforts, a Baseline Requirements Assessment Session on the HIGHWAY, INTERLINE, and AIRPORT models was held at ORNL on April 27, 1994. The purpose of this meeting was to discuss the existing capabilities of the models and data bases and to review enhancements of the models and data bases to expand their usefulness. The results of the Baseline Requirements Assessment Section will be discussed in this report. The discussions pertaining to the different models are contained in separate sections.

  7. Understanding the Effect of Baseline Modeling Implementation Choices on Analysis of Demand Response Performance

    SciTech Connect (OSTI)

    University of California, Berkeley; Addy, Nathan; Kiliccote, Sila; Mathieu, Johanna; Callaway, Duncan S.

    2012-06-13

    Accurate evaluation of the performance of buildings participating in Demand Response (DR) programs is critical to the adoption and improvement of these programs. Typically, we calculate load sheds during DR events by comparing observed electric demand against counterfactual predictions made using statistical baseline models. Many baseline models exist and these models can produce different shed calculations. Moreover, modelers implementing the same baseline model can make different modeling implementation choices, which may affect shed estimates. In this work, using real data, we analyze the effect of different modeling implementation choices on shed predictions. We focused on five issues: weather data source, resolution of data, methods for determining when buildings are occupied, methods for aligning building data with temperature data, and methods for power outage filtering. Results indicate sensitivity to the weather data source and data filtration methods as well as an immediate potential for automation of methods to choose building occupied modes.

  8. computational modeling of biological systems

    E-Print Network [OSTI]

    $author.value

    Faculty. Faculty listing for "computational modeling of biological systems" ... Research Interests: computational modeling of biological systems.

  9. Development of Baseline Monthly Utility Models for Fort Hood, Texas 

    E-Print Network [OSTI]

    Reddy, T. A.; Saman, N. F.; Claridge, D. E.; Haberl, J. S.; Turner, W. D.; Chalifoux, A.

    1996-01-01

    ) + f3 c * DD(r J (3) where DD (r ) are the degree-days to the base r, and the subscripts c and h stand for cooling and heating respectively. Note that eqs. (1) and (2) represent a model with three regression parameters, i.e, a 3-P model, while eq...

  10. Development of Baseline Monthly Utility Models for Fort Hood, Texas 

    E-Print Network [OSTI]

    Reddy, T. A.; Saman, N. F.; Claridge, D. E.; Haberl, J. S.; Turner, W. D.; Chalifoux, A.

    1996-01-01

    -available weather data for Temple, Texas covered only through May 1994. In view of the objectives of this study, it was decided to limit the present analysis at the cantonment area level from January 1989 to December 1993 data only. the effects of parameters... of the presence of functional discontinuities, called "change points". A widely adopted convention is to refer to a single variable model with, say, three parameters as a 3-P SV model. This study will limit itself to SV models only, and consequently the term...

  11. Numerical Modeling of CIGS Solar Cells: Definition of the Baseline and

    E-Print Network [OSTI]

    Sites, James R.

    Thesis Numerical Modeling of CIGS Solar Cells: Definition of the Baseline and Explanation our supervision by Markus Gloeckler entitled "Numerical Modeling of CIGS Solar Cells: Definition. A three-layer structure, simulating a Cu(InGa)Se2 (CIGS) heterojunction solar cell, was set up using

  12. NUMERICAL MODELING OF CIGS AND CdTe SOLAR CELLS: SETTING THE BASELINE

    E-Print Network [OSTI]

    Sites, James R.

    NUMERICAL MODELING OF CIGS AND CdTe SOLAR CELLS: SETTING THE BASELINE M. Gloeckler, A important complications that are often found in experimental CIGS and CdTe solar cells. 1. INTRODUCTION Numerical modeling of polycrystalline thin-film solar cells is an important strategy to test the viability

  13. Dixie Valley Engineered Geothermal System Exploration Methodology Project, Baseline Conceptual Model Report

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Iovenitti, Joe

    The Engineered Geothermal System (EGS) Exploration Methodology Project is developing an exploration approach for EGS through the integration of geoscientific data. The Project chose the Dixie Valley Geothermal System in Nevada as a field laboratory site for methodlogy calibration purposes because, in the public domain, it is a highly characterized geothermal systems in the Basin and Range with a considerable amount of geoscience and most importantly, well data. This Baseline Conceptual Model report summarizes the results of the first three project tasks (1) collect and assess the existing public domain geoscience data, (2) design and populate a GIS database, and (3) develop a baseline (existing data) geothermal conceptual model, evaluate geostatistical relationships, and generate baseline, coupled EGS favorability/trust maps from +1km above sea level (asl) to -4km asl for the Calibration Area (Dixie Valley Geothermal Wellfield) to identify EGS drilling targets at a scale of 5km x 5km. It presents (1) an assessment of the readily available public domain data and some proprietary data provided by Terra-Gen Power, LLC, (2) a re-interpretation of these data as required, (3) an exploratory geostatistical data analysis, (4) the baseline geothermal conceptual model, and (5) the EGS favorability/trust mapping. The conceptual model presented applies to both the hydrothermal system and EGS in the Dixie Valley region.

  14. Dixie Valley Engineered Geothermal System Exploration Methodology Project, Baseline Conceptual Model Report

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Iovenitti, Joe

    2013-05-15

    The Engineered Geothermal System (EGS) Exploration Methodology Project is developing an exploration approach for EGS through the integration of geoscientific data. The Project chose the Dixie Valley Geothermal System in Nevada as a field laboratory site for methodlogy calibration purposes because, in the public domain, it is a highly characterized geothermal systems in the Basin and Range with a considerable amount of geoscience and most importantly, well data. This Baseline Conceptual Model report summarizes the results of the first three project tasks (1) collect and assess the existing public domain geoscience data, (2) design and populate a GIS database, and (3) develop a baseline (existing data) geothermal conceptual model, evaluate geostatistical relationships, and generate baseline, coupled EGS favorability/trust maps from +1km above sea level (asl) to -4km asl for the Calibration Area (Dixie Valley Geothermal Wellfield) to identify EGS drilling targets at a scale of 5km x 5km. It presents (1) an assessment of the readily available public domain data and some proprietary data provided by Terra-Gen Power, LLC, (2) a re-interpretation of these data as required, (3) an exploratory geostatistical data analysis, (4) the baseline geothermal conceptual model, and (5) the EGS favorability/trust mapping. The conceptual model presented applies to both the hydrothermal system and EGS in the Dixie Valley region.

  15. Develop baseline computational model for proactive welding stress

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    AFDC Printable Version Share this resource Send a link to EERE: Alternative Fuels Data Center Home Page to someone by E-mail Share EERE: Alternative Fuels Data Center Home Page on Facebook Tweet about EERE: Alternative Fuels Data Center Home Page on Twitter Bookmark EERE: Alternative Fuels Data Center Home Page on Google Bookmark EERE: Alternative Fuels Data Center Home Page on Delicious Rank EERE:FinancingPetroleum Based| Department8, 20153DanielthroughDeterminingmanagement to suppress helium

  16. DiFX: A software correlator for very long baseline interferometry using multi-processor computing environments

    E-Print Network [OSTI]

    A. T. Deller; S. J. Tingay; M. Bailes; C. West

    2007-02-06

    We describe the development of an FX style correlator for Very Long Baseline Interferometry (VLBI), implemented in software and intended to run in multi-processor computing environments, such as large clusters of commodity machines (Beowulf clusters) or computers specifically designed for high performance computing, such as multi-processor shared-memory machines. We outline the scientific and practical benefits for VLBI correlation, these chiefly being due to the inherent flexibility of software and the fact that the highly parallel and scalable nature of the correlation task is well suited to a multi-processor computing environment. We suggest scientific applications where such an approach to VLBI correlation is most suited and will give the best returns. We report detailed results from the Distributed FX (DiFX) software correlator, running on the Swinburne supercomputer (a Beowulf cluster of approximately 300 commodity processors), including measures of the performance of the system. For example, to correlate all Stokes products for a 10 antenna array, with an aggregate bandwidth of 64 MHz per station and using typical time and frequency resolution presently requires of order 100 desktop-class compute nodes. Due to the effect of Moore's Law on commodity computing performance, the total number and cost of compute nodes required to meet a given correlation task continues to decrease rapidly with time. We show detailed comparisons between DiFX and two existing hardware-based correlators: the Australian Long Baseline Array (LBA) S2 correlator, and the NRAO Very Long Baseline Array (VLBA) correlator. In both cases, excellent agreement was found between the correlators. Finally, we describe plans for the future operation of DiFX on the Swinburne supercomputer, for both astrophysical and geodetic science.

  17. Results from baseline tests of the SPRE I and comparison with code model predictions

    SciTech Connect (OSTI)

    Cairelli, J.E.; Geng, S.M.; Skupinski, R.C.

    1994-09-01

    The Space Power Research Engine (SPRE), a free-piston Stirling engine with linear alternator, is being tested at the NASA Lewis Research Center as part of the Civil Space Technology Initiative (CSTI) as a candidate for high capacity space power. This paper presents results of base-line engine tests at design and off-design operating conditions. The test results are compared with code model predictions.

  18. Typologies of Computation and Computational Models

    E-Print Network [OSTI]

    Mark Burgin; Gordana Dodig-Crnkovic

    2013-12-09

    We need much better understanding of information processing and computation as its primary form. Future progress of new computational devices capable of dealing with problems of big data, internet of things, semantic web, cognitive robotics and neuroinformatics depends on the adequate models of computation. In this article we first present the current state of the art through systematization of existing models and mechanisms, and outline basic structural framework of computation. We argue that defining computation as information processing, and given that there is no information without (physical) representation, the dynamics of information on the fundamental level is physical/ intrinsic/ natural computation. As a special case, intrinsic computation is used for designed computation in computing machinery. Intrinsic natural computation occurs on variety of levels of physical processes, containing the levels of computation of living organisms (including highly intelligent animals) as well as designed computational devices. The present article offers a typology of current models of computation and indicates future paths for the advancement of the field; both by the development of new computational models and by learning from nature how to better compute using different mechanisms of intrinsic computation.

  19. Dixie Valley Engineered Geothermal System Exploration Methodology Project, Baseline Conceptual Model Report

    SciTech Connect (OSTI)

    Iovenitti, Joe

    2014-01-02

    The Engineered Geothermal System (EGS) Exploration Methodology Project is developing an exploration approach for EGS through the integration of geoscientific data. The Project chose the Dixie Valley Geothermal System in Nevada as a field laboratory site for methodology calibration purposes because, in the public domain, it is a highly characterized geothermal system in the Basin and Range with a considerable amount of geoscience and most importantly, well data. The overall project area is 2500km2 with the Calibration Area (Dixie Valley Geothermal Wellfield) being about 170km2. The project was subdivided into five tasks (1) collect and assess the existing public domain geoscience data; (2) design and populate a GIS database; (3) develop a baseline (existing data) geothermal conceptual model, evaluate geostatistical relationships, and generate baseline, coupled EGS favorability/trust maps from +1km above sea level (asl) to -4km asl for the Calibration Area at 0.5km intervals to identify EGS drilling targets at a scale of 5km x 5km; (4) collect new geophysical and geochemical data, and (5) repeat Task 3 for the enhanced (baseline + new ) data. Favorability maps were based on the integrated assessment of the three critical EGS exploration parameters of interest: rock type, temperature and stress. A complimentary trust map was generated to compliment the favorability maps to graphically illustrate the cumulative confidence in the data used in the favorability mapping. The Final Scientific Report (FSR) is submitted in two parts with Part I describing the results of project Tasks 1 through 3 and Part II covering the results of project Tasks 4 through 5 plus answering nine questions posed in the proposal for the overall project. FSR Part I presents (1) an assessment of the readily available public domain data and some proprietary data provided by Terra-Gen Power, LLC, (2) a re-interpretation of these data as required, (3) an exploratory geostatistical data analysis, (4) the baseline geothermal conceptual model, and (5) the EGS favorability/trust mapping. The conceptual model presented applies to both the hydrothermal system and EGS in the Dixie Valley region. FSR Part II presents (1) 278 new gravity stations; (2) enhanced gravity-magnetic modeling; (3) 42 new ambient seismic noise survey stations; (4) an integration of the new seismic noise data with a regional seismic network; (5) a new methodology and approach to interpret this data; (5) a novel method to predict rock type and temperature based on the newly interpreted data; (6) 70 new magnetotelluric (MT) stations; (7) an integrated interpretation of the enhanced MT data set; (8) the results of a 308 station soil CO2 gas survey; (9) new conductive thermal modeling in the project area; (10) new convective modeling in the Calibration Area; (11) pseudo-convective modeling in the Calibration Area; (12) enhanced data implications and qualitative geoscience correlations at three scales (a) Regional, (b) Project, and (c) Calibration Area; (13) quantitative geostatistical exploratory data analysis; and (14) responses to nine questions posed in the proposal for this investigation. Enhanced favorability/trust maps were not generated because there was not a sufficient amount of new, fully-vetted (see below) rock type, temperature, and stress data. The enhanced seismic data did generate a new method to infer rock type and temperature. However, in the opinion of the Principal Investigator for this project, this new methodology needs to be tested and evaluated at other sites in the Basin and Range before it is used to generate the referenced maps. As in the baseline conceptual model, the enhanced findings can be applied to both the hydrothermal system and EGS in the Dixie Valley region.

  20. Dixie Valley Engineered Geothermal System Exploration Methodology Project, Baseline Conceptual Model Report

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Iovenitti, Joe

    FSR Part I presents (1) an assessment of the readily available public domain data and some proprietary data provided by Terra-Gen Power, LLC, (2) a re-interpretation of these data as required, (3) an exploratory geostatistical data analysis, (4) the baseline geothermal conceptual model, and (5) the EGS favorability/trust mapping. The conceptual model presented applies to both the hydrothermal system and EGS in the Dixie Valley region. FSR Part II presents (1) 278 new gravity stations; (2) enhanced gravity-magnetic modeling; (3) 42 new ambient seismic noise survey stations; (4) an integration of the new seismic noise data with a regional seismic network; (5) a new methodology and approach to interpret this data; (5) a novel method to predict rock type and temperature based on the newly interpreted data; (6) 70 new magnetotelluric (MT) stations; (7) an integrated interpretation of the enhanced MT data set; (8) the results of a 308 station soil CO2 gas survey; (9) new conductive thermal modeling in the project area; (10) new convective modeling in the Calibration Area; (11) pseudo-convective modeling in the Calibration Area; (12) enhanced data implications and qualitative geoscience correlations at three scales (a) Regional, (b) Project, and (c) Calibration Area; (13) quantitative geostatistical exploratory data analysis; and (14) responses to nine questions posed in the proposal for this investigation. Enhanced favorability/trust maps were not generated because there was not a sufficient amount of new, fully-vetted (see below) rock type, temperature, and stress data. The enhanced seismic data did generate a new method to infer rock type and temperature. However, in the opinion of the Principal Investigator for this project, this new methodology needs to be tested and evaluated at other sites in the Basin and Range before it is used to generate the referenced maps. As in the baseline conceptual model, the enhanced findings can be applied to both the hydrothermal system and EGS in the Dixie Valley region.

  1. Dixie Valley Engineered Geothermal System Exploration Methodology Project, Baseline Conceptual Model Report

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Iovenitti, Joe

    2014-01-02

    FSR Part I presents (1) an assessment of the readily available public domain data and some proprietary data provided by Terra-Gen Power, LLC, (2) a re-interpretation of these data as required, (3) an exploratory geostatistical data analysis, (4) the baseline geothermal conceptual model, and (5) the EGS favorability/trust mapping. The conceptual model presented applies to both the hydrothermal system and EGS in the Dixie Valley region. FSR Part II presents (1) 278 new gravity stations; (2) enhanced gravity-magnetic modeling; (3) 42 new ambient seismic noise survey stations; (4) an integration of the new seismic noise data with a regional seismic network; (5) a new methodology and approach to interpret this data; (5) a novel method to predict rock type and temperature based on the newly interpreted data; (6) 70 new magnetotelluric (MT) stations; (7) an integrated interpretation of the enhanced MT data set; (8) the results of a 308 station soil CO2 gas survey; (9) new conductive thermal modeling in the project area; (10) new convective modeling in the Calibration Area; (11) pseudo-convective modeling in the Calibration Area; (12) enhanced data implications and qualitative geoscience correlations at three scales (a) Regional, (b) Project, and (c) Calibration Area; (13) quantitative geostatistical exploratory data analysis; and (14) responses to nine questions posed in the proposal for this investigation. Enhanced favorability/trust maps were not generated because there was not a sufficient amount of new, fully-vetted (see below) rock type, temperature, and stress data. The enhanced seismic data did generate a new method to infer rock type and temperature. However, in the opinion of the Principal Investigator for this project, this new methodology needs to be tested and evaluated at other sites in the Basin and Range before it is used to generate the referenced maps. As in the baseline conceptual model, the enhanced findings can be applied to both the hydrothermal system and EGS in the Dixie Valley region.

  2. Baseline for Climate Change: Modeling Watershed Aquatic Biodiversity Relative to Environmental and Anthropogenic Factors

    SciTech Connect (OSTI)

    Maurakis, Eugene G

    2010-10-01

    Objectives of the two-year study were to (1) establish baselines for fish and macroinvertebrate community structures in two mid-Atlantic lower Piedmont watersheds (Quantico Creek, a pristine forest watershed; and Cameron Run, an urban watershed, Virginia) that can be used to monitor changes relative to the impacts related to climate change in the future; (2) create mathematical expressions to model fish species richness and diversity, and macroinvertebrate taxa and macroinvertebrate functional feeding group taxa richness and diversity that can serve as a baseline for future comparisons in these and other watersheds in the mid-Atlantic region; and (3) heighten people’s awareness, knowledge and understanding of climate change and impacts on watersheds in a laboratory experience and interactive exhibits, through internship opportunities for undergraduate and graduate students, a week-long teacher workshop, and a website about climate change and watersheds. Mathematical expressions modeled fish and macroinvertebrate richness and diversity accurately well during most of the six thermal seasons where sample sizes were robust. Additionally, hydrologic models provide the basis for estimating flows under varying meteorological conditions and landscape changes. Continuations of long-term studies are requisite for accurately teasing local human influences (e.g. urbanization and watershed alteration) from global anthropogenic impacts (e.g. climate change) on watersheds. Effective and skillful translations (e.g. annual potential exposure of 750,000 people to our inquiry-based laboratory activities and interactive exhibits in Virginia) of results of scientific investigations are valuable ways of communicating information to the general public to enhance their understanding of climate change and its effects in watersheds.

  3. Statistical Analysis of Baseline Load Models for Non-Residential Buildings

    E-Print Network [OSTI]

    Coughlin, Katie

    2012-01-01

    M. Potter, The Demand Response Baseline, v.1.75, EnerNOC OPSand Techniques for Demand Response, Lawrence BerkeleyS. Kilicotte, Estimating Demand Response Load Impacts:

  4. Description of Model Data for SNL100-00: The Sandia 100-meter All-glass Baseline Wind Turbine

    E-Print Network [OSTI]

    D. Todd Griffith, Brian R. Resor Sandia National Laboratories Wind and Water Power TechnologiesDescription of Model Data for SNL100-00: The Sandia 100-meter All-glass Baseline Wind Turbine Blade version and date, description, etc). A summary of the blade model data is also provided from the design

  5. MIT Integrated Global System Model (IGSM) Version 2: Model Description and Baseline Evaluation

    E-Print Network [OSTI]

    Sokolov, Andrei P.

    The MIT Integrated Global System Model (IGSM) is designed for analyzing the global environmental changes that may result from anthropogenic causes, quantifying the uncertainties associated with the projected changes, and ...

  6. Renewable Diesel from Algal Lipids: An Integrated Baseline for Cost, Emissions, and Resource Potential from a Harmonized Model

    SciTech Connect (OSTI)

    Davis, R.; Fishman, D.; Frank, E. D.; Wigmosta, M. S.; Aden, A.; Coleman, A. M.; Pienkos, P. T.; Skaggs, R. J.; Venteris, E. R.; Wang, M. Q.

    2012-06-01

    The U.S. Department of Energy's Biomass Program has begun an initiative to obtain consistent quantitative metrics for algal biofuel production to establish an 'integrated baseline' by harmonizing and combining the Program's national resource assessment (RA), techno-economic analysis (TEA), and life-cycle analysis (LCA) models. The baseline attempts to represent a plausible near-term production scenario with freshwater microalgae growth, extraction of lipids, and conversion via hydroprocessing to produce a renewable diesel (RD) blendstock. Differences in the prior TEA and LCA models were reconciled (harmonized) and the RA model was used to prioritize and select the most favorable consortium of sites that supports production of 5 billion gallons per year of RD. Aligning the TEA and LCA models produced slightly higher costs and emissions compared to the pre-harmonized results. However, after then applying the productivities predicted by the RA model (13 g/m2/d on annual average vs. 25 g/m2/d in the original models), the integrated baseline resulted in markedly higher costs and emissions. The relationship between performance (cost and emissions) and either productivity or lipid fraction was found to be non-linear, and important implications on the TEA and LCA results were observed after introducing seasonal variability from the RA model. Increasing productivity and lipid fraction alone was insufficient to achieve cost and emission targets; however, combined with lower energy, less expensive alternative technology scenarios, emissions and costs were substantially reduced.

  7. Computational Models for Understanding Weather

    E-Print Network [OSTI]

    Muraki, David J.

    Computational Models for Understanding Weather Mathematics for Atmospheric Science http://weather-S migration Dutton Conway zonal jetstream in unstable weather 6 #12;Baroclinic Instability Vortices

  8. Modeling of Electric Water Heaters for Demand Response: A Baseline PDE Model

    SciTech Connect (OSTI)

    Xu, Zhijie; Diao, Ruisheng; Lu, Shuai; Lian, Jianming; Zhang, Yu

    2014-09-05

    Demand response (DR)control can effectively relieve balancing and frequency regulation burdens on conventional generators, facilitate integrating more renewable energy, and reduce generation and transmission investments needed to meet peak demands. Electric water heaters (EWHs) have a great potential in implementing DR control strategies because: (a) the EWH power consumption has a high correlation with daily load patterns; (b) they constitute a significant percentage of domestic electrical load; (c) the heating element is a resistor, without reactive power consumption; and (d) they can be used as energy storage devices when needed. Accurately modeling the dynamic behavior of EWHs is essential for designing DR controls. Various water heater models, simplified to different extents, were published in the literature; however, few of them were validated against field measurements, which may result in inaccuracy when implementing DR controls. In this paper, a partial differential equation physics-based model, developed to capture detailed temperature profiles at different tank locations, is validated against field test data for more than 10 days. The developed model shows very good performance in capturing water thermal dynamics for benchmark testing purposes

  9. Improved computer models support genetics research

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Simple computer models unravel genetic stress reactions in cells Simple computer models unravel genetic stress reactions in cells Integrated biological and computational methods...

  10. Energy Analysis, Baselining and Modeling of Prairie View A&M University 

    E-Print Network [OSTI]

    Abushakra, B.; Haberl, J. S.; Claridge, D. E.; Eggebrecht, J.; Carlson, K. A.

    1998-01-01

    Analysis of the available data found that electricity savings in the J.B. Coleman Library for June - September, 1998 were 298 MWh, or 38% of the baseline consumption during these months. Extrapolation of these savings to a full year leads would...

  11. A baseline model for utility bill analysis using both weather and non-weather-related variables

    SciTech Connect (OSTI)

    Sonderegger, R.C. [SRC Systems, Inc., Berkeley, CA (United States)

    1998-12-31

    Many utility bill analyses in the literature rely only on weather-based correlations. While often the dominant cause of seasonal variations in utility consumption, weather variables are far from the only determinant factors. Vacation shutdowns, plug creep, changes in building operation and square footage, and plain poor correlation are all too familiar to the practicing performance contractor. This paper presents a generalized baseline equation, consistent with prior results by others but extended to include other, non-weather-related independent variables. Its compatibility with extensive prior research by others is shown, as well as its application to several types of facilities. The baseline equation, as presented, can accommodate up to five simultaneous independent variables for a maximum of eight free parameters. The use of two additional, empirical degree-day threshold parameters is also discussed. The baseline equation presented here is at the base of a commercial utility accounting software program. All case studies presented to illustrate the development of the baseline equation for each facility are drawn from real-life studies performed by users of this program.

  12. LANL computer model boosts engine efficiency

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    LANL computer model boosts engine efficiency LANL computer model boosts engine efficiency The KIVA model has been instrumental in helping researchers and manufacturers understand...

  13. Commercial Building Energy Baseline Modeling Software: Performance Metrics and Method Testing with Open Source Models and Implications for Proprietary Software Testing

    SciTech Connect (OSTI)

    Price, Phillip N.; Granderson, Jessica; Sohn, Michael; Addy, Nathan; Jump, David

    2013-09-01

    The overarching goal of this work is to advance the capabilities of technology evaluators in evaluating the building-level baseline modeling capabilities of Energy Management and Information System (EMIS) software. Through their customer engagement platforms and products, EMIS software products have the potential to produce whole-building energy savings through multiple strategies: building system operation improvements, equipment efficiency upgrades and replacements, and inducement of behavioral change among the occupants and operations personnel. Some offerings may also automate the quantification of whole-building energy savings, relative to a baseline period, using empirical models that relate energy consumption to key influencing parameters, such as ambient weather conditions and building operation schedule. These automated baseline models can be used to streamline the whole-building measurement and verification (M&V) process, and therefore are of critical importance in the context of multi-measure whole-building focused utility efficiency programs. This report documents the findings of a study that was conducted to begin answering critical questions regarding quantification of savings at the whole-building level, and the use of automated and commercial software tools. To evaluate the modeling capabilities of EMIS software particular to the use case of whole-building savings estimation, four research questions were addressed: 1. What is a general methodology that can be used to evaluate baseline model performance, both in terms of a) overall robustness, and b) relative to other models? 2. How can that general methodology be applied to evaluate proprietary models that are embedded in commercial EMIS tools? How might one handle practical issues associated with data security, intellectual property, appropriate testing ‘blinds’, and large data sets? 3. How can buildings be pre-screened to identify those that are the most model-predictable, and therefore those whose savings can be calculated with least error? 4. What is the state of public domain models, that is, how well do they perform, and what are the associated implications for whole-building measurement and verification (M&V)? Additional project objectives that were addressed as part of this study include: (1) clarification of the use cases and conditions for baseline modeling performance metrics, benchmarks and evaluation criteria, (2) providing guidance for determining customer suitability for baseline modeling, (3) describing the portfolio level effects of baseline model estimation errors, (4) informing PG&E’s development of EMIS technology product specifications, and (5) providing the analytical foundation for future studies about baseline modeling and saving effects of EMIS technologies. A final objective of this project was to demonstrate the application of the methodology, performance metrics, and test protocols with participating EMIS product vendors.

  14. Improved computer models support genetics research

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    February Simple computer models unravel genetic stress reactions in cells Simple computer models unravel genetic stress reactions in cells Integrated biological and...

  15. Computer Graphics Hierarchical Modeling

    E-Print Network [OSTI]

    for sta9c models, and vital for anima9on! · Key is to structure the transforma9 Transforma9ons Translate base by (5,0,0) Translate lower arm by (5,0,0) Translate upper arm by (5,0,0) Translate hammer by (5,0,0) ... q Each part

  16. Challenges for the CMS computing model in the first year

    SciTech Connect (OSTI)

    Fisk, I.; /Fermilab

    2009-05-01

    CMS is in the process of commissioning a complex detector and a globally distributed computing infrastructure simultaneously. This represents a unique challenge. Even at the beginning there is not sufficient analysis or organized processing resources at CERN alone. In this presentation we discuss the unique computing challenges CMS expects to face during the first year of running and how they influence the baseline computing model decisions. During the early accelerator commissioning periods, CMS will attempt to collect as many events as possible when the beam is on in order to provide adequate early commissioning data. Some of these plans involve overdriving the Tier-0 infrastructure during data collection with recovery when the beam is off. In addition to the larger number of triggered events, there will be pressure in the first year to collect and analyze more complete data formats as the summarized formats mature. The large event formats impact the required storage, bandwidth, and processing capacity across all the computing centers. While the understanding of the detector and the event selections is being improved, there will likely be a larger number of reconstruction passes and skims performed by both central operations and individual users. We discuss how these additional stresses impact the allocation of resources and the changes from the baseline computing model.

  17. Cosmic Logic: a Computational Model

    E-Print Network [OSTI]

    Vitaly Vanchurin

    2015-07-05

    We describe a simple computational model of cosmic logic suitable for analysis of, for example, discretized cosmological systems. The construction is based on a particular model of computation, developed by Alan Turing, with cosmic observers (CO), cosmic measures (CM) and cosmic symmetries (CS) described by Turing machines. CO machines always start with a blank tape and CM machines take CO's Turing number (also known as description number or G{\\" o}del number) as input and output the corresponding probability. Similarly, CS machines take CO's Turing number as input, but output either one if the CO machines are in the same equivalence class or zero otherwise. We argue that CS machines are more fundamental than CM machines and, thus, should be used as building blocks in constructing CM machines. We prove the non-computability of a CS machine which discriminates between two classes of CO machines: mortal that halts in finite time and immortal that runs forever. In context of eternal inflation this result implies that it is impossible to construct CM machines to compute probabilities using cut-off prescriptions or that all of the cut-off measures are non-computable.

  18. Parallel computing in enterprise modeling.

    SciTech Connect (OSTI)

    Goldsby, Michael E.; Armstrong, Robert C.; Shneider, Max S.; Vanderveen, Keith; Ray, Jaideep; Heath, Zach; Allan, Benjamin A.

    2008-08-01

    This report presents the results of our efforts to apply high-performance computing to entity-based simulations with a multi-use plugin for parallel computing. We use the term 'Entity-based simulation' to describe a class of simulation which includes both discrete event simulation and agent based simulation. What simulations of this class share, and what differs from more traditional models, is that the result sought is emergent from a large number of contributing entities. Logistic, economic and social simulations are members of this class where things or people are organized or self-organize to produce a solution. Entity-based problems never have an a priori ergodic principle that will greatly simplify calculations. Because the results of entity-based simulations can only be realized at scale, scalable computing is de rigueur for large problems. Having said that, the absence of a spatial organizing principal makes the decomposition of the problem onto processors problematic. In addition, practitioners in this domain commonly use the Java programming language which presents its own problems in a high-performance setting. The plugin we have developed, called the Parallel Particle Data Model, overcomes both of these obstacles and is now being used by two Sandia frameworks: the Decision Analysis Center, and the Seldon social simulation facility. While the ability to engage U.S.-sized problems is now available to the Decision Analysis Center, this plugin is central to the success of Seldon. Because Seldon relies on computationally intensive cognitive sub-models, this work is necessary to achieve the scale necessary for realistic results. With the recent upheavals in the financial markets, and the inscrutability of terrorist activity, this simulation domain will likely need a capability with ever greater fidelity. High-performance computing will play an important part in enabling that greater fidelity.

  19. Assessing the economic impact of indirect liquefaction process improvements: Volume 1, Development of the integrated indirect liquefaction model and baseline case

    SciTech Connect (OSTI)

    Gray, D.; Tomlinson, G.C. (Mitre Corp., McLean, VA (USA). Civil Systems Div.)

    1990-10-01

    This report documents the development of an integrated indirect liquefaction system model, which processes input coal to refined liquid products, and the model's application in the analysis of a baseline case. The baseline case uses Shell gasification of coal followed by gas cleaning to produce a clean synthesis gas for slurry-phase Fischer-Tropsch synthesis. The raw liquid products are refined to produce gasoline and diesel. Costs of liquid products have been estimated for the baseline plant. The model also alloys many sensitivity studies to be performed so that the economic impacts of research and development advances can be quantified. When used in this manner, the model can provide research guidance for future indirect liquefaction studies. 18 refs., 12 figs., 12 tabs.

  20. Quantum computation beyond the circuit model

    E-Print Network [OSTI]

    Jordan, Stephen Paul

    2008-01-01

    The quantum circuit model is the most widely used model of quantum computation. It provides both a framework for formulating quantum algorithms and an architecture for the physical construction of quantum computers. However, ...

  1. Cosmic Logic: a Computational Model

    E-Print Network [OSTI]

    Vanchurin, Vitaly

    2015-01-01

    We describe a simple computational model of cosmic logic suitable for analysis of, for example, discretized cosmological systems. The construction is based on a particular model of computation, developed by Alan Turing, with cosmic observers (CO), cosmic measures (CM) and cosmic symmetries (CS) described by Turing machines. CO machines always start with a blank tape and CM machines take CO's Turing number (also known as description number or G{\\" o}del number) as input and output the corresponding probability. Similarly, CS machines take CO's Turing number as input, but output either one if the CO machines are in the same equivalence class or zero otherwise. We argue that CS machines are more fundamental than CM machines and, thus, should be used as building blocks in constructing CM machines. We prove the non-computability of a CS machine which discriminates between two classes of CO machines: mortal that halts in finite time and immortal that runs forever. In context of eternal inflation this result implies...

  2. Hydropower Baseline Cost Modeling

    SciTech Connect (OSTI)

    O'Connor, Patrick W.; Zhang, Qin Fen; DeNeale, Scott T.; Chalise, Dol Raj; Centurion, Emma E.

    2015-01-01

    Recent resource assessments conducted by the United States Department of Energy have identified significant opportunities for expanding hydropower generation through the addition of power to non-powered dams and on undeveloped stream-reaches. Additional interest exists in the powering of existing water resource infrastructure such as conduits and canals, upgrading and expanding existing hydropower facilities, and the construction new pumped storage hydropower. Understanding the potential future role of these hydropower resources in the nation’s energy system requires an assessment of the environmental and techno-economic issues associated with expanding hydropower generation. To facilitate these assessments, this report seeks to fill the current gaps in publically available hydropower cost-estimating tools that can support the national-scale evaluation of hydropower resources.

  3. BASELINE DESIGN/ECONOMICS FOR ADVANCED FISCHER-TROPSCH TECHNOLOGY

    SciTech Connect (OSTI)

    1998-04-01

    Bechtel, along with Amoco as the main subcontractor, developed a Baseline design, two alternative designs, and computer process simulation models for indirect coal liquefaction based on advanced Fischer-Tropsch (F-T) technology for the U. S. Department of Energy's (DOE's) Federal Energy Technology Center (FETC).

  4. Cupola Furnace Computer Process Model

    SciTech Connect (OSTI)

    Seymour Katz

    2004-12-31

    The cupola furnace generates more than 50% of the liquid iron used to produce the 9+ million tons of castings annually. The cupola converts iron and steel into cast iron. The main advantages of the cupola furnace are lower energy costs than those of competing furnaces (electric) and the ability to melt less expensive metallic scrap than the competing furnaces. However the chemical and physical processes that take place in the cupola furnace are highly complex making it difficult to operate the furnace in optimal fashion. The results are low energy efficiency and poor recovery of important and expensive alloy elements due to oxidation. Between 1990 and 2004 under the auspices of the Department of Energy, the American Foundry Society and General Motors Corp. a computer simulation of the cupola furnace was developed that accurately describes the complex behavior of the furnace. When provided with the furnace input conditions the model provides accurate values of the output conditions in a matter of seconds. It also provides key diagnostics. Using clues from the diagnostics a trained specialist can infer changes in the operation that will move the system toward higher efficiency. Repeating the process in an iterative fashion leads to near optimum operating conditions with just a few iterations. More advanced uses of the program have been examined. The program is currently being combined with an ''Expert System'' to permit optimization in real time. The program has been combined with ''neural network'' programs to affect very easy scanning of a wide range of furnace operation. Rudimentary efforts were successfully made to operate the furnace using a computer. References to these more advanced systems will be found in the ''Cupola Handbook''. Chapter 27, American Foundry Society, Des Plaines, IL (1999).

  5. Modeling Computer Viruses MSc Thesis (Afstudeerscriptie)

    E-Print Network [OSTI]

    Amsterdam, University of

    Modeling Computer Viruses MSc Thesis (Afstudeerscriptie) written by Luite Menno Pieter van Zelst About half a year ago, Alban Ponse, my thesis supervisor, suggested that the topic of `computer viruses indus- try and the creators of computer viruses. After all, the anti-virus industry stands to lose a lot

  6. Climate Modeling using High-Performance Computing

    SciTech Connect (OSTI)

    Mirin, A A

    2007-02-05

    The Center for Applied Scientific Computing (CASC) and the LLNL Climate and Carbon Science Group of Energy and Environment (E and E) are working together to improve predictions of future climate by applying the best available computational methods and computer resources to this problem. Over the last decade, researchers at the Lawrence Livermore National Laboratory (LLNL) have developed a number of climate models that provide state-of-the-art simulations on a wide variety of massively parallel computers. We are now developing and applying a second generation of high-performance climate models. Through the addition of relevant physical processes, we are developing an earth systems modeling capability as well.

  7. Modelling energy efficiency for computation

    E-Print Network [OSTI]

    Reams, Charles

    2012-11-13

    . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 130 10 List of Tables 3.1 Energy usage breakdown for computing equipment in the United States. Energy figures are in billions of kWh. . . . . . . . . . . . . . . . . . . . . . 40 5.1 Average solution time... understanding of the underlying behavioural properties will inevitably lead to improvements in the practicality of NTC, and practical NTC-purposed cores have now been constructed; for example, the Phoenix processor, which operates in the near-threshold region...

  8. Computer aided nuclear reactor modeling 

    E-Print Network [OSTI]

    Warraich, Khalid Sarwar

    1995-01-01

    Nuclear reactor modeling is an important activity that lets us analyze existing as well as proposed systems for safety, correct operation, etc. The quality of a analysis is directly proportional to the quality of the model used. In this work we look...

  9. Usage of videomosaic for computed aided analysis of North Sea hard bottom underwater video for baseline study of offshore windmill park

    E-Print Network [OSTI]

    New Hampshire, University of

    on such extreme high-energy coast. To determine possible environmental impact of this project, baseline study/Joint Hydrographic Center, University of New Hampshire, USA * Corresponding author e-mail: aleks@corpi.ku.lt Windmill park on the open North Sea coast at Hävsul area in Norway is one of the first in the world to be build

  10. Regional weather modeling on parallel computers.

    SciTech Connect (OSTI)

    Baillie, C.; Michalakes, J.; Skalin, R.; Mathematics and Computer Science; NOAA Forecast Systems Lab.; Norwegian Meteorological Inst.

    1997-01-01

    This special issue on 'regional weather models' complements the October 1995 special issue on 'climate and weather modeling', which focused on global models. In this introduction we review the similarities and differences between regional and global atmospheric models. Next, the structure of regional models is described and we consider how the basic algorithms applied in these models influence the parallelization strategy. Finally, we give a brief overview of the eight articles in this issue and discuss some remaining challenges in the area of adapting regional weather models to parallel computers.

  11. Computationally Efficient Modeling of High-Efficiency Clean Combustion...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    & Publications Computationally Efficient Modeling of High-Efficiency Clean Combustion Engines Computationally Efficient Modeling of High-Efficiency Clean Combustion Engines...

  12. Modeling-Computer Simulations At Fish Lake Valley Area (Deymonaz...

    Open Energy Info (EERE)

    Modeling-Computer Simulations At Fish Lake Valley Area (Deymonaz, Et Al., 2008) Jump to: navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer...

  13. HIV virus spread and evolution studied through computer modeling

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    HIV and evolution studied through computer modeling HIV virus spread and evolution studied through computer modeling This approach distinguishes between susceptible and infected...

  14. Modeling-Computer Simulations At Nw Basin & Range Region (Pritchett...

    Open Energy Info (EERE)

    Modeling-Computer Simulations At Nw Basin & Range Region (Pritchett, 2004) Jump to: navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer...

  15. Modeling-Computer Simulations At Nw Basin & Range Region (Biasi...

    Open Energy Info (EERE)

    Modeling-Computer Simulations At Nw Basin & Range Region (Biasi, Et Al., 2009) Jump to: navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Modeling-Computer...

  16. Computer modeling reveals how surprisingly potent hepatitis C...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Hepatitis C computer modeling Computer modeling reveals how surprisingly potent hepatitis C drug works A study reveals how daclatasvir targets one of its proteins and causes the...

  17. A Computational Model for Adaptive Emotion Regulation

    E-Print Network [OSTI]

    Treur, Jan

    A Computational Model for Adaptive Emotion Regulation Tibor Bosse, Matthijs Pontier, and Jan Treur} Abstract. Emotion regulation describes how a subject can use certain strategies to affect emotion response levels. Usually, models for emotion regulation as- sume mechanisms based on feedback loops that indicate

  18. Computationally Efficient Model for Dopant Precipitation Kinetics

    E-Print Network [OSTI]

    Dunham, Scott

    Computationally Efficient Model for Dopant Precipitation Kinetics Iuval Clejan and Scott T. Dunham and precipitates. Dopant deactivation is typically modeled using a steady­state solid solubility or clustering such as dopant activation/deactivation, it is essential to consider the fact that precipitation involves a range

  19. Mechanistic Models in Computational Social Science

    E-Print Network [OSTI]

    Holme, Petter

    2015-01-01

    Quantitative social science is not only about regression analysis or, in general, data inference. Computer simulations of social mechanisms have an over 60 years long history. They have been used for many different purposes -- to test scenarios, to test the consistency of descriptive theories (proof-of-concept models), to explore emerging phenomena, for forecasting, etc. In this essay, we sketch these historical developments, the role of mechanistic models in the social sciences and the influences from natural and formal sciences. We argue that mechanistic computational models form a natural common ground for social and natural sciences, and look forward to possible future information flow across the social-natural divide.

  20. Decision Model for Cloud Computing

    E-Print Network [OSTI]

    Kondo, Derrick

    with different pricing models for cost-cutting, resource-hungry users. Second, prices can differ dynamically (as Grenoble, France 1 #12;Trade-offs Supercomputers Performance Reliability Cost ($) low high high high Instances · "Spot" instance price varies dynamically · Spot instance provided when user's bid is greater

  1. California Baseline Energy Demands to 2050 for Advanced Energy Pathways

    E-Print Network [OSTI]

    McCarthy, Ryan; Yang, Christopher; Ogden, Joan M.

    2008-01-01

    diesel). The baseline scenario is based upon the output of the CalCars model with the middle fuel price

  2. CDF computing and event data models

    SciTech Connect (OSTI)

    Snider, F.D.; /Fermilab

    2005-12-01

    The authors discuss the computing systems, usage patterns and event data models used to analyze Run II data from the CDF-II experiment at the Tevatron collider. A critical analysis of the current implementation and design reveals some of the stronger and weaker elements of the system, which serve as lessons for future experiments. They highlight a need to maintain simplicity for users in the face of an increasingly complex computing environment.

  3. GFDL's ESM2 Global Coupled ClimateCarbon Earth System Models. Part I: Physical Formulation and Baseline Simulation Characteristics

    E-Print Network [OSTI]

    Wittenberg, Andrew

    GFDL's ESM2 Global Coupled Climate­Carbon Earth System Models. Part I: Physical Formulation coupled carbon­climate Earth System Models, ESM2M and ESM2G, are described. These models demonstrate al. 2002; Delworth et al. 2006). Our approach has been to de- velop two Earth System Models

  4. Direct coal liquefaction baseline design and system analysis

    SciTech Connect (OSTI)

    Not Available

    1991-04-01

    The primary objective of the study is to develop a computer model for a base line direct coal liquefaction design based on two stage direct coupled catalytic reactors. This primary objective is to be accomplished by completing the following: a base line design based on previous DOE/PETC results from Wilsonville pilot plant and other engineering evaluations; a cost estimate and economic analysis; a computer model incorporating the above two steps over a wide range of capacities and selected process alternatives; a comprehensive training program for DOE/PETC Staff to understand and use the computer model; a thorough documentation of all underlying assumptions for baseline economics; and a user manual and training material which will facilitate updating of the model in the future.

  5. Direct coal liquefaction baseline design and system analysis

    SciTech Connect (OSTI)

    Not Available

    1991-07-01

    The primary objective of the study is to develop a computer model for a base line direct coal liquefaction design based on two stage direct coupled catalytic reactors. This primary objective is to be accomplished by completing the following: a base line design based on previous DOE/PETC results from Wilsonville pilot plant and other engineering evaluations; a cost estimate and economic analysis; a computer model incorporating the above two steps over a wide range of capacities and selected process alternatives; a comprehensive training program for DOE/PETC Staff to understand and use the computer model; a thorough documentation of all underlying assumptions for baseline economics; and a user manual and training material which will facilitate updating of the model in the future.

  6. Optimization of neuron models using grid computing

    E-Print Network [OSTI]

    Vella, Mike

    2011-09-09

    stream_source_info Mike_Vella.ppt.txt stream_content_type text/plain stream_size 2634 Content-Encoding UTF-8 stream_name Mike_Vella.ppt.txt Content-Type text/plain; charset=UTF-8 Optimization of neuron models using grid computing... channels Set of information to be included in a model is large Single neuron multi-compartment models Why optimize? Single neuron models provide a basis for understanding cell and local circuit function Maximal conductances, compartment capacitances...

  7. Modeling Computations in a Semantic Network

    E-Print Network [OSTI]

    Marko A. Rodriguez; Johan Bollen

    2007-05-31

    Semantic network research has seen a resurgence from its early history in the cognitive sciences with the inception of the Semantic Web initiative. The Semantic Web effort has brought forth an array of technologies that support the encoding, storage, and querying of the semantic network data structure at the world stage. Currently, the popular conception of the Semantic Web is that of a data modeling medium where real and conceptual entities are related in semantically meaningful ways. However, new models have emerged that explicitly encode procedural information within the semantic network substrate. With these new technologies, the Semantic Web has evolved from a data modeling medium to a computational medium. This article provides a classification of existing computational modeling efforts and the requirements of supporting technologies that will aid in the further growth of this burgeoning domain.

  8. Low-Order Modelling of Blade-Induced Turbulence for RANS Actuator Disk Computations of Wind and Tidal Turbines

    E-Print Network [OSTI]

    Nishino, Takafumi

    2012-01-01

    Modelling of turbine blade-induced turbulence (BIT) is discussed within the framework of three-dimensional Reynolds-averaged Navier-Stokes (RANS) actuator disk computations. We first propose a generic (baseline) BIT model, which is applied only to the actuator disk surface, does not include any model coefficients (other than those used in the original RANS turbulence model) and is expected to be valid in the limiting case where BIT is fully isotropic and in energy equilibrium. The baseline model is then combined with correction functions applied to the region behind the disk to account for the effect of rotor tip vortices causing a mismatch of Reynolds shear stress between short- and long-time averaged flow fields. Results are compared with wake measurements of a two-bladed wind turbine model of Medici and Alfredsson [Wind Energy, Vol. 9, 2006, pp. 219-236] to demonstrate the capability of the new model.

  9. High performance computing and numerical modelling

    E-Print Network [OSTI]

    ,

    2014-01-01

    Numerical methods play an ever more important role in astrophysics. This is especially true in theoretical works, but of course, even in purely observational projects, data analysis without massive use of computational methods has become unthinkable. The key utility of computer simulations comes from their ability to solve complex systems of equations that are either intractable with analytic techniques or only amenable to highly approximative treatments. Simulations are best viewed as a powerful complement to analytic reasoning, and as the method of choice to model systems that feature enormous physical complexity such as star formation in evolving galaxies, the topic of this 43rd Saas Fee Advanced Course. The organizers asked me to lecture about high performance computing and numerical modelling in this winter school, and to specifically cover the basics of numerically treating gravity and hydrodynamics in the context of galaxy evolution. This is still a vast field, and I necessarily had to select a subset ...

  10. Significant Enhancement of Computational Efficiency in Nonlinear Multiscale Battery Model for Computer Aided Engineering (Presentation)

    SciTech Connect (OSTI)

    Kim, G.; Pesaran, A.; Smith, K.; Graf, P.; Jun, M.; Yang, C.; Li, G.; Li, S.; Hochman, A.; Tselepidakis, D.; White, J.

    2014-06-01

    This presentation discusses the significant enhancement of computational efficiency in nonlinear multiscale battery model for computer aided engineering in current research at NREL.

  11. Sensitivity Analysis Methodology for a Complex System Computational Model

    E-Print Network [OSTI]

    1 Sensitivity Analysis Methodology for a Complex System Computational Model James J. Filliben of computational models to serve as predictive surrogates for the system. The use of such models increasingly) of a computational model for a complex system is always an essential component in accepting/rejecting such a model

  12. Computational Modeling of Self-organization of Dislocations and...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Computational Modeling of Self-organization of Dislocations and Mesoscale Deformation of Metals Event Sponsor: Mathematics and Computing Science - LANS Seminar Start Date: Jun 19...

  13. Computer Modelling of 3D Geological Surface

    E-Print Network [OSTI]

    Kodge, B G

    2011-01-01

    The geological surveying presently uses methods and tools for the computer modeling of 3D-structures of the geographical subsurface and geotechnical characterization as well as the application of geoinformation systems for management and analysis of spatial data, and their cartographic presentation. The objectives of this paper are to present a 3D geological surface model of Latur district in Maharashtra state of India. This study is undertaken through the several processes which are discussed in this paper to generate and visualize the automated 3D geological surface model of a projected area.

  14. Wild Fire Computer Model Helps Firefighters

    SciTech Connect (OSTI)

    Canfield, Jesse

    2012-09-04

    A high-tech computer model called HIGRAD/FIRETEC, the cornerstone of a collaborative effort between U.S. Forest Service Rocky Mountain Research Station and Los Alamos National Laboratory, provides insights that are essential for front-line fire fighters. The science team is looking into levels of bark beetle-induced conditions that lead to drastic changes in fire behavior and how variable or erratic the behavior is likely to be.

  15. Wild Fire Computer Model Helps Firefighters

    ScienceCinema (OSTI)

    Canfield, Jesse

    2014-06-02

    A high-tech computer model called HIGRAD/FIRETEC, the cornerstone of a collaborative effort between U.S. Forest Service Rocky Mountain Research Station and Los Alamos National Laboratory, provides insights that are essential for front-line fire fighters. The science team is looking into levels of bark beetle-induced conditions that lead to drastic changes in fire behavior and how variable or erratic the behavior is likely to be.

  16. COMPUTATIONAL MODELING OF CIRCULATING FLUIDIZED BED REACTORS

    SciTech Connect (OSTI)

    Ibrahim, Essam A

    2013-01-09

    Details of numerical simulations of two-phase gas-solid turbulent flow in the riser section of Circulating Fluidized Bed Reactor (CFBR) using Computational Fluid Dynamics (CFD) technique are reported. Two CFBR riser configurations are considered and modeled. Each of these two riser models consist of inlet, exit, connecting elbows and a main pipe. Both riser configurations are cylindrical and have the same diameter but differ in their inlet lengths and main pipe height to enable investigation of riser geometrical scaling effects. In addition, two types of solid particles are exploited in the solid phase of the two-phase gas-solid riser flow simulations to study the influence of solid loading ratio on flow patterns. The gaseous phase in the two-phase flow is represented by standard atmospheric air. The CFD-based FLUENT software is employed to obtain steady state and transient solutions for flow modulations in the riser. The physical dimensions, types and numbers of computation meshes, and solution methodology utilized in the present work are stated. Flow parameters, such as static and dynamic pressure, species velocity, and volume fractions are monitored and analyzed. The differences in the computational results between the two models, under steady and transient conditions, are compared, contrasted, and discussed.

  17. Bond Computing Systems: a Biologically Inspired and High-level Dynamics Model for Pervasive Computing

    E-Print Network [OSTI]

    Dang, Zhe

    Bond Computing Systems: a Biologically Inspired and High-level Dynamics Model for Pervasive. Targeting at modeling the high-level dynamics of pervasive comput- ing systems, we introduce Bond Computing are regular, and study their computation power and verification problems. Among other results, we show

  18. Transportation Baseline Report

    SciTech Connect (OSTI)

    Fawcett, Ricky Lee; Kramer, George Leroy Jr.

    1999-12-01

    The National Transportation Program 1999 Transportation Baseline Report presents data that form a baseline to enable analysis and planning for future Department of Energy (DOE) Environmental Management (EM) waste and materials transportation. In addition, this Report provides a summary overview of DOE’s projected quantities of waste and materials for transportation. Data presented in this report were gathered as a part of the IPABS Spring 1999 update of the EM Corporate Database and are current as of July 30, 1999. These data were input and compiled using the Analysis and Visualization System (AVS) which is used to update all stream-level components of the EM Corporate Database, as well as TSD System and programmatic risk (disposition barrier) information. Project (PBS) and site-level IPABS data are being collected through the Interim Data Management System (IDMS). The data are presented in appendices to this report.

  19. 7. Business Models LearningsfromfoundingaComputerVisionStartup

    E-Print Network [OSTI]

    Solem, Jan Erik

    7. Business Models #12;LearningsfromfoundingaComputerVisionStartup Flickr:dystopos How are you models ! ! (not only technology) #12;LearningsfromfoundingaComputerVisionStartup Auction business model! Bricks and clicks business model! Collective business models! Component business model! Cutting out

  20. Modeling-Computer Simulations At San Juan Volcanic Field Area...

    Open Energy Info (EERE)

    Area Exploration Technique Modeling-Computer Simulations Activity Date Usefulness useful DOE-funding Unknown Notes In this study we combine thermal maturation models, based on the...

  1. Regional Energy Baseline 

    E-Print Network [OSTI]

    Kim, H.; Baltazar, J.C.; Haberl, J.

    2011-01-01

    -09-02 REGIONAL ENERGY BASELINE (1960 ~ 2009) 0 100 200 300 400 500 600 700 800 1960 1965 1970 1975 1980 1985 1990 1995 2000 2005 To tal En erg y U se pe r C ap ita (M MB tu) Year Total Energy Use per Capita (1960-2009) US... SEEC 12-States TX Hyojin Kim Juan-Carlos Baltazar, Ph.D. Jeff S. Haberl, Ph.D., P.E. September 2011 ENERGY SYSTEMS LABORATORY Texas Engineering Experiment Station Texas A&M University System 1960-2009 Regional Energy...

  2. Computational models of intergroup competition and warfare.

    SciTech Connect (OSTI)

    Letendre, Kenneth (University of New Mexico); Abbott, Robert G.

    2011-11-01

    This document reports on the research of Kenneth Letendre, the recipient of a Sandia Graduate Research Fellowship at the University of New Mexico. Warfare is an extreme form of intergroup competition in which individuals make extreme sacrifices for the benefit of their nation or other group to which they belong. Among animals, limited, non-lethal competition is the norm. It is not fully understood what factors lead to warfare. We studied the global variation in the frequency of civil conflict among countries of the world, and its positive association with variation in the intensity of infectious disease. We demonstrated that the burden of human infectious disease importantly predicts the frequency of civil conflict and tested a causal model for this association based on the parasite-stress theory of sociality. We also investigated the organization of social foraging by colonies of harvester ants in the genus Pogonomyrmex, using both field studies and computer models.

  3. Robust Resource Allocations in Parallel Computing Systems: Model and Heuristics

    E-Print Network [OSTI]

    Maciejewski, Anthony A. "Tony"

    Robust Resource Allocations in Parallel Computing Systems: Model and Heuristics Vladimir Shestak1 in parallel computer systems (including heterogeneous clusters) should be allocated to the computational was supported by the Colorado State University Center for Robustness in Computer Systems (funded by the Colorado

  4. Bond Computing Systems: a Biologically Inspired and High-level Dynamics Model for Pervasive Computing

    E-Print Network [OSTI]

    Dang, Zhe

    Bond Computing Systems: a Biologically Inspired and High-level Dynamics Model for Pervasive their com- putation power and verification problems. Among other results, we show that the computing power) techniques for pervasive computing systems. At a high-level, there are at least two views in modeling

  5. Preliminary Phase Field Computational Model Development

    SciTech Connect (OSTI)

    Li, Yulan; Hu, Shenyang Y.; Xu, Ke; Suter, Jonathan D.; McCloy, John S.; Johnson, Bradley R.; Ramuhalli, Pradeep

    2014-12-15

    This interim report presents progress towards the development of meso-scale models of magnetic behavior that incorporate microstructural information. Modeling magnetic signatures in irradiated materials with complex microstructures (such as structural steels) is a significant challenge. The complexity is addressed incrementally, using the monocrystalline Fe (i.e., ferrite) film as model systems to develop and validate initial models, followed by polycrystalline Fe films, and by more complicated and representative alloys. In addition, the modeling incrementally addresses inclusion of other major phases (e.g., martensite, austenite), minor magnetic phases (e.g., carbides, FeCr precipitates), and minor nonmagnetic phases (e.g., Cu precipitates, voids). The focus of the magnetic modeling is on phase-field models. The models are based on the numerical solution to the Landau-Lifshitz-Gilbert equation. From the computational standpoint, phase-field modeling allows the simulation of large enough systems that relevant defect structures and their effects on functional properties like magnetism can be simulated. To date, two phase-field models have been generated in support of this work. First, a bulk iron model with periodic boundary conditions was generated as a proof-of-concept to investigate major loop effects of single versus polycrystalline bulk iron and effects of single non-magnetic defects. More recently, to support the experimental program herein using iron thin films, a new model was generated that uses finite boundary conditions representing surfaces and edges. This model has provided key insights into the domain structures observed in magnetic force microscopy (MFM) measurements. Simulation results for single crystal thin-film iron indicate the feasibility of the model for determining magnetic domain wall thickness and mobility in an externally applied field. Because the phase-field model dimensions are limited relative to the size of most specimens used in experiments, special experimental methods were devised to create similar boundary conditions in the iron films. Preliminary MFM studies conducted on single and polycrystalline iron films with small sub-areas created with focused ion beam have correlated quite well qualitatively with phase-field simulations. However, phase-field model dimensions are still small relative to experiments thus far. We are in the process of increasing the size of the models and decreasing specimen size so both have identical dimensions. Ongoing research is focused on validation of the phase-field model. Validation is being accomplished through comparison with experimentally obtained MFM images (in progress), and planned measurements of major hysteresis loops and first order reversal curves. Extrapolation of simulation sizes to represent a more stochastic bulk-like system will require sampling of various simulations (i.e., with single non-magnetic defect, single magnetic defect, single grain boundary, single dislocation, etc.) with distributions of input parameters. These outputs can then be compared to laboratory magnetic measurements and ultimately to simulate magnetic Barkhausen noise signals.

  6. Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems

    E-Print Network [OSTI]

    Olshausen, Bruno

    Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems Peter Dayan: Computational and Mathematical Modeling of Neural Systems. The emergence of this book represents more than- scientist was brought up on ``Kandel and Schwartz.'' Now, at last, the field of computational neuroscience

  7. What can computational models tell us about face processing?

    E-Print Network [OSTI]

    Cottrell, Garrison W.

    What can computational models tell us about face processing? Garrison W. Cottrell Gary about face processing? Garrison W. Cottrell Gary's Unbelievable Research Unit (GURU) Computer Science, Lingyun Zhang What can computational models tell us about face processing? Garrison W. Cottrell Gary

  8. Multi-epoch very long baseline interferometric observations of the nuclear starburst region of NGC 253: Improved modeling of the supernova and star formation rates

    SciTech Connect (OSTI)

    Rampadarath, H.; Morgan, J. S.; Tingay, S. J.; Lenc, E.

    2014-01-01

    The results of multi-epoch observations of the southern starburst galaxy, NGC 253, with the Australian Long Baseline Array at 2.3 GHz are presented. As with previous radio interferometric observations of this galaxy, no new sources were discovered. By combining the results of this survey with Very Large Array observations at higher frequencies from the literature, spectra were derived and a free-free absorption model was fitted of 20 known sources in NGC 253. The results were found to be consistent with previous studies. The supernova remnant, 5.48-43.3, was imaged with the highest sensitivity and resolution to date, revealing a two-lobed morphology. Comparisons with previous observations of similar resolution give an upper limit of 10{sup 4} km s{sup –1} for the expansion speed of this remnant. We derive a supernova rate of <0.2 yr{sup –1} for the inner 300 pc using a model that improves on previous methods by incorporating an improved radio supernova peak luminosity distribution and by making use of multi-wavelength radio data spanning 21 yr. A star formation rate of SFR(M ? 5 M {sub ?}) < 4.9 M {sub ?} yr{sup –1} was also estimated using the standard relation between supernova and star formation rates. Our improved estimates of supernova and star formation rates are consistent with studies at other wavelengths. The results of our study point to the possible existence of a small population of undetected supernova remnants, suggesting a low rate of radio supernova production in NGC 253.

  9. Accelerating Bayesian inference in computationally expensive computer models using local and global approximations

    E-Print Network [OSTI]

    Conrad, Patrick Raymond

    2014-01-01

    Computational models of complex phenomena are an important resource for scientists and engineers. However, many state-of-the-art simulations of physical systems are computationally expensive to evaluate and are black ...

  10. Final technical report for DOE Computational Nanoscience Project: Integrated Multiscale Modeling of Molecular Computing Devices

    SciTech Connect (OSTI)

    Cummings, P. T.

    2010-02-08

    This document reports the outcomes of the Computational Nanoscience Project, "Integrated Multiscale Modeling of Molecular Computing Devices". It includes a list of participants and publications arising from the research supported.

  11. Computational systems biology and in silico modeling of the

    E-Print Network [OSTI]

    Borenstein, Elhanan

    Computational systems biology and in silico modeling of the human microbiome Elhanan Borenstein Professor at the Santa Fe Institute. His research interests include computational and evolutionary systems is a complex biological system with numerous interacting components across multiple organizational levels

  12. Integration of engineering models in computer-aided preliminary design

    E-Print Network [OSTI]

    Lajoie, Ronnie M.

    The problems of the integration of engineering models in computer-aided preliminary design are reviewed. This paper details the research, development, and testing of modifications to Paper Airplane, a LISP-based computer ...

  13. Predictive Capability Maturity Model for computational modeling and simulation.

    SciTech Connect (OSTI)

    Oberkampf, William Louis; Trucano, Timothy Guy; Pilch, Martin M.

    2007-10-01

    The Predictive Capability Maturity Model (PCMM) is a new model that can be used to assess the level of maturity of computational modeling and simulation (M&S) efforts. The development of the model is based on both the authors experience and their analysis of similar investigations in the past. The perspective taken in this report is one of judging the usefulness of a predictive capability that relies on the numerical solution to partial differential equations to better inform and improve decision making. The review of past investigations, such as the Software Engineering Institute's Capability Maturity Model Integration and the National Aeronautics and Space Administration and Department of Defense Technology Readiness Levels, indicates that a more restricted, more interpretable method is needed to assess the maturity of an M&S effort. The PCMM addresses six contributing elements to M&S: (1) representation and geometric fidelity, (2) physics and material model fidelity, (3) code verification, (4) solution verification, (5) model validation, and (6) uncertainty quantification and sensitivity analysis. For each of these elements, attributes are identified that characterize four increasing levels of maturity. Importantly, the PCMM is a structured method for assessing the maturity of an M&S effort that is directed toward an engineering application of interest. The PCMM does not assess whether the M&S effort, the accuracy of the predictions, or the performance of the engineering system satisfies or does not satisfy specified application requirements.

  14. Computer Virus Propagation Models Giuseppe Serazzi and Stefano Zanero

    E-Print Network [OSTI]

    Zanero, Stefano

    Computer Virus Propagation Models Giuseppe Serazzi and Stefano Zanero Dipartimento di Elettronica e.zanero@polimi.it Abstract. The availability of reliable models of computer virus propa- gation would prove useful. In this pa- per, we review the most popular models of virus propagation, analyzing the underlying assumptions

  15. Department of Computing CSP||B modelling for railway verification

    E-Print Network [OSTI]

    Doran, Simon J.

    University of Surrey Department of Computing Computing Sciences Report CS-12-03 CSP||B modelling Schneider Helen Treharne March 30th 2012 #12;CSP||B modelling for railway verification: the double junction work in verifying railway systems through CSP k B modelling and analysis. In particular we consider

  16. Computational modeling of composite material fires.

    SciTech Connect (OSTI)

    Brown, Alexander L.; Erickson, Kenneth L.; Hubbard, Joshua Allen; Dodd, Amanda B.

    2010-10-01

    Composite materials behave differently from conventional fuel sources and have the potential to smolder and burn for extended time periods. As the amount of composite materials on modern aircraft continues to increase, understanding the response of composites in fire environments becomes increasingly important. An effort is ongoing to enhance the capability to simulate composite material response in fires including the decomposition of the composite and the interaction with a fire. To adequately model composite material in a fire, two physical model development tasks are necessary; first, the decomposition model for the composite material and second, the interaction with a fire. A porous media approach for the decomposition model including a time dependent formulation with the effects of heat, mass, species, and momentum transfer of the porous solid and gas phase is being implemented in an engineering code, ARIA. ARIA is a Sandia National Laboratories multiphysics code including a range of capabilities such as incompressible Navier-Stokes equations, energy transport equations, species transport equations, non-Newtonian fluid rheology, linear elastic solid mechanics, and electro-statics. To simulate the fire, FUEGO, also a Sandia National Laboratories code, is coupled to ARIA. FUEGO represents the turbulent, buoyantly driven incompressible flow, heat transfer, mass transfer, and combustion. FUEGO and ARIA are uniquely able to solve this problem because they were designed using a common architecture (SIERRA) that enhances multiphysics coupling and both codes are capable of massively parallel calculations, enhancing performance. The decomposition reaction model is developed from small scale experimental data including thermogravimetric analysis (TGA) and Differential Scanning Calorimetry (DSC) in both nitrogen and air for a range of heating rates and from available data in the literature. The response of the composite material subject to a radiant heat flux boundary condition is examined to study the propagation of decomposition fronts of the epoxy and carbon fiber and their dependence on the ambient conditions such as oxygen concentration, surface flow velocity, and radiant heat flux. In addition to the computational effort, small scaled experimental efforts to attain adequate data used to validate model predictions is ongoing. The goal of this paper is to demonstrate the progress of the capability for a typical composite material and emphasize the path forward.

  17. Disruptive technology business models in cloud computing

    E-Print Network [OSTI]

    Krikos, Alexis Christopher

    2010-01-01

    Cloud computing, a term whose origins have been in existence for more than a decade, has come into fruition due to technological capabilities and marketplace demands. Cloud computing can be defined as a scalable and flexible ...

  18. Modeling-Computer Simulations At Nevada Test And Training Range...

    Open Energy Info (EERE)

    Modeling-Computer Simulations At Nevada Test And Training Range Area (Sabin, Et Al., 2004) Jump to: navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity:...

  19. Modeling-Computer Simulations At Fenton Hill HDR Geothermal Area...

    Open Energy Info (EERE)

    Modeling-Computer Simulations At Fenton Hill HDR Geothermal Area (Brown & DuTeaux, 1997) Jump to: navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity:...

  20. Modeling-Computer Simulations At Walker-Lane Transitional Zone...

    Open Energy Info (EERE)

    Pritchett, 2004) Exploration Activity Details Location Walker-Lane Transition Zone Geothermal Region Exploration Technique Modeling-Computer Simulations Activity Date Usefulness...

  1. New partnership uses advanced computer science modeling to address...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    partnership uses advanced computer science modeling to address climate change | National Nuclear Security Administration Facebook Twitter Youtube Flickr RSS People Mission Managing...

  2. COMPUTER MODELING OF NUCLIDE ADSORPTION ON GEOLOGIC MATERIALS

    E-Print Network [OSTI]

    Silva, R.J.

    2010-01-01

    11899 G COMPUTER MODELING OF NUCLIDE ADSORPTION ON GEOLOGICdefined as the amount of nuclide adsorbed per gram of claydivided by the amount of nuclide per milliliter of solution,

  3. Modeling-Computer Simulations At Central Nevada Seismic Zone...

    Open Energy Info (EERE)

    Modeling-Computer Simulations At Central Nevada Seismic Zone Region (Pritchett, 2004) Jump to: navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity:...

  4. Modeling-Computer Simulations At Central Nevada Seismic Zone...

    Open Energy Info (EERE)

    Modeling-Computer Simulations At Central Nevada Seismic Zone Region (Biasi, Et Al., 2009) Jump to: navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity:...

  5. MaRIE theory, modeling and computation roadmap executive summary...

    Office of Scientific and Technical Information (OSTI)

    discovery, with theory and high performance computing, itself co-designed by constrained optimization of hardware and software, and experiments. MaRIE's theory, modeling, and...

  6. Modeling-Computer Simulations At Stillwater Area (Wisian & Blackwell...

    Open Energy Info (EERE)

    Exploration Activity Details Location Stillwater Area Exploration Technique Modeling-Computer Simulations Activity Date Usefulness not indicated DOE-funding Unknown References...

  7. Modeling-Computer Simulations At Desert Peak Area (Wisian & Blackwell...

    Open Energy Info (EERE)

    Exploration Activity Details Location Desert Peak Area Exploration Technique Modeling-Computer Simulations Activity Date Usefulness not indicated DOE-funding Unknown References...

  8. Baseline design/economics for advanced Fischer-Tropsch technology. Quarterly report, April--June 1992

    SciTech Connect (OSTI)

    Not Available

    1992-10-01

    Effective September 26, 1991, Bechtel, with Amoco as the main subcontractor, initiated a study to develop a computer model and baseline design for advanced Fischer-Tropsch (F-T) technology for the US Department of Energy`s Pittsburgh Energy Technology Center (PETC). The objectives of the study are to: Develop a baseline design for indirect liquefaction using advanced F-T technology; prepare the capital and operating costs for the baseline design; and develop a process flow sheet simulation (PI-S) model. The baseline design, the economic analysis, and the computer model win be the major research planning tools that PETC will use to plan, guide, and evaluate its ongoing and future research and commercialization programs relating to indirect coal liquefaction. for the manufacture of synthetic liquid fuels from coal. This report is Bechtel`s third quarterly technical progress report covering the period from March 16, 1992 through June 21, 1992. This report consists of seven sections: Section 1 - introduction; Section 2 - summary; Section 3 - carbon dioxide removal tradeoff study; Section 4 - preliminary plant designs for coal preparation; Section 5 - preliminary design for syngas production; Section 6 - Task 3 - engineering design criteria; and Section 7 - project management.

  9. Baseline Control Measures.pdf

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Individual Permit Baseline Control Measures at Los Alamos National Laboratory, Poster, Individual Permit for Storm Water, NPDES Permit No. NM0030759 Author(s): Veenis, Steven J....

  10. Parameter Discovery for Stochastic Computational Models in Systems Biology Using Bayesian Model

    E-Print Network [OSTI]

    Parameter Discovery for Stochastic Computational Models in Systems Biology Using Bayesian Model--Parameterized probabilistic complex computational (P2 C2 ) models are increasingly used in computational systems biology to study biochemical and physiological systems. A key challenge is to build mechanistic P2 C2 models

  11. "Creating computational models of biological systems to better combat

    E-Print Network [OSTI]

    Zhigilei, Leonid V.

    "Creating computational models of biological systems to better combat dangerous pathogens and human of Biomedical Engineering University of Virginia Charlottesville, VA 434.924.8195 Computational Systems Biology system in biofuel and nutraceutical production. With the aid of computational techniques, we can predict

  12. A framework for modelling trojans and computer virus infection

    E-Print Network [OSTI]

    Cairns, Paul

    A framework for modelling trojans and computer virus infection Harold Thimbleby1 , Stuart Anderson2 world, including the possibility of Trojan Horse programs and computer viruses, as simply a finite realisation of a Turing Machine. We consider the actions of Trojan Horses and viruses in real computer systems

  13. Bytecode unification of geospatial computable models Bytecode unification of geospatial

    E-Print Network [OSTI]

    Köbben, Barend

    heterogeneous to fix and reuse. Field-based and objects-based geospatial models of- ten share common GIS data and object-based data models, and other challenges re- garding synergy of geospatial systems that need to use both types of data models. Keywords: field-based model, object-based model, computability, managed

  14. Towards Real Earth Models --Computational Geophysics on Unstructured Tetrahedral Meshes?

    E-Print Network [OSTI]

    Farquharson, Colin G.

    Towards Real Earth Models -- Computational Geophysics on Unstructured Tetrahedral Meshes? Colin tetrahedral meshes. EM geophysics on unstructured tetrahedral meshes. Disadvantages, difficulties, challenges. Conclusions. #12;Outline: Geological models! Advantages of unstructured tetrahedral meshes. EM geophysics

  15. Applying High Performance Computing to Analyzing by Probabilistic Model Checking

    E-Print Network [OSTI]

    Schneider, Carsten

    Applying High Performance Computing to Analyzing by Probabilistic Model Checking Mobile Cellular on the use of high performance computing in order to analyze with the proba- bilistic model checker PRISM. The Figure Generation Script 22 2 #12;1. Introduction We report in this paper on the use of high performance

  16. Overview of ASC Capability Computing System Governance Model

    SciTech Connect (OSTI)

    Doebling, Scott W. [Los Alamos National Laboratory

    2012-07-11

    This document contains a description of the Advanced Simulation and Computing Program's Capability Computing System Governance Model. Objectives of the Governance Model are to ensure that the capability system resources are allocated on a priority-driven basis according to the Program requirements; and to utilize ASC Capability Systems for the large capability jobs for which they were designed and procured.

  17. An Interactive Computer Model of Two-Country Trade

    E-Print Network [OSTI]

    Hamlen, Kevin W.

    country when free trade is made available. One of the most important learning lessons for the students91 An Interactive Computer Model of Two-Country Trade Bill Hamlen and Kevin Hamlen Abstract We introduce an interactive computer model of two-country trade that allows students to investigate

  18. Los Alamos CCS (Center for Computer Security) formal computer security model

    SciTech Connect (OSTI)

    Dreicer, J.S.; Hunteman, W.J. (Los Alamos National Lab., NM (USA))

    1989-01-01

    This paper provides a brief presentation of the formal computer security model currently being developed at the Los Alamos Department of Energy (DOE) Center for Computer Security (CCS). The initial motivation for this effort was the need to provide a method by which DOE computer security policy implementation could be tested and verified. The actual analytical model was a result of the integration of current research in computer security and previous modeling and research experiences. The model is being developed to define a generic view of the computer and network security domains, to provide a theoretical basis for the design of a security model, and to address the limitations of present models. Formal mathematical models for computer security have been designed and developed in conjunction with attempts to build secure computer systems since the early 70's. The foundation of the Los Alamos DOE CCS model is a series of functionally dependent probability equations, relations, and expressions. The mathematical basis appears to be justified and is undergoing continued discrimination and evolution. We expect to apply the model to the discipline of the Bell-Lapadula abstract sets of objects and subjects. 5 refs.

  19. Error models in quantum computation: an application of model selection

    E-Print Network [OSTI]

    Lucia Schwarz; Steven van Enk

    2013-09-04

    Threshold theorems for fault-tolerant quantum computing assume that errors are of certain types. But how would one detect whether errors of the "wrong" type occur in one's experiment, especially if one does not even know what type of error to look for? The problem is that for many qubits a full state description is impossible to analyze, and a full process description is even more impossible to analyze. As a result, one simply cannot detect all types of errors. Here we show through a quantum state estimation example (on up to 25 qubits) how to attack this problem using model selection. We use, in particular, the Akaike Information Criterion. The example indicates that the number of measurements that one has to perform before noticing errors of the wrong type scales polynomially both with the number of qubits and with the error size.

  20. Computational Modeling of theComputational Modeling of the Vacuum Assisted Resin Transfer MoldingVacuum Assisted Resin Transfer Molding

    E-Print Network [OSTI]

    Grujicic, Mica

    Computational Modeling of theComputational Modeling of the Vacuum Assisted Resin Transfer MoldingVacuum Assisted Resin Transfer Molding (VARTM) Process(VARTM) Process April 2004April 2004 DepartmentMS Thesis Advisor: Dr. Grujicic #12;What is VARTM?What is VARTM? Vacuum Assisted Resin Transfer Molding

  1. Computer Modeling VRF Heat Pumps in Commercial Buildings using EnergyPlus

    SciTech Connect (OSTI)

    Raustad, Richard

    2013-06-01

    Variable Refrigerant Flow (VRF) heat pumps are increasingly used in commercial buildings in the United States. Monitored energy use of field installations have shown, in some cases, savings exceeding 30% compared to conventional heating, ventilating, and air-conditioning (HVAC) systems. A simulation study was conducted to identify the installation or operational characteristics that lead to energy savings for VRF systems. The study used the Department of Energy EnergyPlus? building simulation software and four reference building models. Computer simulations were performed in eight U.S. climate zones. The baseline reference HVAC system incorporated packaged single-zone direct-expansion cooling with gas heating (PSZ-AC) or variable-air-volume systems (VAV with reheat). An alternate baseline HVAC system using a heat pump (PSZ-HP) was included for some buildings to directly compare gas and electric heating results. These baseline systems were compared to a VRF heat pump model to identify differences in energy use. VRF systems combine multiple indoor units with one or more outdoor unit(s). These systems move refrigerant between the outdoor and indoor units which eliminates the need for duct work in most cases. Since many applications install duct work in unconditioned spaces, this leads to installation differences between VRF systems and conventional HVAC systems. To characterize installation differences, a duct heat gain model was included to identify the energy impacts of installing ducts in unconditioned spaces. The configuration of variable refrigerant flow heat pumps will ultimately eliminate or significantly reduce energy use due to duct heat transfer. Fan energy is also studied to identify savings associated with non-ducted VRF terminal units. VRF systems incorporate a variable-speed compressor which may lead to operational differences compared to single-speed compression systems. To characterize operational differences, the computer model performance curves used to simulate cooling operation are also evaluated. The information in this paper is intended to provide a relative difference in system energy use and compare various installation practices that can impact performance. Comparative results of VRF versus conventional HVAC systems include energy use differences due to duct location, differences in fan energy when ducts are eliminated, and differences associated with electric versus fossil fuel type heating systems.

  2. Computing the Electricity Market Equilibrium: Uses of market equilibrium models

    E-Print Network [OSTI]

    Baldick, Ross

    1 Computing the Electricity Market Equilibrium: Uses of market equilibrium models Ross Baldick Abstract--In this paper we consider the formulation and uses of electric- ity market equilibrium models. Keywords--Electricity market, Equilibrium models I. INTRODUCTION Electricity market equilibrium modelling

  3. ComputerScience,TexasA&MUniversity Modeling Heterogeneous User

    E-Print Network [OSTI]

    Loguinov, Dmitri

    1 ComputerScience,TexasA&MUniversity Modeling Heterogeneous User Churn and Local Resilience of Unstructured P2P Networks Modeling Heterogeneous UserModeling Heterogeneous User Churn and Local Resilience ofChurn Churn and Local Resilience of Unstructured P2P Networks Modeling Heterogeneous User

  4. Integrating Numerical Computation into the Modeling Instruction Curriculum

    E-Print Network [OSTI]

    Caballero, Marcos D; Aiken, John M; Douglas, Scott S; Scanlon, Erin M; Thoms, Brian; Schatz, Michael F

    2012-01-01

    We describe a way to introduce physics high school students with no background in programming to computational problem-solving experiences. Our approach builds on the great strides made by the Modeling Instruction reform curriculum. This approach emphasizes the practices of "Developing and using models" and "Computational thinking" highlighted by the NRC K-12 science standards framework. We taught 9th-grade students in a Modeling-Instruction-based physics course to construct computational models using the VPython programming environment. Numerical computation within the Modeling Instruction curriculum provides coherence among the curriculum's different force and motion models, links the various representations which the curriculum employs, and extends the curriculum to include real-world problems that are inaccessible to a purely analytic approach.

  5. 324 Building Baseline Radiological Characterization

    SciTech Connect (OSTI)

    R.J. Reeder, J.C. Cooper

    2010-06-24

    This report documents the analysis of radiological data collected as part of the characterization study performed in 1998. The study was performed to create a baseline of the radiological conditions in the 324 Building.

  6. Ambient temperature modelling with soft computing techniques

    SciTech Connect (OSTI)

    Bertini, Ilaria; Ceravolo, Francesco; Citterio, Marco; Di Pietra, Biagio; Margiotta, Francesca; Pizzuti, Stefano; Puglisi, Giovanni; De Felice, Matteo

    2010-07-15

    This paper proposes a hybrid approach based on soft computing techniques in order to estimate monthly and daily ambient temperature. Indeed, we combine the back-propagation (BP) algorithm and the simple Genetic Algorithm (GA) in order to effectively train artificial neural networks (ANN) in such a way that the BP algorithm initialises a few individuals of the GA's population. Experiments concerned monthly temperature estimation of unknown places and daily temperature estimation for thermal load computation. Results have shown remarkable improvements in accuracy compared to traditional methods. (author)

  7. Computational Modeling and Optimization of Proton Exchange Membrane Fuel Cells

    E-Print Network [OSTI]

    Victoria, University of

    Computational Modeling and Optimization of Proton Exchange Membrane Fuel Cells by Marc Secanell and Optimization of Proton Exchange Membrane Fuel Cells by Marc Secanell Gallart Bachelor in Engineering cells. In this thesis, a computational framework for fuel cell analysis and optimization is presented

  8. Clique-detection Models in Computational Biochemistry and Genomics

    E-Print Network [OSTI]

    Butenko, Sergiy

    Clique-detection Models in Computational Biochemistry and Genomics S. Butenko and W. E. Wilhelm,wilhelm}@tamu.edu Abstract Many important problems arising in computational biochemistry and genomics have been formulated and genomic aspects of the problems as well as to the graph-theoretic aspects of the solution approaches. Each

  9. Inverse Modelling in Geology by Interactive Evolutionary Computation

    E-Print Network [OSTI]

    Boschetti, Fabio

    Inverse Modelling in Geology by Interactive Evolutionary Computation Chris Wijns a,b,, Fabio of geological processes, in the absence of established numerical criteria to act as inversion targets, requires evolutionary computation provides for the inclusion of qualitative geological expertise within a rigorous

  10. Computational load in model physics of the parallel NCAR community climate model

    SciTech Connect (OSTI)

    Michalakes, J.G.; Nanjundiah, R.S.

    1994-11-01

    Maintaining a balance of computational load over processors is a crucial issue in parallel computing. For efficient parallel implementation, complex codes such as climate models need to be analyzed for load imbalances. In the present study we focus on the load imbalances in the physics portion of the community climate model`s (CCM2) distributed-memory parallel implementation on the Intel Touchstone DELTA computer. We note that the major source of load imbalance is the diurnal variation in the computation of solar radiation. Convective weather patterns also cause some load imbalance. Land-ocean contrast is seen to have little effect on computational load in the present version of the model.

  11. The Impact of IBM Cell Technology on the Programming Paradigm in the Context of Computer Systems for Climate and Weather Models

    SciTech Connect (OSTI)

    Zhou, Shujia; Duffy, Daniel; Clune, Thomas; Suarez, Max; Williams, Samuel; Halem, Milton

    2009-01-10

    The call for ever-increasing model resolutions and physical processes in climate and weather models demands a continual increase in computing power. The IBM Cell processor's order-of-magnitude peak performance increase over conventional processors makes it very attractive to fulfill this requirement. However, the Cell's characteristics, 256KB local memory per SPE and the new low-level communication mechanism, make it very challenging to port an application. As a trial, we selected the solar radiation component of the NASA GEOS-5 climate model, which: (1) is representative of column physics components (half the total computational time), (2) has an extremely high computational intensity: the ratio of computational load to main memory transfers, and (3) exhibits embarrassingly parallel column computations. In this paper, we converted the baseline code (single-precision Fortran) to C and ported it to an IBM BladeCenter QS20. For performance, we manually SIMDize four independent columns and include several unrolling optimizations. Our results show that when compared with the baseline implementation running on one core of Intel's Xeon Woodcrest, Dempsey, and Itanium2, the Cell is approximately 8.8x, 11.6x, and 12.8x faster, respectively. Our preliminary analysis shows that the Cell can also accelerate the dynamics component (~;;25percent total computational time). We believe these dramatic performance improvements make the Cell processor very competitive as an accelerator.

  12. COMPUTATIONAL FLUID DYNAMICS MODELING OF SOLID OXIDE FUEL CELLS

    E-Print Network [OSTI]

    COMPUTATIONAL FLUID DYNAMICS MODELING OF SOLID OXIDE FUEL CELLS Ugur Pasaogullari and Chao-dimensional model has been developed to simulate solid oxide fuel cells (SOFC). The model fully couples current density operation. INTRODUCTION Solid oxide fuel cells (SOFC) are among possible candidates

  13. Quantum Computers: Noise Propagation and Adversarial Noise Models

    E-Print Network [OSTI]

    Kalai, Gil

    Quantum Computers: Noise Propagation and Adversarial Noise Models Gil Kalai Hebrew University of Jerusalem and Yale University April 21, 2009 Abstract In this paper we consider adversarial noise models." Detrimental noise is modeled after familiar properties of noise propagation. However, it can have various

  14. Modeling-Computer Simulations At Valles Caldera - Sulphur Springs...

    Open Energy Info (EERE)

    DOE-funding Unknown Notes A computer program capable of two-dimensional modeling of gravity data was used in interpreting gravity observations along profiles A--A' and B--B'...

  15. Modeling-Computer Simulations At Valles Caldera - Redondo Geothermal...

    Open Energy Info (EERE)

    DOE-funding Unknown Notes A computer program capable of two-dimensional modeling of gravity data was used in interpreting gravity observations along profiles A--A' and B--B'...

  16. Computational Model of Forward and Opposed Smoldering Combustion in Microgravity 

    E-Print Network [OSTI]

    Rein, Guillermo; Fernandez-Pello, Carlos; Urban, David

    2006-08-06

    A novel computational model of smoldering combustion capable of predicting both forward and opposed propagation is developed. This is accomplished by considering the one-dimensional, transient, governing equations for ...

  17. Language acquisition and implication for language change: A computational model

    E-Print Network [OSTI]

    Clark, Robert A J

    1997-01-01

    Computer modeling techniques, when applied to language acquisition problems, give an often unrealized insight into the diachronic change that occurs in language over successive generations. This paper shows that using ...

  18. Computer simulation and topological modeling of radiation effects in zircon

    E-Print Network [OSTI]

    Zhang, Yi, 1979-

    2006-01-01

    The purpose of this study is to understand on atomic level the structural response of zircon (ZrSiO4) to irradiation using molecular dynamics (MD) computer simulations, and to develop topological models that can describe ...

  19. 15.094 Systems Optimization: Models and Computation, Spring 2002

    E-Print Network [OSTI]

    Freund, Robert Michael

    A computational and application-oriented introduction to the modeling of large-scale systems in a wide variety of decision-making domains and the optimization of such systems using state-of-the-art optimization software. ...

  20. Computational Modeling for the American Chemical Society | GE...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Computational Modeling for the American Chemical Society Click to email this to a friend (Opens in new window) Share on Facebook (Opens in new window) Click to share (Opens in new...

  1. Computational models of early visual processing layers

    E-Print Network [OSTI]

    Shan, Honghao

    2010-01-01

    prevailing view of retinal processing. However, as discussedsimplified) model of retinal processing. A Retinal Codingretinal coding, the pre-cortical stage of visual processing,

  2. Sandia Energy - Computational Modeling & Simulation

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    AFDC Printable Version Share this resource Send a link to EERE: Alternative Fuels Data Center Home Page to someone by E-mail Share EERE: Alternative Fuels Data Center Home Page on Facebook Tweet about EERE: Alternative Fuels Data Center Home Page on Twitter Bookmark EERE: Alternative Fuels Data Center Homesum_a_epg0_fpd_mmcf_m.xls" ,"Available from WebQuantityBonneville Power AdministrationRobust, High-Throughput Analysis ofSample SULIColin Humphreys Home Colin HumphreysComputational

  3. Sterile Neutrino Fits to Short-Baseline Neutrino Oscillation Measurements

    E-Print Network [OSTI]

    Conrad, Janet

    2013-01-01

    This paper reviews short-baseline oscillation experiments as interpreted within the context of one, two, and three sterile neutrino models associated with additional neutrino mass states in the ~1?eV range. Appearance and ...

  4. Discrete transfinite computation models School of Mathematics,

    E-Print Network [OSTI]

    Welch, Philip

    such as Davies [Davies (2001)], and the models of Beggs and Tucker [Beggs and Tucker (2007)] that attempt' functions or oracles, such as is done in [Beggs et al. (2008)]. We also shall not particularly consider

  5. CARD No. 23 Models and Computer Codes

    E-Print Network [OSTI]

    that are used to demonstrate that the WIPP will comply with the radioactive waste disposal regulations at 40 CFR, and radionuclide transport in the repository and overlying rock formations. Numerical models are then created

  6. Computational model of miniature pulsating heat pipes.

    SciTech Connect (OSTI)

    Martinez, Mario J.; Givler, Richard C.

    2013-01-01

    The modeling work described herein represents Sandia National Laboratories (SNL) portion of a collaborative three-year project with Northrop Grumman Electronic Systems (NGES) and the University of Missouri to develop an advanced, thermal ground-plane (TGP), which is a device, of planar configuration, that delivers heat from a source to an ambient environment with high efficiency. Work at all three institutions was funded by DARPA/MTO; Sandia was funded under DARPA/MTO project number 015070924. This is the final report on this project for SNL. This report presents a numerical model of a pulsating heat pipe, a device employing a two phase (liquid and its vapor) working fluid confined in a closed loop channel etched/milled into a serpentine configuration in a solid metal plate. The device delivers heat from an evaporator (hot zone) to a condenser (cold zone). This new model includes key physical processes important to the operation of flat plate pulsating heat pipes (e.g. dynamic bubble nucleation, evaporation and condensation), together with conjugate heat transfer with the solid portion of the device. The model qualitatively and quantitatively predicts performance characteristics and metrics, which was demonstrated by favorable comparisons with experimental results on similar configurations. Application of the model also corroborated many previous performance observations with respect to key parameters such as heat load, fill ratio and orientation.

  7. Computational Fluid Dynamics (CFD) Modelling on Soot Yield for Fire

    E-Print Network [OSTI]

    Computational Fluid Dynamics (CFD) Modelling on Soot Yield for Fire Engineering Assessment Yong S (CFD) Modelling is now widely used by fire safety engineers throughout the world as a tool of the smoke control design as part of the performance based fire safety design in the current industry

  8. Computational Modeling of Chord Fingering for String Instruments

    E-Print Network [OSTI]

    Radicioni, Daniele

    Computational Modeling of Chord Fingering for String Instruments Daniele P. Radicioni (radicion for the fingering process with string instruments, based on a constraint satisfaction approach. The model is imple-mechanical aspects of the performer's hand in its interaction with the musical instrument. Introduction Music

  9. Personal Computer-Based Model for Cool Storage Performance Simulation 

    E-Print Network [OSTI]

    Kasprowicz, L. M.; Jones, J. W.; Hitzfelder, J.

    1990-01-01

    A personal computer based hourly simulation model was developed based on the CBS/ICE routines in the DOE-2.1 mainframe building simulation software. The menu driven new model employs more efficient data and information handling than the previous...

  10. Methodology for characterizing modeling and discretization uncertainties in computational simulation

    SciTech Connect (OSTI)

    ALVIN,KENNETH F.; OBERKAMPF,WILLIAM L.; RUTHERFORD,BRIAN M.; DIEGERT,KATHLEEN V.

    2000-03-01

    This research effort focuses on methodology for quantifying the effects of model uncertainty and discretization error on computational modeling and simulation. The work is directed towards developing methodologies which treat model form assumptions within an overall framework for uncertainty quantification, for the purpose of developing estimates of total prediction uncertainty. The present effort consists of work in three areas: framework development for sources of uncertainty and error in the modeling and simulation process which impact model structure; model uncertainty assessment and propagation through Bayesian inference methods; and discretization error estimation within the context of non-deterministic analysis.

  11. Computing Biological Model Parameters by Parallel Statistical Model Checking

    E-Print Network [OSTI]

    Tronci, Enrico

    of Treatments for Infertility Related Endocrinological Diseases, 600773). #12;patient-specific model parameters

  12. Models of quantum computation and quantum programming languages

    E-Print Network [OSTI]

    J. A. Miszczak

    2011-12-03

    The goal of the presented paper is to provide an introduction to the basic computational models used in quantum information theory. We review various models of quantum Turing machine, quantum circuits and quantum random access machine (QRAM) along with their classical counterparts. We also provide an introduction to quantum programming languages, which are developed using the QRAM model. We review the syntax of several existing quantum programming languages and discuss their features and limitations.

  13. Quantum Computers: Noise Propagation and Adversarial Noise Models

    E-Print Network [OSTI]

    Gil Kalai

    2009-04-21

    In this paper we consider adversarial noise models that will fail quantum error correction and fault-tolerant quantum computation. We describe known results regarding high-rate noise, sequential computation, and reversible noisy computation. We continue by discussing highly correlated noise and the "boundary," in terms of correlation of errors, of the "threshold theorem." Next, we draw a picture of adversarial forms of noise called (collectively) "detrimental noise." Detrimental noise is modeled after familiar properties of noise propagation. However, it can have various causes. We start by pointing out the difference between detrimental noise and standard noise models for two qubits and proceed to a discussion of highly entangled states, the rate of noise, and general noisy quantum systems.

  14. District-heating strategy model: computer programmer's manual

    SciTech Connect (OSTI)

    Kuzanek, J.F.

    1982-05-01

    The US Department of Housing and Urban Development (HUD) and the US Department of Energy (DOE) cosponsor a program aimed at increasing the number of district heating and cooling (DHC) systems. Such systems can reduce the amount and costs of fuels used to heat and cool buildings in a district. Twenty-eight communities have agreed to aid HUD in a national feasibility assessment of DHC systems. The HUD/DOE program entails technical assistance by Argonne National Laboratory and Oak Ridge National Laboratory. The assistance includes a computer program, called the district heating strategy model (DHSM), that performs preliminary calculations to analyze potential DHC systems. This report describes the general capabilities of the DHSM, provides historical background on its development, and explains the computer installation and operation of the model - including the data file structures and the options. Sample problems illustrate the structure of the various input data files, the interactive computer-output listings. The report is written primarily for computer programmers responsible for installing the model on their computer systems, entering data, running the model, and implementing local modifications to the code.

  15. A gas kick model for the personal computer 

    E-Print Network [OSTI]

    Miller, Clayton Lowell

    1987-01-01

    A GAS KICK MODEL FOR THE PERSONAL COMPUTER A Thesis by CLAYTON LOWELL MILLER Submitted to the Graduate College of Texas A6M University in partial fulfillment of the requirements for the degree of MASTER OF SCIENCE May 1987 Major Subject...: Petroleum Engineering A GAS KICK MODEL FOR THE PERSONAL COMPUTER A Thesis by CLAYTON LOWELL MILLER Approved as to style and content by: Wana C. vkam-Wold (Chair f Committee) Robert W. Heine (Member) Tibor G. Rozg yi (Member) W. D. Von Gonten Head...

  16. Concurrent multiscale computational modeling for dense dry granular materials interfacing

    E-Print Network [OSTI]

    Regueiro, Richard A.

    of interfacial mechanics between granular soil and tire, tool, or penetrometer, while properly representing far computational modeling of interfacial mechanics between granular materials and deformable solid bodies, agricultural grains (in silo flows), dry soils (sand, silt, gravel), and lunar and martian regolith (soil found

  17. ADVANCED WUFI COMPUTER MODELING WORKSHOP FOR WALL DESIGN AND PERFORMANCE

    E-Print Network [OSTI]

    Oak Ridge National Laboratory

    TRANSFER IN BUILDING ENVELOPES) Napa, CA, January 30 - February 1, 2012 WUFI/ORNL1 Program made availableADVANCED WUFI COMPUTER MODELING WORKSHOP FOR WALL DESIGN AND PERFORMANCE (HEAT AND MOISTURE-Institut für Bauphysik (IBP) and co-sponsored by the National Institute of Building Sciences (NIBS)/ Building

  18. Computational model of aortic valve surgical repair using grafted pericardium

    E-Print Network [OSTI]

    1 Computational model of aortic valve surgical repair using grafted pericardium Peter E. Hammer1, aortic valve repair, membrane, surgical planning, leaflet graft, pericardium ABSTRACT Aortic valve leaflets. Difficulty is largely due to the complex geometry and function of the valve and the lower

  19. A Computational Model for Sound Field Absorption by Acoustic Arrays

    E-Print Network [OSTI]

    . We then formulate the acoustic wave equation with the absorption boundary coeÆcient in the frequency the sound absorption property of arrays of micro-acoustic actuators at a control surface. We use the waveA Computational Model for Sound Field Absorption by Acoustic Arrays H. T. Banks #3; D. G. Cole z K

  20. Faceted Models of Blog Feeds Department of Computer

    E-Print Network [OSTI]

    Meng, Weiyi

    Faceted Models of Blog Feeds Lifeng Jia Department of Computer Science University of Illinois@cs.binghamton.edu ABSTRACT Faceted blog distillation aims at retrieving the blogs that are not only relevant to a query blogs depict various topics related to the personal experiences of bloggers while official blogs deliver

  1. Open Learner Models at Birmingham Electronic, Electrical and Computer Engineering

    E-Print Network [OSTI]

    Bull, Susan

    and Engineering Principles and Methods EE2D2 Introduction to Communications EE2F1 Speech and Audio Technology EE2G EE1B2 Circuits, Devices and Fields EE1E1&2 C Programming EE1F1 Introduction to InformationOpen Learner Models at Birmingham Electronic, Electrical and Computer Engineering University

  2. Modeling and Computational Strategies for Optimal Development Planning of Offshore

    E-Print Network [OSTI]

    Grossmann, Ignacio E.

    1 Modeling and Computational Strategies for Optimal Development Planning of Offshore Oilfields for offshore oil and gas fields as a basis to include the generic fiscal rules with ringfencing provisions-integer programming. 1 Introduction Offshore oil and gas field development planning has received significant attention

  3. innovati nNREL Computer Models Integrate Wind Turbines with

    E-Print Network [OSTI]

    innovati nNREL Computer Models Integrate Wind Turbines with Floating Platforms Far off the shores for today's seabed-mounted offshore wind turbines. For the United States to tap into these vast offshore wind energy resources, wind turbines must be mounted on floating platforms to be cost effective

  4. Computational Model for Forced Expiration from Asymmetric Normal Lungs

    E-Print Network [OSTI]

    Lutchen, Kenneth

    Computational Model for Forced Expiration from Asymmetric Normal Lungs ADAM G. POLAK 1 losses along the airway branches. Calculations done for succeeding lung volumes result in the semidynamic to the choke points, characteristic differences of lung regional pressures and volumes, and a shape

  5. Computer Modeling of Crystalline Electrolytes Lithium Thiophosphates and Phosphates

    E-Print Network [OSTI]

    Holzwarth, Natalie

    migration. I. Introduction During the last 5 years, lithium thiophosphate solid electrolyte materials haveComputer Modeling of Crystalline Electrolytes ­ Lithium Thiophosphates and Phosphates N. D. Lepley properties of (thio)phosphate electrolyte materials, focusing on the "superionic" electrolyte Li7P3S11. We

  6. A New Perspective for the Calibration of Computational Predictor Models.

    SciTech Connect (OSTI)

    Crespo, Luis Guillermo

    2014-11-01

    This paper presents a framework for calibrating computational models using data from sev- eral and possibly dissimilar validation experiments. The offset between model predictions and observations, which might be caused by measurement noise, model-form uncertainty, and numerical error, drives the process by which uncertainty in the models parameters is characterized. The resulting description of uncertainty along with the computational model constitute a predictor model. Two types of predictor models are studied: Interval Predictor Models (IPMs) and Random Predictor Models (RPMs). IPMs use sets to characterize uncer- tainty, whereas RPMs use random vectors. The propagation of a set through a model makes the response an interval valued function of the state, whereas the propagation of a random vector yields a random process. Optimization-based strategies for calculating both types of predictor models are proposed. Whereas the formulations used to calculate IPMs target solutions leading to the interval value function of minimal spread containing all observations, those for RPMs seek to maximize the models' ability to reproduce the distribution of obser- vations. Regarding RPMs, we choose a structure for the random vector (i.e., the assignment of probability to points in the parameter space) solely dependent on the prediction error. As such, the probabilistic description of uncertainty is not a subjective assignment of belief, nor is it expected to asymptotically converge to a fixed value, but instead it is a description of the model's ability to reproduce the experimental data. This framework enables evaluating the spread and distribution of the predicted response of target applications depending on the same parameters beyond the validation domain (i.e., roll-up and extrapolation).

  7. Wind energy conversion system analysis model (WECSAM) computer program documentation

    SciTech Connect (OSTI)

    Downey, W T; Hendrick, P L

    1982-07-01

    Described is a computer-based wind energy conversion system analysis model (WECSAM) developed to predict the technical and economic performance of wind energy conversion systems (WECS). The model is written in CDC FORTRAN V. The version described accesses a data base containing wind resource data, application loads, WECS performance characteristics, utility rates, state taxes, and state subsidies for a six state region (Minnesota, Michigan, Wisconsin, Illinois, Ohio, and Indiana). The model is designed for analysis at the county level. The computer model includes a technical performance module and an economic evaluation module. The modules can be run separately or together. The model can be run for any single user-selected county within the region or looped automatically through all counties within the region. In addition, the model has a restart capability that allows the user to modify any data-base value written to a scratch file prior to the technical or economic evaluation. Thus, any user-supplied data for WECS performance, application load, utility rates, or wind resource may be entered into the scratch file to override the default data-base value. After the model and the inputs required from the user and derived from the data base are described, the model output and the various output options that can be exercised by the user are detailed. The general operation is set forth and suggestions are made for efficient modes of operation. Sample listings of various input, output, and data-base files are appended. (LEW)

  8. Baseline LAW Glass Formulation Testing

    SciTech Connect (OSTI)

    Kruger, Albert A. [USDOE Office of River Protection, Richland, WA (United States); Mooers, Cavin [The Catholic University of America, Washington, DC (United States). Vitreous State Lab.; Bazemore, Gina [The Catholic University of America, Washington, DC (United States). Vitreous State Lab; Pegg, Ian L. [The Catholic University of America, Washington, DC (United States). Vitreous State Lab; Hight, Kenneth [The Catholic University of America, Washington, DC (United States). Vitreous State Lab; Lai, Shan Tao [The Catholic University of America, Washington, DC (United States). Vitreous State Lab; Buechele, Andrew [The Catholic University of America, Washington, DC (United States). Vitreous State Lab; Rielley, Elizabeth [The Catholic University of America, Washington, DC (United States). Vitreous State Lab; Gan, Hao [The Catholic University of America, Washington, DC (United States). Vitreous State Lab; Muller, Isabelle S. [The Catholic University of America, Washington, DC (United States). Vitreous State Lab; Cecil, Richard [The Catholic University of America, Washington, DC (United States). Vitreous State Lab

    2013-06-13

    The major objective of the baseline glass formulation work was to develop and select glass formulations that are compliant with contractual and processing requirements for each of the LAW waste streams. Other objectives of the work included preparation and characterization of glasses with respect to the properties of interest, optimization of sulfate loading in the glasses, evaluation of ability to achieve waste loading limits, testing to demonstrate compatibility of glass melts with melter materials of construction, development of glass formulations to support ILAW qualification activities, and identification of glass formulation issues with respect to contract specifications and processing requirements.

  9. ONSET OF CHAOS IN A MODEL OF QUANTUM COMPUTATION G. BERMAN; ET...

    Office of Scientific and Technical Information (OSTI)

    OF CHAOS IN A MODEL OF QUANTUM COMPUTATION G. BERMAN; ET AL 71 CLASSICAL AND QUANTUM MECHANICS, GENERAL PHYSICS; 99 GENERAL AND MISCELLANEOUSMATHEMATICS, COMPUTING, AND...

  10. Second International Workshop on Social Computing, Behavioral Modeling, and Prediction Phoenix, Arizona

    E-Print Network [OSTI]

    Liu, Huan

    Second International Workshop on Social Computing, Behavioral Modeling, and Prediction Phoenix, Arizona March 31 - April 1, 2009 Phoenix, Arizona Proceedings published by Springer Social computing

  11. The origins of computer weather prediction and climate modeling

    SciTech Connect (OSTI)

    Lynch, Peter [Meteorology and Climate Centre, School of Mathematical Sciences, University College Dublin, Belfield (Ireland)], E-mail: Peter.Lynch@ucd.ie

    2008-03-20

    Numerical simulation of an ever-increasing range of geophysical phenomena is adding enormously to our understanding of complex processes in the Earth system. The consequences for mankind of ongoing climate change will be far-reaching. Earth System Models are capable of replicating climate regimes of past millennia and are the best means we have of predicting the future of our climate. The basic ideas of numerical forecasting and climate modeling were developed about a century ago, long before the first electronic computer was constructed. There were several major practical obstacles to be overcome before numerical prediction could be put into practice. A fuller understanding of atmospheric dynamics allowed the development of simplified systems of equations; regular radiosonde observations of the free atmosphere and, later, satellite data, provided the initial conditions; stable finite difference schemes were developed; and powerful electronic computers provided a practical means of carrying out the prodigious calculations required to predict the changes in the weather. Progress in weather forecasting and in climate modeling over the past 50 years has been dramatic. In this presentation, we will trace the history of computer forecasting through the ENIAC integrations to the present day. The useful range of deterministic prediction is increasing by about one day each decade, and our understanding of climate change is growing rapidly as Earth System Models of ever-increasing sophistication are developed.

  12. A General Hippocampal Computational Model Combining Episodic and Spatial Memory in a Spiking Model 

    E-Print Network [OSTI]

    Aguiar, Paulo de Castro

    The hippocampus, in humans and rats, plays crucial roles in spatial tasks and nonspatial tasks involving episodic-type memory. This thesis presents a novel computational model of the hippocampus (CA1, CA3 and dentate ...

  13. Baseline Test Specimen Machining Report

    SciTech Connect (OSTI)

    mark Carroll

    2009-08-01

    The Next Generation Nuclear Plant (NGNP) Project is tasked with selecting a high temperature gas reactor technology that will be capable of generating electricity and supplying large amounts of process heat. The NGNP is presently being designed as a helium-cooled high temperature gas reactor (HTGR) with a large graphite core. The graphite baseline characterization project is conducting the research and development (R&D) activities deemed necessary to fully qualify nuclear-grade graphite for use in the NGNP reactor. Establishing nonirradiated thermomechanical and thermophysical properties by characterizing lot-to-lot and billet-to-billet variations (for probabilistic baseline data needs) through extensive data collection and statistical analysis is one of the major fundamental objectives of the project. The reactor core will be made up of stacks of graphite moderator blocks. In order to gain a more comprehensive understanding of the varying characteristics in a wide range of suitable graphites, any of which can be classified as “nuclear grade,” an experimental program has been initiated to develop an extensive database of the baseline characteristics of numerous candidate graphites. Various factors known to affect the properties of graphite will be investigated, including specimen size, spatial location within a graphite billet, specimen orientation within a billet (either parallel to [P] or transverse to [T] the long axis of the as-produced billet), and billet-to-billet variations within a lot or across different production lots. Because each data point is based on a certain position within a given billet of graphite, particular attention must be paid to the traceability of each specimen and its spatial location and orientation within each billet. The evaluation of these properties is discussed in the Graphite Technology Development Plan (Windes et. al, 2007). One of the key components in the evaluation of these graphite types will be mechanical testing on specimens drawn from carefully controlled sections of each billet. To this end, this report will discuss the machining of the first set of test specimens that will be evaluated in this program through tensile, compressive, and flexural testing. Validation that the test specimens have been produced to the tolerances required by the applicable ASTM standards, and to the quality control levels required by this program, will demonstrate the viability of sending graphite to selected suppliers that will provide valuable and certifiable data to future data sets that are integral to the NGNP program and beyond.

  14. TWRS privatization process technical baseline

    SciTech Connect (OSTI)

    Orme, R.M.

    1996-09-13

    The U.S. Department of Energy (DOE) is planning a two-phased program for the remediation of Hanford tank waste. Phase 1 is a pilot program to demonstrate the procurement of treatment services. The volume of waste treated during the Phase 1 is a small percentage of the tank waste. During Phase 2, DOE intends to procure treatment services for the balance of the waste. The TWRS Privatization Process Technical Baseline (PPTB) provides a summary level flowsheet/mass balance of tank waste treatment operations which is consistent with the tank inventory information, waste feed staging studies, and privatization guidelines currently available. The PPTB will be revised periodically as privatized processing concepts are crystallized.

  15. Pinellas Plant Environmental Baseline Report

    SciTech Connect (OSTI)

    Not Available

    1997-06-01

    The Pinellas Plant has been part of the Department of Energy`s (DOE) nuclear weapons complex since the plant opened in 1957. In March 1995, the DOE sold the Pinellas Plant to the Pinellas County Industry Council (PCIC). DOE has leased back a large portion of the plant site to facilitate transition to alternate use and safe shutdown. The current mission is to achieve a safe transition of the facility from defense production and prepare the site for alternative uses as a community resource for economic development. Toward that effort, the Pinellas Plant Environmental Baseline Report (EBR) discusses the current and past environmental conditions of the plant site. Information for the EBR is obtained from plant records. Historical process and chemical usage information for each area is reviewed during area characterizations.

  16. Computer Modeling of Violent Intent: A Content Analysis Approach

    SciTech Connect (OSTI)

    Sanfilippo, Antonio P.; Mcgrath, Liam R.; Bell, Eric B.

    2014-01-03

    We present a computational approach to modeling the intent of a communication source representing a group or an individual to engage in violent behavior. Our aim is to identify and rank aspects of radical rhetoric that are endogenously related to violent intent to predict the potential for violence as encoded in written or spoken language. We use correlations between contentious rhetoric and the propensity for violent behavior found in documents from radical terrorist and non-terrorist groups and individuals to train and evaluate models of violent intent. We then apply these models to unseen instances of linguistic behavior to detect signs of contention that have a positive correlation with violent intent factors. Of particular interest is the application of violent intent models to social media, such as Twitter, that have proved to serve as effective channels in furthering sociopolitical change.

  17. Comprehensive computer model for magnetron sputtering. II. Charged particle transport

    SciTech Connect (OSTI)

    Jimenez, Francisco J., E-mail: fjimenez@ualberta.ca; Dew, Steven K. [Department of Electrical and Computer Engineering, University of Alberta, Edmonton T6G 2V4 (Canada); Field, David J. [Smith and Nephew (Alberta) Inc., Fort Saskatchewan T8L 4K4 (Canada)

    2014-11-01

    Discharges for magnetron sputter thin film deposition systems involve complex plasmas that are sensitively dependent on magnetic field configuration and strength, working gas species and pressure, chamber geometry, and discharge power. The authors present a numerical formulation for the general solution of these plasmas as a component of a comprehensive simulation capability for planar magnetron sputtering. This is an extensible, fully three-dimensional model supporting realistic magnetic fields and is self-consistently solvable on a desktop computer. The plasma model features a hybrid approach involving a Monte Carlo treatment of energetic electrons and ions, along with a coupled fluid model for thermalized particles. Validation against a well-known one-dimensional system is presented. Various strategies for improving numerical stability are investigated as is the sensitivity of the solution to various model and process parameters. In particular, the effect of magnetic field, argon gas pressure, and discharge power are studied.

  18. Computer Modeling and Simulation of an Active Vision Camera System MingChin Lu

    E-Print Network [OSTI]

    Subbarao, Murali "Rao"

    Computer Modeling and Simulation of an Active Vision Camera System Ming­Chin Lu Communications of computer simulation systems. Computer simulation avoids the necessity of building actual camera systems. Based on the proposed model and algorithms, a computer simulation system called Active Vision Simulator

  19. EconoGrid: A detailed Simulation Model of a Standards-based Grid Compute Economy

    E-Print Network [OSTI]

    EconoGrid: A detailed Simulation Model of a Standards-based Grid Compute Economy EconoGrid is a detailed simulation model, implemented in SLX1 , of a grid compute economy that implements selected of users. In a grid compute economy, computing resources are sold to users in a market where price

  20. A Vast Machine: Computer Models, Climate Data, and the Politics of Global Warming

    E-Print Network [OSTI]

    Maret, Susan

    2011-01-01

    Review: A Vast Machine: Computer Models, Climate Data, andEdwards, Paul N. A Vast Machine: Computer Models, ClimateEdwards, Paul N. 2004. "A vast machine: standards as social

  1. Models and People: an alternative view of the emergent properties of computational models

    E-Print Network [OSTI]

    Boschetti, Fabio

    be used to address scientific questions or support real-world decision making. At these higher levels ecological model may be used i) to address a scientific question and ii) how the scientific insight so gained management problem (Figure 1, top) 11-16 . 2 Computational models and scientific questions Within

  2. Computer Modeling Illuminates Degradation Pathways of Cations in Alkaline Membrane Fuel Cells (Fact Sheet)

    SciTech Connect (OSTI)

    Not Available

    2012-08-01

    Cation degradation insights obtained by computational modeling could result in better performance and longer lifetime for alkaline membrane fuel cells.

  3. Baseline Graphite Characterization: First Billet

    SciTech Connect (OSTI)

    Mark C. Carroll; Joe Lords; David Rohrbaugh

    2010-09-01

    The Next Generation Nuclear Plant Project Graphite Research and Development program is currently establishing the safe operating envelope of graphite core components for a very high temperature reactor design. To meet this goal, the program is generating the extensive amount of quantitative data necessary for predicting the behavior and operating performance of the available nuclear graphite grades. In order determine the in-service behavior of the graphite for the latest proposed designs, two main programs are underway. The first, the Advanced Graphite Creep (AGC) program, is a set of experiments that are designed to evaluate the irradiated properties and behavior of nuclear grade graphite over a large spectrum of temperatures, neutron fluences, and compressive loads. Despite the aggressive experimental matrix that comprises the set of AGC test runs, a limited amount of data can be generated based upon the availability of space within the Advanced Test Reactor and the geometric constraints placed on the AGC specimens that will be inserted. In order to supplement the AGC data set, the Baseline Graphite Characterization program will endeavor to provide supplemental data that will characterize the inherent property variability in nuclear-grade graphite without the testing constraints of the AGC program. This variability in properties is a natural artifact of graphite due to the geologic raw materials that are utilized in its production. This variability will be quantified not only within a single billet of as-produced graphite, but also from billets within a single lot, billets from different lots of the same grade, and across different billets of the numerous grades of nuclear graphite that are presently available. The thorough understanding of this variability will provide added detail to the irradiated property data, and provide a more thorough understanding of the behavior of graphite that will be used in reactor design and licensing. This report covers the development of the Baseline Graphite Characterization program from a testing and data collection standpoint through the completion of characterization on the first billet of nuclear-grade graphite. This data set is the starting point for all future evaluations and comparisons of material properties.

  4. FINITE VOLUME METHODS APPLIED TO THE COMPUTATIONAL MODELLING OF WELDING PHENOMENA

    E-Print Network [OSTI]

    Taylor, Gary

    1 FINITE VOLUME METHODS APPLIED TO THE COMPUTATIONAL MODELLING OF WELDING PHENOMENA Gareth A.Taylor@brunel.ac.uk ABSTRACT This paper presents the computational modelling of welding phenomena within a versatile numerical) and Computational Solid Mechanics (CSM). With regard to the CFD modelling of the weld pool fluid dynamics, heat

  5. How Computational Models Predict the Behavior of Complex Systems John Symons 1

    E-Print Network [OSTI]

    Boschetti, Fabio

    How Computational Models Predict the Behavior of Complex Systems John Symons 1 Fabio Boschetti2,3 1 of prediction in the use of computational models in science. We focus on the consequences of the irreversibility of computational models and on the conditional or ceteris paribus, nature of the kinds of their predictions

  6. Imaging and modeling of flow in porous media using clinical nuclear emission tomography systems and computational fluid dynamics

    E-Print Network [OSTI]

    Boutchko, R.

    2014-01-01

    emission tomography systems and computational fluid dynamicsa computational ?uid dynamics (CFD) model of the systemthe computational domain. A Cartesian coordinate system was

  7. Energy Intensity Baselining and Tracking Guidance

    Broader source: Energy.gov (indexed) [DOE]

    betterbuildings.energy.gov Energy Intensity Baselining and Tracking Guidance i Preface The U.S. Department of Energy's (DOE's) Better Buildings, Better Plants Program (Better...

  8. UCSF Sustainability Baseline Assessment: Carbon Footprint Analysis

    E-Print Network [OSTI]

    Yamamoto, Keith

    UCSF Sustainability Baseline Assessment: Carbon Footprint Analysis Final Issue Date: March 21, 2010 #12;Carbon Footprint Analysis Background This chapter of the Sustainability Assessment focuses on UCSF

  9. Computing Limb Darkening Coefficients from Stellar Atmosphere Models

    E-Print Network [OSTI]

    David Heyrovsky

    2006-10-24

    We explore the sensitivity of limb darkening coefficients computed from stellar atmosphere models to different least-squares fitting methods. We demonstrate that conventional methods are strongly biased to fitting the stellar limb. Our suggested method of fitting by minimizing the radially integrated squared residual yields improved fits with better flux conservation. The differences of the obtained coefficients from commonly used values are observationally significant. We show that the new values are in better agreement with solar limb darkening measurements as well as with coefficients reported from analyses of eclipsing binary light curves.

  10. Computing model independent perturbations in dark energy and modified gravity

    SciTech Connect (OSTI)

    Battye, Richard A. [Jodrell Bank Centre for Astrophysics, School of Physics and Astronomy, The University of Manchester, Manchester M13 9PL (United Kingdom); Pearson, Jonathan A., E-mail: richard.battye@manchester.ac.uk, E-mail: jonathan.pearson@durham.ac.uk [Department of Mathematical Sciences, Durham University, South Road, Durham, DH1 3LE (United Kingdom)

    2014-03-01

    We present a methodology for computing model independent perturbations in dark energy and modified gravity. This is done from the Lagrangian for perturbations, by showing how field content, symmetries, and physical principles are often sufficient ingredients for closing the set of perturbed fluid equations. The fluid equations close once ''equations of state for perturbations'' are identified: these are linear combinations of fluid and metric perturbations which construct gauge invariant entropy and anisotropic stress perturbations for broad classes of theories. Our main results are the proof of the equation of state for perturbations presented in a previous paper, and the development of the required calculational tools.

  11. Degree-1 Earth deformation from very long baseline interferometry measurements

    E-Print Network [OSTI]

    Faulds, James E.

    of the center of mass of the Earth system through satellite orbit models and the former purely on observing earth and the center of mass of the entire Earth system (Earth, oceans and atmosphere). The load momentDegree-1 Earth deformation from very long baseline interferometry measurements D. Lavalle´e and G

  12. Modeling the Fracture of Ice Sheets on Parallel Computers

    SciTech Connect (OSTI)

    Waisman, Haim; Tuminaro, Ray

    2013-10-10

    The objective of this project was to investigate the complex fracture of ice and understand its role within larger ice sheet simulations and global climate change. This objective was achieved by developing novel physics based models for ice, novel numerical tools to enable the modeling of the physics and by collaboration with the ice community experts. At the present time, ice fracture is not explicitly considered within ice sheet models due in part to large computational costs associated with the accurate modeling of this complex phenomena. However, fracture not only plays an extremely important role in regional behavior but also influences ice dynamics over much larger zones in ways that are currently not well understood. To this end, our research findings through this project offers significant advancement to the field and closes a large gap of knowledge in understanding and modeling the fracture of ice sheets in the polar regions. Thus, we believe that our objective has been achieved and our research accomplishments are significant. This is corroborated through a set of published papers, posters and presentations at technical conferences in the field. In particular significant progress has been made in the mechanics of ice, fracture of ice sheets and ice shelves in polar regions and sophisticated numerical methods that enable the solution of the physics in an efficient way.

  13. A Unified Computational Model for Solar and Stellar Flares

    E-Print Network [OSTI]

    Allred, Joel C; Carlsson, Mats

    2015-01-01

    We present a unified computational framework which can be used to describe impulsive flares on the Sun and on dMe stars. The models assume that the flare impulsive phase is caused by a beam of charged particles that is accelerated in the corona and propagates downward depositing energy and momentum along the way. This rapidly heats the lower stellar atmosphere causing it to explosively expand and dramatically brighten. Our models consist of flux tubes that extend from the sub-photosphere into the corona. We simulate how flare-accelerated charged particles propagate down one-dimensional flux tubes and heat the stellar atmosphere using the Fokker-Planck kinetic theory. Detailed radiative transfer is included so that model predictions can be directly compared with observations. The flux of flare-accelerated particles drives return currents which additionally heat the stellar atmosphere. These effects are also included in our models. We examine the impact of the flare-accelerated particle beams on model solar and...

  14. Center for Programming Models for Scalable Parallel Computing

    SciTech Connect (OSTI)

    John Mellor-Crummey

    2008-02-29

    Rice University's achievements as part of the Center for Programming Models for Scalable Parallel Computing include: (1) design and implemention of cafc, the first multi-platform CAF compiler for distributed and shared-memory machines, (2) performance studies of the efficiency of programs written using the CAF and UPC programming models, (3) a novel technique to analyze explicitly-parallel SPMD programs that facilitates optimization, (4) design, implementation, and evaluation of new language features for CAF, including communication topologies, multi-version variables, and distributed multithreading to simplify development of high-performance codes in CAF, and (5) a synchronization strength reduction transformation for automatically replacing barrier-based synchronization with more efficient point-to-point synchronization. The prototype Co-array Fortran compiler cafc developed in this project is available as open source software from http://www.hipersoft.rice.edu/caf.

  15. Computational fluid dynamic modeling of fluidized-bed polymerization reactors

    SciTech Connect (OSTI)

    Rokkam, Ram

    2012-11-02

    Polyethylene is one of the most widely used plastics, and over 60 million tons are produced worldwide every year. Polyethylene is obtained by the catalytic polymerization of ethylene in gas and liquid phase reactors. The gas phase processes are more advantageous, and use fluidized-bed reactors for production of polyethylene. Since they operate so close to the melting point of the polymer, agglomeration is an operational concern in all slurry and gas polymerization processes. Electrostatics and hot spot formation are the main factors that contribute to agglomeration in gas-phase processes. Electrostatic charges in gas phase polymerization fluidized bed reactors are known to influence the bed hydrodynamics, particle elutriation, bubble size, bubble shape etc. Accumulation of electrostatic charges in the fluidized-bed can lead to operational issues. In this work a first-principles electrostatic model is developed and coupled with a multi-fluid computational fluid dynamic (CFD) model to understand the effect of electrostatics on the dynamics of a fluidized-bed. The multi-fluid CFD model for gas-particle flow is based on the kinetic theory of granular flows closures. The electrostatic model is developed based on a fixed, size-dependent charge for each type of particle (catalyst, polymer, polymer fines) phase. The combined CFD model is first verified using simple test cases, validated with experiments and applied to a pilot-scale polymerization fluidized-bed reactor. The CFD model reproduced qualitative trends in particle segregation and entrainment due to electrostatic charges observed in experiments. For the scale up of fluidized bed reactor, filtered models are developed and implemented on pilot scale reactor.

  16. Modeling and Synthesizing Task Placement Constraints in Google Compute Clusters

    E-Print Network [OSTI]

    Cortes, Corinna

    characterization for high performance computing and grids focus on task resource require- ments for CPU, memory of compute clusters. Existing workload characteri- zations for high performance computing and grids focus

  17. Computational model for simulation small testing launcher, technical solution

    SciTech Connect (OSTI)

    Chelaru, Teodor-Viorel; Cristian, Barbu; Chelaru, Adrian

    2014-12-10

    The purpose of this paper is to present some aspects regarding the computational model and technical solutions for multistage suborbital launcher for testing (SLT) used to test spatial equipment and scientific measurements. The computational model consists in numerical simulation of SLT evolution for different start conditions. The launcher model presented will be with six degrees of freedom (6DOF) and variable mass. The results analysed will be the flight parameters and ballistic performances. The discussions area will focus around the technical possibility to realize a small multi-stage launcher, by recycling military rocket motors. From technical point of view, the paper is focused on national project 'Suborbital Launcher for Testing' (SLT), which is based on hybrid propulsion and control systems, obtained through an original design. Therefore, while classical suborbital sounding rockets are unguided and they use as propulsion solid fuel motor having an uncontrolled ballistic flight, SLT project is introducing a different approach, by proposing the creation of a guided suborbital launcher, which is basically a satellite launcher at a smaller scale, containing its main subsystems. This is why the project itself can be considered an intermediary step in the development of a wider range of launching systems based on hybrid propulsion technology, which may have a major impact in the future European launchers programs. SLT project, as it is shown in the title, has two major objectives: first, a short term objective, which consists in obtaining a suborbital launching system which will be able to go into service in a predictable period of time, and a long term objective that consists in the development and testing of some unconventional sub-systems which will be integrated later in the satellite launcher as a part of the European space program. This is why the technical content of the project must be carried out beyond the range of the existing suborbital vehicle programs towards the current technological necessities in the space field, especially the European one.

  18. Computational Biology and Bioinformatics 10.10 Models of substitution I: Basic Models A

    E-Print Network [OSTI]

    Goldschmidt, Christina

    & stochastic grammars 7.11 RNA structures 9.11 Finding signals in sequences 14.11 Challenges in genome of structure & movements & shapes & grammars 28.11 Integrative genomics: the omics DNA mRNA Protein Metabolite Phenotype 30.11 Integrative genomics: mapping #12;Computational Biology and Bioinformatics 10.10 Models

  19. A computational model of the motivation-learning interface Manish Saggar (mishu@cs.utexas.edu)

    E-Print Network [OSTI]

    Maddox, W. Todd

    A computational model of the motivation-learning interface Manish Saggar (mishu the influence of motivation on learning observed by Markman, Baldwin and Maddox (2005). They showed was confirmed. These results constitute a first computational step towards understanding how motivation

  20. A Polarizable QM/MM Explicit Solvent Model for Computational Electrochemistry in Water

    E-Print Network [OSTI]

    Wang, Lee-Ping

    We present a quantum mechanical/molecular mechanical (QM/MM) explicit solvent model for the computation of standard reduction potentials E[subscript 0]. The QM/MM model uses density functional theory (DFT) to model the ...

  1. Computation Modeling and Assessment of Nanocoatings for Ultra Supercritical Boilers

    SciTech Connect (OSTI)

    J. Shingledecker; D. Gandy; N. Cheruvu; R. Wei; K. Chan

    2011-06-21

    Forced outages and boiler unavailability of coal-fired fossil plants is most often caused by fire-side corrosion of boiler waterwalls and tubing. Reliable coatings are required for Ultrasupercritical (USC) application to mitigate corrosion since these boilers will operate at a much higher temperatures and pressures than in supercritical (565 C {at} 24 MPa) boilers. Computational modeling efforts have been undertaken to design and assess potential Fe-Cr-Ni-Al systems to produce stable nanocrystalline coatings that form a protective, continuous scale of either Al{sub 2}O{sub 3} or Cr{sub 2}O{sub 3}. The computational modeling results identified a new series of Fe-25Cr-40Ni with or without 10 wt.% Al nanocrystalline coatings that maintain long-term stability by forming a diffusion barrier layer at the coating/substrate interface. The computational modeling predictions of microstructure, formation of continuous Al{sub 2}O{sub 3} scale, inward Al diffusion, grain growth, and sintering behavior were validated with experimental results. Advanced coatings, such as MCrAl (where M is Fe, Ni, or Co) nanocrystalline coatings, have been processed using different magnetron sputtering deposition techniques. Several coating trials were performed and among the processing methods evaluated, the DC pulsed magnetron sputtering technique produced the best quality coating with a minimum number of shallow defects and the results of multiple deposition trials showed that the process is repeatable. scale, inward Al diffusion, grain growth, and sintering behavior were validated with experimental results. The cyclic oxidation test results revealed that the nanocrystalline coatings offer better oxidation resistance, in terms of weight loss, localized oxidation, and formation of mixed oxides in the Al{sub 2}O{sub 3} scale, than widely used MCrAlY coatings. However, the ultra-fine grain structure in these coatings, consistent with the computational model predictions, resulted in accelerated Al diffusion from the coating into the substrate. An effective diffusion barrier interlayer coating was developed to prevent inward Al diffusion. The fire-side corrosion test results showed that the nanocrystalline coatings with a minimum number of defects have a great potential in providing corrosion protection. The coating tested in the most aggressive environment showed no evidence of coating spallation and/or corrosion attack after 1050 hours exposure. In contrast, evidence of coating spallation in isolated areas and corrosion attack of the base metal in the spalled areas were observed after 500 hours. These contrasting results after 500 and 1050 hours exposure suggest that the premature coating spallation in isolated areas may be related to the variation of defects in the coating between the samples. It is suspected that the cauliflower-type defects in the coating were presumably responsible for coating spallation in isolated areas. Thus, a defect free good quality coating is the key for the long-term durability of nanocrystalline coatings in corrosive environments. Thus, additional process optimization work is required to produce defect-free coatings prior to development of a coating application method for production parts.

  2. MOMDIS: a Glauber model computer code for knockout reactions

    E-Print Network [OSTI]

    C. A. Bertulani; A. Gade

    2006-04-12

    A computer program is described to calculate momentum distributions in stripping and diffraction dissociation reactions. A Glauber model is used with the scattering wavefunctions calculated in the eikonal approximation. The program is appropriate for knockout reactions at intermediate energy collisions (30 MeV $\\leq$ E$_{lab}/$nucleon $\\leq 2000$ MeV). It is particularly useful for reactions involving unstable nuclear beams, or exotic nuclei (e.g. neutron-rich nuclei), and studies of single-particle occupancy probabilities (spectroscopic factors) and other related physical observables. Such studies are an essential part of the scientific program of radioactive beam facilities, as in for instance the proposed RIA (Rare Isotope Accelerator) facility in the US.

  3. computers

    National Nuclear Security Administration (NNSA)

    Each successive generation of computing system has provided greater computing power and energy efficiency.

    CTS-1 clusters will support NNSA's Life Extension Program and...

  4. SSRD+: A Privacy-aware Trust and Security Model for Resource Discovery in Pervasive Computing Environment

    E-Print Network [OSTI]

    Madiraju, Praveen

    discovery process lightweight and secure. In this paper we present details of the trust and risk models. We of devices running in a pervasive computing environment [10]. The resource discovery process demands modelsSSRD+: A Privacy-aware Trust and Security Model for Resource Discovery in Pervasive Computing

  5. A Risk-aware Trust Based Secure Resource Discovery (RTSRD) Model for Pervasive Computing

    E-Print Network [OSTI]

    Madiraju, Praveen

    security threat to them. Thus, the resource discovery process demands models that ensure the privacyA Risk-aware Trust Based Secure Resource Discovery (RTSRD) Model for Pervasive Computing Sheikh I-hoc network of pervasive computing, a resource discovery model is needed that can resolve security and privacy

  6. Using Parallel MCMC Sampling to Calibrate a Computer Model of a Geothermal Reservoir

    E-Print Network [OSTI]

    Fox, Colin

    Using Parallel MCMC Sampling to Calibrate a Computer Model of a Geothermal Reservoir by T. Cui, C. 686 ISSN 1178-360 #12;Using Parallel MCMC Sampling to Calibrate a Computer Model of a Geothermal of a geothermal field to achieve model `calibration' from measured well-test data. We explore three scenarios

  7. Ray tracing computations in the smoothed SEG/EAGE Salt Model

    E-Print Network [OSTI]

    Cerveny, Vlastislav

    Ray tracing computations in the smoothed SEG/EAGE Salt Model V#19;aclav Bucha Department to compute rays and synthetic seismograms of refracted and re ected P-waves in the smoothed SEG/EAGE Salt The original 3-D SEG/EAGE Salt Model (Aminzadeh et al. 1997) is very complex model and cannot be used for ray

  8. A Three-Dimensional Computational Model of PEM Fuel Cell with Serpentine Gas Channels

    E-Print Network [OSTI]

    Victoria, University of

    A Three-Dimensional Computational Model of PEM Fuel Cell with Serpentine Gas Channels by Phong ABSTRACT A three-dimensional computational fluid dynamics model of a Polymer Electrolyte Membrane (PEM) fuel cell with serpentine gas flow channels is presented in this thesis. This comprehensive model

  9. New Physics Effects in Long Baseline Experiments

    E-Print Network [OSTI]

    Osamu Yasuda

    2007-10-13

    We discuss the implications of new physics, which modifies the matter effect in neutrino oscillations, to long baseline experiments, particularly the MINOS experiment. An analytic formula in the presence of such a new physics interaction is derived for $P(\

  10. A network based model for heterogeneous parallel computation 

    E-Print Network [OSTI]

    Sathye, Adwait B.

    1993-01-01

    The computational requirements of science and engineering demand computational resources orders of magnitude of the current day sequential machines. Most of the research effort has been concentrated upon the creation of parallel algorithms...

  11. A New Model for Image-Based Humanities Computing 

    E-Print Network [OSTI]

    Brown, Jacob Hohmann

    2009-05-15

    Image-based humanities computing, the computer-assisted study of digitallyrepresented “objects or artifacts of cultural heritage,” is an increasingly popular yet “established practice” located at the most recent intersections ...

  12. AIR INGRESS ANALYSIS: PART 2 – COMPUTATIONAL FLUID DYNAMIC MODELS

    SciTech Connect (OSTI)

    Chang H. Oh; Eung S. Kim; Richard Schultz; Hans Gougar; David Petti; Hyung S. Kang

    2011-01-01

    The Idaho National Laboratory (INL), under the auspices of the U.S. Department of Energy, is performing research and development that focuses on key phenomena important during potential scenarios that may occur in very high temperature reactors (VHTRs). Phenomena Identification and Ranking Studies to date have ranked an air ingress event, following on the heels of a VHTR depressurization, as important with regard to core safety. Consequently, the development of advanced air ingress-related models and verification and validation data are a very high priority. Following a loss of coolant and system depressurization incident, air will enter the core of the High Temperature Gas Cooled Reactor through the break, possibly causing oxidation of the in-the core and reflector graphite structure. Simple core and plant models indicate that, under certain circumstances, the oxidation may proceed at an elevated rate with additional heat generated from the oxidation reaction itself. Under postulated conditions of fluid flow and temperature, excessive degradation of the lower plenum graphite can lead to a loss of structural support. Excessive oxidation of core graphite can also lead to the release of fission products into the confinement, which could be detrimental to a reactor safety. Computational fluid dynamic model developed in this study will improve our understanding of this phenomenon. This paper presents two-dimensional and three-dimensional CFD results for the quantitative assessment of the air ingress phenomena. A portion of results of the density-driven stratified flow in the inlet pipe will be compared with results of the experimental results.

  13. A model for computer frustration: the role of instrumental and dispositional factors

    E-Print Network [OSTI]

    Shneiderman, Ben

    A model for computer frustration: the role of instrumental and dispositional factors on incident 20742, USA Available online 16 April 2004 Abstract Frustration is almost universally accepted

  14. Designing Computing System Architecture and Models for the HL-LHC era

    E-Print Network [OSTI]

    Lothar Bauerdick; Brian Bockelman; Peter Elmer; Stephen Gowdy; Matevz Tadel; Frank Wuerthwein

    2015-07-20

    This paper describes a programme to study the computing model in CMS after the next long shutdown near the end of the decade.

  15. Designing Computing System Architecture and Models for the HL-LHC era

    E-Print Network [OSTI]

    Bauerdick, Lothar; Elmer, Peter; Gowdy, Stephen; Tadel, Matevz; Wuerthwein, Frank

    2015-01-01

    This paper describes a programme to study the computing model in CMS after the next long shutdown near the end of the decade.

  16. Computational Modeling and Assessment Of Nanocoatings for Ultra Supercritical Boilers

    SciTech Connect (OSTI)

    David W. Gandy; John P. Shingledecker

    2011-04-11

    Forced outages and boiler unavailability in conventional coal-fired fossil power plants is most often caused by fireside corrosion of boiler waterwalls. Industry-wide, the rate of wall thickness corrosion wastage of fireside waterwalls in fossil-fired boilers has been of concern for many years. It is significant that the introduction of nitrogen oxide (NOx) emission controls with staged burners systems has increased reported waterwall wastage rates to as much as 120 mils (3 mm) per year. Moreover, the reducing environment produced by the low-NOx combustion process is the primary cause of accelerated corrosion rates of waterwall tubes made of carbon and low alloy steels. Improved coatings, such as the MCrAl nanocoatings evaluated here (where M is Fe, Ni, and Co), are needed to reduce/eliminate waterwall damage in subcritical, supercritical, and ultra-supercritical (USC) boilers. The first two tasks of this six-task project-jointly sponsored by EPRI and the U.S. Department of Energy (DE-FC26-07NT43096)-have focused on computational modeling of an advanced MCrAl nanocoating system and evaluation of two nanocrystalline (iron and nickel base) coatings, which will significantly improve the corrosion and erosion performance of tubing used in USC boilers. The computational model results showed that about 40 wt.% is required in Fe based nanocrystalline coatings for long-term durability, leading to a coating composition of Fe-25Cr-40Ni-10 wt.% Al. In addition, the long term thermal exposure test results further showed accelerated inward diffusion of Al from the nanocrystalline coatings into the substrate. In order to enhance the durability of these coatings, it is necessary to develop a diffusion barrier interlayer coating such TiN and/or AlN. The third task 'Process Advanced MCrAl Nanocoating Systems' of the six-task project jointly sponsored by the Electric Power Research Institute, EPRI and the U.S. Department of Energy (DE-FC26-07NT43096)- has focused on processing of advanced nanocrystalline coating systems and development of diffusion barrier interlayer coatings. Among the diffusion interlayer coatings evaluated, the TiN interlayer coating was found to be the optimum one. This report describes the research conducted under the Task 3 workscope.

  17. A Mathematical Model for Virus Infection in a System of Interacting Computers

    E-Print Network [OSTI]

    Cipolatti, Rolci

    A Mathematical Model for Virus Infection in a System of Interacting Computers J. L´opez Gondar & R are explored and enlightened in this paper. 1. Introduction The infection of computers by virtual viruses of virtual viruses in a system of interacting computers could be compared with a disease transmitted

  18. Modeling Computational Security in Long-Lived Systems, Version 2 Ran Canetti1,2

    E-Print Network [OSTI]

    International Association for Cryptologic Research (IACR)

    Modeling Computational Security in Long-Lived Systems, Version 2 Ran Canetti1,2 , Ling Cheung2 Introduction Computational security in long-lived systems: Security properties of cryptographic protocols computational power. This type of security degrades progressively over the lifetime of a protocol. However, some

  19. Modeling Computational Security in Long-Lived Systems Ran Canetti1,2

    E-Print Network [OSTI]

    International Association for Cryptologic Research (IACR)

    Modeling Computational Security in Long-Lived Systems Ran Canetti1,2 , Ling Cheung2 , Dilsun Kaynar Introduction Computational security in long-lived systems: Security properties of cryptographic protocols protocols, security relies on the assumption that adversarial entities have lim- ited computational power

  20. Novel properties generated by interacting computational systems: A minimal model Fabio Boschetti1,2

    E-Print Network [OSTI]

    Boschetti, Fabio

    Novel properties generated by interacting computational systems: A minimal model Fabio questions: First, what is the smallest number of components a computational system needs in order such as selforganisation and emergence have been discussed in computational terms within Complex System Science

  1. Trace-Based Analysis and Prediction of Cloud Computing User Behavior Using the Fractal Modeling Technique

    E-Print Network [OSTI]

    Pedram, Massoud

    Trace-Based Analysis and Prediction of Cloud Computing User Behavior Using the Fractal Modeling and technology. In this paper, we investigate the characteristics of the cloud computing requests received the alpha- stable distribution. Keywords- cloud computing; alpha-stable distribution; fractional order

  2. CPT: An Energy-Efficiency Model for Multi-core Computer Systems

    E-Print Network [OSTI]

    Shi, Weisong

    CPT: An Energy-Efficiency Model for Multi-core Computer Systems Weisong Shi, Shinan Wang and Bing efficiency of computer systems. These techniques affect the energy efficiency across different layers metric that represents the energy efficiency of a computer system, for a specific configuration, given

  3. Optimizing Computations in Weather and Climate Prediction Models* F. BAER, BANGLIN ZHANG, AND BING ZHANG

    E-Print Network [OSTI]

    Baer, Ferdinand

    Optimizing Computations in Weather and Climate Prediction Models* F. BAER, BANGLIN ZHANG, AND BING scenarios for many time scales, more computer power than is currently available will be needed. One and sometimes with a biosphere included, are very complex and require so much computing power on available

  4. A Calibrated Computer Model for the Thermal Simulation of Courtyard Microclimates 

    E-Print Network [OSTI]

    Bagneid, A.; Haberl, J.

    2006-01-01

    This paper describes a calibrated stand-alone courtyard microclimate model. This model is considered to be the fIrst calibrated computer program for the simulation of courtyard microclimates. In order to accomplish this a calibrated simplif...

  5. Cielo Computational Environment Usage Model With Mappings to...

    Office of Scientific and Technical Information (OSTI)

    Environment (ACE) Version 8.0 requirements. The ACE requirements reflect the high performance computing requirements for the Production Readiness Milestone user environment...

  6. Mathematical and Computer Modelling 53 (2011) 716730 Contents lists available at ScienceDirect

    E-Print Network [OSTI]

    Berges, John A.

    2011-01-01

    Direct Mathematical and Computer Modelling journal homepage: www.elsevier.com/locate/mcm Dynamics of a virusMathematical and Computer Modelling 53 (2011) 716­730 Contents lists available at Science: Received 19 January 2010 Accepted 17 October 2010 Keywords: Virus­host dynamics Quota Bacteriophage

  7. 8/30/2001 Parallel Programming -Fall 2001 1 Models of Parallel Computation

    E-Print Network [OSTI]

    Browne, James C.

    8/30/2001 Parallel Programming - Fall 2001 1 Models of Parallel Computation Philosophy Parallel of parallel programming. #12;8/30/2001 Parallel Programming - Fall 2001 2 Models of Parallel Computation will discuss parallelism from the viewpoint of programming but with connections to other domains. #12;8/30/2001

  8. Computing Estimates in the Proportional Odds Model David R. Hunter1

    E-Print Network [OSTI]

    Hunter, David

    Computing Estimates in the Proportional Odds Model David R. Hunter1 Kenneth Lange2 Department running head: Computing MLE for proportional odds Submitted to Annals of the Institute of Statistical The semiparametric proportional odds model for survival data is useful when mortality rates of different groups

  9. A Hierarchical Task Model for Dispatching in Computer-Assisted Demand-Responsive Paratransit Operation

    E-Print Network [OSTI]

    Dessouky, Maged

    A Hierarchical Task Model for Dispatching in Computer- Assisted Demand-Responsive Paratransit Model for Dispatching in Computer-Assisted Demand-Responsive Paratransit Operation ABSTRACT, Dispatch Training #12;1 INTRODUCTION Demand-responsive paratransit service is on the rise. For example

  10. Mexico - Greenhouse Gas Emissions Baselines and Reduction Potentials...

    Open Energy Info (EERE)

    Mexico - Greenhouse Gas Emissions Baselines and Reduction Potentials from Buildings Jump to: navigation, search Name Mexico - Greenhouse Gas Emissions Baselines and Reduction...

  11. South Africa - Greenhouse Gas Emission Baselines and Reduction...

    Open Energy Info (EERE)

    South Africa - Greenhouse Gas Emission Baselines and Reduction Potentials from Buildings Jump to: navigation, search Name South Africa - Greenhouse Gas Emission Baselines and...

  12. Tarragon : a programming model for latency-hiding scientific computations

    E-Print Network [OSTI]

    Cicotti, Pietro

    2011-01-01

    Chapter 2 Programming Model . . . . . . . . . 2.1vi Chapter 6 Dynamic programming . . . . . . . . . . 6.1 Theof related programming model implementations. . . . . .

  13. Improvements in fast-response flood modeling: desktop parallel computing and domain tracking

    SciTech Connect (OSTI)

    Judi, David R; Mcpherson, Timothy N; Burian, Steven J

    2009-01-01

    It is becoming increasingly important to have the ability to accurately forecast flooding, as flooding accounts for the most losses due to natural disasters in the world and the United States. Flood inundation modeling has been dominated by one-dimensional approaches. These models are computationally efficient and are considered by many engineers to produce reasonably accurate water surface profiles. However, because the profiles estimated in these models must be superimposed on digital elevation data to create a two-dimensional map, the result may be sensitive to the ability of the elevation data to capture relevant features (e.g. dikes/levees, roads, walls, etc...). Moreover, one-dimensional models do not explicitly represent the complex flow processes present in floodplains and urban environments and because two-dimensional models based on the shallow water equations have significantly greater ability to determine flow velocity and direction, the National Research Council (NRC) has recommended that two-dimensional models be used over one-dimensional models for flood inundation studies. This paper has shown that two-dimensional flood modeling computational time can be greatly reduced through the use of Java multithreading on multi-core computers which effectively provides a means for parallel computing on a desktop computer. In addition, this paper has shown that when desktop parallel computing is coupled with a domain tracking algorithm, significant computation time can be eliminated when computations are completed only on inundated cells. The drastic reduction in computational time shown here enhances the ability of two-dimensional flood inundation models to be used as a near-real time flood forecasting tool, engineering, design tool, or planning tool. Perhaps even of greater significance, the reduction in computation time makes the incorporation of risk and uncertainty/ensemble forecasting more feasible for flood inundation modeling (NRC 2000; Sayers et al. 2000).

  14. Towards a Model for Computing in European Astroparticle Physics

    E-Print Network [OSTI]

    T. Berghöfer; I. Agrafioti; B. Allen; V. Beckmann; T. Chiarusi; M. Delfino; S. Hesping; J. Chudoba; L. Dell'Agnello; S. Katsanevas; G. Lamanna; R. Lemrani; A. Margiotta; G. Maron; C. Palomba; G. Russo; P. Wegner

    2015-12-03

    Current and future astroparticle physics experiments are operated or are being built to observe highly energetic particles, high energy electromagnetic radiation and gravitational waves originating from all kinds of cosmic sources. The data volumes taken by the experiments are large and expected to grow significantly during the coming years. This is a result of advanced research possibilities and improved detector technology. To cope with the substantially increasing data volumes of astroparticle physics projects it is important to understand the future needs for computing resources in this field. Providing these resources constitutes a larger fraction of the overall running costs of future infrastructures. This document presents the results of a survey made by APPEC with the help of computing experts of major projects and future initiatives in astroparticle physics, representatives of current Tier-1 and Tier-2 LHC computing centers, as well as specifically astroparticle physics computing centers, e.g. the Albert Einstein Institute for gravitational waves analysis in Hanover. In summary, the overall CPU usage and short-term disk and long-term (tape) storage space currently available for astroparticle physics projects' computing services is of the order of one third of the central computing available for LHC data at the Tier-0 center at CERN. Till the end of the decade the requirements for computing resources are estimated to increase by a factor of 10. Furthermore, this document shall describe the diversity of astroparticle physics data handling and serve as a basis to estimate a distribution of computing and storage tasks among the major computing centers. (Abridged)

  15. Final Report. DOE Computational Nanoscience Project DE-FG02-03ER46096: Integrated Multiscale Modeling of Molecular Computing Devices

    SciTech Connect (OSTI)

    Cummings, Peter [Vanderbilt University] [Vanderbilt University

    2009-11-15

    The document is the final report of the DOE Computational Nanoscience Project DE-FG02-03ER46096: Integrated Multiscale Modeling of Molecular Computing Devices. It included references to 62 publications that were supported by the grant.

  16. Math 484: Mathematical & Computational Modeling Course Information and Syllabus Spring 2013

    E-Print Network [OSTI]

    Math 484: Mathematical & Computational Modeling Course Information and Syllabus ­ Spring 2013 students how to build and analyze mathematical models. Most concepts presented in the context of physical students how to build mathematical models to physical problems. 2. Be able to solve modeling problems

  17. Math 484: Mathematical & Computational Modeling Course Information and Syllabus Spring 2012

    E-Print Network [OSTI]

    Math 484: Mathematical & Computational Modeling Course Information and Syllabus ­ Spring 2012 students how to build and analyze mathematical models. Most concepts presented in the context of physical students how to build mathematical models to physical problems. 2. Be able to solve modeling problems

  18. COMPUTATIONAL FLUID DYNAMICS MODELING OF SCALED HANFORD DOUBLE SHELL TANK MIXING - CFD MODELING SENSITIVITY STUDY RESULTS

    SciTech Connect (OSTI)

    JACKSON VL

    2011-08-31

    The primary purpose of the tank mixing and sampling demonstration program is to mitigate the technical risks associated with the ability of the Hanford tank farm delivery and celtification systems to measure and deliver a uniformly mixed high-level waste (HLW) feed to the Waste Treatment and Immobilization Plant (WTP) Uniform feed to the WTP is a requirement of 24590-WTP-ICD-MG-01-019, ICD-19 - Interface Control Document for Waste Feed, although the exact definition of uniform is evolving in this context. Computational Fluid Dynamics (CFD) modeling has been used to assist in evaluating scaleup issues, study operational parameters, and predict mixing performance at full-scale.

  19. A Fire Model for 2-D Computer Animation

    E-Print Network [OSTI]

    Yu, J-H.; Patterson, J.W.

    Yu,J-H. Patterson,J.W. Proceedings of the EUROGRAPHICS Workshop on Computer Animation '96, Poitiers, France. Published in Eurographics Series, (Boulic R. and Hegron, G., Eds.). pp 49-60 Springer

  20. Edinburgh Research Explorer Computational models in systems biology

    E-Print Network [OSTI]

    Millar, Andrew J.

    called process algebras and Petri nets offer alternative ways of constructing computational systems networks in an abstract way that is independent of particular mathe- matical techniques of analysis. Access

  1. The Need for Biological Computation System Models | GE Global...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    2012.10.09 Hello everyone, I'm Maria Zavodszky and I work in the Computational Biology and Biostatistics Lab at GE Global Research in Niskayuna, New York. This being our...

  2. Computational Approaches for Modelling Elastohydrodynamic Lubrication Using Multiphysics Software

    E-Print Network [OSTI]

    Jimack, Peter

    the Integral Approach (IA). The computational cost of performing this calculation is , for N points over developing highly specialised, bespoke software are highlighted. In order to calculate the complex nonlinear partial differential equation system representing the full problem. This includes

  3. Multiscale Computational Modeling of Multiphase Composites with Damage 

    E-Print Network [OSTI]

    Cheng, Feifei

    2013-11-01

    A multiscale computational framework for multiphase composites considering damage is developed in this research. In micro-scale, micromechanics based homogenization methods are used to estimate effective elastic moduli of graded Ti_(2)Al...

  4. Automatic Symmetry Detection for Model Checking Using Computational Group Theory

    E-Print Network [OSTI]

    Donaldson, A.F.; Miller, A.

    Donaldson,A.F. Miller,A. Proceedings of the 13th International Symposium on Formal Methods Europe (FME 2005). Lecture Notes in Computing Science volume 3582. pp 481--496 Springer

  5. Applications of Computer Modelling to Fire Safety Design 

    E-Print Network [OSTI]

    Torero, Jose L; Steinhaus, Thomas

    Tools in support of fire safety engineering design have proliferated in the last few years due to the increased performance of computers. These tools are currently being used in a generalized manner in areas such as egress, ...

  6. Cloud computing adoption model for governments and large enterprises

    E-Print Network [OSTI]

    Trivedi, Hrishikesh

    2013-01-01

    Cloud Computing has held organizations across the globe spell bound with its promise. As it moves from being a buzz word and hype into adoption, organizations are faced with question of how to best adopt cloud. Existing ...

  7. Waste management project technical baseline description

    SciTech Connect (OSTI)

    Sederburg, J.P.

    1997-08-13

    A systems engineering approach has been taken to describe the technical baseline under which the Waste Management Project is currently operating. The document contains a mission analysis, function analysis, requirement analysis, interface definitions, alternative analysis, system definition, documentation requirements, implementation definitions, and discussion of uncertainties facing the Project.

  8. Effective Design And Use Of Computer Decision Models.

    E-Print Network [OSTI]

    Fuerst, William L.

    1984-03-01

    of decision models such as simulation have not been demonstrated. This paper looks at recent literature regarding decision model deficiencies, evaluates selected financial simulation model packages, and suggests design needs for expanding the use of decision...

  9. Modeling-Computer Simulations At Dixie Valley Geothermal Area...

    Open Energy Info (EERE)

    W. Wisian, David D. Blackwell (2004) Numerical Modeling Of Basin And Range Geothermal Systems Additional References Retrieved from "http:en.openei.orgwindex.php?titleModel...

  10. Compensator models for fluence field modulated computed tomography

    SciTech Connect (OSTI)

    Bartolac, Steven; Jaffray, David; Radiation Medicine Program, Princess Margaret Hospital Department of Radiation Oncology, University of Toronto, Toronto, Ontario M5G 2M9

    2013-12-15

    Purpose: Fluence field modulated computed tomography (FFMCT) presents a novel approach for acquiring CT images, whereby a patient model guides dynamically changing fluence patterns in an attempt to achieve task-based, user-prescribed, regional variations in image quality, while also controlling dose to the patient. This work aims to compare the relative effectiveness of FFMCT applied to different thoracic imaging tasks (routine diagnostic CT, lung cancer screening, and cardiac CT) when the modulator is subject to limiting constraints, such as might be present in realistic implementations.Methods: An image quality plan was defined for a simulated anthropomorphic chest slice, including regions of high and low image quality, for each of the thoracic imaging tasks. Modulated fluence patterns were generated using a simulated annealing optimization script, which attempts to achieve the image quality plan under a global dosimetric constraint. Optimization was repeated under different types of modulation constraints (e.g., fixed or gantry angle dependent patterns, continuous or comprised of discrete apertures) with the most limiting case being a fixed conventional bowtie filter. For each thoracic imaging task, an image quality map (IQM{sub sd}) representing the regionally varying standard deviation is predicted for each modulation method and compared to the prescribed image quality plan as well as against results from uniform fluence fields. Relative integral dose measures were also compared.Results: Each IQM{sub sd} resulting from FFMCT showed improved agreement with planned objectives compared to those from uniform fluence fields for all cases. Dynamically changing modulation patterns yielded better uniformity, improved image quality, and lower dose compared to fixed filter patterns with optimized tube current. For the latter fixed filter cases, the optimal choice of tube current modulation was found to depend heavily on the task. Average integral dose reduction compared to a uniform fluence field ranged from 10% using a bowtie filter to 40% or greater using an idealized modulator.Conclusions: The results support that FFMCT may achieve regionally varying image quality distributions in good agreement with user-prescribed values, while limiting dose. The imposition of constraints inhibits dose reduction capacity and agreement with image quality plans but still yields significant improvement over what is afforded by conventional dose minimization techniques. These results suggest that FFMCT can be implemented effectively even when the modulator has limited modulation capabilities.

  11. Center for Programming Models for Scalable Parallel Computing: Future Programming Models

    SciTech Connect (OSTI)

    Gao, Guang, R.

    2008-07-24

    The mission of the pmodel center project is to develop software technology to support scalable parallel programming models for terascale systems. The goal of the specific UD subproject is in the context developing an efficient and robust methodology and tools for HPC programming. More specifically, the focus is on developing new programming models which facilitate programmers in porting their application onto parallel high performance computing systems. During the course of the research in the past 5 years, the landscape of microprocessor chip architecture has witnessed a fundamental change – the emergence of multi-core/many-core chip architecture appear to become the mainstream technology and will have a major impact to for future generation parallel machines. The programming model for shared-address space machines is becoming critical to such multi-core architectures. Our research highlight is the in-depth study of proposed fine-grain parallelism/multithreading support on such future generation multi-core architectures. Our research has demonstrated the significant impact such fine-grain multithreading model can have on the productivity of parallel programming models and their efficient implementation.

  12. AgentAgent--Based Computational ModelsBased Computational Models Generative Social ScienceGenerative Social Science

    E-Print Network [OSTI]

    Tesfatsion, Leigh

    Bounded Computing Capacity ·· Explicit SpaceExplicit Space ·· Local InteractionsLocal Interactions ·· Non explanatory notion.notion. #12;4 SugarscapeSugarscape ·· Events unfold on a landscape of renewableEvents

  13. A.24 ENHANCING THE CAPABILITY OF COMPUTATIONAL EARTH SYSTEM MODELS AND NASA DATA FOR OPERATION AND ASSESSMENT

    E-Print Network [OSTI]

    A.24-1 A.24 ENHANCING THE CAPABILITY OF COMPUTATIONAL EARTH SYSTEM MODELS AND NASA DATA) computational support of Earth system modeling. #12;A.24-2 2.1 Acceleration of Operational Use of Research Data

  14. Computer modeling reveals how surprisingly potent hepatitis C drug works

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    AFDC Printable Version Share this resource Send a link to EERE: Alternative Fuels Data Center Home Page to someone by E-mail Share EERE: Alternative Fuels Data Center Home Page on Facebook Tweet about EERE: Alternative Fuels Data Center Home Page on Twitter Bookmark EERE: Alternative Fuels Data Center Homesum_a_epg0_fpd_mmcf_m.xls" ,"Available from WebQuantityBonneville Power Administration would like submit the following commentsMethodsCompositional6 ComputationalComputerHepatitis C

  15. Computer-Aided Construction of Combustion Chemistry Models

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    AFDC Printable Version Share this resource Send a link to EERE: Alternative Fuels Data Center Home Page to someone by E-mail Share EERE: Alternative Fuels Data Center Home Page on Facebook Tweet about EERE: Alternative Fuels Data Center Home Page on Twitter Bookmark EERE: Alternative Fuels Data Center Homesum_a_epg0_fpd_mmcf_m.xls" ,"Available from WebQuantityBonneville Power Administration would like submit the following commentsMethodsCompositional6 ComputationalComputerHepatitis

  16. The U-Machine: A Model of Generalized Computation

    E-Print Network [OSTI]

    MacLennan, Bruce

    classification: 68Q05. This report may be used for any non-profit purpose provided that the source is credited in the broad- est sense, including analog and digital, and provides a framework for computation using novel known, the end of Moore's Law is in sight, and we are approaching the limits of digital electronics

  17. WUFI COMPUTER MODELING WORKSHOP FOR WALL DESIGN AND PERFORMANCE

    E-Print Network [OSTI]

    Oak Ridge National Laboratory

    from rain, solar radiation and other crucial weather events on an hourly basis. Both vapor and liquid which can easily be selected from a map. The temporal behavior of the computed quantities (temperatures tell me ­ performance predicting · Mold growth predictions ­ new post-processing modules · Development

  18. Modeling Living Systems for Computer Vision Demetri Terzopoulos

    E-Print Network [OSTI]

    Terzopoulos, Demetri

    , for computer vision. First, I present a new breed of artificial animals in a physics­based virtual marine world vision. 1 These advances center around the idea of artificial animals, or ``animats'' a term coined and 3 present our work on artificial fishes and animat vision. The basic idea in a nutshell

  19. Modelling Photochemical Pollution using Parallel and Distributed Computing Platforms

    E-Print Network [OSTI]

    Abramson, David

    of photochemical air pollution (smog) in industrialised cities. However, computational hardware demands can that have been used as part of an air pollution study being conducted in Melbourne, Australia. We also necessary to perform real air pollution studies. The system is used as part of the Melbourne Airshed study

  20. Modeling-Computer Simulations At Kilauea East Rift Geothermal...

    Open Energy Info (EERE)

    East Rift Zone Notes Three models were made from data collected in the exploratory well HGP-A. The models simulated constant heat sources from a vertical dike, a magma chamber,...

  1. An Axiomatisation of Computationally Adequate Domain Theoretic Models of FPC 

    E-Print Network [OSTI]

    Fiore, Marcelo P; Plotkin, Gordon

    1994-01-01

    Categorical models of the metalanguage FPC (a type theory with sums, products, exponentials and recursive types) are defined. Then, domain-theoretic models of FPC are axiomatised and a wide subclass of them —the ...

  2. Experimental Evaluations of Expert and Non-expert Computer Users' Mental Models of Security Risks

    E-Print Network [OSTI]

    Camp, L. Jean

    Experimental Evaluations of Expert and Non-expert Computer Users' Mental Models of Security Risks risks and thereby enable informed decisions by naive users. Yet computer security has not been en- gaged with the scholarship of risk communication. While the existence of malicious actors may appear at first to distinguish

  3. Creative Commons Copyright 2013 Some Rights Reserved CMC: A Model Computer Science Curriculum

    E-Print Network [OSTI]

    Iyer, Sridhar

    for K-12 Schools 3rd Edition, Released June 2013 Technical Report: TR-CSE-2013-52 Department of Computer-12 Schools Creative Commons Copyright © 2013 Some Rights Reserved 2 CMC: A Model Computer Science Curriculum for K-12 Schools 3rd Edition June, 2013 Authors Sridhar Iyer*, Farida Khan, Sahana Murthy

  4. An Exact Modeling of Signal Statistics in Energy-integrating X-ray Computed Tomography

    E-Print Network [OSTI]

    used by modern computed tomography (CT) scanners and has been an interesting research topicAn Exact Modeling of Signal Statistics in Energy-integrating X-ray Computed Tomography Yi Fan1.i.d.), such as Gamma, Gaussian, etc, would be valid. A comparison study was performed to estimate the introduced errors

  5. Internship Parallel Computer Evaluation Parallelization of a Lagrangian Particle Diffusion Model

    E-Print Network [OSTI]

    use case is a nuclear accident like a core meltdown at a atomic power plant, where atomic radiation emits in the air. The Lagrangian model can predict how the nuclear cloud spreads under different that will be computed. Particle: One single molecule floating in the wind field. Compute unit: One unit that runs

  6. A computational model for nanoscale adhesion between deformable solids and its application to gecko adhesion

    E-Print Network [OSTI]

    A computational model for nanoscale adhesion between deformable solids and its application to gecko adhesion Roger A. Sauer 1 Aachen Institute for Advanced Study in Computational Engineering Science (AICES), RWTH Aachen University, Templergaben 55, 52056 Aachen, Germany Published2 in the Journal of Adhesion

  7. Author's personal copy Calibration procedures for a computational model of ductile fracture

    E-Print Network [OSTI]

    Hutchinson, John W.

    Author's personal copy Calibration procedures for a computational model of ductile fracture Z. Xue fracture Computational fracture Shear fracture Damage parameters a b s t r a c t A recent extension of the cup-cone fracture mode in the neck of a round tensile bar. Ductility of a notched round bar provides

  8. A Vast Machine Computer Models, Climate Data, and the Politics of Global Warming

    E-Print Network [OSTI]

    Edwards, Paul N.

    A Vast Machine Computer Models, Climate Data, and the Politics of Global Warming Paul N. Edwards models, climate data, and the politics of global warming / Paul N. Edwards. p. cm. Includes this: Global warming is a myth. It's all model predictions, nothing but simulations. Before you believe

  9. A Computational Model of Knowledge-Intensive Learning and Problem Solving1

    E-Print Network [OSTI]

    Aamodt, Agnar

    1 A Computational Model of Knowledge-Intensive Learning and Problem Solving1 Agnar Aamodt Knowledge. If knowledge-based systems are to become more competent and robust in solving real world problems, they need model - a framework -for knowledge-intensive problem solving and learning from experience. The model has

  10. COMPUTERS AND BIOMEDICAL RESEARCH 17, 580-589 ( 1984) Multicompartment Model of Lung Dynamics*

    E-Print Network [OSTI]

    Longtin, André

    COMPUTERS AND BIOMEDICAL RESEARCH 17, 580-589 ( 1984) Multicompartment Model of Lung Dynamics* B of the lungs. The lungs are represented by 24 compartments each corresponding to a generation of the Weibel model A. In the model it is assumed that gases are transported in the lungs by convection and diffusion

  11. Lecture Notes in Computer Science 1 Data Reduction Using Multiple Models Integration

    E-Print Network [OSTI]

    Obradovic, Zoran

    Lecture Notes in Computer Science 1 Data Reduction Using Multiple Models Integration Aleksandar the models constructed on previously considered data samples. In addition to random sampling, controllable sampling based on the boosting algorithm is proposed, where the models are combined using a weighted voting

  12. Computational Fluid Dynamics Modeling of a Lithium/Thionyl Chloride Battery with Electrolyte Flow

    E-Print Network [OSTI]

    Wang, Chao-Yang

    Computational Fluid Dynamics Modeling of a Lithium/Thionyl Chloride Battery with Electrolyte Flow W-dimensional model is developed to simulate discharge of a primary lithium/thionyl chloride battery. The model to the first task with important examples of lead-acid,1-3 nickel-metal hydride,4-8 and lithium-based batteries

  13. DEVELOPMENT OF A COMPUTER SIMULATION MODEL FOR BLOWING GLASS CONTAINERS C. G. Giannopapa

    E-Print Network [OSTI]

    Eindhoven, Technische Universiteit

    1 DEVELOPMENT OF A COMPUTER SIMULATION MODEL FOR BLOWING GLASS CONTAINERS C. G. Giannopapa Dept to be used for industrial purposes that accurately captures the blowing step of glass containers. The model,2]. This paper concentrates on modeling the blowing stage of the forming process of glass containers

  14. Computational modeling of damage evolution in unidirectional fiber reinforced ceramic matrix composites

    E-Print Network [OSTI]

    Ortiz, Michael

    mechanical re- sponse of a ceramic matrix composite is simulated by a numerical model for a ®ber-matrix unitComputational modeling of damage evolution in unidirectional fiber reinforced ceramic matrix evolution in brittle matrix composites was developed. This modeling is based on an axisymmetric unit cell

  15. April 30, 2013 Mathematical and Computer Modelling of Dynamical Systems criticalTransitions Mathematical and Computer Modelling of Dynamical Systems

    E-Print Network [OSTI]

    Gedeon, Tomas

    , from those appearing in physiology and ecology to Earth systems modeling, often experience critical

  16. Baseline Microstructural Characterization of Outer 3013 Containers

    SciTech Connect (OSTI)

    Zapp, Phillip E.; Dunn, Kerry A

    2005-07-31

    Three DOE Standard 3013 outer storage containers were examined to characterize the microstructure of the type 316L stainless steel material of construction. Two of the containers were closure-welded yielding production-quality outer 3013 containers; the third examined container was not closed. Optical metallography and Knoop microhardness measurements were performed to establish a baseline characterization that will support future destructive examinations of 3013 outer containers in the storage inventory. Metallography revealed the microstructural features typical of this austenitic stainless steel as it is formed and welded. The grains were equiaxed with evident annealing twins. Flow lines were prominent in the forming directions of the cylindrical body and flat lids and bottom caps. No adverse indications were seen. Microhardness values, although widely varying, were consistent with annealed austenitic stainless steel. The data gathered as part of this characterization will be used as a baseline for the destructive examination of 3013 containers removed from the storage inventory.

  17. Computational Modeling of Conventionally Reinforced Concrete Coupling Beams 

    E-Print Network [OSTI]

    Shastri, Ajay Seshadri

    2012-02-14

    . The model is developed in the finite element analysis software ABAQUS. The concrete damaged plasticity model was used to simulate the behavior of concrete. A calibration model using a cantilever beam was produced to generate key parameters in the model... Stress (ABAQUS 2008)??????.????...61 Fig. 3.9. CPS8 Element Used for Modeling Concrete (ABAQUS 2008)?????64 Fig. 4.1. Elevation and Cross-Section of the Cantilever Beam?????????66 Fig. 4.2. Compressive Stress-Strain Behavior of Concrete...

  18. A scalable computational approach for modeling dynamic fracture of brittle solids in three dimensions

    E-Print Network [OSTI]

    Seagraves, Andrew Nathan

    2010-01-01

    In this thesis a new parallel computational method is proposed for modeling threedimensional dynamic fracture of brittle solids. The method is based on a combination of the discontinuous Galerkin (DG) formulation of the ...

  19. Towards a Computational Model of Musical Accompaniment: Disambiguation of Musical Analyses by Reference to Performance Data 

    E-Print Network [OSTI]

    Curry, Benjamin David

    A goal of Artificial Intelligence is to develop computational models of what would be considered intelligent behaviour in a human. One such task is that of musical performance. This research specifically focuses on aspects ...

  20. A Computational Market Model for Distributed Configuration Design Michael P. Wellman

    E-Print Network [OSTI]

    Wellman, Michael P.

    economies" constitutes the market solution to the original problem. After defining the configuration design. Consider a hyper-simplified scenario in aircraft design. (We choose this not as a serious exemplar ¡ ¢ £ £ ¤ ¢ ¥ ¦ § ¨ © ¨ A Computational Market Model for Distributed Configuration

  1. THEORETICAL MODELING AND COMPUTATIONAL SIMULATION OF ROBUST CONTROL FOR MARS AIRCRAFT

    E-Print Network [OSTI]

    Oh, Seyool

    2014-05-31

    The focus of this dissertation is the development of control system design algorithms for autonomous operation of an aircraft in the Martian atmosphere. This research will show theoretical modeling and computational ...

  2. Development of mathematical models and mathematical, computational framework for multi-media interaction processes

    E-Print Network [OSTI]

    Ma, Yongting

    2011-01-11

    This thesis presents development of mathematical models for multi-media interaction process using Eulerian description and associated computational infrastructure to obtain numerical solution of the initial value problems ...

  3. Computing

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    AFDC Printable Version Share this resource Send a link to EERE: Alternative Fuels Data Center Home Page to someone by E-mail Share EERE: Alternative Fuels Data Center Home Page on Facebook Tweet about EERE: Alternative Fuels Data Center Home Page on Twitter Bookmark EERE: Alternative Fuels Data Center Homesum_a_epg0_fpd_mmcf_m.xls" ,"Available from WebQuantityBonneville Power Administration would like submit the following commentsMethodsCompositional6Energy Computers,Computing and

  4. Computing

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    AFDC Printable Version Share this resource Send a link to EERE: Alternative Fuels Data Center Home Page to someone by E-mail Share EERE: Alternative Fuels Data Center Home Page on Facebook Tweet about EERE: Alternative Fuels Data Center Home Page on Twitter Bookmark EERE: Alternative Fuels Data Center Homesum_a_epg0_fpd_mmcf_m.xls" ,"Available from WebQuantityBonneville Power Administration would like submit the following commentsMethodsCompositional6Energy Computers,Computing

  5. Computing

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    AFDC Printable Version Share this resource Send a link to EERE: Alternative Fuels Data Center Home Page to someone by E-mail Share EERE: Alternative Fuels Data Center Home Page on Facebook Tweet about EERE: Alternative Fuels Data Center Home Page on Twitter Bookmark EERE: Alternative Fuels Data Center Homesum_a_epg0_fpd_mmcf_m.xls" ,"Available from WebQuantity ofkandz-cm11 Outreach Home Room News PublicationsAuditsCluster Compatibilitydefault Changes TheComputeComputing

  6. Formal computational models and non-standard finiteness

    E-Print Network [OSTI]

    Ayala-Rincón, Mauricio

    (f) #12;Finiteness in Computation ^ without NNO T : Pos - f : P(Q Ã? Pos Ã? Tfin) - P(Q Ã? Posfin Ã? Q), such that Tfin = {h/fin({p Pos : h(p) = b})} X = P(Q Ã? Pos Ã? Tfin), St(X) = {S X : f(S) S} i : St(X) P(X) has = { x , y /Pos , yT > ^ Pos , xT > and final(q)} #12;Finitenes

  7. WIDE-BASELINE IMAGE CHANGE DETECTION Ziggy Jones Mike Brookes Pier Luigi Dragotti David Benton

    E-Print Network [OSTI]

    Dragotti, Pier Luigi

    the problem of wide-baseline image change detection and presents a method for identifying areas that have such as leaves rustling in the wind. The appearance of any planar region of the scene in two different images to pro- duce fewer matches and require a larger computational effort. This paper utilises a novel

  8. Final Report for Integrated Multiscale Modeling of Molecular Computing Devices

    SciTech Connect (OSTI)

    Glotzer, Sharon C.

    2013-08-28

    In collaboration with researchers at Vanderbilt University, North Carolina State University, Princeton and Oakridge National Laboratory we developed multiscale modeling and simulation methods capable of modeling the synthesis, assembly, and operation of molecular electronics devices. Our role in this project included the development of coarse-grained molecular and mesoscale models and simulation methods capable of simulating the assembly of millions of organic conducting molecules and other molecular components into nanowires, crossbars, and other organized patterns.

  9. Modeling-Computer Simulations At Long Valley Caldera Geothermal...

    Open Energy Info (EERE)

    surrounding a vertically dipping prolate spheroid source during an active period of time-dependent deformation between 1995 and 2000 at Long Valley caldera. We model a rapid...

  10. Modeling-Computer Simulations At Long Valley Caldera Geothermal...

    Open Energy Info (EERE)

    and seismic data was conducted in 2003 to investigate the cause of recent uplift of the resurgent dome. Notes Modeling of deformation and microgravity data suggests...

  11. Advanced Computing Tools and Models for Accelerator Physics

    E-Print Network [OSTI]

    Ryne, Robert D.

    2008-01-01

    MODELS FOR ACCELERATOR PHYSICS * Robert D. Ryne, Lawrencetools for accelerator physics. Following an introduction Icomputing in accelerator physics. INTRODUCTION To begin I

  12. Mathematical modeling and computer simulation of processes in energy systems

    SciTech Connect (OSTI)

    Hanjalic, K.C. )

    1990-01-01

    This book is divided into the following chapters. Modeling techniques and tools (fundamental concepts of modeling); 2. Fluid flow, heat and mass transfer, chemical reactions, and combustion; 3. Processes in energy equipment and plant components (boilers, steam and gas turbines, IC engines, heat exchangers, pumps and compressors, nuclear reactors, steam generators and separators, energy transport equipment, energy convertors, etc.); 4. New thermal energy conversion technologies (MHD, coal gasification and liquefaction fluidized-bed combustion, pulse-combustors, multistage combustion, etc.); 5. Combined cycles and plants, cogeneration; 6. Dynamics of energy systems and their components; 7. Integrated approach to energy systems modeling, and 8. Application of modeling in energy expert systems.

  13. Modeling-Computer Simulations At Long Valley Caldera Geothermal...

    Open Energy Info (EERE)

    LVEW are best matched using modeled solutions for a flow system consisting of a rock matrix with finite hydraulic conductivity cut by a steeply dipping fracture with infinite...

  14. Radial pulsations of neutron stars: computing alternative polytropic models regarding density and adiabatic index

    E-Print Network [OSTI]

    Vassilis Geroyannis; Georgios Kleftogiannis

    2014-06-14

    We revisit the problem of radial pulsations of neutron stars by computing four general-relativistic polytropic models, in which "density" and "adiabatic index" are involved with their discrete meanings: (i) "rest-mass density" or (ii) "mass-energy density" regarding the density, and (i) "constant" or (ii) "variable" regarding the adiabatic index. Considering the resulting four discrete combinations, we construct corresponding models and compute for each model the frequencies of the lowest three radial modes. Comparisons with previous results are made. The deviations of respective frequencies of the resolved models seem to exhibit a systematic behavior, an issue discussed here in detail.

  15. Gaussian Process Modeling and Computation in Engineering Applications 

    E-Print Network [OSTI]

    Pourhabib, Arash

    2014-07-08

    ; and predictive modeling for large datasets. First, we develop a spatial-temporal model for local wind fields in a wind farm with more than 200 wind turbines. Our framework utilizes the correlation among the derivatives of wind speeds to find a neighborhood...

  16. Modeling civil violence: An agent-based computational approach

    E-Print Network [OSTI]

    Tesfatsion, Leigh

    ," I do so advisedly, recognizing that no political or social order is represented in the model of revolutions properly speaking. The dynamics of decentralized upheaval, rather than its political substance Against Central Authority This model involves two categories of actors. ``Agents'' are members

  17. Statistical meta modeling of computer experiments Kriging as the main tool: two examples of computer experiments

    E-Print Network [OSTI]

    Huang, Su-Yun

    factors · Simulation codes with calibration parameters 8 #12;Example: Designing Cellular Heat Exchangers in Qian et al. (2006, ASME) Related to the autoregressive model in Kennedy and O'Hagan (2000) · x = (x1

  18. Didactyl: Toward a Useful Computational Model of Piano Fingering

    E-Print Network [OSTI]

    , ergonomics, cognition, and habit. Theoretically, the search space for an ideal) to define evaluation corpora to allow comparisons between the models, and (3 and evaluation corpora to reduce opportunity costs for future researchers working

  19. Computational tools for modeling and measuring chromosome structure

    E-Print Network [OSTI]

    Ross, Brian Christopher

    2012-01-01

    DNA conformation within cells has many important biological implications, but there are challenges both in modeling DNA due to the need for specialized techniques, and experimentally since tracing out in vivo conformations ...

  20. When Model Checking Met Deduction Computer Science Laboratory

    E-Print Network [OSTI]

    Clarke, Edmund M.

    Park, CA Sep 19, 2014 #12;Alan Turing It is of course important that some efforts be made to verify hold in each case. Alan Turing (quoted by D. MacKenzie in Risk and Reason) N. Shankar Model checking

  1. Scalable computational architecture for integrating biological pathway models

    E-Print Network [OSTI]

    Shiva, V. A

    2007-01-01

    A grand challenge of systems biology is to model the cell. The cell is an integrated network of cellular functions. Each cellular function, such as immune response, cell division, metabolism or apoptosis, is defined by an ...

  2. A Computational Model of How the Basal Ganglia Produce Sequences

    E-Print Network [OSTI]

    Berns, Gregory S.

    closely on known anatomy and physiology. First, we assume that the thalamic targets, which relay ascend the external globus pallidus (GPe) and the subthalamic nucleus (STN). As a test of the model, the system

  3. Computational Modeling of Combined Steam Pyrolysis and Hydrogasification of Ethanol

    E-Print Network [OSTI]

    Singh, S; Park, C S; Norbeck, J N

    2005-01-01

    Approximate modelling of coal pyrolysis. Fuel, 78(7), 825-9. Gonzalez, JF. (2003). Pyrolysis of cherry stones: energyof analytical and applied pyrolysis, 67(1), 165-190. 10.

  4. Computer support to run models of the atmosphere. Final report

    SciTech Connect (OSTI)

    Fung, I.

    1996-08-30

    This research is focused on a better quantification of the variations in CO{sub 2} exchanges between the atmosphere and biosphere and the factors responsible for these exchangers. The principal approach is to infer the variations in the exchanges from variations in the atmospheric CO{sub 2} distribution. The principal tool involves using a global three-dimensional tracer transport model to advect and convect CO{sub 2} in the atmosphere. The tracer model the authors used was developed at the Goddard institute for Space Studies (GISS) and is derived from the GISS atmospheric general circulation model. A special run of the GCM is made to save high-frequency winds and mixing statistics for the tracer model.

  5. Modeling-Computer Simulations At White Mountains Area (Goff ...

    Open Energy Info (EERE)

    useful DOE-funding Unknown Notes Review and identification of 24 potential sites for EGS development across the U.S., as well as modeling of the representative geologic systems...

  6. Modeling-Computer Simulations At Fenton Hill HDR Geothermal Area...

    Open Energy Info (EERE)

    useful DOE-funding Unknown Notes Review and identification of 24 potential sites for EGS development across the U.S., as well as modeling of the representative geologic systems...

  7. Continuum-based Multiscale Computational Damage Modeling of Cementitous Composites 

    E-Print Network [OSTI]

    Kim, Sun-Myung

    2011-08-08

    , aggregates, and interfacial transition zone (ITZ) and interaction among components at meso-scale, and the interaction between reinforcements, such as fiber and carbon nanotubes (CNTs) and mortar matrix or the ITZ at nano scale in order to predict more... of Advisory Committee: Dr. Rashid K. Abu Al-Rub Based on continuum damage mechanics (CDM), an isotropic and anisotropic damage model coupled with a novel plasticity model for plain concrete is proposed in this research. Two different damage evolution laws...

  8. Computer

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    AFDC Printable Version Share this resource Send a link to EERE: Alternative Fuels Data Center Home Page to someone by E-mail Share EERE: Alternative Fuels Data Center Home Page on Facebook Tweet about EERE: Alternative Fuels Data Center Home Page on Twitter Bookmark EERE: Alternative Fuels Data Center Homesum_a_epg0_fpd_mmcf_m.xls" ,"Available from WebQuantityBonneville Power Administration would like submit the following commentsMethodsCompositional6 Computational CSEEnergy

  9. SRP Baseline Hydrogeologic Investigation, Phase 3

    SciTech Connect (OSTI)

    Bledsoe, H.W.

    1988-08-01

    The SRP Baseline Hydrogeologic Investigation was implemented for the purpose of updating and improving the knowledge and understanding of the hydrogeologic systems underlying the SRP site. Phase III, which is discussed in this report, includes the drilling of 7 deep coreholes (sites P-24 through P-30) and the installation of 53 observation wells ranging in depth from approximately 50 ft to more than 970 ft below the ground surface. In addition to the collection of geologic cores for lithologic and stratigraphic study, samples were also collected for the determination of physical characteristics of the sediments and for the identification of microorganisms.

  10. SRP baseline hydrogeologic investigation, Phase 2

    SciTech Connect (OSTI)

    Bledsoe, H.W.

    1987-11-01

    As discussed in the program plan for the Savannah River Plant (SRP) Baseline Hydrogeologic Investigation, this program has been implemented for the purpose of updating and improving the current state of knowledge and understanding of the hydrogeologic systems underlying the Savannah River Plant (SRP). The objective of the program is to install a series of observation well clusters (wells installed in each major water bearing formation at the same site) at key locations across the plant site in order to: (1) provide detailed information on the lithology, stratigraphy, and groundwater hydrology, (2) provide observation wells to monitor the groundwater quality, head relationships, gradients, and flow paths.

  11. R&D for computational cognitive and social models : foundations for model evaluation through verification and validation (final LDRD report).

    SciTech Connect (OSTI)

    Slepoy, Alexander; Mitchell, Scott A.; Backus, George A.; McNamara, Laura A.; Trucano, Timothy Guy

    2008-09-01

    Sandia National Laboratories is investing in projects that aim to develop computational modeling and simulation applications that explore human cognitive and social phenomena. While some of these modeling and simulation projects are explicitly research oriented, others are intended to support or provide insight for people involved in high consequence decision-making. This raises the issue of how to evaluate computational modeling and simulation applications in both research and applied settings where human behavior is the focus of the model: when is a simulation 'good enough' for the goals its designers want to achieve? In this report, we discuss two years' worth of review and assessment of the ASC program's approach to computational model verification and validation, uncertainty quantification, and decision making. We present a framework that extends the principles of the ASC approach into the area of computational social and cognitive modeling and simulation. In doing so, we argue that the potential for evaluation is a function of how the modeling and simulation software will be used in a particular setting. In making this argument, we move from strict, engineering and physics oriented approaches to V&V to a broader project of model evaluation, which asserts that the systematic, rigorous, and transparent accumulation of evidence about a model's performance under conditions of uncertainty is a reasonable and necessary goal for model evaluation, regardless of discipline. How to achieve the accumulation of evidence in areas outside physics and engineering is a significant research challenge, but one that requires addressing as modeling and simulation tools move out of research laboratories and into the hands of decision makers. This report provides an assessment of our thinking on ASC Verification and Validation, and argues for further extending V&V research in the physical and engineering sciences toward a broader program of model evaluation in situations of high consequence decision-making.

  12. Theoretical and computer models of detonation in solid explosives

    SciTech Connect (OSTI)

    Tarver, C.M.; Urtiew, P.A.

    1997-10-01

    Recent experimental and theoretical advances in understanding energy transfer and chemical kinetics have led to improved models of detonation waves in solid explosives. The Nonequilibrium Zeldovich - von Neumann - Doring (NEZND) model is supported by picosecond laser experiments and molecular dynamics simulations of the multiphonon up-pumping and internal vibrational energy redistribution (IVR) processes by which the unreacted explosive molecules are excited to the transition state(s) preceding reaction behind the leading shock front(s). High temperature, high density transition state theory calculates the induction times measured by laser interferometric techniques. Exothermic chain reactions form product gases in highly excited vibrational states, which have been demonstrated to rapidly equilibrate via supercollisions. Embedded gauge and Fabry-Perot techniques measure the rates of reaction product expansion as thermal and chemical equilibrium is approached. Detonation reaction zone lengths in carbon-rich condensed phase explosives depend on the relatively slow formation of solid graphite or diamond. The Ignition and Growth reactive flow model based on pressure dependent reaction rates and Jones-Wilkins-Lee (JWL) equations of state has reproduced this nanosecond time resolved experimental data and thus has yielded accurate average reaction zone descriptions in one-, two- and three- dimensional hydrodynamic code calculations. The next generation reactive flow model requires improved equations of state and temperature dependent chemical kinetics. Such a model is being developed for the ALE3D hydrodynamic code, in which heat transfer and Arrhenius kinetics are intimately linked to the hydrodynamics.

  13. A Vast Machine Computer Models, Climate Data, and the Politics of Global Warming

    E-Print Network [OSTI]

    Edwards, Paul N.

    A Vast Machine Computer Models, Climate Data, and the Politics of Global Warming Paul N. Edwards models, climate data, and the politics of global warming / Paul N. Edwards. p. cm. Includes. Climatology--History. 3. Meteorology--History. 4. Climatology--Technological innovation. 5. Global temperature

  14. Computational modeling of thermal conductivity of single walled carbon nanotube polymer composites

    E-Print Network [OSTI]

    Maruyama, Shigeo

    was developed to study the thermal conductivity of single walled carbon nanotube (SWNT)-polymer composites1 Computational modeling of thermal conductivity of single walled carbon nanotube polymer resistance on effective conductivity of composites were quantified. The present model is a useful tool

  15. The Application of L-systems and Developmental Models to Computer Art,

    E-Print Network [OSTI]

    McCormack, Jon

    The Application of L-systems and Developmental Models to Computer Art, Animation, and Music Synthesis Jon McCormack, B.Sc.(Hons), Grad Dip. Art (Film & Television) A Thesis Submitted for the Degree.3.1 Generative Modelling Systems ................................................ 8 1.3.2 Critical and Art

  16. Toward Cost-Sensitive Modeling for Intrusion Detection Computer Science Department

    E-Print Network [OSTI]

    Toward Cost-Sensitive Modeling for Intrusion Detection Wenke Lee Computer Science Department North,ezk,weaselg@cs.columbia.edu Abstract Intrusion detection systems need to maximize security while minimizing costs. In this paper, we study the problem of building cost-sensitive intrusion detection models. We examine the major cost

  17. STATISTICAL MODELING OF THE LUNG NODULES IN LOW DOSE COMPUTED TOMOGRAPHY SCANS OF THE CHEST

    E-Print Network [OSTI]

    Louisville, University of

    STATISTICAL MODELING OF THE LUNG NODULES IN LOW DOSE COMPUTED TOMOGRAPHY SCANS OF THE CHEST Amal in automatic detection of the lung nodules and is compared with respect to parametric nodule models in terms appearing in low dose CT (LDCT) scans of the human chest. Four types of common lung nodules are analyzed

  18. A two-layer granular landslide model for tsunami wave generation: Theory and computation

    E-Print Network [OSTI]

    Kirby, James T.

    A two-layer granular landslide model for tsunami wave generation: Theory and computation Gangfeng for granular landslide motion and tsunami wave generation. The landslide, either submarine or subaerial experiments on impulsive wave generation by subaerial granular landslides. Model results illustrate a complex

  19. Computational Strategies for Large-Scale MILP Transshipment Models for Heat Exchanger Network Synthesis

    E-Print Network [OSTI]

    Grossmann, Ignacio E.

    1 Computational Strategies for Large-Scale MILP Transshipment Models for Heat Exchanger Network Determining the minimum number of units is an important step in heat exchanger network synthesis (HENS Words heat exchanger network synthesis (HENS), transshipment model, mixed-integer linear programming

  20. Dose Reconstruction Using Computational Modeling of Handling a Particular Arsenic-73/Arsenic-74 Source 

    E-Print Network [OSTI]

    Stallard, Alisha M.

    2011-08-08

    . This prompted a reconstruction of the dose to the worker’s hands. The computer code MCNP was chosen to model the tasks that the worker performed to evaluate the potential nonuniform hand dose distribution. A model was constructed similar to the worker’s hands...

  1. A computational model for predicting damage evolution in laminated composite plates 

    E-Print Network [OSTI]

    Phillips, Mark Lane

    1999-01-01

    computationally tenable is shown herein. Due to the complicated nature of the many cracks and their interactions, a multi-scale micro-meso-local-global methodology is employed in order to model damage modes. Interface degradation is first modeled analytically...

  2. Modeling and Design of RF MEMS Structures Using Computationally Efficient Numerical Techniques

    E-Print Network [OSTI]

    Tentzeris, Manos

    Modeling and Design of RF MEMS Structures Using Computationally Efficient Numerical Techniques N. A Abstract The modeling of MEMS structures using MRTD is presented. Many complex RF structures have been communication systems efficiently and accurately. Specifically, micromachined structures such as MEMS

  3. Modeling and Optimization of RF-MEMS Reconfigurable Tuners with Computationally Efficient Time-Domain Techniques

    E-Print Network [OSTI]

    Tentzeris, Manos

    Modeling and Optimization of RF-MEMS Reconfigurable Tuners with Computationally Efficient Time of Technology, Atlanta, GA 30332 2 Raytheon Company, Tucson AZ, 85734 Abstract -- Modern RF-MEMS device design methods in which the FDTD technique can be used to model a reconfigurable RF-MEMS tuner. A new method

  4. An efficient computational model for macroscale simulations of moving contact lines

    E-Print Network [OSTI]

    Boyer, Edmond

    with CO2, for example). A major challenge in numerical simulations of moving contact linesAn efficient computational model for macroscale simulations of moving contact lines Y. Sui1 simulation of moving contact lines. The main purpose is to formulate and test a model wherein the macroscale

  5. Coupling remote sensing with computational fluid dynamics modelling to estimate lake chlorophyll-a concentration

    E-Print Network [OSTI]

    Coupling remote sensing with computational fluid dynamics modelling to estimate lake chlorophyll form 17 October 2000; accepted 1 June 2001 Abstract A remotely sensed image of Loch Leven, a shallow in the remotely sensed image. It is proposed that CFD modelling benefits the interpretation of remotely sensed

  6. GWU Department of Mathematics Topics in Model Theory: Classical and Computable

    E-Print Network [OSTI]

    Harizanov, Valentina S.

    framework for the notions of language, meaning, and truth. A model, a concept used in all of sciences course will be, in some sense, self-contained. We will start by reviewing the fundamental concepts­194 (survey chapter without proofs). (3) V. Harizanov, "Pure computable model theory," in the volume: Handbook

  7. Systems, methods and computer-readable media for modeling cell performance fade of rechargeable electrochemical devices

    DOE Patents [OSTI]

    Gering, Kevin L

    2013-08-27

    A system includes an electrochemical cell, monitoring hardware, and a computing system. The monitoring hardware periodically samples performance characteristics of the electrochemical cell. The computing system determines cell information from the performance characteristics of the electrochemical cell. The computing system also develops a mechanistic level model of the electrochemical cell to determine performance fade characteristics of the electrochemical cell and analyzing the mechanistic level model to estimate performance fade characteristics over aging of a similar electrochemical cell. The mechanistic level model uses first constant-current pulses applied to the electrochemical cell at a first aging period and at three or more current values bracketing a first exchange current density. The mechanistic level model also is based on second constant-current pulses applied to the electrochemical cell at a second aging period and at three or more current values bracketing the second exchange current density.

  8. HIGH RESOLUTION FORWARD AND INVERSE EARTHQUAKE MODELING ON TERASCALE COMPUTERS

    E-Print Network [OSTI]

    Shewchuk, Jonathan

    highly populated seismic region in the U.S., it has well- characterized geological structures (including in characterizing earthquake source and basin material properties, a critical remaining challenge is to invert basin geology and earthquake sources, and to use this capability to model and forecast strong ground

  9. CASE FOR SUPPORT: Computational Modeling of Salience Sensitive Control

    E-Print Network [OSTI]

    Heinke, Dietmar

    in neural network modeling, machine learning, adaptive systems in general and self-organising systems] and verification of real-time systems [6]. A large amount of this research has been performed using the CADP verification environment, which is one of the most powerful tool suites available, boasting a spectrum

  10. A quantitative model of computation Dan R. Ghica

    E-Print Network [OSTI]

    Ghica, Dan

    languages. We define a Hyland-Ong-style games framework called slot games, which consists of HO games be nevertheless difficult, if not impossible, to handle using known operational techniques. Categories and Subject]: Modeling techniques. General Terms: Languages, Performance, Theory, Verifi- cation Keywords: Game semantics

  11. US Biofuels Baseline and impact of extending the

    E-Print Network [OSTI]

    Noble, James S.

    June 2011 US Biofuels Baseline and impact of extending the $0.45 ethanol blenders baseline projections for agricultural and biofuel markets.1 That baseline assumed current biofuel policy for cellulosic biofuels was assumed to expire at the end of 2012. This report compares a slightly modified

  12. CASTING DEFECT MODELING IN AN INTEGRATED COMPUTATIONAL MATERIALS ENGINEERING APPROACH

    SciTech Connect (OSTI)

    Sabau, Adrian S [ORNL

    2015-01-01

    To accelerate the introduction of new cast alloys, the simultaneous modeling and simulation of multiphysical phenomena needs to be considered in the design and optimization of mechanical properties of cast components. The required models related to casting defects, such as microporosity and hot tears, are reviewed. Three aluminum alloys are considered A356, 356 and 319. The data on calculated solidification shrinkage is presented and its effects on microporosity levels discussed. Examples are given for predicting microporosity defects and microstructure distribution for a plate casting. Models to predict fatigue life and yield stress are briefly highlighted here for the sake of completion and to illustrate how the length scales of the microstructure features as well as porosity defects are taken into account for modeling the mechanical properties. Thus, the data on casting defects, including microstructure features, is crucial for evaluating the final performance-related properties of the component. ACKNOWLEDGEMENTS This work was performed under a Cooperative Research and Development Agreement (CRADA) with the Nemak Inc., and Chrysler Co. for the project "High Performance Cast Aluminum Alloys for Next Generation Passenger Vehicle Engines. The author would also like to thank Amit Shyam for reviewing the paper and Andres Rodriguez of Nemak Inc. Research sponsored by the U. S. Department of Energy, Office of Energy Efficiency and Renewable Energy, Vehicle Technologies Office, as part of the Propulsion Materials Program under contract DE-AC05-00OR22725 with UT-Battelle, LLC. Part of this research was conducted through the Oak Ridge National Laboratory's High Temperature Materials Laboratory User Program, which is sponsored by the U. S. Department of Energy, Office of Energy Efficiency and Renewable Energy, Vehicle Technologies Program.

  13. Computational models for the berry phase in semiconductor quantum dots

    SciTech Connect (OSTI)

    Prabhakar, S. Melnik, R. V. N.; Sebetci, A.

    2014-10-06

    By developing a new model and its finite element implementation, we analyze the Berry phase low-dimensional semiconductor nanostructures, focusing on quantum dots (QDs). In particular, we solve the Schrödinger equation and investigate the evolution of the spin dynamics during the adiabatic transport of the QDs in the 2D plane along circular trajectory. Based on this study, we reveal that the Berry phase is highly sensitive to the Rashba and Dresselhaus spin-orbit lengths.

  14. A Variable Refrigerant Flow Heat Pump Computer Model in EnergyPlus

    SciTech Connect (OSTI)

    Raustad, Richard A. [Florida Solar Energy Center

    2013-01-01

    This paper provides an overview of the variable refrigerant flow heat pump computer model included with the Department of Energy's EnergyPlusTM whole-building energy simulation software. The mathematical model for a variable refrigerant flow heat pump operating in cooling or heating mode, and a detailed model for the variable refrigerant flow direct-expansion (DX) cooling coil are described in detail.

  15. Verification of a VRF Heat Pump Computer Model in EnergyPlus

    SciTech Connect (OSTI)

    Nigusse, Bereket; Raustad, Richard

    2013-06-01

    This paper provides verification results of the EnergyPlus variable refrigerant flow (VRF) heat pump computer model using manufacturer's performance data. The paper provides an overview of the VRF model, presents the verification methodology, and discusses the results. The verification provides quantitative comparison of full and part-load performance to manufacturer's data in cooling-only and heating-only modes of operation. The VRF heat pump computer model uses dual range bi-quadratic performance curves to represent capacity and Energy Input Ratio (EIR) as a function of indoor and outdoor air temperatures, and dual range quadratic performance curves as a function of part-load-ratio for modeling part-load performance. These performance curves are generated directly from manufacturer's published performance data. The verification compared the simulation output directly to manufacturer's performance data, and found that the dual range equation fit VRF heat pump computer model predicts the manufacturer's performance data very well over a wide range of indoor and outdoor temperatures and part-load conditions. The predicted capacity and electric power deviations are comparbale to equation-fit HVAC computer models commonly used for packaged and split unitary HVAC equipment.

  16. Global Nuclear Energy Partnership Waste Treatment Baseline

    SciTech Connect (OSTI)

    Dirk Gombert; William Ebert; James Marra; Robert Jubin; John Vienna

    2008-05-01

    The Global Nuclear Energy Partnership program (GNEP) is designed to demonstrate a proliferation-resistant and sustainable integrated nuclear fuel cycle that can be commercialized and used internationally. Alternative stabilization concepts for byproducts and waste streams generated by fuel recycling processes were evaluated and a baseline of waste forms was recommended for the safe disposition of waste streams. Waste forms are recommended based on the demonstrated or expected commercial practicability and technical maturity of the processes needed to make the waste forms, and performance of the waste form materials when disposed. Significant issues remain in developing technologies to process some of the wastes into the recommended waste forms, and a detailed analysis of technology readiness and availability may lead to the choice of a different waste form than what is recommended herein. Evolving regulations could also affect the selection of waste forms.

  17. Computational Human Performance Modeling For Alarm System Design

    SciTech Connect (OSTI)

    Jacques Hugo

    2012-07-01

    The introduction of new technologies like adaptive automation systems and advanced alarms processing and presentation techniques in nuclear power plants is already having an impact on the safety and effectiveness of plant operations and also the role of the control room operator. This impact is expected to escalate dramatically as more and more nuclear power utilities embark on upgrade projects in order to extend the lifetime of their plants. One of the most visible impacts in control rooms will be the need to replace aging alarm systems. Because most of these alarm systems use obsolete technologies, the methods, techniques and tools that were used to design the previous generation of alarm system designs are no longer effective and need to be updated. The same applies to the need to analyze and redefine operators’ alarm handling tasks. In the past, methods for analyzing human tasks and workload have relied on crude, paper-based methods that often lacked traceability. New approaches are needed to allow analysts to model and represent the new concepts of alarm operation and human-system interaction. State-of-the-art task simulation tools are now available that offer a cost-effective and efficient method for examining the effect of operator performance in different conditions and operational scenarios. A discrete event simulation system was used by human factors researchers at the Idaho National Laboratory to develop a generic alarm handling model to examine the effect of operator performance with simulated modern alarm system. It allowed analysts to evaluate alarm generation patterns as well as critical task times and human workload predicted by the system.

  18. The Nuclear Energy Advanced Modeling and Simulation Enabling Computational Technologies FY09 Report

    SciTech Connect (OSTI)

    Diachin, L F; Garaizar, F X; Henson, V E; Pope, G

    2009-10-12

    In this document we report on the status of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Enabling Computational Technologies (ECT) effort. In particular, we provide the context for ECT In the broader NEAMS program and describe the three pillars of the ECT effort, namely, (1) tools and libraries, (2) software quality assurance, and (3) computational facility (computers, storage, etc) needs. We report on our FY09 deliverables to determine the needs of the integrated performance and safety codes (IPSCs) in these three areas and lay out the general plan for software quality assurance to meet the requirements of DOE and the DOE Advanced Fuel Cycle Initiative (AFCI). We conclude with a brief description of our interactions with the Idaho National Laboratory computer center to determine what is needed to expand their role as a NEAMS user facility.

  19. Modeling of BWR core meltdown accidents - for application in the MELRPI. MOD2 computer code

    SciTech Connect (OSTI)

    Koh, B R; Kim, S H; Taleyarkhan, R P; Podowski, M Z; Lahey, Jr, R T

    1985-04-01

    This report summarizes improvements and modifications made in the MELRPI computer code. A major difference between this new, updated version of the code, called MELRPI.MOD2, and the one reported previously, concerns the inclusion of a model for the BWR emergency core cooling systems (ECCS). This model and its computer implementation, the ECCRPI subroutine, account for various emergency injection modes, for both intact and rubblized geometries. Other changes to MELRPI deal with an improved model for canister wall oxidation, rubble bed modeling, and numerical integration of system equations. A complete documentation of the entire MELRPI.MOD2 code is also given, including an input guide, list of subroutines, sample input/output and program listing.

  20. Systems, methods and computer-readable media to model kinetic performance of rechargeable electrochemical devices

    DOE Patents [OSTI]

    Gering, Kevin L.

    2013-01-01

    A system includes an electrochemical cell, monitoring hardware, and a computing system. The monitoring hardware samples performance characteristics of the electrochemical cell. The computing system determines cell information from the performance characteristics. The computing system also analyzes the cell information of the electrochemical cell with a Butler-Volmer (BV) expression modified to determine exchange current density of the electrochemical cell by including kinetic performance information related to pulse-time dependence, electrode surface availability, or a combination thereof. A set of sigmoid-based expressions may be included with the modified-BV expression to determine kinetic performance as a function of pulse time. The determined exchange current density may be used with the modified-BV expression, with or without the sigmoid expressions, to analyze other characteristics of the electrochemical cell. Model parameters can be defined in terms of cell aging, making the overall kinetics model amenable to predictive estimates of cell kinetic performance along the aging timeline.

  1. Energy reconstruction in the Long-Baseline Neutrino Experiment

    E-Print Network [OSTI]

    Ulrich Mosel; Olga Lalakulich; Kai Gallmeister

    2014-04-24

    The Long-Baseline Neutrino Experiment aims at measuring fundamental physical parameters to high precision and exploring physics beyond the standard model. Nuclear targets introduce complications towards that aim. We investigate the uncertainties in the energy reconstruction, based on quasielastic scattering relations, due to nuclear effects. The reconstructed event distributions as a function of energy tend to be smeared out and shifted by several 100 MeV in their oscillatory structure if standard event selection is used. We show that a more restrictive experimental event selection offers the possibility to reach the accuracy needed for a determination of the mass ordering and the $CP$-violating phase. Quasielastic-based energy reconstruction could thus be a viable alternative to the calorimetric reconstruction also at higher energies.

  2. Medical Nuclear Supply Chain Design: A Tractable Network Model and Computational

    E-Print Network [OSTI]

    Nagurney, Anna

    importance of considering waste management. Anna Nagurney and Ladimer S. Nagurney Medical Nuclear SupplyMedical Nuclear Supply Chain Design: A Tractable Network Model and Computational Approach Anna Nagurney1 and Ladimer S. Nagurney2 1John F. Smith Memorial Professor - Isenberg School of Management

  3. Medical Nuclear Supply Chain Design: A Tractable Network Model and Computational Approach

    E-Print Network [OSTI]

    Nagurney, Anna

    operational cost minimization, and the minimization of cost associated with nuclear waste discarding, coupled, the underlying physics of radioactive decay, and the inclusion of waste management. We focus on Molybdenum-99 dueMedical Nuclear Supply Chain Design: A Tractable Network Model and Computational Approach Anna

  4. Many Task Computing for Modeling the Fate of Oil Discharged from the Deep Water Horizon Well

    E-Print Network [OSTI]

    Many Task Computing for Modeling the Fate of Oil Discharged from the Deep Water Horizon Well@jhu.edu Abstract--The Deep Water Horizon well blowout on April 20th 2010 discharged between 40,000 - 1.2 million@med.miami.edu O. M. Knio Dept of Mechanical Engineering Johns Hopkins University Baltimore, MD knio

  5. Computational Modeling of Neural Plasticity for Self-Organization of Neural Networks

    E-Print Network [OSTI]

    Jin, Yaochu

    Computational Modeling of Neural Plasticity for Self-Organization of Neural Networks Joseph Chrol on the learning per- formance of neural networks for accomplishing machine learning tasks such as classication, dynamics and learning per- formance of neural networks remains elusive. The purpose of this article

  6. In-Vehicle Testing and Computer Modeling of Electric Vehicle Batteries

    E-Print Network [OSTI]

    Wang, Chao-Yang

    In-Vehicle Testing and Computer Modeling of Electric Vehicle Batteries B. Thomas, W.B. Gu, J.edu Abstract A combined simulation and testing approach has been developed to evaluate battery packs in real accelerates battery development cycle, and enables innovative battery design and optimization. Several

  7. Computational Modelling of Kinase Signalling Cascades David Gilbert, Monika Heiner, Rainer Breitling, and Richard Orton

    E-Print Network [OSTI]

    Breitling, Rainer

    , Continuous Petri nets Computational modelling of intracellular biochemical networks has become a growth topic and analysis of such networks, as well as an increase in the quality and amount of experimentally determined such as bifurcations, robustness to interference, or oscillations are not obvious from the network topology

  8. Computational Modeling of Electrolyte/Cathode Interfaces in Proton Exchange Membrane Fuel Cells

    E-Print Network [OSTI]

    Bjørnstad, Ottar Nordal

    Computational Modeling of Electrolyte/Cathode Interfaces in Proton Exchange Membrane Fuel Cells Dr Proton exchange membrane fuel cells (PEMFCs) are alternative energy conversion devices that efficiently. The fundamental relationship between operating conditions and device performance will help to optimize the device

  9. A computational strategy for multiscale systems with applications to Lorenz 96 model

    E-Print Network [OSTI]

    Van Den Eijnden, Eric

    A computational strategy for multiscale systems with applications to Lorenz 96 model Ibrahim 2004 Available online Abstract Numerical schemes for systems with multiple spatio-temporal scales are investigated. The multiscale schemes use asymptotic results for this type of systems which guarantee

  10. P&P: a Combined Push-Pull Model for Resource Monitoring in Cloud Computing Environment

    E-Print Network [OSTI]

    Wang, Liqiang

    P&P: a Combined Push-Pull Model for Resource Monitoring in Cloud Computing Environment He Huang, various platforms and software. Resource monitoring involves collecting information of system resources to facilitate decision making by other components in Cloud environ- ment. It is the foundation of many major

  11. COMPUTATIONAL CHALLENGES IN THE NUMERICAL TREATMENT OF LARGE AIR POLLUTION MODELS

    E-Print Network [OSTI]

    Ostromsky, Tzvetan

    COMPUTATIONAL CHALLENGES IN THE NUMERICAL TREATMENT OF LARGE AIR POLLUTION MODELS I. DIMOV , K. GEORGIEVy, TZ. OSTROMSKY , R. J. VAN DER PASz, AND Z. ZLATEVx Abstract. The air pollution, and especially the reduction of the air pollution to some acceptable levels, is an important environmental problem, which

  12. MATHEMATICAL PERGAMON Mathematical and Computer Modelling 35 (2002) 1371-1375

    E-Print Network [OSTI]

    Gorban, Alexander N.

    2002-01-01

    Application to the Efficiency of Free Flow Turbines A. GORBAN' Institute of Computational Modeling, Russian obstacle is considered. Its application to estimating the efficiency of free flow turbines is discussed hydraulic turbines, i.e., the turbines that work without dams [l]. For this kind of turbine, the term

  13. A Computational Model of Aging and Calcification in the Aortic Heart Valve

    E-Print Network [OSTI]

    Mofrad, Mohammad R. K.

    A Computational Model of Aging and Calcification in the Aortic Heart Valve Eli J. Weinberg1 of America Abstract The aortic heart valve undergoes geometric and mechanical changes over time. The cusps of a normal, healthy valve thicken and become less extensible over time. In the disease calcific aortic

  14. Computational Modeling and the Experimental Plasma Research Program A White Paper Submitted to the FESAC Subcommittee

    E-Print Network [OSTI]

    Computational Modeling and the Experimental Plasma Research Program A White Paper Submitted of the fusion energy program. The experimental plasma research (EPR) program is well positioned to make major in fusion development and promote scientific discovery. Experimental plasma research projects explore

  15. PhD Position Available: integrative biomechanics, computational modeling, nonlinear dynamics

    E-Print Network [OSTI]

    Clewley, Robert

    PhD Position Available: integrative biomechanics, computational modeling, nonlinear dynamics and mathematical analysis of biomechanical and neural control systems. We are looking for an excellent and highly.edu/~biodhe/#Research). These are being used to study the Crayfish swim escape mechanism as a case study in integrative biomechanical

  16. Computational framework for modeling the dynamic evolution of large-scale multi-agent organizations

    E-Print Network [OSTI]

    Lazar, Alina

    disciplines as well. This is because the emergence of such a social structure can have a profound impact 48202 ABSTRACT The process by which complex social entities such as the state emerged from lower level on the societies' physical and social environment. However, the task of developing realistic computational models

  17. DOE Issues Funding Opportunity for Advanced Computational and Modeling Research for the Electric Power System

    Broader source: Energy.gov [DOE]

    The objective of this Funding Opportunity Announcement (FOA) is to leverage scientific advancements in mathematics and computation for application to power system models and software tools, with the long-term goal of enabling real-time protection and control based on wide-area sensor measurements.

  18. CloudSim: A Novel Framework for Modeling and Simulation of Cloud Computing Infrastructures and Services

    E-Print Network [OSTI]

    Buyya, Rajkumar

    infrastructure (hardware, software, services) for different application and service models under varying load problem to tackle. To simplify this process, in this paper we propose CloudSim: a new generalized Cloud computing infrastructures and management services. The simulation framework has the following

  19. Entrainment to Periodic Initiation and Transition Rates in a Computational Model for Gene Translation

    E-Print Network [OSTI]

    Margaliot, Michael

    1 Entrainment to Periodic Initiation and Transition Rates in a Computational Model for Gene, the biological system must entrain or phase-lock to the periodic excitation. Entrainment is also important in synthetic biology. For example, connecting several artificial biological systems that entrain to a common

  20. Entrainment to Periodic Initiation and Transition Rates in a Computational Model for Gene Translation

    E-Print Network [OSTI]

    Sontag, Eduardo

    Entrainment to Periodic Initiation and Transition Rates in a Computational Model for Gene to the solar day. In the terminology of systems theory, the biological system must entrain or phase-lock to the periodic excitation. Entrainment is also important in synthetic biology. For example, connecting several

  1. 3D Bone Microarchitecture Modeling and Fracture Risk Department of Computer

    E-Print Network [OSTI]

    Buffalo, State University of New York

    3D Bone Microarchitecture Modeling and Fracture Risk Prediction Hui Li Department of Computer will also rise. It calls for innovative research on understanding of osteoporo- sis and fracture mechanisms-of-the-art probabilistic approach to analyze bone fracture risk factors including demographic attributes and life styles

  2. Conditional Spectrum Computation Incorporating Multiple Causal Earthquakes and Ground-Motion Prediction Models

    E-Print Network [OSTI]

    Baker, Jack W.

    Conditional Spectrum Computation Incorporating Multiple Causal Earthquakes and Ground-Motion Prediction Models by Ting Lin, Stephen C. Harmsen, Jack W. Baker, and Nicolas Luco Abstract The conditional uncertainties in all earthquake scenarios and resulting ground motions, as well as the epistemic uncertainties

  3. Modeling Computational Security in LongLived Systems # ## Ran Canetti 1,2 , Ling Cheung 2 , Dilsun Kaynar 3 ,

    E-Print Network [OSTI]

    International Association for Cryptologic Research (IACR)

    Modeling Computational Security in Long­Lived Systems # ## Ran Canetti 1,2 , Ling Cheung 2 , Dilsun Introduction Computational security in long­lived systems: Security properties of cryptographic protocols computational power. This type of security degrades progressively over the lifetime of a protocol. However, some

  4. CloudAnalyst: A CloudSim-based Visual Modeller for Analysing Cloud Computing Environments and Applications

    E-Print Network [OSTI]

    Buyya, Rajkumar

    CloudAnalyst: A CloudSim-based Visual Modeller for Analysing Cloud Computing Environments and Applications Bhathiya Wickremasinghe1 , Rodrigo N. Calheiros2 , and Rajkumar Buyya1 1 The Cloud Computing and Distributed Systems (CLOUDS) Laboratory Department of Computer Science and Software Engineering The University

  5. Technical Review of the CENWP Computational Fluid Dynamics Model of the John Day Dam Forebay

    SciTech Connect (OSTI)

    Rakowski, Cynthia L.; Serkowski, John A.; Richmond, Marshall C.

    2010-12-01

    The US Army Corps of Engineers Portland District (CENWP) has developed a computational fluid dynamics (CFD) model of the John Day forebay on the Columbia River to aid in the development and design of alternatives to improve juvenile salmon passage at the John Day Project. At the request of CENWP, Pacific Northwest National Laboratory (PNNL) Hydrology Group has conducted a technical review of CENWP's CFD model run in CFD solver software, STAR-CD. PNNL has extensive experience developing and applying 3D CFD models run in STAR-CD for Columbia River hydroelectric projects. The John Day forebay model developed by CENWP is adequately configured and validated. The model is ready for use simulating forebay hydraulics for structural and operational alternatives. The approach and method are sound, however CENWP has identified some improvements that need to be made for future models and for modifications to this existing model.

  6. LTC vacuum blasting machine (concrete): Baseline report

    SciTech Connect (OSTI)

    1997-07-31

    The LTC shot blast technology was tested and is being evaluated at Florida International University (FIU) as a baseline technology. In conjunction with FIU`s evaluation of efficiency and cost, this report covers the evaluation conducted for safety and health issues. It is a commercially available technology and has been used for various projects at locations throughout the country. The LTC 1073 Vacuum Blasting Machine uses a high-capacity, direct-pressure blasting system which incorporates a continuous feed for the blast media. The blast media cleans the surface within the contained brush area of the blast. It incorporates a vacuum system which removes dust and debris from the surface as it is blasted. The safety and health evaluation during the testing demonstration focused on two main areas of exposure: dust and noise. Dust exposure during maintenance activities was minimal, but due to mechanical difficulties dust monitoring could not be conducted during operation. Noise exposure was significant. Further testing for each of these exposures is recommended because of the outdoor environment where the testing demonstration took place. This may cause the results to be inaccurate. It is feasible that the dust and noise levels will be higher in an enclosed environment. In addition, other safety and health issues found were ergonomics, heat stress, tripping hazards, electrical hazards, lockout/tagout, and arm-hand vibration.

  7. LTC vacuum blasting machine (metal): Baseline report

    SciTech Connect (OSTI)

    1997-07-31

    The LTC coating removal technology was tested and is being evaluated at Florida International University (FIU) as a baseline technology. In conjunction with evaluation of efficiency and cost, this report covers the evaluation conducted for safety and health issues. It is a commercially available technology and has been used for various projects at locations throughout the country. The LTC coating removal system consisted of several hand tools, a Roto Peen scaler, and a needlegun. They are designed to remove coatings from steel, concrete, brick, and wood. These hand tools are used with the LTC PTC-6 vacuum system to capture dust and debris as removal of the coating takes place. The safety and health evaluation during the testing demonstration focused on two main areas of exposure: dust and noise. The dust exposure was minimal but noise exposure was significant. Further testing for each exposure is recommended because of the environment where the testing demonstration took place. It is feasible that the dust and noise levels will be higher in an enclosed operating environment of different construction. In addition, other areas of concern found were arm-hand vibration, whole-body vibration, ergonomics, heat stress, tripping hazards, electrical hazards, machine guarding, and lockout/tagout.

  8. Pentek metal coating removal system: Baseline report

    SciTech Connect (OSTI)

    1997-07-31

    The Pentek coating removal technology was tested and is being evaluated at Florida International University (FIU) as a baseline technology. In conjunction with FIU`s evaluation of efficiency and cost, this report covers evaluation conducted for safety and health issues. It is a commercially available technology and has been used for various projects at locations throughout the country. The Pentek coating removal system consisted of the ROTO-PEEN Scaler, CORNER-CUTTER{reg_sign}, and VAC-PAC{reg_sign}. They are designed to remove coatings from steel, concrete, brick, and wood. The Scaler uses 3M Roto Peen tungsten carbide cutters while the CORNER-CUTTER{reg_sign} uses solid needles for descaling activities. These hand tools are used with the VAC-PAC{reg_sign} vacuum system to capture dust and debris as removal of the coating takes place. The safety and health evaluation during the testing demonstration focused on two main areas of exposure: dust and noise. Dust exposure minimal, but noise exposure was significant. Further testing for each exposure is recommended because of the environment where the testing demonstration took place. It is feasible that the dust and noise levels will be higher in an enclosed operating environment of different construction. In addition, other areas of concern found were arm-hand vibration, whole-body, ergonomics, heat stress, tripping hazards, electrical hazards, machine guarding, and lockout/tagout.

  9. Gated integrator with signal baseline subtraction

    DOE Patents [OSTI]

    Wang, X.

    1996-12-17

    An ultrafast, high precision gated integrator includes an opamp having differential inputs. A signal to be integrated is applied to one of the differential inputs through a first input network, and a signal indicative of the DC offset component of the signal to be integrated is applied to the other of the differential inputs through a second input network. A pair of electronic switches in the first and second input networks define an integrating period when they are closed. The first and second input networks are substantially symmetrically constructed of matched components so that error components introduced by the electronic switches appear symmetrically in both input circuits and, hence, are nullified by the common mode rejection of the integrating opamp. The signal indicative of the DC offset component is provided by a sample and hold circuit actuated as the integrating period begins. The symmetrical configuration of the integrating circuit improves accuracy and speed by balancing out common mode errors, by permitting the use of high speed switching elements and high speed opamps and by permitting the use of a small integrating time constant. The sample and hold circuit substantially eliminates the error caused by the input signal baseline offset during a single integrating window. 5 figs.

  10. Gated integrator with signal baseline subtraction

    DOE Patents [OSTI]

    Wang, Xucheng (Lisle, IL)

    1996-01-01

    An ultrafast, high precision gated integrator includes an opamp having differential inputs. A signal to be integrated is applied to one of the differential inputs through a first input network, and a signal indicative of the DC offset component of the signal to be integrated is applied to the other of the differential inputs through a second input network. A pair of electronic switches in the first and second input networks define an integrating period when they are closed. The first and second input networks are substantially symmetrically constructed of matched components so that error components introduced by the electronic switches appear symmetrically in both input circuits and, hence, are nullified by the common mode rejection of the integrating opamp. The signal indicative of the DC offset component is provided by a sample and hold circuit actuated as the integrating period begins. The symmetrical configuration of the integrating circuit improves accuracy and speed by balancing out common mode errors, by permitting the use of high speed switching elements and high speed opamps and by permitting the use of a small integrating time constant. The sample and hold circuit substantially eliminates the error caused by the input signal baseline offset during a single integrating window.

  11. SRS baseline hydrogeologic investigation: Summary report

    SciTech Connect (OSTI)

    Bledsoe, H.W.; Aadland, R.K. (Westinghouse Savannah River Co., Aiken, SC (United States)); Sargent, K.A. (Furman Univ., Greenville, SC (United States). Dept. of Geology)

    1990-11-01

    Work on the Savannah River Site (SRS) Baseline Hydrogeologic Investigation began in 1983 when it was determined that the knowledge of the plant hydrogeologic systems needed to be expanded and improved in response to changing stratigraphic and hydrostratigraphic terminology and increased involvement by regulatory agencies (Bledsoe, 1984). Additionally, site-wide data were needed to determine flow paths, gradients, and velocities associated with the different aquifers underlying the plant site. The program was divided into three phases in order to allow the results of one phase to be evaluated and necessary changes and improvements incorporated into the following phases. This report summarizes the results of all three phases and includes modified graphic logs, lithologic descriptions of the different geologic formations, profiles of each cluster site, hydrostratigraphic cross sections, hydrographs of selected wells within each cluster for the first full year of uninterrupted water level measurements, potentiometric maps developed from data collected from all clusters, completion diagrams for each well, and a summary of laboratory tests. Additionally, the proposed new classification of hydrostratigraphic units at SRS (Aadland and Bledsoe, 1990) has been incorporated.

  12. U.S. Department of Energy Performance Baseline Guide

    Broader source: Directives, Delegations, and Requirements [Office of Management (MA)]

    2008-09-12

    The guide supports DOE O 413.3A and identifies key performance baseline development processes and practices. Does not cancel other directives.

  13. EA-1943: Long Baseline Neutrino Facility/Deep Underground Neutrino...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    DUNE) at Fermilab, Batavia, Illinois and the Sanford Underground Research Facility, Lead, South Dakota EA-1943: Long Baseline Neutrino FacilityDeep Underground Neutrino...

  14. EA-1943: Long Baseline Neutrino Facility/Deep Underground Neutrino...

    Broader source: Energy.gov (indexed) [DOE]

    May 27, 2015 EA-1943: Draft Environmental Assessment Long Baseline Neutrino FacilityDeep Underground Neutrino Experiment (LBNFDUNE) at Fermilab, Batavia, Illinois and the...

  15. Updates to the International Linear Collider Damping Rings Baseline...

    Office of Scientific and Technical Information (OSTI)

    Updates to the International Linear Collider Damping Rings Baseline Design Citation Details In-Document Search Title: Updates to the International Linear Collider Damping Rings...

  16. Cost and Performance Comparison Baseline for Fossil Energy Power...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    blocks together into a new, revolutionary concept for future coal-based power and energy production. Objective To establish baseline performance and cost estimates for today's...

  17. UNFCCC-Consolidated baseline and monitoring methodology for landfill...

    Open Energy Info (EERE)

    inventory Resource Type: Guidemanual Website: cdm.unfccc.intpublicinputsmethacm0001index.html Cost: Free Language: English References: UNFCCC-Consolidated baseline and...

  18. Estimation and Analysis of Life Cycle Costs of Baseline Enhanced...

    Open Energy Info (EERE)

    Estimation and Analysis of Life Cycle Costs of Baseline Enhanced Geothermal Systems Geothermal Project Jump to: navigation, search Last modified on July 22, 2011. Project Title...

  19. California Baseline Energy Demands to 2050 for Advanced Energy Pathways

    E-Print Network [OSTI]

    McCarthy, Ryan; Yang, Christopher; Ogden, Joan M.

    2008-01-01

    CEC (2005b) Energy demand forecast methods report.growth in California energy demands forecast in the baseline2006-2016: Staff energy demand forecast (Revised September

  20. Seismic baseline and induction studies- Roosevelt Hot Springs...

    Open Energy Info (EERE)

    Seismic baseline and induction studies- Roosevelt Hot Springs, Utah and Raft River, Idaho Jump to: navigation, search OpenEI Reference LibraryAdd to library Report: Seismic...

  1. ITP Distributed Energy: 2008 Combined Heat and Power Baseline...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    2008 Combined Heat and Power Baseline Assessment and Action Plan for the Nevada Market Final Project Report September 30, 2008 Prepared By: Pacific Region Combined Heat and...

  2. Computer modeling of electromagnetic edge containment in twin-roll casting

    SciTech Connect (OSTI)

    Chang, F.C.; Turner, L.R.; Hull, J.R.; Wang, Y.H.; Blazek, K.E.

    1998-07-01

    This paper presents modeling studies of magnetohydrodynamics (MHD) analysis in twin-roll casting. Argonne National Laboratory (ANL) and Inland Steel Company have worked together to develop a 3-D computer model that can predict eddy currents, fluid flows, and liquid metal containment for an electromagnetic (EM) edge containment device. This mathematical model can greatly shorten casting research on the use of EM fields for liquid metal containment and control. It can also optimize the existing casting processes and minimize expensive, time-consuming full-scale testing. The model was verified by comparing predictions with experimental results of liquid-metal containment and fluid flow in EM edge dams designed at Inland Steel for twin-roll casting. Numerical simulation was performed by coupling a three-dimensional (3-D) finite-element EM code (ELEKTRA) and a 3-D finite-difference fluids code (CaPS-EM) to solve Maxwell`s equations, Ohm`s law, Navier-Stokes equations, and transport equations of turbulence flow in a casting process that uses EM fields. ELEKTRA is able to predict the eddy-current distribution and electromagnetic forces in complex geometry. CaPS-EM is capable of modeling fluid flows with free-surfaces and dynamic rollers. The computed 3-D magnetic fields and induced eddy currents in ELEKTRA are used as input to flow-field computations in CaPS-EM. Results of the numerical simulation compared well with measurements obtained from both static and dynamic tests.

  3. High-Performance Computing Modeling Advances Accelerator Science for High-Energy Physics

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Amundson, James; Macridin, Alexandru; Spentzouris, Panagiotis

    2014-11-01

    The development and optimization of particle accelerators are essential for advancing our understanding of the properties of matter, energy, space and time. Particle accelerators are complex devices whose behavior involves many physical effects on multiple scales. Therefore, advanced computational tools utilizing high-performance computing (HPC) are essential for accurately modeling them. In the past decade, the DOE SciDAC program has produced such accelerator-modeling tools, which have beem employed to tackle some of the most difficult accelerator science problems. In this article we discuss the Synergia beam-dynamics framework and its applications to high-intensity particle accelerator physics. Synergia is an accelerator simulation packagemore »capable of handling the entire spectrum of beam dynamics simulations. We present the design principles, key physical and numerical models in Synergia and its performance on HPC platforms. Finally, we present the results of Synergia applications for the Fermilab proton source upgrade, known as the Proton Improvement Plan (PIP).« less

  4. High-Performance Computing Modeling Advances Accelerator Science for High-Energy Physics

    SciTech Connect (OSTI)

    Amundson, James; Macridin, Alexandru; Spentzouris, Panagiotis

    2014-11-01

    The development and optimization of particle accelerators are essential for advancing our understanding of the properties of matter, energy, space and time. Particle accelerators are complex devices whose behavior involves many physical effects on multiple scales. Therefore, advanced computational tools utilizing high-performance computing (HPC) are essential for accurately modeling them. In the past decade, the DOE SciDAC program has produced such accelerator-modeling tools, which have beem employed to tackle some of the most difficult accelerator science problems. In this article we discuss the Synergia beam-dynamics framework and its applications to high-intensity particle accelerator physics. Synergia is an accelerator simulation package capable of handling the entire spectrum of beam dynamics simulations. We present the design principles, key physical and numerical models in Synergia and its performance on HPC platforms. Finally, we present the results of Synergia applications for the Fermilab proton source upgrade, known as the Proton Improvement Plan (PIP).

  5. An efficient method for computing genus expansions and counting numbers in the Hermitian matrix model

    E-Print Network [OSTI]

    Gabriel Álvarez; Luis Martínez Alonso; Elena Medina

    2011-01-14

    We present a method to compute the genus expansion of the free energy of Hermitian matrix models from the large N expansion of the recurrence coefficients of the associated family of orthogonal polynomials. The method is based on the Bleher-Its deformation of the model, on its associated integral representation of the free energy, and on a method for solving the string equation which uses the resolvent of the Lax operator of the underlying Toda hierarchy. As a byproduct we obtain an efficient algorithm to compute generating functions for the enumeration of labeled k-maps which does not require the explicit expressions of the coefficients of the topological expansion. Finally we discuss the regularization of singular one-cut models within this approach.

  6. Computational method and system for modeling, analyzing, and optimizing DNA amplification and synthesis

    DOE Patents [OSTI]

    Vandersall, Jennifer A.; Gardner, Shea N.; Clague, David S.

    2010-05-04

    A computational method and computer-based system of modeling DNA synthesis for the design and interpretation of PCR amplification, parallel DNA synthesis, and microarray chip analysis. The method and system include modules that address the bioinformatics, kinetics, and thermodynamics of DNA amplification and synthesis. Specifically, the steps of DNA selection, as well as the kinetics and thermodynamics of DNA hybridization and extensions, are addressed, which enable the optimization of the processing and the prediction of the products as a function of DNA sequence, mixing protocol, time, temperature and concentration of species.

  7. Sodium fast reactor gaps analysis of computer codes and models for accident analysis and reactor safety.

    SciTech Connect (OSTI)

    Carbajo, Juan; Jeong, Hae-Yong; Wigeland, Roald; Corradini, Michael; Schmidt, Rodney Cannon; Thomas, Justin; Wei, Tom; Sofu, Tanju; Ludewig, Hans; Tobita, Yoshiharu; Ohshima, Hiroyuki; Serre, Frederic

    2011-06-01

    This report summarizes the results of an expert-opinion elicitation activity designed to qualitatively assess the status and capabilities of currently available computer codes and models for accident analysis and reactor safety calculations of advanced sodium fast reactors, and identify important gaps. The twelve-member panel consisted of representatives from five U.S. National Laboratories (SNL, ANL, INL, ORNL, and BNL), the University of Wisconsin, the KAERI, the JAEA, and the CEA. The major portion of this elicitation activity occurred during a two-day meeting held on Aug. 10-11, 2010 at Argonne National Laboratory. There were two primary objectives of this work: (1) Identify computer codes currently available for SFR accident analysis and reactor safety calculations; and (2) Assess the status and capability of current US computer codes to adequately model the required accident scenarios and associated phenomena, and identify important gaps. During the review, panel members identified over 60 computer codes that are currently available in the international community to perform different aspects of SFR safety analysis for various event scenarios and accident categories. A brief description of each of these codes together with references (when available) is provided. An adaptation of the Predictive Capability Maturity Model (PCMM) for computational modeling and simulation is described for use in this work. The panel's assessment of the available US codes is presented in the form of nine tables, organized into groups of three for each of three risk categories considered: anticipated operational occurrences (AOOs), design basis accidents (DBA), and beyond design basis accidents (BDBA). A set of summary conclusions are drawn from the results obtained. At the highest level, the panel judged that current US code capabilities are adequate for licensing given reasonable margins, but expressed concern that US code development activities had stagnated and that the experienced user-base and the experimental validation base was decaying away quickly.

  8. Computational Modeling of Blood Flow in the TrapEase Inferior Vena Cava Filter

    SciTech Connect (OSTI)

    Singer, M A; Henshaw, W D; Wang, S L

    2008-02-04

    To evaluate the flow hemodynamics of the TrapEase vena cava filter using three dimensional computational fluid dynamics, including simulated thrombi of multiple shapes, sizes, and trapping positions. The study was performed to identify potential areas of recirculation and stagnation and areas in which trapped thrombi may influence intrafilter thrombosis. Computer models of the TrapEase filter, thrombi (volumes ranging from 0.25mL to 2mL, 3 different shapes), and a 23mm diameter cava were constructed. The hemodynamics of steady-state flow at Reynolds number 600 was examined for the unoccluded and partially occluded filter. Axial velocity contours and wall shear stresses were computed. Flow in the unoccluded TrapEase filter experienced minimal disruption, except near the superior and inferior tips where low velocity flow was observed. For spherical thrombi in the superior trapping position, stagnant and recirculating flow was observed downstream of the thrombus; the volume of stagnant flow and the peak wall shear stress increased monotonically with thrombus volume. For inferiorly trapped spherical thrombi, marked disruption to the flow was observed along the cava wall ipsilateral to the thrombus and in the interior of the filter. Spherically shaped thrombus produced a lower peak wall shear stress than conically shaped thrombus and a larger peak stress than ellipsoidal thrombus. We have designed and constructed a computer model of the flow hemodynamics of the TrapEase IVC filter with varying shapes, sizes, and positions of thrombi. The computer model offers several advantages over in vitro techniques including: improved resolution, ease of evaluating different thrombus sizes and shapes, and easy adaptation for new filter designs and flow parameters. Results from the model also support a previously reported finding from photochromic experiments that suggest the inferior trapping position of the TrapEase IVC filter leads to an intra-filter region of recirculating/stagnant flow with very low shear stress that may be thrombogenic.

  9. Baseline and Target Values for PV Forecasts: Toward Improved Solar Power Forecasting: Preprint

    SciTech Connect (OSTI)

    Zhang, Jie; Hodge, Bri-Mathias; Lu, Siyuan; Hamann, Hendrik F.; Lehman, Brad; Simmons, Joseph; Campos, Edwin; Banunarayanan, Venkat

    2015-08-05

    Accurate solar power forecasting allows utilities to get the most out of the solar resources on their systems. To truly measure the improvements that any new solar forecasting methods can provide, it is important to first develop (or determine) baseline and target solar forecasting at different spatial and temporal scales. This paper aims to develop baseline and target values for solar forecasting metrics. These were informed by close collaboration with utility and independent system operator partners. The baseline values are established based on state-of-the-art numerical weather prediction models and persistence models. The target values are determined based on the reduction in the amount of reserves that must be held to accommodate the uncertainty of solar power output. forecasting metrics. These were informed by close collaboration with utility and independent system operator partners. The baseline values are established based on state-of-the-art numerical weather prediction models and persistence models. The target values are determined based on the reduction in the amount of reserves that must be held to accommodate the uncertainty of solar power output.

  10. On the Use of Computational Models for Wave Climate Assessment in Support of the Wave Energy Industry

    E-Print Network [OSTI]

    Victoria, University of

    On the Use of Computational Models for Wave Climate Assessment in Support of the Wave Energy On the Use of Computational Models for Wave Climate Assessment in Support of the Wave Energy Industry Effective, economic extraction of ocean wave energy requires an intimate under- standing of the ocean wave

  11. Baseline Evaluations to Support Control Room Modernization at Nuclear Power Plants

    SciTech Connect (OSTI)

    Boring, Ronald L.; Joe, Jeffrey C.

    2015-02-01

    For any major control room modernization activity at a commercial nuclear power plant (NPP) in the U.S., a utility should carefully follow the four phases prescribed by the U.S. Nuclear Regulatory Commission in NUREG-0711, Human Factors Engineering Program Review Model. These four phases include Planning and Analysis, Design, Verification and Validation, and Implementation and Operation. While NUREG-0711 is a useful guideline, it is written primarily from the perspective of regulatory review, and it therefore does not provide a nuanced account of many of the steps the utility might undertake as part of control room modernization. The guideline is largely summative—intended to catalog final products—rather than formative—intended to guide the overall modernization process. In this paper, we highlight two crucial formative sub-elements of the Planning and Analysis phase specific to control room modernization that are not covered in NUREG-0711. These two sub-elements are the usability and ergonomics baseline evaluations. A baseline evaluation entails evaluating the system as-built and currently in use. The usability baseline evaluation provides key insights into operator performance using the control system currently in place. The ergonomics baseline evaluation identifies possible deficiencies in the physical configuration of the control system. Both baseline evaluations feed into the design of the replacement system and subsequent summative benchmarking activities that help ensure that control room modernization represents a successful evolution of the control system.

  12. DualTrust: A Trust Management Model for Swarm-Based Autonomic Computing Systems

    SciTech Connect (OSTI)

    Maiden, Wendy M.

    2010-05-01

    Trust management techniques must be adapted to the unique needs of the application architectures and problem domains to which they are applied. For autonomic computing systems that utilize mobile agents and ant colony algorithms for their sensor layer, certain characteristics of the mobile agent ant swarm -- their lightweight, ephemeral nature and indirect communication -- make this adaptation especially challenging. This thesis looks at the trust issues and opportunities in swarm-based autonomic computing systems and finds that by monitoring the trustworthiness of the autonomic managers rather than the swarming sensors, the trust management problem becomes much more scalable and still serves to protect the swarm. After analyzing the applicability of trust management research as it has been applied to architectures with similar characteristics, this thesis specifies the required characteristics for trust management mechanisms used to monitor the trustworthiness of entities in a swarm-based autonomic computing system and describes a trust model that meets these requirements.

  13. Experimental validation of a kilovoltage x-ray source model for computing imaging dose

    SciTech Connect (OSTI)

    Poirier, Yannick, E-mail: yannick.poirier@cancercare.mb.ca [CancerCare Manitoba, 675 McDermot Ave, Winnipeg, Manitoba R3E 0V9 (Canada)] [CancerCare Manitoba, 675 McDermot Ave, Winnipeg, Manitoba R3E 0V9 (Canada); Kouznetsov, Alexei; Koger, Brandon [Department of Physics and Astronomy, University of Calgary, Calgary, Alberta T2N 1N4 (Canada)] [Department of Physics and Astronomy, University of Calgary, Calgary, Alberta T2N 1N4 (Canada); Tambasco, Mauro, E-mail: mtambasco@mail.sdsu.edu [Department of Physics, San Diego State University, San Diego, California 92182-1233 and Department of Physics and Astronomy and Department of Oncology, University of Calgary, Calgary, Alberta T2N 1N4 (Canada)] [Department of Physics, San Diego State University, San Diego, California 92182-1233 and Department of Physics and Astronomy and Department of Oncology, University of Calgary, Calgary, Alberta T2N 1N4 (Canada)

    2014-04-15

    Purpose: To introduce and validate a kilovoltage (kV) x-ray source model and characterization method to compute absorbed dose accrued from kV x-rays. Methods: The authors propose a simplified virtual point source model and characterization method for a kV x-ray source. The source is modeled by: (1) characterizing the spatial spectral and fluence distributions of the photons at a plane at the isocenter, and (2) creating a virtual point source from which photons are generated to yield the derived spatial spectral and fluence distribution at isocenter of an imaging system. The spatial photon distribution is determined by in-air relative dose measurements along the transverse (x) and radial (y) directions. The spectrum is characterized using transverse axis half-value layer measurements and the nominal peak potential (kVp). This source modeling approach is used to characterize a Varian{sup ®} on-board-imager (OBI{sup ®}) for four default cone-beam CT beam qualities: beams using a half bowtie filter (HBT) with 110 and 125 kVp, and a full bowtie filter (FBT) with 100 and 125 kVp. The source model and characterization method was validated by comparing dose computed by the authors’ inhouse software (kVDoseCalc) to relative dose measurements in a homogeneous and a heterogeneous block phantom comprised of tissue, bone, and lung-equivalent materials. Results: The characterized beam qualities and spatial photon distributions are comparable to reported values in the literature. Agreement between computed and measured percent depth-dose curves is ?2% in the homogeneous block phantom and ?2.5% in the heterogeneous block phantom. Transverse axis profiles taken at depths of 2 and 6 cm in the homogeneous block phantom show an agreement within 4%. All transverse axis dose profiles in water, in bone, and lung-equivalent materials for beams using a HBT, have an agreement within 5%. Measured profiles of FBT beams in bone and lung-equivalent materials were higher than their computed counterparts resulting in an agreement within 2.5%, 5%, and 8% within solid water, bone, and lung, respectively. Conclusions: The proposed virtual point source model and characterization method can be used to compute absorbed dose in both the homogeneous and heterogeneous block phantoms within of 2%–8% of measured values, depending on the phantom and the beam quality. The authors’ results also provide experimental validation for their kV dose computation software, kVDoseCalc.

  14. The LBNO long-baseline oscillation sensitivities with two conventional neutrino beams at different baselines

    E-Print Network [OSTI]

    LAGUNA-LBNO Collaboration; :; S. K. Agarwalla; L. Agostino; M. Aittola; A. Alekou; B. Andrieu; F. Antoniou; R. Asfandiyarov; D. Autiero; O. Bésida; A. Balik; P. Ballett; I. Bandac; D. Banerjee; W. Bartmann; F. Bay; B. Biskup; A. M. Blebea-Apostu; A. Blondel; M. Bogomilov; S. Bolognesi; E. Borriello; I. Brancus; A. Bravar; M. Buizza-Avanzini; D. Caiulo; M. Calin; M. Calviani; M. Campanelli; C. Cantini; G. Cata-Danil; S. Chakraborty; N. Charitonidis; L. Chaussard; D. Chesneanu; F. Chipesiu; P. Crivelli; J. Dawson; I. De Bonis; Y. Declais; P. Del Amo Sanchez; A. Delbart; S. Di Luise; D. Duchesneau; J. Dumarchez; I. Efthymiopoulos; A. Eliseev; S. Emery; T. Enqvist; K. Enqvist; L. Epprecht; A. N. Erykalov; T. Esanu; D. Franco; M. Friend; V. Galymov; G. Gavrilov; A. Gendotti; C. Giganti; S. Gilardoni; B. Goddard; C. M. Gomoiu; Y. A. Gornushkin; P. Gorodetzky; A. Haesler; T. Hasegawa; S. Horikawa; K. Huitu; A. Izmaylov; A. Jipa; K. Kainulainen; Y. Karadzhov; M. Khabibullin; A. Khotjantsev; A. N. Kopylov; A. Korzenev; S. Kosyanenko; D. Kryn; Y. Kudenko; P. Kuusiniemi; I. Lazanu; C. Lazaridis; J. -M. Levy; K. Loo; J. Maalampi; R. M. Margineanu; J. Marteau; C. Martin-Mari; V. Matveev; E. Mazzucato; A. Mefodiev; O. Mineev; A. Mirizzi; B. Mitrica; S. Murphy; T. Nakadaira; S. Narita; D. A. Nesterenko; K. Nguyen; K. Nikolics; E. Noah; Yu. Novikov; A. Oprima; J. Osborne; T. Ovsyannikova; Y. Papaphilippou; S. Pascoli; T. Patzak; M. Pectu; E. Pennacchio; L. Periale; H. Pessard; B. Popov; M. Ravonel; M. Rayner; F. Resnati; O. Ristea; A. Robert; A. Rubbia; K. Rummukainen; A. Saftoiu; K. Sakashita; F. Sanchez-Galan; J. Sarkamo; N. Saviano; E. Scantamburlo; F. Sergiampietri; D. Sgalaberna; E. Shaposhnikova; M. Slupecki; D. Smargianaki; D. Stanca; R. Steerenberg; A. R. Sterian; P. Sterian; S. Stoica; C. Strabel; J. Suhonen; V. Suvorov; G. Toma; A. Tonazzo; W. H. Trzaska; R. Tsenov; K. Tuominen; M. Valram; G. Vankova-Kirilova; F. Vannucci; G. Vasseur; F. Velotti; P. Velten; V. Venturi; T. Viant; S. Vihonen; H. Vincke; A. Vorobyev; A. Weber; S. Wu; N. Yershov; L. Zambelli; M. Zito

    2014-12-02

    The proposed Long Baseline Neutrino Observatory (LBNO) initially consists of $\\sim 20$ kton liquid double phase TPC complemented by a magnetised iron calorimeter, to be installed at the Pyh\\"asalmi mine, at a distance of 2300 km from CERN. The conventional neutrino beam is produced by 400 GeV protons accelerated at the SPS accelerator delivering 700 kW of power. The long baseline provides a unique opportunity to study neutrino flavour oscillations over their 1st and 2nd oscillation maxima exploring the $L/E$ behaviour, and distinguishing effects arising from $\\delta_{CP}$ and matter. In this paper we show how this comprehensive physics case can be further enhanced and complemented if a neutrino beam produced at the Protvino IHEP accelerator complex, at a distance of 1160 km, and with modest power of 450 kW is aimed towards the same far detectors. We show that the coupling of two independent sub-MW conventional neutrino and antineutrino beams at different baselines from CERN and Protvino will allow to measure CP violation in the leptonic sector at a confidence level of at least $3\\sigma$ for 50\\% of the true values of $\\delta_{CP}$ with a 20 kton detector. With a far detector of 70 kton, the combination allows a $3\\sigma$ sensitivity for 75\\% of the true values of $\\delta_{CP}$ after 10 years of running. Running two independent neutrino beams, each at a power below 1 MW, is more within today's state of the art than the long-term operation of a new single high-energy multi-MW facility, which has several technical challenges and will likely require a learning curve.

  15. Spent Nuclear Fuel Project technical baseline document. Fiscal year 1995: Volume 1, Baseline description

    SciTech Connect (OSTI)

    Womack, J.C. [Westinghouse Hanford Co., Richland, WA (United States); Cramond, R. [TRW (United States); Paedon, R.J. [SAIC (United States)] [and others

    1995-03-13

    This document is a revision to WHC-SD-SNF-SD-002, and is issued to support the individual projects that make up the Spent Nuclear Fuel Project in the lower-tier functions, requirements, interfaces, and technical baseline items. It presents results of engineering analyses since Sept. 1994. The mission of the SNFP on the Hanford site is to provide safety, economic, environmentally sound management of Hanford SNF in a manner that stages it to final disposition. This particularly involves K Basin fuel, although other SNF is involved also.

  16. Compare Energy Use in Variable Refrigerant Flow Heat Pumps Field Demonstration and Computer Model

    SciTech Connect (OSTI)

    Sharma, Chandan; Raustad, Richard

    2013-06-01

    Variable Refrigerant Flow (VRF) heat pumps are often regarded as energy efficient air-conditioning systems which offer electricity savings as well as reduction in peak electric demand while providing improved individual zone setpoint control. One of the key advantages of VRF systems is minimal duct losses which provide significant reduction in energy use and duct space. However, there is limited data available to show their actual performance in the field. Since VRF systems are increasingly gaining market share in the US, it is highly desirable to have more actual field performance data of these systems. An effort was made in this direction to monitor VRF system performance over an extended period of time in a US national lab test facility. Due to increasing demand by the energy modeling community, an empirical model to simulate VRF systems was implemented in the building simulation program EnergyPlus. This paper presents the comparison of energy consumption as measured in the national lab and as predicted by the program. For increased accuracy in the comparison, a customized weather file was created by using measured outdoor temperature and relative humidity at the test facility. Other inputs to the model included building construction, VRF system model based on lab measured performance, occupancy of the building, lighting/plug loads, and thermostat set-points etc. Infiltration model inputs were adjusted in the beginning to tune the computer model and then subsequent field measurements were compared to the simulation results. Differences between the computer model results and actual field measurements are discussed. The computer generated VRF performance closely resembled the field measurements.

  17. Parametric Studies and Optimization of Eddy Current Techniques through Computer Modeling

    SciTech Connect (OSTI)

    Todorov, E. I. [EWI, Engineering and NDE, 1250 Arthur E. Adams Dr., Columbus, OH 43221-3585 (United States)

    2007-03-21

    The paper demonstrates the use of computer models for parametric studies and optimization of surface and subsurface eddy current techniques. The study with high-frequency probe investigates the effect of eddy current frequency and probe shape on the detectability of flaws in the steel substrate. The low-frequency sliding probe study addresses the effect of conductivity between the fastener and the hole, frequency and coil separation distance on detectability of flaws in subsurface layers.

  18. NREL Computer Models Integrate Wind Turbines with Floating Platforms (Fact Sheet)

    SciTech Connect (OSTI)

    Not Available

    2011-07-01

    Far off the shores of energy-hungry coastal cities, powerful winds blow over the open ocean, where the water is too deep for today's seabed-mounted offshore wind turbines. For the United States to tap into these vast offshore wind energy resources, wind turbines must be mounted on floating platforms to be cost effective. Researchers at the National Renewable Energy Laboratory (NREL) are supporting that development with computer models that allow detailed analyses of such floating wind turbines.

  19. An Additive Bivariate Hierarchical Model for Functional Data and Related Computations 

    E-Print Network [OSTI]

    Redd, Andrew Middleton

    2011-10-21

    the penalties 11 are given is computationally intensive. In addition the space for nding the penalty parameters is four dimensional, two for each predictor variable, one each for the mean and principal component functions. Optimizing over a four dimensional... ed by the structure of the formula used. I implement two new formula operators that only work with the pfda package; %&% bind together variables, on the left side of the formula indicates the paired model, on the right an additive variable...

  20. Path2Models: large-scale generation of computational models from biochemical pathway maps

    E-Print Network [OSTI]

    2013-01-01

    C: Petri net modelling of biological networks. Brief Bioinfonetworks [14-16] to discrete algebra [17] and dif- ferential equations [18], Petri

  1. 3D Computer Vision and Video Computing 3D Vision3D Vision

    E-Print Network [OSTI]

    Zhu, Zhigang

    and right projections of P, respectively. #12;6 3D Computer Vision and Video Computing A Simple Stereo length Optical Center Or pr(xr,yr) RIGHT CAMERA #12;7 3D Computer Vision and Video Computing Disparity vs = Baseline f = focal length Optical Center Or pr(xr,yr) RIGHT CAMERA 3D Computer Vision and Video Computing

  2. Fort Irwin Integrated Resource Assessment. Volume 2, Baseline detail

    SciTech Connect (OSTI)

    Richman, E.E.; Keller, J.M.; Dittmer, A.L.; Hadley, D.L.

    1994-01-01

    This report documents the assessment of baseline energy use at Fort Irwin, a US Army Forces Command facility near Barstow, California. It is a companion report to Volume 1, Executive Summary, and Volume 3, Integrated Resource Assessment. The US Army Forces Command (FORSCOM) has tasked the US Department of Energy (DOE) Federal Energy Management Program (FEMP), supported by the Pacific Northwest Laboratory (PNL), to identify, evaluate, and assist in acquiring all cost-effective energy projects at Fort Irwin. This is part of a model program that PNL has designed to support energy-use decisions in the federal sector. This program (1) identifies and evaluates all cost-effective energy projects; (2) develops a schedule at each installation for project acquisition considering project type, size, timing, and capital requirements, as well as energy and dollar savings; and (3) targets 100% of the financing required to implement energy efficiency projects. PNL applied this model program to Fort Irwin. This analysis examines the characteristics of electric, propane gas, and vehicle fuel use for a typical operating year. It records energy-use intensities for the facilities at Fort Irwin by building type and energy end use. It also breaks down building energy consumption by fuel type, energy end use, and building type. A complete energy consumption reconciliation is presented that accounts for all energy use among buildings, utilities, and applicable losses.

  3. Vehicle Technologies Office Merit Review 2014: Significant Enhancement of Computational Efficiency in Nonlinear Multiscale Battery Model for Computer Aided Engineering

    Broader source: Energy.gov [DOE]

    Presentation given by NREL at 2014 DOE Hydrogen and Fuel Cells Program and Vehicle Technologies Office Annual Merit Review and Peer Evaluation Meeting about significant enhancement of computational...

  4. Efficient Computation of Info-Gap Robustness for Finite Element Models

    SciTech Connect (OSTI)

    Stull, Christopher J.; Hemez, Francois M.; Williams, Brian J.

    2012-07-05

    A recent research effort at LANL proposed info-gap decision theory as a framework by which to measure the predictive maturity of numerical models. Info-gap theory explores the trade-offs between accuracy, that is, the extent to which predictions reproduce the physical measurements, and robustness, that is, the extent to which predictions are insensitive to modeling assumptions. Both accuracy and robustness are necessary to demonstrate predictive maturity. However, conducting an info-gap analysis can present a formidable challenge, from the standpoint of the required computational resources. This is because a robustness function requires the resolution of multiple optimization problems. This report offers an alternative, adjoint methodology to assess the info-gap robustness of Ax = b-like numerical models solved for a solution x. Two situations that can arise in structural analysis and design are briefly described and contextualized within the info-gap decision theory framework. The treatments of the info-gap problems, using the adjoint methodology are outlined in detail, and the latter problem is solved for four separate finite element models. As compared to statistical sampling, the proposed methodology offers highly accurate approximations of info-gap robustness functions for the finite element models considered in the report, at a small fraction of the computational cost. It is noted that this report considers only linear systems; a natural follow-on study would extend the methodologies described herein to include nonlinear systems.

  5. Computer modeling of electromagnetic fields and fluid flows for edge containment in continuous casting

    SciTech Connect (OSTI)

    Chang, F.C.; Hull, J.R.; Wang, Y.H.; Blazek, K.E.

    1996-02-01

    A computer model was developed to predict eddy currents and fluid flows in molten steel. The model was verified by comparing predictions with experimental results of liquid-metal containment and fluid flow in electromagnetic (EM) edge dams (EMDs) designed at Inland Steel for twin-roll casting. The model can optimize the EMD design so it is suitable for application, and minimize expensive, time-consuming full-scale testing. Numerical simulation was performed by coupling a three-dimensional (3-D) finite-element EM code (ELEKTRA) and a 3-D finite-difference fluids code (CaPS-EM) to solve heat transfer, fluid flow, and turbulence transport in a casting process that involves EM fields. ELEKTRA is able to predict the eddy- current distribution and the electromagnetic forces in complex geometries. CaPS-EM is capable of modeling fluid flows with free surfaces. Results of the numerical simulation compared well with measurements obtained from a static test.

  6. Computational fluid dynamics modeling of coal gasification in a pressurized spout-fluid bed

    SciTech Connect (OSTI)

    Zhongyi Deng; Rui Xiao; Baosheng Jin; He Huang; Laihong Shen; Qilei Song; Qianjun Li [Southeast University, Nanjing (China). Key Laboratory of Clean Coal Power Generation and Combustion Technology of Ministry of Education

    2008-05-15

    Computational fluid dynamics (CFD) modeling, which has recently proven to be an effective means of analysis and optimization of energy-conversion processes, has been extended to coal gasification in this paper. A 3D mathematical model has been developed to simulate the coal gasification process in a pressurized spout-fluid bed. This CFD model is composed of gas-solid hydrodynamics, coal pyrolysis, char gasification, and gas phase reaction submodels. The rates of heterogeneous reactions are determined by combining Arrhenius rate and diffusion rate. The homogeneous reactions of gas phase can be treated as secondary reactions. A comparison of the calculated and experimental data shows that most gasification performance parameters can be predicted accurately. This good agreement indicates that CFD modeling can be used for complex fluidized beds coal gasification processes. 37 refs., 7 figs., 5 tabs.

  7. Rapidly re-computable EEG (electroencephalography) forward models for realistic head shapes

    SciTech Connect (OSTI)

    Ermer, J. J.; Mosher, J. C.; Baillet, S.; Leahy, R. M.

    2001-01-01

    Solution of the EEG source localization (inverse) problem utilizing model-based methods typically requires a significant number of forward model evaluations. For subspace based inverse methods like MUSIC [6], the total number of forward model evaluations can often approach an order of 10{sup 3} or 10{sup 4}. Techniques based on least-squares minimization may require significantly more evaluations. The observed set of measurements over an M-sensor array is often expressed as a linear forward spatio-temporal model of the form: F = GQ + N (1) where the observed forward field F (M-sensors x N-time samples) can be expressed in terms of the forward model G, a set of dipole moment(s) Q (3xP-dipoles x N-time samples) and additive noise N. Because of their simplicity, ease of computation, and relatively good accuracy, multi-layer spherical models [7] (or fast approximations described in [1], [7]) have traditionally been the 'forward model of choice' for approximating the human head. However, approximation of the human head via a spherical model does have several key drawbacks. By its very shape, the use of a spherical model distorts the true distribution of passive currents in the skull cavity. Spherical models also require that the sensor positions be projected onto the fitted sphere (Fig. 1), resulting in a distortion of the true sensor-dipole spatial geometry (and ultimately the computed surface potential). The use of a single 'best-fitted' sphere has the added drawback of incomplete coverage of the inner skull region, often ignoring areas such as the frontal cortex. In practice, this problem is typically countered by fitting additional sphere(s) to those region(s) not covered by the primary sphere. The use of these additional spheres results in added complication to the forward model. Using high-resolution spatial information obtained via X-ray CT or MR imaging, a realistic head model can be formed by tessellating the head into a set of contiguous regions (typically the scalp, outer skull, and inner skull surfaces). Since accurate in vivo determination of internal conductivities is currently not currently possible, the head is typically assumed to consist of a set of contiguous isotropic regions, each with constant conductivity.

  8. Functional Testing Protocols for Commercial Building Efficiency Baseline Modeling Software

    E-Print Network [OSTI]

    Jump, David

    2014-01-01

    for   Proprietary  Software  Testing  Protocols”  (vendors   value   software   testing   as   a   means   to  and  requirements  for  testing  software  vendor  energy  

  9. Functional Testing Protocols for Commercial Building Efficiency Baseline Modeling Software

    E-Print Network [OSTI]

    Jump, David

    2014-01-01

    with   natural   gas   smart   meter   data.   Two   years  with  natural  gas  smart  meter  data.  Two  years  of  

  10. Model Baseline Fire Department/Fire Protection Engineering Assessment

    Broader source: Energy.gov [DOE]

    The purpose of the document is to comprehensively delineate and rationalize the roles and responsibilities of the Fire Department and Fire Protection (Engineering).

  11. Functional Testing Protocols for Commercial Building Efficiency Baseline Modeling Software

    E-Print Network [OSTI]

    Jump, David

    2014-01-01

    CV(STD). )   Appendix   A   of   IPMVP   (2012)   includes  Demand  Savings.   IPMVP:   International   Performance  

  12. Economic Model Predictive Control Theory: Computational Efficiency and Application to Smart Manufacturing

    E-Print Network [OSTI]

    Ellis, Matthew

    2015-01-01

    efficiency. . . . . . . . . . . . . . . . . . . . . . . .Performance and Computational Efficiency . . 4.4.1 Class ofestimation and computational efficiency. Journal of Process

  13. Stack and cell modelling with SOFC3D: a computer program for the 3D simulations of

    E-Print Network [OSTI]

    Herbin, Raphaèle

    Stack and cell modelling with SOFC3D: a computer program for the 3D simulations of solid oxide fuel, France 1 Introduction SOFC3D is a computer program, which simulates the behaviour of a solid oxide fuel or the channels, the electrical potential \\Phi at any point of the solid part of the SOFC, and the molar fractions

  14. Computational Fluid Dynamics (CFD) Modeling for High Rate Pulverized Coal Injection (PCI) into the Blast Furnace

    SciTech Connect (OSTI)

    Dr. Chenn Zhou

    2008-10-15

    Pulverized coal injection (PCI) into the blast furnace (BF) has been recognized as an effective way to decrease the coke and total energy consumption along with minimization of environmental impacts. However, increasing the amount of coal injected into the BF is currently limited by the lack of knowledge of some issues related to the process. It is therefore important to understand the complex physical and chemical phenomena in the PCI process. Due to the difficulty in attaining trus BF measurements, Computational fluid dynamics (CFD) modeling has been identified as a useful technology to provide such knowledge. CFD simulation is powerful for providing detailed information on flow properties and performing parametric studies for process design and optimization. In this project, comprehensive 3-D CFD models have been developed to simulate the PCI process under actual furnace conditions. These models provide raceway size and flow property distributions. The results have provided guidance for optimizing the PCI process.

  15. Go-Smart: Web-based Computational Modeling of Minimally Invasive Cancer Treatments

    E-Print Network [OSTI]

    Weir, Phil; Ellerweg, Roland; Alhonnoro, Tuomas; Pollari, Mika; Voglreiter, Philip; Mariappan, Panchatcharam; Flanagan, Ronan; Park, Chang Sub; Payne, Stephen; Staerk, Elmar; Voigt, Peter; Moche, Michael; Kolesnik, Marina

    2015-01-01

    The web-based Go-Smart environment is a scalable system that allows the prediction of minimally invasive cancer treatment. Interventional radiologists create a patient-specific 3D model by semi-automatic segmentation and registration of pre-interventional CT (Computed Tomography) and/or MRI (Magnetic Resonance Imaging) images in a 2D/3D browser environment. This model is used to compare patient-specific treatment plans and device performance via built-in simulation tools. Go-Smart includes evaluation techniques for comparing simulated treatment with real ablation lesions segmented from follow-up scans. The framework is highly extensible, allowing manufacturers and researchers to incorporate new ablation devices, mathematical models and physical parameters.

  16. Time Utility Functions for Modeling and Evaluating Resource Allocations in a Heterogeneous Computing System

    SciTech Connect (OSTI)

    Briceno, Luis Diego [Colorado State University, Fort Collins; Khemka, Bhavesh [Colorado State University, Fort Collins; Siegel, Howard Jay [Colorado State University, Fort Collins; Maciejewski, Anthony A [ORNL; Groer, Christopher S [ORNL; Koenig, Gregory A [ORNL; Okonski, Gene D [ORNL; Poole, Stephen W [ORNL

    2011-01-01

    This study considers a heterogeneous computing system and corresponding workload being investigated by the Extreme Scale Systems Center (ESSC) at Oak Ridge National Laboratory (ORNL). The ESSC is part of a collaborative effort between the Department of Energy (DOE) and the Department of Defense (DoD) to deliver research, tools, software, and technologies that can be integrated, deployed, and used in both DOE and DoD environments. The heterogeneous system and workload described here are representative of a prototypical computing environment being studied as part of this collaboration. Each task can exhibit a time-varying importance or utility to the overall enterprise. In this system, an arriving task has an associated priority and precedence. The priority is used to describe the importance of a task, and precedence is used to describe how soon the task must be executed. These two metrics are combined to create a utility function curve that indicates how valuable it is for the system to complete a task at any given moment. This research focuses on using time-utility functions to generate a metric that can be used to compare the performance of different resource schedulers in a heterogeneous computing system. The contributions of this paper are: (a) a mathematical model of a heterogeneous computing system where tasks arrive dynamically and need to be assigned based on their priority, precedence, utility characteristic class, and task execution type, (b) the use of priority and precedence to generate time-utility functions that describe the value a task has at any given time, (c) the derivation of a metric based on the total utility gained from completing tasks to measure the performance of the computing environment, and (d) a comparison of the performance of resource allocation heuristics in this environment.

  17. Status of Baseline Sampling for Elements in Soil and Vegetation...

    Open Energy Info (EERE)

    Status of Baseline Sampling for Elements in Soil and Vegetation at Four Kgra's in the Imperial Valley, California Jump to: navigation, search OpenEI Reference LibraryAdd to library...

  18. Cost and Performance Baseline for Fossil Energy Plants Volume...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    www.netl.doe.gov This page intentionally left blank Cost and Performance Baseline for Coal-to-SNG and Ammonia (Volume 2) i Table of Contents LIST OF EXHIBITS......

  19. IEEE TRANSACTIONS ON NANOBIOSCIENCE, VOL. 5, NO. 1, MARCH 2006 41 Computational Modeling of a New Fluorescent

    E-Print Network [OSTI]

    IEEE TRANSACTIONS ON NANOBIOSCIENCE, VOL. 5, NO. 1, MARCH 2006 41 Computational Modeling of a New Truong, Member, IEEE Abstract--The class of fluorescence resonance energy transfer (FRET) protein

  20. The Impact of IBM Cell Technology on the Programming Paradigm in the Context of Computer Systems for Climate and Weather Models

    E-Print Network [OSTI]

    Zhou, Shujia

    2009-01-01

    Acceleration of Numerical Weather Prediction,” ProceedingsComputer Systems for Climate and Weather Models Shujia Zhouprocesses in climate and weather models demands a continual

  1. Enabling a Highly-Scalable Global Address Space Model for Petascale Computing

    SciTech Connect (OSTI)

    Apra, Edoardo; Vetter, Jeffrey S; Yu, Weikuan

    2010-01-01

    Over the past decade, the trajectory to the petascale has been built on increased complexity and scale of the underlying parallel architectures. Meanwhile, software de- velopers have struggled to provide tools that maintain the productivity of computational science teams using these new systems. In this regard, Global Address Space (GAS) programming models provide a straightforward and easy to use addressing model, which can lead to improved produc- tivity. However, the scalability of GAS depends directly on the design and implementation of the runtime system on the target petascale distributed-memory architecture. In this paper, we describe the design, implementation, and optimization of the Aggregate Remote Memory Copy Interface (ARMCI) runtime library on the Cray XT5 2.3 PetaFLOPs computer at Oak Ridge National Laboratory. We optimized our implementation with the flow intimation technique that we have introduced in this paper. Our optimized ARMCI implementation improves scalability of both the Global Arrays (GA) programming model and a real-world chemistry application NWChem from small jobs up through 180,000 cores.

  2. Computational Model of Population Dynamics Based on the Cell Cycle and Local Interactions

    SciTech Connect (OSTI)

    Oprisan, Sorinel Adrian; Oprisan, Ana

    2005-03-31

    Our study bridges cellular (mesoscopic) level interactions and global population (macroscopic) dynamics of carcinoma. The morphological differences and transitions between well and smooth defined benign tumors and tentacular malignat tumors suggest a theoretical analysis of tumor invasion based on the development of mathematical models exhibiting bifurcations of spatial patterns in the density of tumor cells. Our computational model views the most representative and clinically relevant features of oncogenesis as a fight between two distinct sub-systems: the immune system of the host and the neoplastic system. We implemented the neoplastic sub-system using a three-stage cell cycle: active, dormant, and necrosis. The second considered sub-system consists of cytotoxic active (effector) cells -- EC, with a very broad phenotype ranging from NK cells to CTL cells, macrophages, etc. Based on extensive numerical simulations, we correlated the fractal dimensions for carcinoma, which could be obtained from tumor imaging, with the malignat stage. Our computational model was able to also simulate the effects of surgical, chemotherapeutical, and radiotherapeutical treatments.

  3. Baseline design/economics for advanced Fischer-Tropsch technology. Quarterly report, January--March 1992

    SciTech Connect (OSTI)

    Not Available

    1992-09-01

    The objectives of the study are to: Develop a baseline design for indirect liquefaction using advanced Fischer-Tropsch (F-T) technology. Prepare the capital and operating costs for the baseline design. Develop a process flow sheet simulation (PFS) model. This report summarizes the activities completed during the period December 23, 1992 through March 15, 1992. In Task 1, Baseline Design and Alternates, the following activities related to the tradeoff studies were completed: approach and basis; oxygen purity; F-T reactor pressure; wax yield; autothermal reformer; hydrocarbons (C{sub 3}/C{sub 4}s) recovery; and hydrogenrecovery. In Task 3, Engineering Design Criteria, activities were initiated to support the process tradeoff studies in Task I and to develop the environmental strategy for the Illinois site. The work completed to date consists of the development of the F-T reactor yield correlation from the Mobil dam and a brief review of the environmental strategy prepared for the same site in the direct liquefaction baseline study.Some work has also been done in establishing site-related criteria, in establishing the maximum vessel diameter for train sizing and in coping with the low H{sub 2}/CO ratio from the Shell gasifier. In Task 7, Project Management and Administration, the following activities were completed: the subcontract agreement between Amoco and Bechtel was negotiated; a first technical progress meeting was held at the Bechtel office in February; and the final Project Management Plan was approved by PETC and issued in March 1992.

  4. High-Performance Computer Modeling of the Cosmos-Iridium Collision

    SciTech Connect (OSTI)

    Olivier, S; Cook, K; Fasenfest, B; Jefferson, D; Jiang, M; Leek, J; Levatin, J; Nikolaev, S; Pertica, A; Phillion, D; Springer, K; De Vries, W

    2009-08-28

    This paper describes the application of a new, integrated modeling and simulation framework, encompassing the space situational awareness (SSA) enterprise, to the recent Cosmos-Iridium collision. This framework is based on a flexible, scalable architecture to enable efficient simulation of the current SSA enterprise, and to accommodate future advancements in SSA systems. In particular, the code is designed to take advantage of massively parallel, high-performance computer systems available, for example, at Lawrence Livermore National Laboratory. We will describe the application of this framework to the recent collision of the Cosmos and Iridium satellites, including (1) detailed hydrodynamic modeling of the satellite collision and resulting debris generation, (2) orbital propagation of the simulated debris and analysis of the increased risk to other satellites (3) calculation of the radar and optical signatures of the simulated debris and modeling of debris detection with space surveillance radar and optical systems (4) determination of simulated debris orbits from modeled space surveillance observations and analysis of the resulting orbital accuracy, (5) comparison of these modeling and simulation results with Space Surveillance Network observations. We will also discuss the use of this integrated modeling and simulation framework to analyze the risks and consequences of future satellite collisions and to assess strategies for mitigating or avoiding future incidents, including the addition of new sensor systems, used in conjunction with the Space Surveillance Network, for improving space situational awareness.

  5. Swelling in light water reactor internal components: Insights from computational modeling

    SciTech Connect (OSTI)

    Stoller, Roger E.; Barashev, Alexander V.; Golubov, Stanislav I.

    2015-08-01

    A modern cluster dynamics model has been used to investigate the materials and irradiation parameters that control microstructural evolution under the relatively low-temperature exposure conditions that are representative of the operating environment for in-core light water reactor components. The focus is on components fabricated from austenitic stainless steel. The model accounts for the synergistic interaction between radiation-produced vacancies and the helium that is produced by nuclear transmutation reactions. Cavity nucleation rates are shown to be relatively high in this temperature regime (275 to 325°C), but are sensitive to assumptions about the fine scale microstructure produced under low-temperature irradiation. The cavity nucleation rates observed run counter to the expectation that void swelling would not occur under these conditions. This expectation was based on previous research on void swelling in austenitic steels in fast reactors. This misleading impression arose primarily from an absence of relevant data. The results of the computational modeling are generally consistent with recent data obtained by examining ex-service components. However, it has been shown that the sensitivity of the model s predictions of low-temperature swelling behavior to assumptions about the primary damage source term and specification of the mean-field sink strengths is somewhat greater that that observed at higher temperatures. Further assessment of the mathematical model is underway to meet the long-term objective of this research, which is to provide a predictive model of void swelling at relevant lifetime exposures to support extended reactor operations.

  6. Computer modeling of the spatial resolution properties of a dedicated breast CT system

    SciTech Connect (OSTI)

    Yang Kai; Kwan, Alexander L. C.; Boone, John M. [Department of Radiology, University of California, Davis Medical Center, 4860 Y Street, Suite 3100 Ellison Building, Sacramento, California 95817 (United States) and Department of Biomedical Engineering, University of California, Davis, California, 95616 (United States); Department of Radiology, University of California, Davis Medical Center, 4860 Y Street, Suite 3100 Ellison Building, Sacramento, California 95817 (United States); Department of Radiology, University of California, Davis Medical Center, 4860 Y Street, Suite 3100 Ellison Building, Sacramento, California 95817 (United States) and Department of Biomedical Engineering, University of California, Davis, California 95616 (United States)

    2007-06-15

    Computer simulation methods were used to evaluate the spatial resolution properties of a dedicated cone-beam breast CT system. x-ray projection data of a 70 {mu}m nickel-chromium wire were simulated. The modulation transfer function (MTF) was calculated from the reconstructed axial images at different radial positions from the isocenter to study the spatial dependency of the spatial resolution of the breast CT scanner. The MTF was also calculated in both the radial and azimuthal directions. Subcomponents of the cone beam CT system that affect the MTF were modeled in the computer simulation in a serial manner, including the x-ray focal spot distribution, gantry rotation under the condition of continuous fluoroscopy, detector lag, and detector spatial resolution. Comparison between the computer simulated and physically measured MTF values demonstrates reasonable accuracy in the simulation process, with a small systematic difference ({approx}9.5{+-}6.4% difference, due to unavoidable uncertainties from physical measurement and system calibration). The intrinsic resolution in the radial direction determined by simulation was about 2.0 mm{sup -1} uniformly through the field of view. The intrinsic resolution in the azimuthal direction degrades from 2.0 mm{sup -1} at the isocenter to 1.0 mm{sup -1} at the periphery with 76.9 mm from the isocenter. The results elucidate the intrinsic spatial resolution properties of the prototype breast CT system, and suggest ways in which spatial resolution can be improved with system modification.

  7. Climate Modeling using High-Performance Computing The Center for Applied Scientific Computing (CASC) and the LLNL Climate and Carbon

    E-Print Network [OSTI]

    and NCAR in the development of a comprehensive, earth systems model. This model incorporates the most-performance climate models. Through the addition of relevant physical processes, we are developing an earth systems modeling capability as well. Our collaborators in climate research include the National Center

  8. 14th Conference on Computational Natural Language Learning, Uppsala, Sweden, 2010 Computing Optimal Alignments for the IBM-3 Translation Model

    E-Print Network [OSTI]

    Lunds Universitet

    Alignments for the IBM-3 Translation Model Thomas Schoenemann Centre for Mathematical Sciences Lund University, Sweden Abstract Prior work on training the IBM-3 transla- tion model is based on suboptimal meth from a statistical viewpoint and introduced five probability models, known as IBM 1-5. Their models

  9. In-Service Design & Performance Prediction of Advanced Fusion Material Systems by Computational Modeling and Simulation

    SciTech Connect (OSTI)

    G. R. Odette; G. E. Lucas

    2005-11-15

    This final report on "In-Service Design & Performance Prediction of Advanced Fusion Material Systems by Computational Modeling and Simulation" (DE-FG03-01ER54632) consists of a series of summaries of work that has been published, or presented at meetings, or both. It briefly describes results on the following topics: 1) A Transport and Fate Model for Helium and Helium Management; 2) Atomistic Studies of Point Defect Energetics, Dynamics and Interactions; 3) Multiscale Modeling of Fracture consisting of: 3a) A Micromechanical Model of the Master Curve (MC) Universal Fracture Toughness-Temperature Curve Relation, KJc(T - To), 3b) An Embrittlement DTo Prediction Model for the Irradiation Hardening Dominated Regime, 3c) Non-hardening Irradiation Assisted Thermal and Helium Embrittlement of 8Cr Tempered Martensitic Steels: Compilation and Analysis of Existing Data, 3d) A Model for the KJc(T) of a High Strength NFA MA957, 3e) Cracked Body Size and Geometry Effects of Measured and Effective Fracture Toughness-Model Based MC and To Evaluations of F82H and Eurofer 97, 3-f) Size and Geometry Effects on the Effective Toughness of Cracked Fusion Structures; 4) Modeling the Multiscale Mechanics of Flow Localization-Ductility Loss in Irradiation Damaged BCC Alloys; and 5) A Universal Relation Between Indentation Hardness and True Stress-Strain Constitutive Behavior. Further details can be found in the cited references or presentations that generally can be accessed on the internet, or provided upon request to the authors. Finally, it is noted that this effort was integrated with our base program in fusion materials, also funded by the DOE OFES.

  10. Validation of the thermospheric vector spherical harmonic (VSH) computer model. Master's thesis

    SciTech Connect (OSTI)

    Davis, J.L.

    1991-01-01

    A semi-empirical computer model of the lower thermosphere has been developed that provides a description of the composition and dynamics of the thermosphere (Killeen et al., 1992). Input variables needed to run the VSH model include time, space and geophysical conditions. One of the output variables the model provides, neutral density, is of particular interest to the U.S. Air Force. Neutral densities vary both as a result of change in solar flux (eg. the solar cycle) and as a result of changes in the magnetosphere (eg. large changes occur in neutral density during geomagnetic storms). Satellites in earth orbit experience aerodynamic drag due to the atmospheric density of the thermosphere. Variability in the neutral density described above affects the drag a satellite experiences and as a result can change the orbital characteristics of the satellite. These changes make it difficult to track the satellite's position. Therefore, it is particularly important to insure that the accuracy of the model's neutral density is optimized for all input parameters. To accomplish this, a validation program was developed to evaluate the strengths and weaknesses of the model's density output by comparing it to SETA-2 (satellite electrostatic accelerometer) total mass density measurements.

  11. Multi-baseline interferometric synthetic aperture radar applications and error analysis

    E-Print Network [OSTI]

    Chua, Song Liang

    2007-01-01

    In this thesis, we deal primarily with the multi-baseline SAR configuration utilizing three satellites. Two applications of InSAR, multi-baseline height retrieval and multi-baseline compensation of CCD's slope biasing ...

  12. Multiproject baselines for evaluation of electric power projects

    SciTech Connect (OSTI)

    Sathaye, Jayant; Murtishaw, Scott; Price, Lynn; Lefranc, Maurice; Roy, Joyashree; Winkler, Harald; Spalding-Fecher, Randall

    2003-03-12

    Calculating greenhouse gas emissions reductions from climate change mitigation projects requires construction of a baseline that sets emissions levels that would have occurred without the project. This paper describes a standardized multiproject methodology for setting baselines, represented by the emissions rate (kg C/kWh), for electric power projects. A standardized methodology would reduce the transaction costs of projects. The most challenging aspect of setting multiproject emissions rates is determining the vintage and types of plants to include in the baseline and the stringency of the emissions rates to be considered, in order to balance the desire to encourage no- or low-carbon projects while maintaining environmental integrity. The criteria for selecting power plants to include in the baseline depend on characteristics of both the project and the electricity grid it serves. Two case studies illustrate the application of these concepts to the electric power grids in eastern India and South Africa. We use hypothetical, but realistic, climate change projects in each country to illustrate the use of the multiproject methodology, and note the further research required to fully understand the implications of the various choices in constructing and using these baselines.

  13. Way to increase the user access at the LCLS baseline

    E-Print Network [OSTI]

    Geloni, Gianluca; Saldin, Evgeni

    2010-01-01

    The LCLS beam is meant for a single user, but the baseline undulator is long enough to serve two users simultaneously. To this end, we propose a setup composed of two elements: an X-ray mirrors pair for X-ray beam deflection, and a 4 m-long magnetic chicane, which creates an offset for mirrors pair installation in the middle of the baseline undulator. The insertable mirrors pair can separate spatially the X-ray beams generated in the first and in the second half of the baseline undulator. Rapid switching of the FEL amplification process allows deactivating one half and activating another half of the undulator. As proposed elsewhere, using a kicker installed upstream of the LCLS baseline undulator and an already existing corrector in the first half of the undulator, it is possible to rapidly switch the X-ray beam from one user to another. We present simulation results for the LCLS baseline, and show that it is possible to generate two saturated SASE X-ray beams in the whole 0.8-8 keV photon energy range in the...

  14. Economic Model Predictive Control Theory: Computational Efficiency and Application to Smart Manufacturing

    E-Print Network [OSTI]

    Ellis, Matthew

    2015-01-01

    integration. Computers & Chemical Engineering, 20:315–323,control. Computers & Chemical Engineering, 58:334–343,reactors: a review. Chemical Engineering Communications, 1:

  15. Mathematics and Computers in Simulation 65 (2004) 557577 Parallel runs of a large air pollution model on a

    E-Print Network [OSTI]

    2004-01-01

    Mathematics and Computers in Simulation 65 (2004) 557­577 Parallel runs of a large air pollution 20 January 2004; accepted 21 January 2004 Abstract Large-scale air pollution models can successfully. The mathematical description of a large-scale air pollution model will be discussed in this paper. The principles

  16. Quantum computing of quantum chaos in the kicked rotator model B. Levi, B. Georgeot, and D. L. Shepelyansky

    E-Print Network [OSTI]

    Shepelyansky, Dima

    Quantum computing of quantum chaos in the kicked rotator model B. Le´vi, B. Georgeot, and D. L model, a system that displays rich physical properties and enables to study problems of quantum chaos are robust in presence of imperfections. This implies that the algorithm can simulate the dynamics of quantum

  17. Computational Science Technical Note CSTN-163 Transients in a Forest-Fire Simulation Model with Varying Combustion

    E-Print Network [OSTI]

    Hawick, Ken

    0 Computational Science Technical Note CSTN-163 Transients in a Forest-Fire Simulation Model with Varying Combustion Neighbourhoods and Watercourse Firebreaks K. A. Hawick 2012 Forest and bush fires-Schwabl Forest-fire model is investigated with various localised combustion neighbourhoods. The transient

  18. ABB SCADA/EMS System INEEL Baseline Summary Test Report (November...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    ABB SCADAEMS System INEEL Baseline Summary Test Report (November 2004) ABB SCADAEMS System INEEL Baseline Summary Test Report (November 2004) This document covers the security...

  19. Ray tracing computations in the smoothed SEG/EAGE Salt Model V. Bucha*, Department of Geophysics, Charles University, Prague, Czech Republic

    E-Print Network [OSTI]

    Cerveny, Vlastislav

    Ray tracing computations in the smoothed SEG/EAGE Salt Model V. Bucha*, Department of Geophysics in comparison with precise methods, e.g., finite differences or finite elements, are the speed of computation, because ray tracing computations need smooth velocity macro models (Bucha, Bulant & Klimes 2003

  20. 200-BP-5 operable unit Technical Baseline report

    SciTech Connect (OSTI)

    Jacques, I.D.; Kent, S.K.

    1991-10-01

    This report supports development of a remedial investigation/feasibility study work plan for the 200-BP-5 operable unit. The report summarizes baseline information for waste sites and unplanned release sites located in the 200-BP-5 operable unit. The sites were investigated by the Technical Baseline Section of the Environmental Engineering Group, Westinghouse Hanford Company (Westinghouse Hanford). The investigation consisted of review and evaluation of current and historical Hanford Site reports, drawings, and photographs, and was supplemented with recent inspections of the Hanford Site and employee interviews. No field investigations or sampling were conducted.

  1. Precision Measurements of Long-Baseline Neutrino Oscillation at LBNF

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Worcester, Elizabeth

    2015-08-06

    In a long-baseline neutrino oscillation experiment, the primary physics objectives are to determine the neutrino mass hierarchy, to determine the octant of the neutrino mixing angle ?23, to search for CP violation in neutrino oscillation, and to precisely measure the size of any CP-violating effect that is discovered. This presentation provides a brief introduction to these measurements and reports on efforts to optimize the design of a long-baseline neutrino oscillation experiment, the status of LBNE, and the transition to an international collaboration at LBNF.

  2. Current Computer Modeling Cannot Explain Why Two Highly Similar Sequences Fold into Different Structures

    E-Print Network [OSTI]

    Economou, Tassos

    understanding how it functions; correspondingly, one of the holy grails of computational struc- tural biology

  3. COMPUTATIONAL THERMODYNAMIC MODELING OF HOT CORROSION OF ALLLOYS HAYNES 242 AND HASTELLOYTMN FOR MOLTEN SALT SERVICE

    SciTech Connect (OSTI)

    Michael V. Glazoff; Piyush Sabharwall; Akira Tokuhiro

    2014-09-01

    An evaluation of thermodynamic aspects of hot corrosion of the superalloys Haynes 242 and HastelloyTM N in the eutectic mixtures of KF and ZrF4 is carried out for development of Advanced High Temperature Reactor (AHTR). This work models the behavior of several superalloys, potential candidates for the AHTR, using computational thermodynamics tool (ThermoCalc), leading to the development of thermodynamic description of the molten salt eutectic mixtures, and on that basis, mechanistic prediction of hot corrosion. The results from these studies indicated that the principal mechanism of hot corrosion was associated with chromium leaching for all of the superalloys described above. However, HastelloyTM N displayed the best hot corrosion performance. This was not surprising given it was developed originally to withstand the harsh conditions of molten salt environment. However, the results obtained in this study provided confidence in the employed methods of computational thermodynamics and could be further used for future alloy design efforts. Finally, several potential solutions to mitigate hot corrosion were proposed for further exploration, including coating development and controlled scaling of intermediate compounds in the KF-ZrF4 system.

  4. Emergency Response Equipment and Related Training: Airborne Radiological Computer System (Model II)

    SciTech Connect (OSTI)

    David P. Colton

    2007-02-28

    The materials included in the Airborne Radiological Computer System, Model-II (ARCS-II) were assembled with several considerations in mind. First, the system was designed to measure and record the airborne gamma radiation levels and the corresponding latitude and longitude coordinates, and to provide a first overview look of the extent and severity of an accident's impact. Second, the portable system had to be light enough and durable enough that it could be mounted in an aircraft, ground vehicle, or watercraft. Third, the system must control the collection and storage of the data, as well as provide a real-time display of the data collection results to the operator. The notebook computer and color graphics printer components of the system would only be used for analyzing and plotting the data. In essence, the provided equipment is composed of an acquisition system and an analysis system. The data can be transferred from the acquisition system to the analysis system at the end of the data collection or at some other agreeable time.

  5. The recent times have seen a surge in computational modeling of materials and processes. New research initiatives like the Materials Genome Initiative (MGI)

    E-Print Network [OSTI]

    Ghosh, Somnath

    . New research initiatives like the Materials Genome Initiative (MGI) and the Integrated Computational Materials Science & Engineering (ICMSE) are creating unprecedented opportunities for unraveling new1 PREFACE The recent times have seen a surge in computational modeling of materials and processes

  6. Protein translocation without specific quality control in a computational model of the Tat system

    E-Print Network [OSTI]

    Chitra R. Nayak; Aidan I. Brown; Andrew D. Rutenberg

    2014-08-20

    The twin-arginine translocation (Tat) system transports folded proteins of various sizes across both bacterial and plant thylakoid membranes. The membrane-associated TatA protein is an essential component of the Tat translocon, and a broad distribution of different sized TatA-clusters is observed in bacterial membranes. We assume that the size dynamics of TatA clusters are affected by substrate binding, unbinding, and translocation to associated TatBC clusters, where clusters with bound translocation substrates favour growth and those without associated substrates favour shrinkage. With a stochastic model of substrate binding and cluster dynamics, we numerically determine the TatA cluster size distribution. We include a proportion of targeted but non-translocatable (NT) substrates, with the simplifying hypothesis that the substrate translocatability does not directly affect cluster dynamical rate constants or substrate binding or unbinding rates. This amounts to a translocation model without specific quality control. Nevertheless, NT substrates will remain associated with TatA clusters until unbound and so will affect cluster sizes and translocation rates. We find that the number of larger TatA clusters depends on the NT fraction $f$. The translocation rate can be optimized by tuning the rate of spontaneous substrate unbinding, $\\Gamma_U$. We present an analytically solvable three-state model of substrate translocation without cluster size dynamics that follows our computed translocation rates, and that is consistent with {\\em in vitro} Tat-translocation data in the presence of NT substrates.

  7. Introduction to Focus Issue: Rhythms and Dynamic Transitions in Neurological Disease: Modeling, Computation, and Experiment

    SciTech Connect (OSTI)

    Kaper, Tasso J. Kramer, Mark A.; Rotstein, Horacio G.

    2013-12-15

    Rhythmic neuronal oscillations across a broad range of frequencies, as well as spatiotemporal phenomena, such as waves and bumps, have been observed in various areas of the brain and proposed as critical to brain function. While there is a long and distinguished history of studying rhythms in nerve cells and neuronal networks in healthy organisms, the association and analysis of rhythms to diseases are more recent developments. Indeed, it is now thought that certain aspects of diseases of the nervous system, such as epilepsy, schizophrenia, Parkinson's, and sleep disorders, are associated with transitions or disruptions of neurological rhythms. This focus issue brings together articles presenting modeling, computational, analytical, and experimental perspectives about rhythms and dynamic transitions between them that are associated to various diseases.

  8. Extraction of actinides by multi-dentate diamides and their evaluation with computational molecular modeling

    SciTech Connect (OSTI)

    Sasaki, Y.; Kitatsuji, Y.; Hirata, M.; Kimura, T.; Yoshizuka, K.

    2008-07-01

    Multi-dentate diamides have been synthesized and examined for actinide (An) extractions. Bi- and tridentate extractants are the focus in this work. The extraction of actinides was performed from 0.1-6 M HNO{sub 3} to organic solvents. It was obvious that N,N,N',N'-tetra-alkyl-diglycolamide (DGA) derivatives, 2,2'-(methylimino)bis(N,N-dioctyl-acetamide) (MIDOA), and N,N'-dimethyl-N,N'-dioctyl-2-(3-oxa-pentadecane)-malonamide (DMDOOPDMA) have relatively high D values (D(Pu) > 70). The following notable results using DGA extractants were obtained: (1) DGAs with short alkyl chains give higher D values than those with long alkyl chain, (2) DGAs with long alkyl chain have high solubility in n-dodecane. Computational molecular modeling was also used to elucidate the effects of structural and electronic properties of the reagents on their different extractabilities. (authors)

  9. Computational experiences with variable modulus, elastic-plastic, and viscoelastic concrete models. [HTGR

    SciTech Connect (OSTI)

    Anderson, C.A.

    1981-01-01

    Six years ago the Reactor Safety Research Division of the Nuclear Regulatory Commission (NRC) approached the Los Alamos National Laboratory to develop a comprehensive concrete structural analysis code to predict the static and dynamic behavior of Prestressed Concrete Reactor Vessels (PCRVs) that serve as the containment structure of a High-Temperature Gas-Cooled Reactor. The PCRV is a complex concrete structure that must be modeled in three dimensions and posseses other complicating features such as a steel liner for the reactor cavity and woven cables embedded vertically in the PCRV and wound circumferentially on the outside of the PCRV. The cables, or tendons, are used for prestressing the reactor vessel. In addition to developing the computational capability to predict inelastic three dimensional concrete structural behavior, the code response was verified against documented experiments on concrete structural behavior. This code development/verification effort is described.

  10. GEOCITY: a computer model for systems analysis of geothermal district heating and cooling costs

    SciTech Connect (OSTI)

    Fassbender, L.L.; Bloomster, C.H.

    1981-06-01

    GEOCITY is a computer-simulation model developed to study the economics of district heating/cooling using geothermal energy. GEOCITY calculates the cost of district heating/cooling based on climate, population, resource characteristics, and financing conditions. The basis for our geothermal-energy cost analysis is the unit cost of energy which will recover all the costs of production. The calculation of the unit cost of energy is based on life-cycle costing and discounted-cash-flow analysis. A wide variation can be expected in the range of potential geothermal district heating and cooling costs. The range of costs is determined by the characteristics of the resource, the characteristics of the demand, and the distance separating the resource and the demand. GEOCITY is a useful tool for estimating costs for each of the main parts of the production process and for determining the sensitivity of these costs to several significant parameters under a consistent set of assumptions.

  11. A method to efficiently simulate the thermodynamical properties of the Fermi-Hubbard model on a quantum computer

    E-Print Network [OSTI]

    Pierre-Luc Dallaire-Demers; Frank K. Wilhelm

    2015-08-18

    Many phenomena of strongly correlated materials are encapsulated in the Fermi-Hubbard model whose thermodynamical properties can be computed from its grand canonical potential according to standard procedures. In general, there is no closed form solution for lattices of more than one spatial dimension, but solutions can be approximated with cluster perturbation theory. To model long-range effects such as order parameters, a powerful method to compute the cluster's Green's function consists in finding its self-energy through a variational principle of the grand canonical potential. This opens the possibility of studying various phase transitions at finite temperature in the Fermi-Hubbard model. However, a classical cluster solver quickly hits an exponential wall in the memory (or computation time) required to store the computation variables. Here it is shown theoretically that that the cluster solver can be mapped to a subroutine on a quantum computer whose quantum memory scales as the number of orbitals in the simulated cluster. A quantum computer with a few tens of qubits could therefore simulate the thermodynamical properties of complex fermionic lattices inaccessible to classical supercomputers.

  12. A method to efficiently simulate the thermodynamical properties of the Fermi-Hubbard model on a quantum computer

    E-Print Network [OSTI]

    Pierre-Luc Dallaire-Demers; Frank K. Wilhelm

    2015-11-27

    Many phenomena of strongly correlated materials are encapsulated in the Fermi-Hubbard model whose thermodynamical properties can be computed from its grand canonical potential according to standard procedures. In general, there is no closed form solution for lattices of more than one spatial dimension, but solutions can be approximated with cluster perturbation theory. To model long-range effects such as order parameters, a powerful method to compute the cluster's Green's function consists in finding its self-energy through a variational principle of the grand canonical potential. This opens the possibility of studying various phase transitions at finite temperature in the Fermi-Hubbard model. However, a classical cluster solver quickly hits an exponential wall in the memory (or computation time) required to store the computation variables. Here it is shown theoretically that that the cluster solver can be mapped to a subroutine on a quantum computer whose quantum memory scales as the number of orbitals in the simulated cluster. A quantum computer with a few tens of qubits could therefore simulate the thermodynamical properties of complex fermionic lattices inaccessible to classical supercomputers.

  13. A sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory.

    SciTech Connect (OSTI)

    Johnson, J. D.; Oberkampf, William Louis; Helton, Jon Craig (Arizona State University, Tempe, AZ); Storlie, Curtis B. (North Carolina State University, Raleigh, NC)

    2006-10-01

    Evidence theory provides an alternative to probability theory for the representation of epistemic uncertainty in model predictions that derives from epistemic uncertainty in model inputs, where the descriptor epistemic is used to indicate uncertainty that derives from a lack of knowledge with respect to the appropriate values to use for various inputs to the model. The potential benefit, and hence appeal, of evidence theory is that it allows a less restrictive specification of uncertainty than is possible within the axiomatic structure on which probability theory is based. Unfortunately, the propagation of an evidence theory representation for uncertainty through a model is more computationally demanding than the propagation of a probabilistic representation for uncertainty, with this difficulty constituting a serious obstacle to the use of evidence theory in the representation of uncertainty in predictions obtained from computationally intensive models. This presentation describes and illustrates a sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory. Preliminary trials indicate that the presented strategy can be used to propagate uncertainty representations based on evidence theory in analysis situations where naive sampling-based (i.e., unsophisticated Monte Carlo) procedures are impracticable due to computational cost.

  14. Automated baseline change detection phase I. Final report

    SciTech Connect (OSTI)

    NONE

    1995-12-01

    The Automated Baseline Change Detection (ABCD) project is supported by the DOE Morgantown Energy Technology Center (METC) as part of its ER&WM cross-cutting technology program in robotics. Phase 1 of the Automated Baseline Change Detection project is summarized in this topical report. The primary objective of this project is to apply robotic and optical sensor technology to the operational inspection of mixed toxic and radioactive waste stored in barrels, using Automated Baseline Change Detection (ABCD), based on image subtraction. Absolute change detection is based on detecting any visible physical changes, regardless of cause, between a current inspection image of a barrel and an archived baseline image of the same barrel. Thus, in addition to rust, the ABCD system can also detect corrosion, leaks, dents, and bulges. The ABCD approach and method rely on precise camera positioning and repositioning relative to the barrel and on feature recognition in images. In support of this primary objective, there are secondary objectives to determine DOE operational inspection requirements and DOE system fielding requirements.

  15. Technical Evaluation Report "Baseline Design for the COS Aperture Plate"

    E-Print Network [OSTI]

    Colorado at Boulder, University of

    Technical Evaluation Report "Baseline Design for the COS Aperture Plate" Date: October 14, 1999 Document Number: COS-11-0009 Revision: Initial Release Contract No.: NAS5-98043 CDRL No.: SE-05 Prepared By: J. Morse, COS Project Scientist, CU/CASA Date Reviewed By: E. Wilkinson, COS Instrument Scientist

  16. THE FIRST VERY LONG BASELINE INTERFEROMETRIC SETI EXPERIMENT

    SciTech Connect (OSTI)

    Rampadarath, H.; Morgan, J. S.; Tingay, S. J.; Trott, C. M.

    2012-08-15

    The first Search for Extra-Terrestrial Intelligence (SETI) conducted with very long baseline interferometry (VLBI) is presented. By consideration of the basic principles of interferometry, we show that VLBI is efficient at discriminating between SETI signals and human generated radio frequency interference (RFI). The target for this study was the star Gliese 581, thought to have two planets within its habitable zone. On 2007 June 19, Gliese 581 was observed for 8 hr at 1230-1544 MHz with the Australian Long Baseline Array. The data set was searched for signals appearing on all interferometer baselines above five times the noise limit. A total of 222 potential SETI signals were detected and by using automated data analysis techniques were ruled out as originating from the Gliese 581 system. From our results we place an upper limit of 7 MW Hz{sup -1} on the power output of any isotropic emitter located in the Gliese 581 system within this frequency range. This study shows that VLBI is ideal for targeted SETI including follow-up observations. The techniques presented are equally applicable to next-generation interferometers, such as the long baselines of the Square Kilometre Array.

  17. New baseline for the magnet cooling system Yury Ivanyushenkov

    E-Print Network [OSTI]

    McDonald, Kirk

    1 New baseline for the magnet cooling system Yury Ivanyushenkov Engineering and Instrumentation Department, Rutherford Appleton Laboratory #12;2 Liquid nitrogen cooling system: Conceptual points · Magnet the magnet as a stand alone system. #12;3 Liquid nitrogen cooling system: Diagram Drawn by Peter Titus #12

  18. Overview of the North Coast MPA Baseline Program

    E-Print Network [OSTI]

    Jaffe, Jules

    & Socioeconomic scope 10 Ecosystem Features Traditional Ecological Knowledge · Contextual Information e new contextual information More information on RFP p. 3-6 #12;North Coast Ecosystem Features.g., fisheries information, physical oceanographic information Baseline Program funds cannot be used to collect

  19. 2004 Compliance Recertification Application Performance Assessment Baseline Calculation

    E-Print Network [OSTI]

    2004 Compliance Recertification Application Performance Assessment Baseline Calculation Revision O Sandia National Laboratories Waste Isolation Pilot Plant 2004 Compliance Recertification Application (2 ~"f, Date QA Review Mario Chavez Print WIPP: 1.4.1.1.:P A:QA-L:540232 lof153 #12;2004 Compliance

  20. Revised SRC-I project baseline. Volume 2

    SciTech Connect (OSTI)

    Not Available

    1984-01-01

    The SRC Process Area Design Baseline consists of six volumes. The first four were submitted to DOE on 9 September 1981. The fifth volume, summarizing the Category A Engineering Change Proposals (ECPs), was not submitted. The sixth volume, containing proprietary information on Kerr-McGee's Critical Solvent Deashing System, was forwarded to BRHG Synthetic Fuels, Inc. for custody, according to past instructions from DOE, and is available for perusal by authorized DOE representatives. DOE formally accepted the Design Baseline under ICRC Release ECP 4-1001, at the Project Configuration Control Board meeting in Oak Ridge, Tennessee on 5 November 1981. The documentation was then revised by Catalytic, Inc. to incorporate the Category B and C and Post-Baseline Engineering Change Proposals. Volumes I through V of the Revised Design Baseline, dated 22 October 1982, are nonproprietary and they were issued to the DOE via Engineering Change Notice (ECN) 4-1 on 23 February 1983. Volume VI again contains proprieary information on Kerr-McGee Critical Solvent Deashing System; it was issued to Burns and Roe Synthetic Fuels, Inc. Subsequently, updated process descriptions, utility summaries, and errata sheets were issued to the DOE and Burns and Roe Synthetic Fuels, Inc. on nonproprietary Engineering Change Notices 4-2 and 4-3 on 24 May 1983.

  1. Historical forest baselines reveal potential for continued carbon sequestration

    E-Print Network [OSTI]

    Mladenoff, David

    Historical forest baselines reveal potential for continued carbon sequestration Jeanine M-based studies suggest that land-use history is a more important driver of carbon sequestration in these systems agricultural lands are being promoted as important avenues for future carbon sequestration (8). But the degree

  2. An Alternative Baseline Methodology for the Power Sector

    E-Print Network [OSTI]

    An Alternative Baseline Methodology for the Power Sector - Taking a Systemic Approach Jakob Asger in August 2005 to discuss the international future strategy of climate policies. Both events put our work process from idea to final thesis. Further we would like to express our warm thanks to Senior Energy

  3. Revised SRC-I project baseline. Volume 1

    SciTech Connect (OSTI)

    Not Available

    1984-01-01

    International Coal Refining Company (ICRC), in cooperation with the Commonwealth of Kentucky has contracted with the United States Department of Energy (DOE) to design, build and operate a first-of-its-kind plant demonstrating the economic, environmental, socioeconomic and technical feasibility of the direct coal liquefaction process known as SRC-I. ICRC has made a massive commitment of time and expertise to design processes, plan and formulate policy, schedules, costs and technical drawings for all plant systems. These fully integrated plans comprise the Project Baseline and are the basis for all future detailed engineering, plant construction, operation, and other work set forth in the contract between ICRC and the DOE. Volumes I and II of the accompanying documents constitute the updated Project Baseline for the SRC-I two-stage liquefaction plant. International Coal Refining Company believes this versatile plant design incorporates the most advanced coal liquefaction system available in the synthetic fuels field. SRC-I two-stage liquefaction, as developed by ICRC, is the way of the future in coal liquefaction because of its product slate flexibility, high process thermal efficiency, and low consumption of hydrogen. The SRC-I Project Baseline design also has made important state-of-the-art advances in areas such as environmental control systems. Because of a lack of funding, the DOE has curtailed the total project effort without specifying a definite renewal date. This precludes the development of revised accurate and meaningful schedules and, hence, escalated project costs. ICRC has revised and updated the original Design Baseline to include in the technical documentation all of the approved but previously non-incorporated Category B and C and new Post-Baseline Engineering Change Proposals.

  4. Steering in computational science: mesoscale modelling and J. CHIN{, J. HARTING{, S. JHA{, P. V. COVENEY{, A. R. PORTER{ and S. M. PICKLES{{

    E-Print Network [OSTI]

    Harting, Jens

    Steering in computational science: mesoscale modelling and simulation J. CHIN{, J. HARTING{, S. JHA steering for high performance computing applications. Lattice-Boltzmann mesoscale fluid simulations, there is currently considerable interest in mesoscale models. These models coarse grain most of the atomic

  5. MULTIFRAME DEEP NEURAL NETWORKS FOR ACOUSTIC MODELING Vincent Vanhoucke, Matthieu Devin, Georg Heigold

    E-Print Network [OSTI]

    Cortes, Corinna

    performance to the typical frame-synchronous model, while achieving up to a 4X reduction in the computational cores, or even machines [4]. To go beyond that, one might have to consider limiting the size: Section 2 describes the baseline system and shows the per- formance/complexity tradeoff of a typical frame-synchronous

  6. Wind Turbine Modeling for Computational Fluid Dynamics: December 2010 - December 2012

    SciTech Connect (OSTI)

    Tossas, L. A. M.; Leonardi, S.

    2013-07-01

    With the shortage of fossil fuel and the increasing environmental awareness, wind energy is becoming more and more important. As the market for wind energy grows, wind turbines and wind farms are becoming larger. Current utility-scale turbines extend a significant distance into the atmospheric boundary layer. Therefore, the interaction between the atmospheric boundary layer and the turbines and their wakes needs to be better understood. The turbulent wakes of upstream turbines affect the flow field of the turbines behind them, decreasing power production and increasing mechanical loading. With a better understanding of this type of flow, wind farm developers could plan better-performing, less maintenance-intensive wind farms. Simulating this flow using computational fluid dynamics is one important way to gain a better understanding of wind farm flows. In this study, we compare the performance of actuator disc and actuator line models in producing wind turbine wakes and the wake-turbine interaction between multiple turbines. We also examine parameters that affect the performance of these models, such as grid resolution, the use of a tip-loss correction, and the way in which the turbine force is projected onto the flow field.

  7. Computer modelling of the reduction of rare earth dopants in barium aluminate

    SciTech Connect (OSTI)

    Rezende, Marcos V. dos S; Valerio, Mario E.G. [Department of Physics, Federal University of Sergipe, 49100-000 Sao Cristovao, SE (Brazil); Jackson, Robert A., E-mail: r.a.jackson@chem.keele.ac.uk [School of Physical and Geographical Sciences, Keele University, Keele, Staffordshire ST5 5BG (United Kingdom)

    2011-08-15

    Long lasting phosphorescence in barium aluminates can be achieved by doping with rare earth ions in divalent charge states. The rare earth ions are initially in a trivalent charge state, but are reduced to a divalent charge state before being doped into the material. In this paper, the reduction of trivalent rare earth ions in the BaAl{sub 2}O{sub 4} lattice is studied by computer simulation, with the energetics of the whole reduction and doping process being modelled by two methods, one based on single ion doping and one which allows dopant concentrations to be taken into account. A range of different reduction schemes are considered and the most energetically favourable schemes identified. - Graphical abstract: The doping and subsequent reduction of a rare earth ion into the barium aluminate lattice. Highlights: > The doping of barium aluminate with rare earth ions reduced in a range of atmospheres has been modelled. > The overall solution energy for the doping process for each ion in each reducing atmosphere is calculated using two methods. > The lowest energy reduction process is predicted and compared with experimental results.

  8. Development of Computational Tools for Metabolic Model Curation, Flux Elucidation and Strain Design

    SciTech Connect (OSTI)

    Maranas, Costas D

    2012-05-21

    An overarching goal of the Department of EnergyÂ? mission is the efficient deployment and engineering of microbial and plant systems to enable biomass conversion in pursuit of high energy density liquid biofuels. This has spurred the pace at which new organisms are sequenced and annotated. This torrent of genomic information has opened the door to understanding metabolism in not just skeletal pathways and a handful of microorganisms but for truly genome-scale reconstructions derived for hundreds of microbes and plants. Understanding and redirecting metabolism is crucial because metabolic fluxes are unique descriptors of cellular physiology that directly assess the current cellular state and quantify the effect of genetic engineering interventions. At the same time, however, trying to keep pace with the rate of genomic data generation has ushered in a number of modeling and computational challenges related to (i) the automated assembly, testing and correction of genome-scale metabolic models, (ii) metabolic flux elucidation using labeled isotopes, and (iii) comprehensive identification of engineering interventions leading to the desired metabolism redirection.

  9. Modeling a Reconfigurable System for Computing the FFT in Place via RewritingLogic #

    E-Print Network [OSTI]

    Ayala-Rincón, Mauricio

    hardware implementation of the Fast Fourier Transform -- FFT using rewriting­logic. It is shown on general purpose processors, reconfigurable computing delivers more processing power due electronic market. There are several taxonomies applied to reconfigurable computing. Concerning the specific

  10. Development of Apple Workgroup Cluster and Parallel Computing for Phase Field Model of Magnetic Materials 

    E-Print Network [OSTI]

    Huang, Yongxin

    2010-01-16

    ferromagnets, there exist two main challenges: the complicated microelasticity due to the magnetostrictive strain, and very expensive computation mainly caused by the calculation of long-range magnetostatic and elastic interactions. A parallel computing...

  11. A Computational Model which Learns to Selectively Attend in Category Learning

    E-Print Network [OSTI]

    Cottrell, Garrison W.

    Garrison W. Cottrell lingyun,gary@cs.ucsd.edu UCSD Computer Science and Engineering 9500 Gilman Dr., La

  12. Computational Nanophotonics: Model Optical Interactions and Transport in Tailored Nanosystem Architectures

    SciTech Connect (OSTI)

    Stockman, Mark [Georgia State University Research Foundation] [Georgia State University Research Foundation; Gray, Steven [Argon National Laboratory] [Argon National Laboratory

    2014-02-21

    The program is directed toward development of new computational approaches to photoprocesses in nanostructures whose geometry and composition are tailored to obtain desirable optical responses. The emphasis of this specific program is on the development of computational methods and prediction and computational theory of new phenomena of optical energy transfer and transformation on the extreme nanoscale (down to a few nanometers).

  13. Modeling and Simulation Environment for Photonic Interconnection Networks in High Performance Computing

    E-Print Network [OSTI]

    Bergman, Keren

    at the scale of high performance computer clusters and warehouse scale data centers, system level simulations and results for rack scale photonic interconnection networks for high performance computing. Keywords: optical to the newsworthy power consumption [3], latency [4] and bandwidth challenges [5] of high performance computing (HPC

  14. Gas-Phase Lubrication of ta-C by Glycerol and Hydrogen Peroxide. Experimental and Computer Modeling

    E-Print Network [OSTI]

    Goddard III, William A.

    Gas-Phase Lubrication of ta-C by Glycerol and Hydrogen Peroxide. Experimental and Computer Modeling lubrication conditions at 80 °C in presence of OH-containing molecules. To understand the mechanism of ultralow friction, we performed gas-phase lubrication experiments followed by time-of-flight secondary ion

  15. PHYSICAL REVIEW A 87, 032341 (2013) Simulating the transverse Ising model on a quantum computer: Error correction

    E-Print Network [OSTI]

    Geller, Michael R.

    2013-01-01

    code. Section III maps the calculation of the ground-state energy for the TIM onto a quantum phase in the transverse Ising model (TIM) [12], there is a large number of physical qubits and lengthy computational time]. Here we investigate the quantum simulation of the TIM ground-state energy on a surface code quantum

  16. Integrated Baseline System (IBS) Version 2.0: User guide

    SciTech Connect (OSTI)

    Bower, J.C. [Bower Software Services, Kennewick, WA (United States); Burford, M.J.; Downing, T.R.; Matsumoto, S.W.; Schrank, E.E.; Williams, J.R.; Winters, C.; Wood, B.M.

    1994-03-01

    The Integrated Baseline System (IBS) is an emergency management planning and analysis tool being developed under the direction of the Federal Emergency Management Agency. This User Guide explains how to start and use the IBS Program, which is designed to help civilian emergency management personnel to plan for and support their responses to a chemical-releasing event at a military chemical stockpile. The intended audience for this document is all users of the IBS, especially emergency management planners and analysts.

  17. Marine and Hydrokinetic Technology (MHK) Instrumentation, Measurement, and Computer Modeling Workshop

    SciTech Connect (OSTI)

    Musial, W.; Lawson, M.; Rooney, S.

    2013-02-01

    The Marine and Hydrokinetic Technology (MHK) Instrumentation, Measurement, and Computer Modeling Workshop was hosted by the National Renewable Energy Laboratory (NREL) in Broomfield, Colorado, July 9-10, 2012. The workshop brought together over 60 experts in marine energy technologies to disseminate technical information to the marine energy community and collect information to help identify ways in which the development of a commercially viable marine energy industry can be accelerated. The workshop was comprised of plenary sessions that reviewed the state of the marine energy industry and technical sessions that covered specific topics of relevance. Each session consisted of presentations, followed by facilitated discussions. During the facilitated discussions, the session chairs posed several prepared questions to the presenters and audience to encourage communication and the exchange of ideas between technical experts. Following the workshop, attendees were asked to provide written feedback on their takeaways and their best ideas on how to accelerate the pace of marine energy technology development. The first four sections of this document give a general overview of the workshop format, provide presentation abstracts and discussion session notes, and list responses to the post-workshop questions. The final section presents key findings and conclusions from the workshop that suggest how the U.S. Department of Energy and national laboratory resources can be utilized to most effectively assist the marine energy industry.

  18. Marine and Hydrokinetic Technology (MHK) Instrumentation, Measurement, and Computer Modeling Workshop

    SciTech Connect (OSTI)

    Musial, W.; Lawson, M.; Rooney, S.

    2013-02-01

    The Marine and Hydrokinetic Technology (MHK) Instrumentation, Measurement, and Computer Modeling Workshop was hosted by the National Renewable Energy Laboratory (NREL) in Broomfield, Colorado, July 9–10, 2012. The workshop brought together over 60 experts in marine energy technologies to disseminate technical information to the marine energy community, and to collect information to help identify ways in which the development of a commercially viable marine energy industry can be accelerated. The workshop was comprised of plenary sessions that reviewed the state of the marine energy industry and technical sessions that covered specific topics of relevance. Each session consisted of presentations, followed by facilitated discussions. During the facilitated discussions, the session chairs posed several prepared questions to the presenters and audience to encourage communication and the exchange of ideas between technical experts. Following the workshop, attendees were asked to provide written feedback on their takeaways from the workshop and their best ideas on how to accelerate the pace of marine energy technology development. The first four sections of this document give a general overview of the workshop format, provide presentation abstracts, supply discussion session notes, and list responses to the post-workshop questions. The final section presents key findings and conclusions from the workshop that suggest what the most pressing MHK technology needs are and how the U.S. Department of Energy (DOE) and national laboratory resources can be utilized to assist the marine energy industry in the most effective manner.

  19. Synchronization and Control in Intrinsic and Designed Computation: An Information-Theoretic Analysis of Competing Models of Stochastic Computation

    E-Print Network [OSTI]

    James P. Crutchfield; Christopher J. Ellison; Ryan G. James; John R. Mahoney

    2010-07-30

    We adapt tools from information theory to analyze how an observer comes to synchronize with the hidden states of a finitary, stationary stochastic process. We show that synchronization is determined by both the process's internal organization and by an observer's model of it. We analyze these components using the convergence of state-block and block-state entropies, comparing them to the previously known convergence properties of the Shannon block entropy. Along the way, we introduce a hierarchy of information quantifiers as derivatives and integrals of these entropies, which parallels a similar hierarchy introduced for block entropy. We also draw out the duality between synchronization properties and a process's controllability. The tools lead to a new classification of a process's alternative representations in terms of minimality, synchronizability, and unifilarity.

  20. Overview of Computer-Aided Engineering of Batteries and Introduction to Multi-Scale, Multi-Dimensional Modeling of Li-Ion Batteries (Presentation)

    SciTech Connect (OSTI)

    Pesaran, A.; Kim, G. H.; Smith, K.; Santhanagopalan, S.; Lee, K. J.

    2012-05-01

    This 2012 Annual Merit Review presentation gives an overview of the Computer-Aided Engineering of Batteries (CAEBAT) project and introduces the Multi-Scale, Multi-Dimensional model for modeling lithium-ion batteries for electric vehicles.

  1. Data-Driven Optimization for Modeling in Computer Graphics and Vision

    E-Print Network [OSTI]

    Yu, Lap Fai

    2013-01-01

    the recent trend of 3D printing in computer-aided design.3D display, and 3D printing have made these technologies

  2. Data-Driven Optimization for Modeling in Computer Graphics and Vision

    E-Print Network [OSTI]

    Yu, Lap Fai

    2013-01-01

    in 3D scanning, 3D display, and 3D printing have made thesethe recent trend of 3D printing in computer-aided design.

  3. International Nuclear Energy Research Initiative Development of Computational Models for Pyrochemical Electrorefiners of Nuclear Waste Transmutation Systems

    SciTech Connect (OSTI)

    M.F. Simpson; K.-R. Kim

    2010-12-01

    In support of closing the nuclear fuel cycle using non-aqueous separations technology, this project aims to develop computational models of electrorefiners based on fundamental chemical and physical processes. Spent driver fuel from Experimental Breeder Reactor-II (EBR-II) is currently being electrorefined in the Fuel Conditioning Facility (FCF) at Idaho National Laboratory (INL). And Korea Atomic Energy Research Institute (KAERI) is developing electrorefining technology for future application to spent fuel treatment and management in the Republic of Korea (ROK). Electrorefining is a critical component of pyroprocessing, a non-aqueous chemical process which separates spent fuel into four streams: (1) uranium metal, (2) U/TRU metal, (3) metallic high-level waste containing cladding hulls and noble metal fission products, and (4) ceramic high-level waste containing sodium and active metal fission products. Having rigorous yet flexible electrorefiner models will facilitate process optimization and assist in trouble-shooting as necessary. To attain such models, INL/UI has focused on approaches to develop a computationally-light and portable two-dimensional (2D) model, while KAERI/SNU has investigated approaches to develop a computationally intensive three-dimensional (3D) model for detailed and fine-tuned simulation.

  4. From International Computer Performance and Dependability Symposium, Erlangen, Germany, April 1995, pp.285 294 MODELING RECYCLE: A CASE STUDY IN THE INDUSTRIAL USE OF

    E-Print Network [OSTI]

    Illinois at Urbana-Champaign, University of

    85721 Center for Reliable and High-Performance Computing Coordinated Science Laboratory UniversityFrom International Computer Performance and Dependability Symposium, Erlangen, Germany, April 1995, pp.285 294 MODELING RECYCLE: A CASE STUDY IN THE INDUSTRIAL USE OF MEASUREMENT AND MODELING Luai M

  5. A COMPUTER SIMULATION MODEL FOR THE BLOW-BLOW FORMING PROCESS OF GLASS C. G. Giannopapa J. A. W. M. Groot

    E-Print Network [OSTI]

    Eindhoven, Technische Universiteit

    1 A COMPUTER SIMULATION MODEL FOR THE BLOW-BLOW FORMING PROCESS OF GLASS CONTAINERS C. G for industrial purposes. To achieve this both steps of the forming of glass containers, namely blow- blow needs conditions. In [1] the development of a computer model to be used for glass blowing was described

  6. JLab High Efficiency Klystron Baseline Design for 12 GeV Upgrade

    SciTech Connect (OSTI)

    Hovater, J.; Delayen, Jean; Harwood, Leigh; Nelson, Richard; Wang, Haipeng

    2003-05-01

    A computer design of a 13.5 kW, 1497 MHz, CW type, 55% efficiency, 0.8 microPv beam perveance, ~40 dB gain, 5-cavity klystron has been developed for JLab 12 GeV Upgrade project.The design uses TRICOMP codes to simulate the gun, mod-anode section, solenoid focus channel and beam dump. The klystron tube was designed by JPNDISK (1D) code initially and then optimized by MASK (2D) code for the baseline parameters. All of these codes have been bunch marked by JLab 5 kW operational klystrons. The details of design parameters and the simulations by MAFIA (3D) for the cavity couplings tuners, and window are also going to be presented.

  7. User's guide for SAMMY: a computer model for multilevel r-matrix fits to neutron data using Bayes' equations

    SciTech Connect (OSTI)

    Larson, N. M.; Perey, F. G.

    1980-11-01

    A method is described for determining the parameters of a model from experimental data based upon the utilization of Bayes' theorem. This method has several advantages over the least-squares method as it is commonly used; one important advantage is that the assumptions under which the parameter values have been determined are more clearly evident than in many results based upon least squares. Bayes' method has been used to develop a computer code which can be utilized to analyze neutron cross-section data by means of the R-matrix theory. The required formulae from the R-matrix theory are presented, and the computer implementation of both Bayes' equations and R-matrix theory is described. Details about the computer code and compelte input/output information are given.

  8. Coupling Multi-Component Models with MPH on Distributed Memory Computer Architectures

    E-Print Network [OSTI]

    He, Yun; Ding, Chris

    2005-01-01

    Among these, NASA’s Earth System Models Framework (ESMF) [to facilitate coupling earth system model components and to

  9. MODELING STRATEGIES TO COMPUTE NATURAL CIRCULATION USING CFD IN A VHTR AFTER A LOFA

    SciTech Connect (OSTI)

    Yu-Hsin Tung; Richard W. Johnson; Ching-Chang Chieng; Yuh-Ming Ferng

    2012-11-01

    A prismatic gas-cooled very high temperature reactor (VHTR) is being developed under the next generation nuclear plant program (NGNP) of the U.S. Department of Energy, Office of Nuclear Energy. In the design of the prismatic VHTR, hexagonal shaped graphite blocks are drilled to allow insertion of fuel pins, made of compacted TRISO fuel particles, and coolant channels for the helium coolant. One of the concerns for the reactor design is the effects of a loss of flow accident (LOFA) where the coolant circulators are lost for some reason, causing a loss of forced coolant flow through the core. In such an event, it is desired to know what happens to the (reduced) heat still being generated in the core and if it represents a problem for the fuel compacts, the graphite core or the reactor vessel (RV) walls. One of the mechanisms for the transport of heat out of the core is by the natural circulation of the coolant, which is still present. That is, how much heat may be transported by natural circulation through the core and upwards to the top of the upper plenum? It is beyond current capability for a computational fluid dynamic (CFD) analysis to perform a calculation on the whole RV with a sufficiently refined mesh to examine the full potential of natural circulation in the vessel. The present paper reports the investigation of several strategies to model the flow and heat transfer in the RV. It is found that it is necessary to employ representative geometries of the core to estimate the heat transfer. However, by taking advantage of global and local symmetries, a detailed estimate of the strength of the resulting natural circulation and the level of heat transfer to the top of the upper plenum is obtained.

  10. Toward an Optimal Position for IVC Filters: Computational Modeling of the Impact of Renal Vein Inflow

    SciTech Connect (OSTI)

    Wang, S L; Singer, M A

    2009-07-13

    The purpose of this report is to evaluate the hemodynamic effects of renal vein inflow and filter position on unoccluded and partially occluded IVC filters using three-dimensional computational fluid dynamics. Three-dimensional models of the TrapEase and Gunther Celect IVC filters, spherical thrombi, and an IVC with renal veins were constructed. Hemodynamics of steady-state flow was examined for unoccluded and partially occluded TrapEase and Gunther Celect IVC filters in varying proximity to the renal veins. Flow past the unoccluded filters demonstrated minimal disruption. Natural regions of stagnant/recirculating flow in the IVC are observed superior to the bilateral renal vein inflows, and high flow velocities and elevated shear stresses are observed in the vicinity of renal inflow. Spherical thrombi induce stagnant and/or recirculating flow downstream of the thrombus. Placement of the TrapEase filter in the suprarenal vein position resulted in a large area of low shear stress/stagnant flow within the filter just downstream of thrombus trapped in the upstream trapping position. Filter position with respect to renal vein inflow influences the hemodynamics of filter trapping. Placement of the TrapEase filter in a suprarenal location may be thrombogenic with redundant areas of stagnant/recirculating flow and low shear stress along the caval wall due to the upstream trapping position and the naturally occurring region of stagnant flow from the renal veins. Infrarenal vein placement of IVC filters in a near juxtarenal position with the downstream cone near the renal vein inflow likely confers increased levels of mechanical lysis of trapped thrombi due to increased shear stress from renal vein inflow.

  11. 200-UP-2 Operable Unit technical baseline report

    SciTech Connect (OSTI)

    Deford, D.H.

    1991-02-01

    This report is prepared in support of the development of a Remedial Investigation/Feasibility Study (RI/FS) Work Plan for the 200-UP-2 Operable Unit by EBASCO Environmental, Incorporated. It provides a technical baseline of the 200-UP-2 Operable Unit and results from an environmental investigation undertaken by the Technical Baseline Section of the Environmental Engineering Group, Westinghouse Hanford Company (Westinghouse Hanford). The 200-UP-2 Operable Unit Technical Baseline Report is based on review and evaluation of numerous Hanford Site current and historical reports, Hanford Site drawings and photographs and is supplemented with Hanford Site inspections and employee interviews. No field investigations or sampling were conducted. Each waste site in the 200-UP-2 Operable Unit is described separately. Close relationships between waste units, such as overflow from one to another, are also discussed. The 200-UP-2 Operable Unit consists of liquid-waste disposal sites in the vicinity of, and related to, U Plant operations in the 200 West Area of the Hanford Site. The U Plant'' refers to the 221-U Process Canyon Building, a chemical separations facility constructed during World War 2. It also includes the Uranium Oxide (UO{sub 3}) Plant, which was constructed at the same time and, like the 221-U Process Canyon Building, was later converted for other missions. Waste sites in the 200-UP-2 Operable Unit are associated with the U Plant Uranium Metal Recovery Program mission that occurred between 1952 and 1958 and the UO{sub 3} Plant's ongoing uranium oxide mission and include one or more cribs, reverse wells, french drains, septic tanks and drain fields, trenches, catch tanks, settling tanks, diversion boxes, waste vaults, and the lines and encasements that connect them. 11 refs., 1 tab.

  12. A Computational Model Incorporating Neural Stem Cell Dynamics Reproduces Glioma Incidence across the Lifespan in the Human Population

    E-Print Network [OSTI]

    Bauer, Roman; Stoll, Elizabeth

    2015-01-01

    Glioma is the most common form of primary brain tumor. Demographically, the risk of occurrence increases until old age. Here we present a novel computational model to reproduce the probability of glioma incidence across the lifespan. Previous mathematical models explaining glioma incidence are framed in a rather abstract way, and do not directly relate to empirical findings. To decrease this gap between theory and experimental observations, we incorporate recent data on cellular and molecular factors underlying gliomagenesis. Since evidence implicates the adult neural stem cell as the likely cell-of-origin of glioma, we have incorporated empirically-determined estimates of neural stem cell number, cell division rate, mutation rate and oncogenic potential into our model. We demonstrate that our model yields results which match actual demographic data in the human population. In particular, this model accounts for the observed peak incidence of glioma at approximately 80 years of age, without the need to assert...

  13. Vietnam-Danish Government Baseline Workstream | Open Energy Information

    Open Energy Info (EERE)

    AFDC Printable Version Share this resource Send a link to EERE: Alternative Fuels Data Center Home Page to someone by E-mail Share EERE: Alternative Fuels Data Center Home Page on Facebook Tweet about EERE: Alternative Fuels Data Center Home Page on Twitter Bookmark EERE: Alternative Fuels Data Center Home Page on Google Bookmark EERE: Alternative Fuels Data Center Home Page| Open Energy Information Serbia-EnhancingEt Al.,Turin, NewArkansas:Standards JumpUSA JumpVideoconVientiane,Baseline

  14. Integrated Baseline System (IBS) Version 2.0: Utilities Guide

    SciTech Connect (OSTI)

    Burford, M.J.; Downing, T.R.; Williams, J.R.; Bower, J.C.

    1994-03-01

    The Integrated Baseline System (IBS) is an emergency management planning and analysis tool being developed under the direction of the US Army Nuclear and Chemical Agency. This Utilities Guide explains how you can use the IBS utility programs to manage and manipulate various kinds of IBS data. These programs include utilities for creating, editing, and displaying maps and other data that are referenced to geographic location. The intended audience for this document are chiefly data managers but also system managers and some emergency management planners and analysts.

  15. Integrated Baseline System (IBS) Version 1.03: Utilities guide

    SciTech Connect (OSTI)

    Burford, M.J.; Downing, T.R.; Pottier, M.C.; Schrank, E.E.; Williams, J.R.

    1993-01-01

    The Integrated Baseline System (IBS) is an emergency management planning and analysis tool that was developed under the direction of the Federal Emergency Management Agency (FEMA). This Utilities Guide explains how to operate utility programs that are supplied as a part of the IBS. These utility programs are chiefly for managing and manipulating various kinds of IBS data and system administration files. Many of the utilities are for creating, editing, converting, or displaying map data and other data that are related to geographic location.

  16. PROSPECT - A Precision Reactor Oscillation and Spectrum Experiment at Short Baselines

    E-Print Network [OSTI]

    J. Ashenfelter; A. B. Balantekin; H. R. Band; G. Barclay; C. Bass; N. S. Bowden; C. D. Bryan; J. J. Cherwinka; R. Chu; T. Classen; D. Davee; D. Dean; G. Deichert; M. Diwan; M. J. Dolinski; J. Dolph; D. A. Dwyer; Y. Efremenko; S. Fan; A. Galindo-Uribarri; K. Gilje; A. Glenn; M. Green; K. Han; S. Hans; K. M. Heeger; B. Heffron; L. Hu; P. Huber; D. E. Jaffe; Y. Kamyshkov; S. Kettell; C. Lane; T. J. Langford; B. R. Littlejohn; D. Martinez; R. D. McKeown; M. P. Mendenhall; S. Morrell; P. Mueller; H. P. Mumm; J. Napolitano; J. S. Nico; D. Norcini; D. Pushin; X. Qian; E. Romero; R. Rosero; B. S. Seilhan; R. Sharma; P. T. Surukuchi; S. J. Thompson; R. L. Varner; B. Viren; W. Wang; B. White; C. White; J. Wilhelmi; C. Williams; R. E. Williams; T. Wise; H. Yao; M. Yeh; N. Zaitseva; C. Zhang; X. Zhang

    2015-01-27

    Current models of antineutrino production in nuclear reactors predict detection rates and spectra at odds with the existing body of direct reactor antineutrino measurements. High-resolution antineutrino detectors operated close to compact research reactor cores can produce new precision measurements useful in testing explanations for these observed discrepancies involving underlying nuclear or new physics. Absolute measurement of the 235U-produced antineutrino spectrum can provide additional constraints for evaluating the accuracy of current and future reactor models, while relative measurements of spectral distortion between differing baselines can be used to search for oscillations arising from the existence of eV-scale sterile neutrinos. Such a measurement can be performed in the United States at several highly-enriched uranium fueled research reactors using near-surface segmented liquid scintillator detectors. We describe here the conceptual design and physics potential of the PROSPECT experiment, a U.S.-based, multi-phase experiment with reactor-detector baselines of 7-20 meters capable of addressing these and other physics and detector development goals. Current R&D status and future plans for PROSPECT detector deployment and data-taking at the High Flux Isotope Reactor at Oak Ridge National Laboratory will be discussed.

  17. DEVELOPING A NEW APPROACH OF COMPUTER USE `KISS MODELING' FOR DESIGN-IDEAS ALTERNATIVES OF FORM

    E-Print Network [OSTI]

    , YOSHIHIRO KOBAYASHI South Valley University, Faculty of Fine Arts at Luxor, Egypt. wael, and form generation through computational power is more prominent in the two dimensions than the three

  18. Computational Model of Forward and Opposed Smoldering Combustion with Improved Chemical Kinetics 

    E-Print Network [OSTI]

    Rein, Guillermo

    A computational study has been carried out to investigate smoldering ignition and propagation in polyurethane foam. The onedimensional, transient, governing equations for smoldering combustion in a porous fuel are solved ...

  19. Parallel Computational Modelling of Inelastic Neutron Scattering in Multi-node and Multi-core Architectures 

    E-Print Network [OSTI]

    Garba, M.T.; Gonzales-Velez, H.; Roach, D.L.

    2010-11-26

    This paper examines the initial parallel implementation of SCATTER, a computationally intensive inelastic neutron scattering routine with polycrystalline averaging capability, for the General Utility Lattice Program (GULP). Of particular importance...

  20. 100 IEEE TRANSACTIONS ON COMPUTERS, VOL. 45, NO. 10, OCTOBER 1996 An Analytical Model

    E-Print Network [OSTI]

    Mudge, Trevor

    to spend money is the cheapest (rather than the fastest) cache level, particularly with small system are with the Advanced Com­ puter Architecture Laboratory, Department of Electrical Engineer­ ing and Computer Science