National Library of Energy BETA

Sample records for algorithm theoretical basis

  1. Theoretical Basis for the Design of a DWPF Evacuated Canister

    SciTech Connect (OSTI)

    Routt, K.R.

    2001-09-17

    This report provides the theoretical bases for use of an evacuated canister for draining a glass melter. Design recommendations are also presented to ensure satisfactory performance in future tests of the concept.

  2. A Decision Theoretic Approach to Evaluate Radiation Detection Algorithms

    SciTech Connect (OSTI)

    Nobles, Mallory A.; Sego, Landon H.; Cooley, Scott K.; Gosink, Luke J.; Anderson, Richard M.; Hays, Spencer E.; Tardiff, Mark F.

    2013-07-01

    There are a variety of sensor systems deployed at U.S. border crossings and ports of entry that scan for illicit nuclear material. In this work, we develop a framework for comparing the performance of detection algorithms that interpret the output of these scans and determine when secondary screening is needed. We optimize each algorithm to minimize its risk, or expected loss. We measure an algorithms risk by considering its performance over a sample, the probability distribution of threat sources, and the consequence of detection errors. While it is common to optimize algorithms by fixing one error rate and minimizing another, our framework allows one to simultaneously consider multiple types of detection errors. Our framework is flexible and easily adapted to many different assumptions regarding the probability of a vehicle containing illicit material, and the relative consequences of a false positive and false negative errors. Our methods can therefore inform decision makers of the algorithm family and parameter values which best reduce the threat from illicit nuclear material, given their understanding of the environment at any point in time. To illustrate the applicability of our methods, in this paper, we compare the risk from two families of detection algorithms and discuss the policy implications of our results.

  3. Basis

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    pop.aip.orgpopcopyright.jsp we only need data from a small neighborhood around each point of interest. The new method retains many of the advantages of the existing basis...

  4. Theoretical Physics

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    HEP Theoretical Physics Understanding discoveries at the Energy, Intensity, and Cosmic ... HEP Theory at Los Alamos The Theoretical High Energy Physics group at Los Alamos National ...

  5. Theoretical Division

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Dynamics and Solid Mechanics Physics of Condensed Matter and Complex Systems Applied Mathematics and Plasma Physics Theoretical Biology and Biophysics Contacts Division Leader...

  6. Theoretical Physics

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    HEP Theoretical Physics Understanding discoveries at the Energy, Intensity, and Cosmic Frontiers Get Expertise Rajan Gupta (505) 667-7664 Email Bruce Carlsten (505) 667-5657 Email HEP Theory at Los Alamos The Theoretical High Energy Physics group at Los Alamos National Laboratory is active in a number of diverse areas of research. Their primary areas of interest are in physics beyond the Standard Model, cosmology, dark matter, lattice quantum chromodynamics, neutrinos, the fundamentals of

  7. Improved multiprocessor garbage collection algorithms

    SciTech Connect (OSTI)

    Newman, I.A.; Stallard, R.P.; Woodward, M.C.

    1983-01-01

    Outlines the results of an investigation of existing multiprocessor garbage collection algorithms and introduces two new algorithms which significantly improve some aspects of the performance of their predecessors. The two algorithms arise from different starting assumptions. One considers the case where the algorithm will terminate successfully whatever list structure is being processed and assumes that the extra data space should be minimised. The other seeks a very fast garbage collection time for list structures that do not contain loops. Results of both theoretical and experimental investigations are given to demonstrate the efficacy of the algorithms. 7 references.

  8. An algorithm for nonrelativistic quantum-mechanical finite-nuclear-mass variational calculations of nitrogen atom in L = 0, M = 0 states using all-electrons explicitly correlated Gaussian basis functions

    SciTech Connect (OSTI)

    Sharkey, Keeper L.; Adamowicz, Ludwik; Department of Physics, University of Arizona, Tucson, Arizona 85721

    2014-05-07

    An algorithm for quantum-mechanical nonrelativistic variational calculations of L = 0 and M = 0 states of atoms with an arbitrary number of s electrons and with three p electrons have been implemented and tested in the calculations of the ground {sup 4}S state of the nitrogen atom. The spatial part of the wave function is expanded in terms of all-electrons explicitly correlated Gaussian functions with the appropriate pre-exponential Cartesian angular factors for states with the L = 0 and M = 0 symmetry. The algorithm includes formulas for calculating the Hamiltonian and overlap matrix elements, as well as formulas for calculating the analytic energy gradient determined with respect to the Gaussian exponential parameters. The gradient is used in the variational optimization of these parameters. The Hamiltonian used in the approach is obtained by rigorously separating the center-of-mass motion from the laboratory-frame all-particle Hamiltonian, and thus it explicitly depends on the finite mass of the nucleus. With that, the mass effect on the total ground-state energy is determined.

  9. Safety Basis Report

    SciTech Connect (OSTI)

    R.J. Garrett

    2002-01-14

    As part of the internal Integrated Safety Management Assessment verification process, it was determined that there was a lack of documentation that summarizes the safety basis of the current Yucca Mountain Project (YMP) site characterization activities. It was noted that a safety basis would make it possible to establish a technically justifiable graded approach to the implementation of the requirements identified in the Standards/Requirements Identification Document. The Standards/Requirements Identification Documents commit a facility to compliance with specific requirements and, together with the hazard baseline documentation, provide a technical basis for ensuring that the public and workers are protected. This Safety Basis Report has been developed to establish and document the safety basis of the current site characterization activities, establish and document the hazard baseline, and provide the technical basis for identifying structures, systems, and components (SSCs) that perform functions necessary to protect the public, the worker, and the environment from hazards unique to the YMP site characterization activities. This technical basis for identifying SSCs serves as a grading process for the implementation of programs such as Conduct of Operations (DOE Order 5480.19) and the Suspect/Counterfeit Items Program. In addition, this report provides a consolidated summary of the hazards analyses processes developed to support the design, construction, and operation of the YMP site characterization facilities and, therefore, provides a tool for evaluating the safety impacts of changes to the design and operation of the YMP site characterization activities.

  10. Technical Planning Basis

    Broader source: Directives, Delegations, and Requirements [Office of Management (MA)]

    2007-07-11

    The Guide assists DOE/NNSA field elements and operating contractors in identifying and analyzing hazards at facilities and sites to provide the technical planning basis for emergency management programs. Supersedes DOE G 151.1-1, Volume 2.

  11. Algorithms for builder guidelines

    SciTech Connect (OSTI)

    Balcomb, J.D.; Lekov, A.B.

    1989-06-01

    The Builder Guidelines are designed to make simple, appropriate guidelines available to builders for their specific localities. Builders may select from passive solar and conservation strategies with different performance potentials. They can then compare the calculated results for their particular house design with a typical house in the same location. Algorithms used to develop the Builder Guidelines are described. The main algorithms used are the monthly solar ratio (SLR) method for winter heating, the diurnal heat capacity (DHC) method for temperature swing, and a new simplified calculation method (McCool) for summer cooling. This paper applies the algorithms to estimate the performance potential of passive solar strategies, and the annual heating and cooling loads of various combinations of conservation and passive solar strategies. The basis of the McCool method is described. All three methods are implemented in a microcomputer program used to generate the guideline numbers. Guidelines for Denver, Colorado, are used to illustrate the results. The structure of the guidelines and worksheet booklets are also presented. 5 refs., 3 tabs.

  12. Theoretical manual for DYNA3D

    SciTech Connect (OSTI)

    Hallquist, J.O.

    1983-03-01

    This report provides a theoretical manual for DYNA3D, a vectorized explicit three-dimensional finite element code for analyzing the large deformation dynamic response of inelastic solids. A contact-impact algorithm that permits gaps and sliding along material interfaces is described. By a specialization of this algorithm, such interfaces can be rigidly tied to admit variable zoning without the need of transition regions. Spatial discretization is achieved by the use of 8-node solid elements, and the equations-of-motion are integrated by the central difference method. DYNA3D is operational on the CRAY-1 and CDC7600 computers.

  13. Library of Continuation Algorithms

    Energy Science and Technology Software Center (OSTI)

    2005-03-01

    LOCA (Library of Continuation Algorithms) is scientific software written in C++ that provides advanced analysis tools for nonlinear systems. In particular, it provides parameter continuation algorithms. bifurcation tracking algorithms, and drivers for linear stability analysis. The algorithms are aimed at large-scale applications that use Newton’s method for their nonlinear solve.

  14. Exploratory Development of Theoretical Methods | The Ames Laboratory

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Exploratory Development of Theoretical Methods Research Personnel Updates Publications Calculating Plutonium and Praseodymium Structural Transformations Read More Genetic Algorithm for Grain Boundary and Crystal Structure Predictions Read More Universal Dynamical Decoupling of a Single Solid-state Spin from a Spin Bath Read More Previous Pause Next Modeling The purpose of this FWP is to generate new theories, models, and algorithms that will be beneficial to the research programs at the Ames

  15. ALGORITHM FOR ACCNT

    Energy Science and Technology Software Center (OSTI)

    002651IBMPC00 Algorithm for Accounting for the Interactions of Multiple Renewable Energy Technologies in Estimation of Annual Performance

  16. Basis

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    function multifield bispectral deconvolution analysis D. A. Baver and P. W. Terry Department of Physics, University of Wisconsin, Madison, Wisconsin 53706 ͑Received 8 September 2004; accepted 8 December 2004; published online 24 March 2005͒ A different procedure for calculating linear and nonlinear coefficients of model systems for fully developed turbulence is derived. This procedure can be applied to systems with multiple interacting fields; in the single-field case the linear coefficients

  17. CRAD, NNSA- Safety Basis (SB)

    Broader source: Energy.gov [DOE]

    CRAD for Safety Basis (SB). Criteria Review and Approach Documents (CRADs) that can be used to conduct a well-organized and thorough assessment of elements of safety and health programs.

  18. Basis functions for electronic structure calculations on spheres

    SciTech Connect (OSTI)

    Gill, Peter M. W. Loos, Pierre-François Agboola, Davids

    2014-12-28

    We introduce a new basis function (the spherical Gaussian) for electronic structure calculations on spheres of any dimension D. We find general expressions for the one- and two-electron integrals and propose an efficient computational algorithm incorporating the Cauchy-Schwarz bound. Using numerical calculations for the D = 2 case, we show that spherical Gaussians are more efficient than spherical harmonics when the electrons are strongly localized.

  19. The Basis Code Development System

    Energy Science and Technology Software Center (OSTI)

    1994-03-15

    BASIS9.4 is a system for developing interactive computer programs in Fortran, with some support for C and C++ as well. Using BASIS9.4 you can create a program that has a sophisticated programming language as its user interface so that the user can set, calculate with, and plot, all the major variables in the program. The program author writes only the scientific part of the program; BASIS9.4 supplies an environment in which to exercise that scientificmore » programming which includes an interactive language, an interpreter, graphics, terminal logs, error recovery, macros, saving and retrieving variables, formatted I/O, and online documentation.« less

  20. Authorization basis requirements comparison report

    SciTech Connect (OSTI)

    Brantley, W.M.

    1997-08-18

    The TWRS Authorization Basis (AB) consists of a set of documents identified by TWRS management with the concurrence of DOE-RL. Upon implementation of the TWRS Basis for Interim Operation (BIO) and Technical Safety Requirements (TSRs), the AB list will be revised to include the BIO and TSRs. Some documents that currently form part of the AB will be removed from the list. This SD identifies each - requirement from those documents, and recommends a disposition for each to ensure that necessary requirements are retained when the AB is revised to incorporate the BIO and TSRs. This SD also identifies documents that will remain part of the AB after the BIO and TSRs are implemented. This document does not change the AB, but provides guidance for the preparation of change documentation.

  1. Hanford Generic Interim Safety Basis

    SciTech Connect (OSTI)

    Lavender, J.C.

    1994-09-09

    The purpose of this document is to identify WHC programs and requirements that are an integral part of the authorization basis for nuclear facilities that are generic to all WHC-managed facilities. The purpose of these programs is to implement the DOE Orders, as WHC becomes contractually obligated to implement them. The Hanford Generic ISB focuses on the institutional controls and safety requirements identified in DOE Order 5480.23, Nuclear Safety Analysis Reports.

  2. Safety Basis Information System | Department of Energy

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    This report provides a list of all DOE nuclear facilities with the safety basis status, hazard categorization, and safety basis type. Safety Basis Login Click on the above link to ...

  3. OpenEIS Algorithms

    Energy Science and Technology Software Center (OSTI)

    2013-07-29

    The OpenEIS Algorithm package seeks to provide a low-risk path for building owners, service providers and managers to explore analytical methods for improving building control and operational efficiency. Users of this software can analyze building data, and learn how commercial implementations would provide long-term value. The code also serves as a reference implementation for developers who wish to adapt the algorithms for use in commercial tools or service offerings.

  4. OSR encapsulation basis -- 100-KW

    SciTech Connect (OSTI)

    Meichle, R.H.

    1995-01-27

    The purpose of this report is to provide the basis for a change in the Operations Safety Requirement (OSR) encapsulated fuel storage requirements in the 105 KW fuel storage basin which will permit the handling and storing of encapsulated fuel in canisters which no longer have a water-free space in the top of the canister. The scope of this report is limited to providing the change from the perspective of the safety envelope (bases) of the Safety Analysis Report (SAR) and Operations Safety Requirements (OSR). It does not change the encapsulation process itself.

  5. Scalable Methods for Electronic Excitations and Optical Responses in Nanostructures: Mathematics to Algorithms to Observables

    SciTech Connect (OSTI)

    James R. Chelikowsky

    2009-03-31

    The work reported here took place at the University of Minnesota from September 15, 2003 to November 14, 2005. This funding resulted in 10 invited articles or book chapters, 37 articles in refereed journals and 13 invited talks. The funding helped train 5 PhD students. The research supported by this grant focused on developing theoretical methods for predicting and understanding the properties of matter at the nanoscale. Within this regime, new phenomena occur that are characteristic of neither the atomic limit, nor the crystalline limit. Moreover, this regime is crucial for understanding the emergence of macroscopic properties such as ferromagnetism. For example, elemental Fe clusters possess magnetic moments that reside between the atomic and crystalline limits, but the transition from the atomic to the crystalline limit is not a simple interpolation between the two size regimes. To capitalize properly on predicting such phenomena in this transition regime, a deeper understanding of the electronic, magnetic and structural properties of matter is required, e.g., electron correlation effects are enhanced within this size regime and the surface of a confined system must be explicitly included. A key element of our research involved the construction of new algorithms to address problems peculiar to the nanoscale. Typically, one would like to consider systems with thousands of atoms or more, e.g., a silicon nanocrystal that is 7 nm in diameter would contain over 10,000 atoms. Previous ab initio methods could address systems with hundreds of atoms whereas empirical methods can routinely handle hundreds of thousands of atoms (or more). However, these empirical methods often rely on ad hoc assumptions and lack incorporation of structural and electronic degrees of freedom. The key theoretical ingredients in our work involved the use of ab initio pseudopotentials and density functional approaches. The key numerical ingredients involved the implementation of algorithms for solving the Kohn-Sham equation without the use of an explicit basis, i.e., a real space grid. We invented algorithms for a solution of the Kohn-Sham equation based on Chebyshev 'subspace filtering'. Our filtering algorithms dramatically enhanced our ability to explore systems with thousands of atoms, i.e., we examined silicon quantum dots with approximately 11,000 atoms (or 40,000 electrons). We applied this algorithm to a number of nanoscale systems to examine the role of quantum confinement on electronic and magnetic properties: (1) Doping of nanocrystals and nanowires, including both magnetic and non-magnetic dopants and the role of self-purification; (2) Optical excitations and electronic properties of nanocrystals; (3) Intrinsic defects in nanostructures; and (4) The emergence of ferromagnetism from atoms to crystals.

  6. Beyond Design Basis Events | Department of Energy

    Energy Savers [EERE]

    Beyond Design Basis Events Beyond Design Basis Events Beyond Design Basis Events Following the March 2011 Fukushima Daiichi nuclear plant accident in Japan, DOE embarked upon several initiatives to investigate the safety posture of its nuclear facilities relative to beyond design basis events (BDBEs). These initiatives included issuing Safety Bulletin 2011-01, Events Beyond Design Safety Basis Analysis, and conducting two DOE nuclear safety workshops. DOE also issued two reports documenting the

  7. Internal dosimetry technical basis manual

    SciTech Connect (OSTI)

    Not Available

    1990-12-20

    The internal dosimetry program at the Savannah River Site (SRS) consists of radiation protection programs and activities used to detect and evaluate intakes of radioactive material by radiation workers. Examples of such programs are: air monitoring; surface contamination monitoring; personal contamination surveys; radiobioassay; and dose assessment. The objectives of the internal dosimetry program are to demonstrate that the workplace is under control and that workers are not being exposed to radioactive material, and to detect and assess inadvertent intakes in the workplace. The Savannah River Site Internal Dosimetry Technical Basis Manual (TBM) is intended to provide a technical and philosophical discussion of the radiobioassay and dose assessment aspects of the internal dosimetry program. Detailed information on air, surface, and personal contamination surveillance programs is not given in this manual except for how these programs interface with routine and special bioassay programs.

  8. New Effective Multithreaded Matching Algorithms

    SciTech Connect (OSTI)

    Manne, Fredrik; Halappanavar, Mahantesh

    2014-05-19

    Matching is an important combinatorial problem with a number of applications in areas such as community detection, sparse linear algebra, and network alignment. Since computing optimal matchings can be very time consuming, several fast approximation algorithms, both sequential and parallel, have been suggested. Common to the algorithms giving the best solutions is that they tend to be sequential by nature, while algorithms more suitable for parallel computation give solutions of less quality. We present a new simple 1 2 -approximation algorithm for the weighted matching problem. This algorithm is both faster than any other suggested sequential 1 2 -approximation algorithm on almost all inputs and also scales better than previous multithreaded algorithms. We further extend this to a general scalable multithreaded algorithm that computes matchings of weight comparable with the best sequential algorithms. The performance of the suggested algorithms is documented through extensive experiments on different multithreaded architectures.

  9. BASIS Set Exchange (BSE): Chemistry Basis Sets from the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL) Basis Set Library

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Feller, D; Schuchardt, Karen L.; Didier, Brett T.; Elsethagen, Todd; Sun, Lisong; Gurumoorthi, Vidhya; Chase, Jared; Li, Jun

    The Basis Set Exchange (BSE) provides a web-based user interface for downloading and uploading Gaussian-type (GTO) basis sets, including effective core potentials (ECPs), from the EMSL Basis Set Library. It provides an improved user interface and capabilities over its predecessor, the EMSL Basis Set Order Form, for exploring the contents of the EMSL Basis Set Library. The popular Basis Set Order Form and underlying Basis Set Library were originally developed by Dr. David Feller and have been available from the EMSL webpages since 1994. BSE not only allows downloading of the more than 200 Basis sets in various formats; it allows users to annotate existing sets and to upload new sets. (Specialized Interface)

  10. Recent Theoretical Results for Advanced Thermoelectric Materials...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Theoretical Results for Advanced Thermoelectric Materials Recent Theoretical Results for Advanced Thermoelectric Materials Transport theory and first principles calculations ...

  11. Catalyst by Design - Theoretical, Nanostructural, and Experimental...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Oxidation Catalyst for Diesel Engine Emission Treatment Catalyst by Design - Theoretical, ... More Documents & Publications Catalyst by Design - Theoretical, Nanostructural, and ...

  12. Advanced Fuel Cycle Cost Basis

    SciTech Connect (OSTI)

    D. E. Shropshire; K. A. Williams; W. B. Boore; J. D. Smith; B. W. Dixon; M. Dunzik-Gougar; R. D. Adams; D. Gombert; E. Schneider

    2008-03-01

    This report, commissioned by the U.S. Department of Energy (DOE), provides a comprehensive set of cost data supporting a cost analysis for the relative economic comparison of options for use in the Advanced Fuel Cycle Initiative (AFCI) Program. The report describes the AFCI cost basis development process, reference information on AFCI cost modules, a procedure for estimating fuel cycle costs, economic evaluation guidelines, and a discussion on the integration of cost data into economic computer models. This report contains reference cost data for 25 cost modules—23 fuel cycle cost modules and 2 reactor modules. The cost modules were developed in the areas of natural uranium mining and milling, conversion, enrichment, depleted uranium disposition, fuel fabrication, interim spent fuel storage, reprocessing, waste conditioning, spent nuclear fuel (SNF) packaging, long-term monitored retrievable storage, near surface disposal of low-level waste (LLW), geologic repository and other disposal concepts, and transportation processes for nuclear fuel, LLW, SNF, transuranic, and high-level waste.

  13. Advanced Fuel Cycle Cost Basis

    SciTech Connect (OSTI)

    D. E. Shropshire; K. A. Williams; W. B. Boore; J. D. Smith; B. W. Dixon; M. Dunzik-Gougar; R. D. Adams; D. Gombert

    2007-04-01

    This report, commissioned by the U.S. Department of Energy (DOE), provides a comprehensive set of cost data supporting a cost analysis for the relative economic comparison of options for use in the Advanced Fuel Cycle Initiative (AFCI) Program. The report describes the AFCI cost basis development process, reference information on AFCI cost modules, a procedure for estimating fuel cycle costs, economic evaluation guidelines, and a discussion on the integration of cost data into economic computer models. This report contains reference cost data for 26 cost modules—24 fuel cycle cost modules and 2 reactor modules. The cost modules were developed in the areas of natural uranium mining and milling, conversion, enrichment, depleted uranium disposition, fuel fabrication, interim spent fuel storage, reprocessing, waste conditioning, spent nuclear fuel (SNF) packaging, long-term monitored retrievable storage, near surface disposal of low-level waste (LLW), geologic repository and other disposal concepts, and transportation processes for nuclear fuel, LLW, SNF, and high-level waste.

  14. Advanced Fuel Cycle Cost Basis

    SciTech Connect (OSTI)

    D. E. Shropshire; K. A. Williams; W. B. Boore; J. D. Smith; B. W. Dixon; M. Dunzik-Gougar; R. D. Adams; D. Gombert; E. Schneider

    2009-12-01

    This report, commissioned by the U.S. Department of Energy (DOE), provides a comprehensive set of cost data supporting a cost analysis for the relative economic comparison of options for use in the Advanced Fuel Cycle Initiative (AFCI) Program. The report describes the AFCI cost basis development process, reference information on AFCI cost modules, a procedure for estimating fuel cycle costs, economic evaluation guidelines, and a discussion on the integration of cost data into economic computer models. This report contains reference cost data for 25 cost modules—23 fuel cycle cost modules and 2 reactor modules. The cost modules were developed in the areas of natural uranium mining and milling, conversion, enrichment, depleted uranium disposition, fuel fabrication, interim spent fuel storage, reprocessing, waste conditioning, spent nuclear fuel (SNF) packaging, long-term monitored retrievable storage, near surface disposal of low-level waste (LLW), geologic repository and other disposal concepts, and transportation processes for nuclear fuel, LLW, SNF, transuranic, and high-level waste.

  15. Robotic Follow Algorithm

    Energy Science and Technology Software Center (OSTI)

    2005-03-30

    The Robotic Follow Algorithm enables allows any robotic vehicle to follow a moving target while reactively choosing a route around nearby obstacles. The robotic follow behavior can be used with different camera systems and can be used with thermal or visual tracking as well as other tracking methods such as radio frequency tags.

  16. On constructing optimistic simulation algorithms for the discrete event system specification

    SciTech Connect (OSTI)

    Nutaro, James J

    2008-01-01

    This article describes a Time Warp simulation algorithm for discrete event models that are described in terms of the Discrete Event System Specification (DEVS). The article shows how the total state transition and total output function of a DEVS atomic model can be transformed into an event processing procedure for a logical process. A specific Time Warp algorithm is constructed around this logical process, and it is shown that the algorithm correctly simulates a DEVS coupled model that consists entirely of interacting atomic models. The simulation algorithm is presented abstractly; it is intended to provide a basis for implementing efficient and scalable parallel algorithms that correctly simulate DEVS models.

  17. Large scale tracking algorithms.

    SciTech Connect (OSTI)

    Hansen, Ross L.; Love, Joshua Alan; Melgaard, David Kennett; Karelitz, David B.; Pitts, Todd Alan; Zollweg, Joshua David; Anderson, Dylan Z.; Nandy, Prabal; Whitlow, Gary L.; Bender, Daniel A.; Byrne, Raymond Harry

    2015-01-01

    Low signal-to-noise data processing algorithms for improved detection, tracking, discrimination and situational threat assessment are a key research challenge. As sensor technologies progress, the number of pixels will increase signi cantly. This will result in increased resolution, which could improve object discrimination, but unfortunately, will also result in a significant increase in the number of potential targets to track. Many tracking techniques, like multi-hypothesis trackers, suffer from a combinatorial explosion as the number of potential targets increase. As the resolution increases, the phenomenology applied towards detection algorithms also changes. For low resolution sensors, "blob" tracking is the norm. For higher resolution data, additional information may be employed in the detection and classfication steps. The most challenging scenarios are those where the targets cannot be fully resolved, yet must be tracked and distinguished for neighboring closely spaced objects. Tracking vehicles in an urban environment is an example of such a challenging scenario. This report evaluates several potential tracking algorithms for large-scale tracking in an urban environment.

  18. CCM Continuity Constraint Method: A finite-element computational fluid dynamics algorithm for incompressible Navier-Stokes fluid flows

    SciTech Connect (OSTI)

    Williams, P.T.

    1993-09-01

    As the field of computational fluid dynamics (CFD) continues to mature, algorithms are required to exploit the most recent advances in approximation theory, numerical mathematics, computing architectures, and hardware. Meeting this requirement is particularly challenging in incompressible fluid mechanics, where primitive-variable CFD formulations that are robust, while also accurate and efficient in three dimensions, remain an elusive goal. This dissertation asserts that one key to accomplishing this goal is recognition of the dual role assumed by the pressure, i.e., a mechanism for instantaneously enforcing conservation of mass and a force in the mechanical balance law for conservation of momentum. Proving this assertion has motivated the development of a new, primitive-variable, incompressible, CFD algorithm called the Continuity Constraint Method (CCM). The theoretical basis for the CCM consists of a finite-element spatial semi-discretization of a Galerkin weak statement, equal-order interpolation for all state-variables, a 0-implicit time-integration scheme, and a quasi-Newton iterative procedure extended by a Taylor Weak Statement (TWS) formulation for dispersion error control. Original contributions to algorithmic theory include: (a) formulation of the unsteady evolution of the divergence error, (b) investigation of the role of non-smoothness in the discretized continuity-constraint function, (c) development of a uniformly H{sup 1} Galerkin weak statement for the Reynolds-averaged Navier-Stokes pressure Poisson equation, (d) derivation of physically and numerically well-posed boundary conditions, and (e) investigation of sparse data structures and iterative methods for solving the matrix algebra statements generated by the algorithm.

  19. Property:ExplorationBasis | Open Energy Information

    Open Energy Info (EERE)

    Text Description Exploration Basis Why was exploration work conducted in this area (e.g., USGS report of a geothermal resource, hot springs with geothemmetry indicating...

  20. Lightning Talks 2015: Theoretical Division

    SciTech Connect (OSTI)

    Shlachter, Jack S.

    2015-11-25

    This document is a compilation of slides from a number of student presentations given to LANL Theoretical Division members. The subjects cover the range of activities of the Division, including plasma physics, environmental issues, materials research, bacterial resistance to antibiotics, and computational methods.

  1. Dynamical properties of non-ideal plasma on the basis of effective potentials

    SciTech Connect (OSTI)

    Ramazanov, T. S.; Kodanova, S. K.; Moldabekov, Zh. A.; Issanova, M. K.

    2013-11-15

    In this work, stopping power has been calculated on the basis of the Coulomb logarithm using the effective potentials. Calculations of the Coulomb logarithm and stopping power for different interaction potentials and degrees of ionization are compared. The comparison with the data of other theoretical and experimental works was carried out.

  2. Theoretical Estimate of Maximum Possible Nuclear Explosion

    DOE R&D Accomplishments [OSTI]

    Bethe, H. A.

    1950-01-31

    The maximum nuclear accident which could occur in a Na-cooled, Be moderated, Pu and power producing reactor is estimated theoretically. (T.R.H.) 2O82 Results of nuclear calculations for a variety of compositions of fast, heterogeneous, sodium-cooled, U-235-fueled, plutonium- and power-producing reactors are reported. Core compositions typical of plate-, pin-, or wire-type fuel elements and with uranium as metal, alloy, and oxide were considered. These compositions included atom ratios in the following range: U-23B to U-235 from 2 to 8; sodium to U-235 from 1.5 to 12; iron to U-235 from 5 to 18; and vanadium to U-235 from 11 to 33. Calculations were performed to determine the effect of lead and iron reflectors between the core and blanket. Both natural and depleted uranium were evaluated as the blanket fertile material. Reactors were compared on a basis of conversion ratio, specific power, and the product of both. The calculated results are in general agreement with the experimental results from fast reactor assemblies. An analysis of the effect of new cross-section values as they became available is included. (auth)

  3. Safety Basis Information System | Department of Energy

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Request Click on the above link to access the form to request access to the Safety Basis web interface. If you need assistance logging in, please AU UserSupport. Contact Nimi Rao...

  4. Basis for OUO | Department of Energy

    Energy Savers [EERE]

    OUO Basis for OUO What documents contain the policy foundation for the OUO program? DOE O 471.3 Admin Chg 1, Identifying and Protecting Official Use Only Information. DOE M 471.3-1 Admin Chg 1, Manual for Identifying and Protecting Official Use Only Information. Training & Reference Materials Basis for OUO OUO Review Requirement Access to OUO Protection of OUO Questions about Making OUO Determinations - OUO and the FOIA Exemptions How is a Document Containing OUO Marked? Controlled

  5. Nanoplasmonics simulations at the basis set limit through completeness-optimized, local numerical basis sets

    SciTech Connect (OSTI)

    Rossi, Tuomas P. Sakko, Arto; Puska, Martti J.; Lehtola, Susi; Nieminen, Risto M.

    2015-03-07

    We present an approach for generating local numerical basis sets of improving accuracy for first-principles nanoplasmonics simulations within time-dependent density functional theory. The method is demonstrated for copper, silver, and gold nanoparticles that are of experimental interest but computationally demanding due to the semi-core d-electrons that affect their plasmonic response. The basis sets are constructed by augmenting numerical atomic orbital basis sets by truncated Gaussian-type orbitals generated by the completeness-optimization scheme, which is applied to the photoabsorption spectra of homoatomic metal atom dimers. We obtain basis sets of improving accuracy up to the complete basis set limit and demonstrate that the performance of the basis sets transfers to simulations of larger nanoparticles and nanoalloys as well as to calculations with various exchange-correlation functionals. This work promotes the use of the local basis set approach of controllable accuracy in first-principles nanoplasmonics simulations and beyond.

  6. Cubit Adaptive Meshing Algorithm Library

    Energy Science and Technology Software Center (OSTI)

    2004-09-01

    CAMAL (Cubit adaptive meshing algorithm library) is a software component library for mesh generation. CAMAL 2.0 includes components for triangle, quad and tetrahedral meshing. A simple Application Programmers Interface (API) takes a discrete boundary definition and CAMAL computes a quality interior unstructured grid. The triangle and quad algorithms may also import a geometric definition of a surface on which to define the grid. CAMAL’s triangle meshing uses a 3D space advancing front method, the quadmore » meshing algorithm is based upon Sandia’s patented paving algorithm and the tetrahedral meshing algorithm employs the GHS3D-Tetmesh component developed by INRIA, France.« less

  7. Theoretical perspectives on strange physics

    SciTech Connect (OSTI)

    Ellis, J.

    1983-04-01

    Kaons are heavy enough to have an interesting range of decay modes available to them, and light enough to be produced in sufficient numbers to explore rare modes with satisfying statistics. Kaons and their decays have provided at least two major breakthroughs in our knowledge of fundamental physics. They have revealed to us CP violation, and their lack of flavor-changing neutral interactions warned us to expect charm. In addition, K/sup 0/-anti K/sup 0/ mixing has provided us with one of our most elegant and sensitive laboratories for testing quantum mechanics. There is every reason to expect that future generations of kaon experiments with intense sources would add further to our knowledge of fundamental physics. This talk attempts to set future kaon experiments in a general theoretical context, and indicate how they may bear upon fundamental theoretical issues. A survey of different experiments which would be done with an Intense Medium Energy Source of Strangeness, including rare K decays, probes of the nature of CP isolation, ..mu.. decays, hyperon decays and neutrino physics is given. (WHK)

  8. Arctic Mixed-Phase Cloud Properties from AERI Lidar Observations: Algorithm and Results from SHEBA

    SciTech Connect (OSTI)

    Turner, David D.

    2005-04-01

    A new approach to retrieve microphysical properties from mixed-phase Arctic clouds is presented. This mixed-phase cloud property retrieval algorithm (MIXCRA) retrieves cloud optical depth, ice fraction, and the effective radius of the water and ice particles from ground-based, high-resolution infrared radiance and lidar cloud boundary observations. The theoretical basis for this technique is that the absorption coefficient of ice is greater than that of liquid water from 10 to 13 ?m, whereas liquid water is more absorbing than ice from 16 to 25 ?m. MIXCRA retrievals are only valid for optically thin (?visible < 6) single-layer clouds when the precipitable water vapor is less than 1 cm. MIXCRA was applied to the Atmospheric Emitted Radiance Interferometer (AERI) data that were collected during the Surface Heat Budget of the Arctic Ocean (SHEBA) experiment from November 1997 to May 1998, where 63% of all of the cloudy scenes above the SHEBA site met this specification. The retrieval determined that approximately 48% of these clouds were mixed phase and that a significant number of clouds (during all 7 months) contained liquid water, even for cloud temperatures as low as 240 K. The retrieved distributions of effective radii for water and ice particles in single-phase clouds are shown to be different than the effective radii in mixed-phase clouds.

  9. Structural Basis for Activation of Cholera Toxin

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Structural Basis for Activation of Cholera Toxin Structural Basis for Activation of Cholera Toxin Print Wednesday, 30 November 2005 00:00 Cholera is a serious disease that claims thousands of victims each year in third-world, war-torn, and disaster-stricken nations. The culprit is the bacterium Vibrio cholerae, which can be ingested through contaminated food or water and colonizes the mucous membrane of the human small intestine. There, it secretes cholera toxin (CT), a protein whose A1 subunit

  10. Theoretical Nuclear Physics - Research - Cyclotron Institute

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Theoretical Nuclear Physics By addressing this elastic scattering indirect technique, we hope that more accurate measurements of elastic scattering data will provide very important astrophysical information. Progress toward understanding the structure and behavior of strongly interacting many-body systems requires detailed theoretical study. The theoretical physics program concentrates on the development of fundamental and phenomenological models of nuclear behavior. In some systems, the

  11. Recent Theoretical Results for Advanced Thermoelectric Materials |

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Department of Energy Theoretical Results for Advanced Thermoelectric Materials Recent Theoretical Results for Advanced Thermoelectric Materials Transport theory and first principles calculations applied to oxides, chalcogenides and skutterudite show that transport functions, including the thermopower, can be directly calculated from the electronic structure PDF icon singh.pdf More Documents & Publications Recent Theoretical Results for Advanced Thermoelectric Materials Thermoelectric

  12. Optimized Algorithms Boost Combustion Research

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Optimized Algorithms Boost Combustion Research Optimized Algorithms Boost Combustion Research Methane Flame Simulations Run 6x Faster on NERSC's Hopper Supercomputer November 25, 2014 Contact: Kathy Kincade, +1 510 495 2124, kkincade@lbl.gov Turbulent combustion simulations, which provide input to the design of more fuel-efficient combustion systems, have gotten their own efficiency boost, thanks to researchers from the Computational Research Division (CRD) at Lawrence Berkeley National

  13. CRAD, Facility Safety- Nuclear Facility Safety Basis

    Broader source: Energy.gov [DOE]

    A section of Appendix C to DOE G 226.1-2 "Federal Line Management Oversight of Department of Energy Nuclear Facilities." Consists of Criteria Review and Approach Documents (CRADs) that can be used for assessment of a contractor's Nuclear Facility Safety Basis.

  14. Review and Approval of Nuclear Facility Safety Basis and Safety Design Basis Documents

    Broader source: Directives, Delegations, and Requirements [Office of Management (MA)]

    2014-12-19

    This Standard describes a framework and the criteria to be used for approval of (1) safety basis documents, as required by 10 Code of Federal Regulation (C.F.R.) 830, Nuclear Safety Management, and (2) safety design basis documents, as required by Department of Energy (DOE) Standard (STD)-1189-2008, Integration of Safety into the Design Process.

  15. Research in Theoretical Particle Physics

    SciTech Connect (OSTI)

    Feldman, Hume A; Marfatia, Danny

    2014-09-24

    This document is the final report on activity supported under DOE Grant Number DE-FG02-13ER42024. The report covers the period July 15, 2013 March 31, 2014. Faculty supported by the grant during the period were Danny Marfatia (1.0 FTE) and Hume Feldman (1% FTE). The grant partly supported University of Hawaii students, David Yaylali and Keita Fukushima, who are supervised by Jason Kumar. Both students are expected to graduate with Ph.D. degrees in 2014. Yaylali will be joining the University of Arizona theory group in Fall 2014 with a 3-year postdoctoral appointment under Keith Dienes. The groups research covered topics subsumed under the Energy Frontier, the Intensity Frontier, and the Cosmic Frontier. Many theoretical results related to the Standard Model and models of new physics were published during the reporting period. The report contains brief project descriptions in Section 1. Sections 2 and 3 lists published and submitted work, respectively. Sections 4 and 5 summarize group activity including conferences, workshops and professional presentations.

  16. Grid and basis adaptive polynomial chaos techniques for sensitivity and uncertainty analysis

    SciTech Connect (OSTI)

    Perk, Zoltn Gilli, Luca Lathouwers, Danny Kloosterman, Jan Leen

    2014-03-01

    The demand for accurate and computationally affordable sensitivity and uncertainty techniques is constantly on the rise and has become especially pressing in the nuclear field with the shift to Best Estimate Plus Uncertainty methodologies in the licensing of nuclear installations. Besides traditional, already well developed methods such as first order perturbation theory or Monte Carlo sampling Polynomial Chaos Expansion (PCE) has been given a growing emphasis in recent years due to its simple application and good performance. This paper presents new developments of the research done at TU Delft on such Polynomial Chaos (PC) techniques. Our work is focused on the Non-Intrusive Spectral Projection (NISP) approach and adaptive methods for building the PCE of responses of interest. Recent efforts resulted in a new adaptive sparse grid algorithm designed for estimating the PC coefficients. The algorithm is based on Gerstner's procedure for calculating multi-dimensional integrals but proves to be computationally significantly cheaper, while at the same it retains a similar accuracy as the original method. More importantly the issue of basis adaptivity has been investigated and two techniques have been implemented for constructing the sparse PCE of quantities of interest. Not using the traditional full PC basis set leads to further reduction in computational time since the high order grids necessary for accurately estimating the near zero expansion coefficients of polynomial basis vectors not needed in the PCE can be excluded from the calculation. Moreover the sparse PC representation of the response is easier to handle when used for sensitivity analysis or uncertainty propagation due to the smaller number of basis vectors. The developed grid and basis adaptive methods have been implemented in Matlab as the Fully Adaptive Non-Intrusive Spectral Projection (FANISP) algorithm and were tested on four analytical problems. These show consistent good performance both in terms of the accuracy of the resulting PC representation of quantities and the computational costs associated with constructing the sparse PCE. Basis adaptivity also seems to make the employment of PC techniques possible for problems with a higher number of input parameters (1520), alleviating a well known limitation of the traditional approach. The prospect of larger scale applicability and the simplicity of implementation makes such adaptive PC algorithms particularly appealing for the sensitivity and uncertainty analysis of complex systems and legacy codes.

  17. GPU Accelerated Event Detection Algorithm

    Energy Science and Technology Software Center (OSTI)

    2011-05-25

    Smart grid external require new algorithmic approaches as well as parallel formulations. One of the critical components is the prediction of changes and detection of anomalies within the power grid. The state-of-the-art algorithms are not suited to handle the demands of streaming data analysis. (i) need for events detection algorithms that can scale with the size of data, (ii) need for algorithms that can not only handle multi dimensional nature of the data, but alsomore » model both spatial and temporal dependencies in the data, which, for the most part, are highly nonlinear, (iii) need for algorithms that can operate in an online fashion with streaming data. The GAEDA code is a new online anomaly detection techniques that take into account spatial, temporal, multi-dimensional aspects of the data set. The basic idea behind the proposed approach is to (a) to convert a multi-dimensional sequence into a univariate time series that captures the changes between successive windows extracted from the original sequence using singular value decomposition (SVD), and then (b) to apply known anomaly detection techniques for univariate time series. A key challenge for the proposed approach is to make the algorithm scalable to huge datasets by adopting techniques from perturbation theory, incremental SVD analysis. We used recent advances in tensor decomposition techniques which reduce computational complexity to monitor the change between successive windows and detect anomalies in the same manner as described above. Therefore we propose to develop the parallel solutions on many core systems such as GPUs, because these algorithms involve lot of numerical operations and are highly data-parallelizable.« less

  18. Radioactive Waste Management BasisApril 2006

    SciTech Connect (OSTI)

    Perkins, B K

    2011-08-31

    This Radioactive Waste Management Basis (RWMB) documents radioactive waste management practices adopted at Lawrence Livermore National Laboratory (LLNL) pursuant to Department of Energy (DOE) Order 435.1, Radioactive Waste Management. The purpose of this Radioactive Waste Management Basis is to describe the systematic approach for planning, executing, and evaluating the management of radioactive waste at LLNL. The implementation of this document will ensure that waste management activities at LLNL are conducted in compliance with the requirements of DOE Order 435.1, Radioactive Waste Management, and the Implementation Guide for DOE Manual 435.1-1, Radioactive Waste Management Manual. Technical justification is provided where methods for meeting the requirements of DOE Order 435.1 deviate from the DOE Manual 435.1-1 and Implementation Guide.

  19. TECHNICAL BASIS DOCUMENT FOR NATURAL EVENT HAZARDS

    SciTech Connect (OSTI)

    KRIPPS, L.J.

    2006-07-31

    This technical basis document was developed to support the documented safety analysis (DSA) and describes the risk binning process and the technical basis for assigning risk bins for natural event hazard (NEH)-initiated accidents. The purpose of the risk binning process is to determine the need for safety-significant structures, systems, and components (SSC) and technical safety requirement (TSR)-level controls for a given representative accident or represented hazardous conditions based on an evaluation of the frequency and consequence. Note that the risk binning process is not applied to facility workers, because all facility worker hazardous conditions are considered for safety-significant SSCs and/or TSR-level controls.

  20. Structural Basis for Activation of Cholera Toxin

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Structural Basis for Activation of Cholera Toxin Print Cholera is a serious disease that claims thousands of victims each year in third-world, war-torn, and disaster-stricken nations. The culprit is the bacterium Vibrio cholerae, which can be ingested through contaminated food or water and colonizes the mucous membrane of the human small intestine. There, it secretes cholera toxin (CT), a protein whose A1 subunit (CTA1) triggers a series of events culminating in the massive efflux of

  1. Structural Basis for Activation of Cholera Toxin

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Structural Basis for Activation of Cholera Toxin Print Cholera is a serious disease that claims thousands of victims each year in third-world, war-torn, and disaster-stricken nations. The culprit is the bacterium Vibrio cholerae, which can be ingested through contaminated food or water and colonizes the mucous membrane of the human small intestine. There, it secretes cholera toxin (CT), a protein whose A1 subunit (CTA1) triggers a series of events culminating in the massive efflux of

  2. Structural Basis for Activation of Cholera Toxin

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Structural Basis for Activation of Cholera Toxin Print Cholera is a serious disease that claims thousands of victims each year in third-world, war-torn, and disaster-stricken nations. The culprit is the bacterium Vibrio cholerae, which can be ingested through contaminated food or water and colonizes the mucous membrane of the human small intestine. There, it secretes cholera toxin (CT), a protein whose A1 subunit (CTA1) triggers a series of events culminating in the massive efflux of

  3. Structural Basis for Activation of Cholera Toxin

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Structural Basis for Activation of Cholera Toxin Print Cholera is a serious disease that claims thousands of victims each year in third-world, war-torn, and disaster-stricken nations. The culprit is the bacterium Vibrio cholerae, which can be ingested through contaminated food or water and colonizes the mucous membrane of the human small intestine. There, it secretes cholera toxin (CT), a protein whose A1 subunit (CTA1) triggers a series of events culminating in the massive efflux of

  4. Design Basis Threat | National Nuclear Security Administration

    National Nuclear Security Administration (NNSA)

    Design Basis Threat | National Nuclear Security Administration Facebook Twitter Youtube Flickr RSS People Mission Managing the Stockpile Preventing Proliferation Powering the Nuclear Navy Emergency Response Recapitalizing Our Infrastructure Countering Nuclear Terrorism About Our Programs Our History Who We Are Our Leadership Our Locations Budget Our Operations Library Bios Congressional Testimony Fact Sheets Newsletters Press Releases Photo Gallery Jobs Apply for Our Jobs Our Jobs Working at

  5. design basis threat | National Nuclear Security Administration

    National Nuclear Security Administration (NNSA)

    design basis threat | National Nuclear Security Administration Facebook Twitter Youtube Flickr RSS People Mission Managing the Stockpile Preventing Proliferation Powering the Nuclear Navy Emergency Response Recapitalizing Our Infrastructure Countering Nuclear Terrorism About Our Programs Our History Who We Are Our Leadership Our Locations Budget Our Operations Library Bios Congressional Testimony Fact Sheets Newsletters Press Releases Photo Gallery Jobs Apply for Our Jobs Our Jobs Working at

  6. Review and Approval of Nuclear Facility Safety Basis and Safety Design Basis Documents

    Energy Savers [EERE]

    SENSITIVE DOE-STD-1104-2014 December 2014 Superseding DOE-STD-1104-2009 DOE STANDARD REVIEW AND APPROVAL OF NUCLEAR FACILITY SAFETY BASIS AND SAFETY DESIGN BASIS DOCUMENTS U.S. Department of Energy AREA SAFT Washington, DC 20585 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. DOE-STD-1104-2014 i FOREWORD 1. This Standard describes a framework and the criteria to be used for approval of (1) safety basis documents, as required by 10 Code of Federal Regulation

  7. COLLOQUIUM: Theoretical and Experimental Aspects of Controlled...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    5:30pm MBG Auditorium COLLOQUIUM: Theoretical and Experimental Aspects of Controlled Quantum Dynamics Professor Herschel Rabitz Princeton University Abstract: PDF icon...

  8. Catalyst by Design - Theoretical, Nanostructural, and Experimental...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Emission Treatment Catalyst Catalyst by Design - Theoretical, Nanostructural, and Experimental Studies of Emission Treatment Catalyst Poster presented at the 16th Directions in...

  9. 2005 American Conference on Theoretical Chemistry

    SciTech Connect (OSTI)

    Carter, Emily A

    2006-11-19

    The materials uploaded are meant to serve as final report on the funds provided by DOE-BES to help sponsor the 2005 American Conference on Theoretical Chemistry.

  10. The Three-Dimensional Structural Basis of Type II Hyperprolinemia...

    Office of Scientific and Technical Information (OSTI)

    The Three-Dimensional Structural Basis of Type II Hyperprolinemia Citation Details In-Document Search Title: The Three-Dimensional Structural Basis of Type II Hyperprolinemia Type ...

  11. Nuclear Safety Basis Program Review Overview and Management Oversight...

    Office of Environmental Management (EM)

    Nuclear Safety Basis Program Review Overview and Management Oversight Standard Review Plan Nuclear Safety Basis Program Review Overview and Management Oversight Standard Review ...

  12. Technical Cost Modeling - Life Cycle Analysis Basis for Program...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Technical Cost Modeling - Life Cycle Analysis Basis for Program Focus Technical Cost Modeling - Life Cycle Analysis Basis for Program Focus Polymer Composites Research in the LM ...

  13. ORISE: The Medical Basis for Radiation-Accident Preparedness...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    The Medical Basis for Radiation-Accident Preparedness: Medical Management Proceedings of the Fifth International REACTS Symposium on the Medical Basis for Radiation-Accident...

  14. Real-time algorithm for robust coincidence search

    SciTech Connect (OSTI)

    Petrovic, T.; Vencelj, M.; Lipoglavsek, M.; Gajevic, J.; Pelicon, P.

    2012-10-20

    In in-beam {gamma}-ray spectroscopy experiments, we often look for coincident detection events. Among every N events detected, coincidence search is naively of principal complexity O(N{sup 2}). When we limit the approximate width of the coincidence search window, the complexity can be reduced to O(N), permitting the implementation of the algorithm into real-time measurements, carried out indefinitely. We have built an algorithm to find simultaneous events between two detection channels. The algorithm was tested in an experiment where coincidences between X and {gamma} rays detected in two HPGe detectors were observed in the decay of {sup 61}Cu. Functioning of the algorithm was validated by comparing calculated experimental branching ratio for EC decay and theoretical calculation for 3 selected {gamma}-ray energies for {sup 61}Cu decay. Our research opened a question on the validity of the adopted value of total angular momentum of the 656 keV state (J{sup {pi}} = 1/2{sup -}) in {sup 61}Ni.

  15. NDRPProtocolTechBasisCompiled020705.doc

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Basis Document for the Neutron Dose Reconstruction Project NEUTRON DOSE RECONSTRUCTION PROTOCOL Roger B. Falk, Joe M. Aldrich, Jerry Follmer, Nancy M. Daugherty, and Dr. Duane E. Hilmas Oak Ridge Institute of Science and Education and Dr. Phillip L. Chapman Department of Statistics, Colorado State University February 7, 2005 ORISE 05-0199 This document was produced under contract number DE-AC05-00OR22750 between the U.S. Department of Energy and Oak Ridge Associated Universities The authors of

  16. Adaptive protection algorithm and system

    DOE Patents [OSTI]

    Hedrick, Paul (Pittsburgh, PA) [Pittsburgh, PA; Toms, Helen L. (Irwin, PA) [Irwin, PA; Miller, Roger M. (Mars, PA) [Mars, PA

    2009-04-28

    An adaptive protection algorithm and system for protecting electrical distribution systems traces the flow of power through a distribution system, assigns a value (or rank) to each circuit breaker in the system and then determines the appropriate trip set points based on the assigned rank.

  17. Jet measurements at D0 using a KT algorithm

    SciTech Connect (OSTI)

    V.Daniel Elvira

    2002-10-03

    D0 has implemented and calibrated a k{perpendicular} jet algorithm for the first time in a p{bar p} collider. We present two results based on 1992-1996 data which were recently published: the subjet multiplicity in quark and gluon jets and the central inclusive jet cross section. The measured ratio between subjet multiplicities in gluon and quark jets is consistent with theoretical predictions and previous experimental values. NLO pQCD predictions of the k{perpendicular} inclusive jet cross section agree with the D0 measurement, although marginally in the low p{sub T} range. We also present a preliminary measurement of thrust cross sections, which indicates the need to include higher than {alpha}{sub s}{sup 3} terms and resumation in the theoretical calculations.

  18. Theoretical Fusion Research | Princeton Plasma Physics Lab

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Theory & Computational Department Weekly Highlights Weekly Seminars Basic Plasma Science Plasma Astrophysics Other Physics and Engineering Research PPPL Technical Reports NSTX-U Education Organization Contact Us Overview Experimental Fusion Research Theoretical Fusion Research Theory & Computational Department Weekly Highlights Weekly Seminars Basic Plasma Science Plasma Astrophysics Other Physics and Engineering Research PPPL Technical Reports NSTX-U Theoretical Fusion Research About

  19. Technical Basis for PNNL Beryllium Inventory

    SciTech Connect (OSTI)

    Johnson, Michelle Lynn

    2014-07-09

    The Department of Energy (DOE) issued Title 10 of the Code of Federal Regulations Part 850, “Chronic Beryllium Disease Prevention Program” (the Beryllium Rule) in 1999 and required full compliance by no later than January 7, 2002. The Beryllium Rule requires the development of a baseline beryllium inventory of the locations of beryllium operations and other locations of potential beryllium contamination at DOE facilities. The baseline beryllium inventory is also required to identify workers exposed or potentially exposed to beryllium at those locations. Prior to DOE issuing 10 CFR 850, Pacific Northwest Nuclear Laboratory (PNNL) had documented the beryllium characterization and worker exposure potential for multiple facilities in compliance with DOE’s 1997 Notice 440.1, “Interim Chronic Beryllium Disease.” After DOE’s issuance of 10 CFR 850, PNNL developed an implementation plan to be compliant by 2002. In 2014, an internal self-assessment (ITS #E-00748) of PNNL’s Chronic Beryllium Disease Prevention Program (CBDPP) identified several deficiencies. One deficiency is that the technical basis for establishing the baseline beryllium inventory when the Beryllium Rule was implemented was either not documented or not retrievable. In addition, the beryllium inventory itself had not been adequately documented and maintained since PNNL established its own CBDPP, separate from Hanford Site’s program. This document reconstructs PNNL’s baseline beryllium inventory as it would have existed when it achieved compliance with the Beryllium Rule in 2001 and provides the technical basis for the baseline beryllium inventory.

  20. algorithms

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    and its Use in Coupling Codes for Multiphysics Simulations Rod Schmidt, Noel Belcourt, Russell Hooper, and Roger Pawlowski Sandia National Laboratories P.O. Box 5800...

  1. algorithms

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    of the vehicle. Here the two domains are the fluid exterior to the vehicle (compressible, turbulent fluid flow) and the interior of the vehicle (structural dynamics)...

  2. algorithms

    Office of Scientific and Technical Information (OSTI)

    1 are estimated us- ing the conventional MCMC (C-MCMC) with 60,000 model executions (red-solid lines), the linear, quadratic, and cubic surrogate systems with 9226, 4375, 3765...

  3. Review and Approval of Nuclear Facility Safety Basis and Safety Design Basis Documents

    Energy Savers [EERE]

    SENSITIVE DOE-STD-1104-2009 May 2009 Superseding DOE-STD-1104-96 DOE STANDARD REVIEW AND APPROVAL OF NUCLEAR FACILITY SAFETY BASIS AND SAFETY DESIGN BASIS DOCUMENTS U.S. Department of Energy AREA SAFT Washington, DC 20585 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. DOE-STD-1104-2009 ii Available on the Department of Energy Technical Standards web page at http://www.hss.energy.gov/nuclearsafety/ns/techstds/ DOE-STD-1104-2009 iii CONTENTS FOREWORD

  4. Time Variant Floating Mean Counting Algorithm

    Energy Science and Technology Software Center (OSTI)

    1999-06-03

    This software was written to test a time variant floating mean counting algorithm. The algorithm was developed by Westinghouse Savannah River Company and a provisional patent has been filed on the algorithm. The test software was developed to work with the Val Tech model IVB prototype version II count rate meter hardware. The test software was used to verify the algorithm developed by WSRC could be correctly implemented with the vendor''s hardware.

  5. A radial basis function Galerkin method for inhomogeneous nonlocal diffusion

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Lehoucq, Richard B.; Rowe, Stephen T.

    2016-02-01

    We introduce a discretization for a nonlocal diffusion problem using a localized basis of radial basis functions. The stiffness matrix entries are assembled by a special quadrature routine unique to the localized basis. Combining the quadrature method with the localized basis produces a well-conditioned, sparse, symmetric positive definite stiffness matrix. We demonstrate that both the continuum and discrete problems are well-posed and present numerical results for the convergence behavior of the radial basis function method. As a result, we explore approximating the solution to anisotropic differential equations by solving anisotropic nonlocal integral equations using the radial basis function method.

  6. Review and Approval of Nuclear Facility Safety Basis and Safety...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    104-2014, Review and Approval of Nuclear Facility Safety Basis and Safety Design Basis Documents by Website Administrator This Standard describes a framework and the criteria to be...

  7. Office of Nuclear Safety Basis and Facility Design

    Broader source: Energy.gov [DOE]

    The Office of Nuclear Safety Basis & Facility Design establishes safety basis and facility design requirements and expectations related to analysis and design of nuclear facilities to ensure protection of workers and the public from the hazards associated with nuclear operations.

  8. CRAD, Integrated Safety Basis and Engineering Design Review ...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Integrated Safety Basis and Engineering Design Review - August 20, 2014 (EA CRAD 31-4, Rev. 0) CRAD, Integrated Safety Basis and Engineering Design Review - August 20, 2014 (EA...

  9. Authorization basis status report (miscellaneous TWRS facilities, tanks and components)

    SciTech Connect (OSTI)

    Stickney, R.G.

    1998-04-29

    This report presents the results of a systematic evaluation conducted to identify miscellaneous TWRS facilities, tanks and components with potential needed authorization basis upgrades. It provides the Authorization Basis upgrade plan for those miscellaneous TWRS facilities, tanks and components identified.

  10. PARFUME Theory and Model basis Report

    SciTech Connect (OSTI)

    Darrell L. Knudson; Gregory K Miller; G.K. Miller; D.A. Petti; J.T. Maki; D.L. Knudson

    2009-09-01

    The success of gas reactors depends upon the safety and quality of the coated particle fuel. The fuel performance modeling code PARFUME simulates the mechanical, thermal and physico-chemical behavior of fuel particles during irradiation. This report documents the theory and material properties behind various capabilities of the code, which include: 1) various options for calculating CO production and fission product gas release, 2) an analytical solution for stresses in the coating layers that accounts for irradiation-induced creep and swelling of the pyrocarbon layers, 3) a thermal model that calculates a time-dependent temperature profile through a pebble bed sphere or a prismatic block core, as well as through the layers of each analyzed particle, 4) simulation of multi-dimensional particle behavior associated with cracking in the IPyC layer, partial debonding of the IPyC from the SiC, particle asphericity, and kernel migration (or amoeba effect), 5) two independent methods for determining particle failure probabilities, 6) a model for calculating release-to-birth (R/B) ratios of gaseous fission products that accounts for particle failures and uranium contamination in the fuel matrix, and 7) the evaluation of an accident condition, where a particle experiences a sudden change in temperature following a period of normal irradiation. The accident condition entails diffusion of fission products through the particle coating layers and through the fuel matrix to the coolant boundary. This document represents the initial version of the PARFUME Theory and Model Basis Report. More detailed descriptions will be provided in future revisions.

  11. Kinetically balanced Gaussian basis-set approach to relativistic Compton profiles of atoms

    SciTech Connect (OSTI)

    Jaiswal, Prerit; Shukla, Alok

    2007-02-15

    Atomic Compton profiles (CPs) are a very important property which provide us information about the momentum distribution of atomic electrons. Therefore, for CPs of heavy atoms, relativistic effects are expected to be important, warranting a relativistic treatment of the problem. In this paper, we present an efficient approach aimed at ab initio calculations of atomic CPs within a Dirac-Hartree-Fock (DHF) formalism, employing kinetically balanced Gaussian basis functions. The approach is used to compute the CPs of noble gases ranging from He to Rn, and the results have been compared to the experimental and other theoretical data, wherever possible. The influence of the quality of the basis set on the calculated CPs has also been systematically investigated.

  12. Assessing Beyond Design Basis Seismic Events and Implications on Seismic

    Office of Environmental Management (EM)

    Risk | Department of Energy Assessing Beyond Design Basis Seismic Events and Implications on Seismic Risk Assessing Beyond Design Basis Seismic Events and Implications on Seismic Risk September 19, 2012 Presenter: Jeffrey Kimball, Technical Specialist (Seismologist) Defense Nuclear Facilities Safety Board Topics Covered: Department of Energy Approach to Natural Phenomena Hazards Analysis and Design (Seismic) Design Basis and Beyond Design Basis Seismic Events Seismic Risk Implications - Key

  13. CRAD, Engineering Design and Safety Basis- December 22, 2009

    Broader source: Energy.gov [DOE]

    Engineering Design and Safety Basis Inspection Criteria, Inspection Activities, and Lines of Inquiry (HSS CRAD 64-19, Rev. 0)

  14. Nuclear Safety Basis Program Review Overview and Management Oversight

    Energy Savers [EERE]

    Standard Review Plan | Department of Energy Safety Basis Program Review Overview and Management Oversight Standard Review Plan Nuclear Safety Basis Program Review Overview and Management Oversight Standard Review Plan This SRP, Nuclear Safety Basis Program Review, consists of five volumes. It provides information to help strengthen the technical rigor of line management oversight and federal monitoring of DOE nuclear facilities. It provides a primer on the safety basis development and

  15. Theoretical aspects of light meson spectroscopy

    SciTech Connect (OSTI)

    Barnes, T. |

    1995-12-31

    In this pedagogical review the authors discuss the theoretical understanding of light hadron spectroscopy in terms of QCD and the quark model. They begin with a summary of the known and surmised properties of QCD and confinement. Following this they review the nonrelativistic quark potential model for q{anti q} mesons and discuss the quarkonium spectrum and methods for identifying q{anti q} states. Finally, they review theoretical expectations for non-q{anti q} states (glueballs, hybrids and multiquark systems) and the status of experimental candidates for these states.

  16. Hanford External Dosimetry Technical Basis Manual PNL-MA-842

    SciTech Connect (OSTI)

    Rathbone, Bruce A.

    2010-01-01

    The Hanford External Dosimetry Technical Basis Manual PNL-MA-842 documents the design and implementation of the external dosimetry system used at Hanford. The manual describes the dosimeter design, processing protocols, dose calculation methodology, radiation fields encountered, dosimeter response characteristics, limitations of dosimeter design under field conditions, and makes recommendations for effective use of the dosimeters in the field. The manual describes the technical basis for the dosimetry system in a manner intended to help ensure defensibility of the dose of record at Hanford and to demonstrate compliance with 10 CFR 835, DOELAP, DOE-RL, ORP, PNSO, and Hanford contractor requirements. The dosimetry system is operated by PNNLs Hanford External Dosimetry Program (HEDP) which provides dosimetry services to all Hanford contractors. The primary users of this manual are DOE and DOE contractors at Hanford using the dosimetry services of PNNL. Development and maintenance of this manual is funded directly by DOE and DOE contractors. Its contents have been reviewed and approved by DOE and DOE contractors at Hanford through the Hanford Personnel Dosimetry Advisory Committee (HPDAC) which is chartered and chaired by DOE-RL and serves as means of coordinating dosimetry practices across contractors at Hanford. This manual was established in 1996. Since its inception, it has been revised many times and maintained by PNNL as a controlled document with controlled distribution. The first revision to be released through PNNLs Electronic Records & Information Capture Architecture (ERICA) database was designated Revision 0. Revision numbers that are whole numbers reflect major revisions typically involving significant changes to all chapters in the document. Revision numbers that include a decimal fraction reflect minor revisions, usually restricted to selected chapters or selected pages in the document. Maintenance and distribution of controlled hard copies of the manual by PNNL was discontinued beginning with Revision 0.2. Revision Log: Rev. 0 (2/25/2005) Major revision and expansion. Rev. 0.1 (3/12/2007) Updated Chapters 5, 6 and 9 to reflect change in default ring calibration factor used in HEDP dose calculation software. Factor changed from 1.5 to 2.0 beginning January 1, 2007. Pages on which changes were made are as follows: 5.23, 5.69, 5.78, 5.80, 5.82, 6.3, 6.5, 6.29, and 9.2. Rev 0.2 (8/28/2009) Updated Chapters 3, 5, 6, 8 and 9. Chapters 6 and 8 were significantly expanded. References in the Preface and Chapters 1, 2, 4, and 7 were updated to reflect updates to DOE documents. Approved by HPDAC on 6/2/2009. Rev 1.0 (1/1/2010) Major revision. Updated all chapters to reflect the Hanford site wide implementation on January 1, 2010 of new DOE requirements for occupational radiation protection. The new requirements are given in the June 8, 2007 amendment to 10 CFR 835 Occupational Radiation Protection (Federal Register, June 8, 2007. Title 10 Part 835. U.S., Code of Federal Regulations, Vol. 72, No. 110, 31904-31941). Revision 1.0 to the manual replaces ICRP 26 dosimetry concepts and terminology with ICRP 60 dosimetry concepts and terminology and replaces external dose conversion factors from ICRP 51 with those from ICRP 74 for use in measurement of operational quantities with dosimeters. Descriptions of dose algorithms and dosimeter response characteristics, and field performance were updated to reflect changes in the neutron quality factors used in the measurement of operational quantities.

  17. Toward Catalyst Design from Theoretical Calculations (464th Brookhaven...

    Office of Scientific and Technical Information (OSTI)

    Toward Catalyst Design from Theoretical Calculations (464th Brookhaven Lecture) Citation Details In-Document Search Title: Toward Catalyst Design from Theoretical Calculations...

  18. Theoretical investigations of two Si-based spintronic materials...

    Office of Scientific and Technical Information (OSTI)

    Conference: Theoretical investigations of two Si-based spintronic materials Citation Details In-Document Search Title: Theoretical investigations of two Si-based spintronic ...

  19. Toward Catalyst Design from Theoretical Calculations (464th Brookhaven...

    Office of Scientific and Technical Information (OSTI)

    Conference: Toward Catalyst Design from Theoretical Calculations (464th Brookhaven Lecture) Citation Details In-Document Search Title: Toward Catalyst Design from Theoretical ...

  20. Catalysis by Design - Theoretical and Experimental Studies of...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Design - Theoretical and Experimental Studies of Model Catalysts for Lean NOx Treatment Catalysis by Design - Theoretical and Experimental Studies of Model Catalysts for Lean NOx ...

  1. Two-electron reduction of ethylene carbonate: theoretical review...

    Office of Scientific and Technical Information (OSTI)

    theoretical review of SEI formation mechanisms. Citation Details In-Document Search Title: Two-electron reduction of ethylene carbonate: theoretical review of SEI formation ...

  2. EXPERIMENTAL AND THEORETICAL DETERMINATION OF HEAVY OIL VISCOSITY...

    Office of Scientific and Technical Information (OSTI)

    EXPERIMENTAL AND THEORETICAL DETERMINATION OF HEAVY OIL VISCOSITY UNDER RESERVOIR CONDITIONS Citation Details In-Document Search Title: EXPERIMENTAL AND THEORETICAL DETERMINATION...

  3. Research in theoretical nuclear and neutrino physics. Final report...

    Office of Scientific and Technical Information (OSTI)

    Research in theoretical nuclear and neutrino physics. Final report Citation Details In-Document Search Title: Research in theoretical nuclear and neutrino physics. Final report The ...

  4. Research in theoretical nuclear and neutrino physics. Final report...

    Office of Scientific and Technical Information (OSTI)

    Technical Report: Research in theoretical nuclear and neutrino physics. Final report Citation Details In-Document Search Title: Research in theoretical nuclear and neutrino ...

  5. Operando Raman and Theoretical Vibration Spectroscopy of Non...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Operando Raman and Theoretical Vibration Spectroscopy of Non-PGM Catalysts Operando Raman and Theoretical Vibration Spectroscopy of Non-PGM Catalysts Presentation about...

  6. Efficient Theoretical Screening of Solid Sorbents for CO2 Capture...

    Office of Scientific and Technical Information (OSTI)

    Journal Article: Efficient Theoretical Screening of Solid Sorbents for CO2 Capture Applications* Citation Details In-Document Search Title: Efficient Theoretical Screening of Solid...

  7. Theoretical Synthesis of Mixed Materials for CO2 Capture Applications...

    Office of Scientific and Technical Information (OSTI)

    Theoretical Synthesis of Mixed Materials for CO2 Capture Applications Citation Details In-Document Search Title: Theoretical Synthesis of Mixed Materials for CO2 Capture...

  8. Student's algorithm solves real-world problem

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Student's algorithm solves real-world problem Supercomputing Challenge: student's algorithm solves real-world problem Students learn how to use powerful computers to analyze, model, and solve real-world problems. April 3, 2012 Jordon Medlock of Albuquerque's Manzano High School won the 2012 Lab-sponsored Supercomputing Challenge Jordon Medlock of Albuquerque's Manzano High School won the 2012 Lab-sponsored Supercomputing Challenge by creating a computer algorithm that automates the process of

  9. Java implementation of Class Association Rule algorithms

    Energy Science and Technology Software Center (OSTI)

    2007-08-30

    Java implementation of three Class Association Rule mining algorithms, NETCAR, CARapriori, and clustering based rule mining. NETCAR algorithm is a novel algorithm developed by Makio Tamura. The algorithm is discussed in a paper: UCRL-JRNL-232466-DRAFT, and would be published in a peer review scientific journal. The software is used to extract combinations of genes relevant with a phenotype from a phylogenetic profile and a phenotype profile. The phylogenetic profiles is represented by a binary matrix andmore » a phenotype profile is represented by a binary vector. The present application of this software will be in genome analysis, however, it could be applied more generally.« less

  10. Solar Position Algorithm (SPA) - Energy Innovation Portal

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Thermal Solar Thermal Energy Analysis Energy Analysis Find More Like This Return to Search Solar Position Algorithm (SPA) National Renewable Energy Laboratory Contact NREL About ...

  11. A new paradigm for the molecular basis of rubber elasticity

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Hanson, David E.; Barber, John L.

    2015-02-19

    The molecular basis for rubber elasticity is arguably the oldest and one of the most important questions in the field of polymer physics. The theoretical investigation of rubber elasticity began in earnest almost a century ago with the development of analytic thermodynamic models, based on simple, highly-symmetric configurations of so-called Gaussian chains, i.e. polymer chains that obey Markov statistics. Numerous theories have been proposed over the past 90 years based on the ansatz that the elastic force for individual network chains arises from the entropy change associated with the distribution of end-to-end distances of a free polymer chain. There aremore » serious philosophical objections to this assumption and others, such as the assumption that all network nodes undergo affine motion and that all of the network chains have the same length. Recently, a new paradigm for elasticity in rubber networks has been proposed that is based on mechanisms that originate at the molecular level. Using conventional statistical mechanics analyses, quantum chemistry, and molecular dynamics simulations, the fundamental entropic and enthalpic chain extension forces for polyisoprene (natural rubber) have been determined, along with estimates for the basic force constants. Concurrently, the complex morphology of natural rubber networks (the joint probability density distributions that relate the chain end-to-end distance to its contour length) has also been captured in a numerical model. When molecular chain forces are merged with the network structure in this model, it is possible to study the mechanical response to tensile and compressive strains of a representative volume element of a polymer network. As strain is imposed on a network, pathways of connected taut chains, that completely span the network along strain axis, emerge. Although these chains represent only a few percent of the total, they account for nearly all of the elastic stress at high strain. Here we provide a brief review of previous elasticity theories and their deficiencies, and present a new paradigm with an emphasis on experimental comparisons.« less

  12. ORISE: The Medical Basis for Radiation-Accident Preparedness: Medical

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Management (Published by REAC/TS) The Medical Basis for Radiation-Accident Preparedness: Medical Management Proceedings of the Fifth International REAC/TS Symposium on the Medical Basis for Radiation-Accident Preparedness and the Biodosimetry Workshop As part of its mission to provide continuing education for personnel responsible for treating radiation injuries, REAC/TS hosted the Fifth International REAC/TS Symposium on the Medical Basis for Radiation-Accident Preparedness symposium and

  13. Assessing Beyond Design Basis Seismic Events and Implications...

    Office of Environmental Management (EM)

    Defense Nuclear Facilities Safety Board Topics Covered: Department of Energy Approach to Natural Phenomena Hazards Analysis and Design (Seismic) Design Basis and Beyond Design...

  14. Enterprise Assessments Targeted Review of the Safety Basis at...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Targeted Review of the Safety Basis at the Savannah River Site F-Area Central Laboratory ......... 11 5.7 Federal Review and Approval ......

  15. CRAD, Review of Safety Basis Development- January 31, 2013

    Broader source: Energy.gov [DOE]

    Review of Safety Basis Development for the Savannah River Site Salt Waste Processing Facility - Inspection Criteria, Approach, and Lines of Inquiry (HSS CRAD 45-57, Rev. 0)

  16. Technical Planning Basis - DOE Directives, Delegations, and Requiremen...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    2, Technical Planning Basis by David Freshwater Functional areas: Defense Nuclear Facility Safety and Health Requirement, Safety and Security, The Guide assists DOENNSA field...

  17. Protocol for Enhanced Evaluations of Beyond Design Basis Events...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Protocol for Enhanced Evaluations of Beyond Design Basis Events Supporting Implementation of Operating Experience Report 2013-01 Protocol for Enhanced Evaluations of Beyond Design...

  18. Online Monitoring Technical Basis and Analysis Framework for...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Online Monitoring Technical Basis and Analysis Framework for Emergency Diesel Generators-Interim Report for FY 2013 The Light Water Reactor Sustainability Program is a research, ...

  19. SRS FTF Section 3116 Basis for Determination | Department of Energy

    Energy Savers [EERE]

    FTF Section 3116 Basis for Determination SRS FTF Section 3116 Basis for Determination Basis for Section 3116 Determination for Closure of F-Tank Farm at the Savannah River Site. In accordance with NDAA Section 3116, certain waste from reprocessing of spent nuclear fuel is not high-level waste if the Secretary of Energy, in consultation with the NRC, determines that the criteria in NDAA Section 3116(a) are met. This FTF 3116 Basis Document shows that those criteria are satisfied, to support a

  20. Petascale algorithms for reactor hydrodynamics.

    SciTech Connect (OSTI)

    Fischer, P.; Lottes, J.; Pointer, W. D.; Siegel, A.

    2008-01-01

    We describe recent algorithmic developments that have enabled large eddy simulations of reactor flows on up to P = 65, 000 processors on the IBM BG/P at the Argonne Leadership Computing Facility. Petascale computing is expected to play a pivotal role in the design and analysis of next-generation nuclear reactors. Argonne's SHARP project is focused on advanced reactor simulation, with a current emphasis on modeling coupled neutronics and thermal-hydraulics (TH). The TH modeling comprises a hierarchy of computational fluid dynamics approaches ranging from detailed turbulence computations, using DNS (direct numerical simulation) and LES (large eddy simulation), to full core analysis based on RANS (Reynolds-averaged Navier-Stokes) and subchannel models. Our initial study is focused on LES of sodium-cooled fast reactor cores. The aim is to leverage petascale platforms at DOE's Leadership Computing Facilities (LCFs) to provide detailed information about heat transfer within the core and to provide baseline data for less expensive RANS and subchannel models.

  1. Final Report: Sublinear Algorithms for In-situ and In-transit Data Analysis at Exascale.

    SciTech Connect (OSTI)

    Bennett, Janine Camille; Pinar, Ali; Seshadhri, C.; Thompson, David; Salloum, Maher; Bhagatwala, Ankit; Chen, Jacqueline H.

    2015-09-01

    Post-Moore's law scaling is creating a disruptive shift in simulation workflows, as saving the entirety of raw data to persistent storage becomes expensive. We are moving away from a post-process centric data analysis paradigm towards a concurrent analysis framework, in which raw simulation data is processed as it is computed. Algorithms must adapt to machines with extreme concurrency, low communication bandwidth, and high memory latency, while operating within the time constraints prescribed by the simulation. Furthermore, in- put parameters are often data dependent and cannot always be prescribed. The study of sublinear algorithms is a recent development in theoretical computer science and discrete mathematics that has significant potential to provide solutions for these challenges. The approaches of sublinear algorithms address the fundamental mathematical problem of understanding global features of a data set using limited resources. These theoretical ideas align with practical challenges of in-situ and in-transit computation where vast amounts of data must be processed under severe communication and memory constraints. This report details key advancements made in applying sublinear algorithms in-situ to identify features of interest and to enable adaptive workflows over the course of a three year LDRD. Prior to this LDRD, there was no precedent in applying sublinear techniques to large-scale, physics based simulations. This project has definitively demonstrated their efficacy at mitigating high performance computing challenges and highlighted the rich potential for follow-on re- search opportunities in this space.

  2. Initial borehole acoustic televiewer data processing algorithms

    SciTech Connect (OSTI)

    Moore, T.K.

    1988-06-01

    With the development of a new digital televiewer, several algorithms have been developed in support of off-line data processing. This report describes the initial set of utilities developed to support data handling as well as data display. Functional descriptions, implementation details, and instructions for use of the seven algorithms are provided. 5 refs., 33 figs., 1 tab.

  3. Technical Cost Modeling - Life Cycle Analysis Basis for Program Focus |

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Department of Energy 1 DOE Hydrogen and Fuel Cells Program, and Vehicle Technologies Program Annual Merit Review and Peer Evaluation PDF icon lm001_das_2011_o.pdf More Documents & Publications Technical Cost Modeling - Life Cycle Analysis Basis for Program Focus Technical Cost Modeling - Life Cycle Analysis Basis for Program Focus Multi-Material Joining: Challenges and Opportunities

  4. Theoretical, Methodological, and Empirical Approaches to Cost Savings: A Compendium

    SciTech Connect (OSTI)

    M Weimar

    1998-12-10

    This publication summarizes and contains the original documentation for understanding why the U.S. Department of Energy's (DOE's) privatization approach provides cost savings and the different approaches that could be used in calculating cost savings for the Tank Waste Remediation System (TWRS) Phase I contract. The initial section summarizes the approaches in the different papers. The appendices are the individual source papers which have been reviewed by individuals outside of the Pacific Northwest National Laboratory and the TWRS Program. Appendix A provides a theoretical basis for and estimate of the level of savings that can be" obtained from a fixed-priced contract with performance risk maintained by the contractor. Appendix B provides the methodology for determining cost savings when comparing a fixed-priced contractor with a Management and Operations (M&O) contractor (cost-plus contractor). Appendix C summarizes the economic model used to calculate cost savings and provides hypothetical output from preliminary calculations. Appendix D provides the summary of the approach for the DOE-Richland Operations Office (RL) estimate of the M&O contractor to perform the same work as BNFL Inc. Appendix E contains information on cost growth and per metric ton of glass costs for high-level waste at two other DOE sites, West Valley and Savannah River. Appendix F addresses a risk allocation analysis of the BNFL proposal that indicates,that the current approach is still better than the alternative.

  5. The double-beta decay: Theoretical challenges

    SciTech Connect (OSTI)

    Horoi, Mihai

    2012-11-20

    Neutrinoless double beta decay is a unique process that could reveal physics beyond the Standard Model of particle physics namely, if observed, it would prove that neutrinos are Majorana particles. In addition, it could provide information regarding the neutrino masses and their hierarchy, provided that reliable nuclear matrix elements can be obtained. The two neutrino double beta decay is an associate process that is allowed by the Standard Model, and it was observed for about ten nuclei. The present contribution gives a brief review of the theoretical challenges associated with these two process, emphasizing the reliable calculation of the associated nuclear matrix elements.

  6. Theoretical Screening of Mixed Solid Sorbent for

    Office of Scientific and Technical Information (OSTI)

    xtended A b stra c t o f 2 0 1 4 AICliE S pring M eeting, New O rleans, LA, M ar.30-A pr.02, 20 1 4 Theoretical Screening of Mixed Solid Sorbent for Applications to C 0 2 Capture Technology Yuhua Duan' N ational E nergy T echnology Laboratory, United States D epartm ent o f Energy, Pittsburgh, Pennsylvania 15236, USA Abstract Since current technologies for capturing CO2 to fight global clim ate change are still too energy intensive, there is a critical need for developm ent o f new m aterials

  7. Structural basis for the antibody neutralization of Herpes simplex virus

    Office of Scientific and Technical Information (OSTI)

    (Journal Article) | SciTech Connect Structural basis for the antibody neutralization of Herpes simplex virus Citation Details In-Document Search Title: Structural basis for the antibody neutralization of Herpes simplex virus The gD-E317-Fab complex crystal revealed the conformational epitope of human mAb E317 on HSV gD, providing a molecular basis for understanding the viral neutralization mechanism. Glycoprotein D (gD) of Herpes simplex virus (HSV) binds to a host cell surface receptor,

  8. THEORETICAL STUDIES OF HADRONS AND NUCLEI

    SciTech Connect (OSTI)

    STEPHEN R COTANCH

    2007-03-20

    This report details final research results obtained during the 9 year period from June 1, 1997 through July 15, 2006. The research project, entitled ?Theoretical Studies of Hadrons and Nuclei?, was supported by grant DE-FG02-97ER41048 between North Carolina State University [NCSU] and the U. S. Department of Energy [DOE]. In compliance with grant requirements the Principal Investigator [PI], Professor Stephen R. Cotanch, conducted a theoretical research program investigating hadrons and nuclei and devoted to this program 50% of his time during the academic year and 100% of his time in the summer. Highlights of new, significant research results are briefly summarized in the following three sections corresponding to the respective sub-programs of this project (hadron structure, probing hadrons and hadron systems electromagnetically, and many-body studies). Recent progress is also discussed in a recent renewal/supplemental grant proposal submitted to DOE. Finally, full detailed descriptions of completed work can be found in the publications listed at the end of this report.

  9. Advanced Imaging Algorithms for Radiation Imaging Systems

    SciTech Connect (OSTI)

    Marleau, Peter

    2015-10-01

    The intent of the proposed work, in collaboration with University of Michigan, is to develop the algorithms that will bring the analysis from qualitative images to quantitative attributes of objects containing SNM. The first step to achieving this is to develop an indepth understanding of the intrinsic errors associated with the deconvolution and MLEM algorithms. A significant new effort will be undertaken to relate the image data to a posited three-dimensional model of geometric primitives that can be adjusted to get the best fit. In this way, parameters of the model such as sizes, shapes, and masses can be extracted for both radioactive and non-radioactive materials. This model-based algorithm will need the integrated response of a hypothesized configuration of material to be calculated many times. As such, both the MLEM and the model-based algorithm require significant increases in calculation speed in order to converge to solutions in practical amounts of time.

  10. Advanced CHP Control Algorithms: Scope Specification

    SciTech Connect (OSTI)

    Katipamula, Srinivas; Brambley, Michael R.

    2006-04-28

    The primary objective of this multiyear project is to develop algorithms for combined heat and power systems to ensure optimal performance, increase reliability, and lead to the goal of clean, efficient, reliable and affordable next generation energy systems.

  11. Tracking Algorithm for Multi- Dimensional Array Transposition

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    192002 Yun (Helen) He, SC2002 1 MPI and OpenMP Paradigms on Cluster of SMP Architectures: the Vacancy Tracking Algorithm for Multi- Dimensional Array Transposition Yun (Helen) He...

  12. Drainage Algorithm for Geospatial Knowledge

    Energy Science and Technology Software Center (OSTI)

    2006-08-15

    The Pacific Northwest National Laboratory (PNNL) has developed a prototype stream extraction algorithm that semi-automatically extracts and characterizes streams using a variety of multisensor imagery and digital terrain elevation data (DTED) data. The system is currently optimized for three types of single-band imagery: radar, visible, and thermal. Method of Solution: DRAGON: (1) classifies pixels into clumps of water objects based on the classification of water pixels by spectral signatures and neighborhood relationships, (2) uses themore » morphology operations (erosion and dilation) to separate out large lakes (or embayment), isolated lakes, ponds, wide rivers and narrow rivers, and (3) translates the river objects into vector objects. In detail, the process can be broken down into the following steps. A. Water pixels are initially identified using on the extend range and slope values (if an optional DEM file is available). B. Erode to the distance that defines a large water body and then dilate back. The resulting mask can be used to identify large lake and embayment objects that are then removed from the image. Since this operation be time consuming it is only performed if a simple test (i.e. a large box can be found somewhere in the image that contains only water pixels) that indicates a large water body is present. C. All water pixels are ‘clumped’ (in Imagine terminology clumping is when pixels of a common classification that touch are connected) and clumps which do not contain pure water pixels (e.g. dark cloud shadows) are removed D. The resulting true water pixels are clumped and water objects which are too small (e.g. ponds) or isolated lakes (i.e. isolated objects with a small compactness ratio) are removed. Note that at this point lakes have been identified has a byproduct of the filtering process and can be output has vector layers if needed. E. At this point only river pixels are left in the image. To separate out wide rivers all objects in the image are eroded by the half width of narrow rivers. This causes all narrow rivers to be removed and leaves only the core of wide rivers. This core is dilated out by the same distance to create a mask that is used with the original river image to separate out rivers into two separate images of narrow rivers and wide rivers F. If in the image that contains wide rivers there are small isolated short (less than 300 meters if NGA criteria is used) segments these segments are transferred to the narrow river file in order to be treated has parts of single line rivers G. The narrow river file is optionally dilated and eroded. This ‘closing’ has the effect of removing small islands, filling small gaps, and smoothing the outline H. The user also has the option of ‘closing’ objects in the wide river file. However, this depends on the degree to which the user wants to remove small islands in the large rivers. I. To make the translation from raster to single vector easier the objects in the narrow river image are reduced to a single center line (i.e. thinned) with binary morphology operations.« less

  13. CRAD, Review of Safety Basis Development- October 11, 2012

    Broader source: Energy.gov [DOE]

    Review of Safety Basis Development for the Y-12 National Security Complex Uranium Processing Facility Inspection Criteria, Approach, and Lines of Inquiry (HSS CRAD 45-55, Rev. 0)

  14. Basis for Section 3116 Determination for Salt Waste Disposal...

    Office of Environmental Management (EM)

    WD-2005-001 January 2006 Basis for Section 3116 Determination for Salt Waste Disposal at ......... 28 4.0 THE WASTE DOES NOT REQUIRE PERMANENT ISOLATION IN A ...

  15. Structural basis for the antibody neutralization of Herpes simplex...

    Office of Scientific and Technical Information (OSTI)

    of Herpes simplex virus Citation Details In-Document Search Title: Structural basis for the antibody neutralization of Herpes simplex virus The gD-E317-Fab complex ...

  16. General Engineer/Physical Scientist (Safety Basis Engineer/Scientist)

    Broader source: Energy.gov [DOE]

    A successful candidate in this position will serve as an authority in the safety basis functional area. The incumbent is responsible for managing, coordinating, and authorizing work in the context...

  17. Advanced Test Reactor Design Basis Reconstitution Project Issue Resolution Process

    SciTech Connect (OSTI)

    Steven D. Winter; Gregg L. Sharp; William E. Kohn; Richard T. McCracken

    2007-05-01

    The Advanced Test Reactor (ATR) Design Basis Reconstitution Program (DBRP) is a structured assessment and reconstitution of the design basis for the ATR. The DBRP is designed to establish and document the ties between the Document Safety Analysis (DSA), design basis, and actual system configurations. Where the DBRP assessment team cannot establish a link between these three major elements, a gap is identified. Resolutions to identified gaps represent configuration management and design basis recovery actions. The proposed paper discusses the process being applied to define, evaluate, report, and address gaps that are identified through the ATR DBRP. Design basis verification may be performed or required for a nuclear facility safety basis on various levels. The process is applicable to large-scale design basis reconstitution efforts, such as the ATR DBRP, or may be scaled for application on smaller projects. The concepts are applicable to long-term maintenance of a nuclear facility safety basis and recovery of degraded safety basis components. The ATR DBRP assessment team has observed numerous examples where a clear and accurate link between the DSA, design basis, and actual system configuration was not immediately identifiable in supporting documentation. As a result, a systematic approach to effectively document, prioritize, and evaluate each observation is required. The DBRP issue resolution process provides direction for consistent identification, documentation, categorization, and evaluation, and where applicable, entry into the determination process for a potential inadequacy in the safety analysis (PISA). The issue resolution process is a key element for execution of the DBRP. Application of the process facilitates collection, assessment, and reporting of issues identified by the DBRP team. Application of the process results in an organized database of safety basis gaps and prioritized corrective action planning and resolution. The DBRP team follows the ATR DBRP issue resolution process which provides a method for the team to promptly sort and prioritize questions and issues between those that can be addressed as a normal part of the reconstitution project and those that are to be handle as PISAs. Presentation of the DBRP issue resolution process provides an example for similar activities that may be required at other facilities within the Department of Energy complex.

  18. Complex basis functions revisited: Implementation with applications to

    Office of Scientific and Technical Information (OSTI)

    carbon tetrafluoride and aromatic N-containing heterocycles within the static-exchange approximation (Journal Article) | DOE PAGES Complex basis functions revisited: Implementation with applications to carbon tetrafluoride and aromatic N-containing heterocycles within the static-exchange approximation « Prev Next » Title: Complex basis functions revisited: Implementation with applications to carbon tetrafluoride and aromatic N-containing heterocycles within the static-exchange

  19. Structural basis for Tetrahymena telomerase processivity factor Teb1

    Office of Scientific and Technical Information (OSTI)

    binding to single-stranded telomeric-repeat DNA (Journal Article) | SciTech Connect SciTech Connect Search Results Journal Article: Structural basis for Tetrahymena telomerase processivity factor Teb1 binding to single-stranded telomeric-repeat DNA Citation Details In-Document Search Title: Structural basis for Tetrahymena telomerase processivity factor Teb1 binding to single-stranded telomeric-repeat DNA Authors: Zeng, Zhixiong ; Min, Bosun ; Huang, Jing ; Hong, Kyungah ; Yang, Yuting ;

  20. Structural basis for substrate specificity in the Escherichia coli maltose

    Office of Scientific and Technical Information (OSTI)

    transport system (Journal Article) | SciTech Connect Structural basis for substrate specificity in the Escherichia coli maltose transport system Citation Details In-Document Search Title: Structural basis for substrate specificity in the Escherichia coli maltose transport system Authors: Oldham, Michael L. ; Chen, Shanshuang ; Chen, Jue [1] ; HHMI) [2] + Show Author Affiliations (Purdue) [Purdue ( Publication Date: 2013-11-11 OSTI Identifier: 1105053 Resource Type: Journal Article Resource

  1. Atomic substitution reveals the structural basis for substrate adenine

    Office of Scientific and Technical Information (OSTI)

    recognition and removal by adenine DNA glycosylase (Journal Article) | SciTech Connect Atomic substitution reveals the structural basis for substrate adenine recognition and removal by adenine DNA glycosylase Citation Details In-Document Search Title: Atomic substitution reveals the structural basis for substrate adenine recognition and removal by adenine DNA glycosylase Adenine DNA glycosylase catalyzes the glycolytic removal of adenine from the promutagenic A {center_dot} oxoG base pair in

  2. Structural and Functional Basis for Inhibition of Erythrocyte Invasion by

    Office of Scientific and Technical Information (OSTI)

    Antibodies that Target Plasmodium falciparum EBA-175 (Journal Article) | SciTech Connect Journal Article: Structural and Functional Basis for Inhibition of Erythrocyte Invasion by Antibodies that Target Plasmodium falciparum EBA-175 Citation Details In-Document Search Title: Structural and Functional Basis for Inhibition of Erythrocyte Invasion by Antibodies that Target Plasmodium falciparum EBA-175 Authors: Chen, Edwin ; Paing, May M. ; Salinas, Nichole ; Sim, B. Kim Lee ; Tolia, Niraj H.

  3. Structural basis for biomolecular recognition in overlapping binding sites

    Office of Scientific and Technical Information (OSTI)

    in a diiron enzyme system (Journal Article) | SciTech Connect Structural basis for biomolecular recognition in overlapping binding sites in a diiron enzyme system Citation Details In-Document Search Title: Structural basis for biomolecular recognition in overlapping binding sites in a diiron enzyme system Authors: Acheson, Justin F. ; Bailey, Lucas J. ; Elsen, Nathaniel L. ; Fox, Brian G. [1] + Show Author Affiliations UW Publication Date: 2016-01-22 OSTI Identifier: 1229904 Resource Type:

  4. Structural basis of JAZ repression of MYC transcription factors in

    Office of Scientific and Technical Information (OSTI)

    jasmonate signalling (Journal Article) | SciTech Connect Structural basis of JAZ repression of MYC transcription factors in jasmonate signalling Citation Details In-Document Search Title: Structural basis of JAZ repression of MYC transcription factors in jasmonate signalling Authors: Zhang, Feng ; Yao, Jian ; Ke, Jiyuan ; Zhang, Li ; Lam, Vinh Q. ; Xin, Xiu-Fang ; Zhou, X. Edward ; Chen, Jian ; Brunzelle, Joseph ; Griffin, Patrick R. ; Zhou, Mingguo ; Xu, H. Eric ; Melcher, Karsten ; He ,

  5. Beyond Design Basis Event Pilot Evaluations | Department of Energy

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Beyond Design Basis Event Pilot Evaluations Beyond Design Basis Event Pilot Evaluations May 13, 2013 In the six months after the March 2011 Fukushima Daiichi nuclear power plant accident in Japan, the U.S. Department of Energy (DOE) took several actions to review the safety of its nuclear facilities and identify situations where near-term improvements could be made. These actions and recommendations were addressed in an August 2011 report to the Secretary of Energy, Review of Requirements and

  6. Technical Cost Modeling - Life Cycle Analysis Basis for Program Focus |

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Department of Energy 2 DOE Hydrogen and Fuel Cells Program and Vehicle Technologies Program Annual Merit Review and Peer Evaluation Meeting PDF icon lm001_das_2012_o.pdf More Documents & Publications Technical Cost Modeling - Life Cycle Analysis Basis for Program Focus Technical Cost Modeling - Life Cycle Analysis Basis for Program Focus Polymer Composites Research in the LM Materials Program Overview

  7. Technical Cost Modeling - Life Cycle Analysis Basis for Program Focus |

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Department of Energy 0 DOE Vehicle Technologies and Hydrogen Programs Annual Merit Review and Peer Evaluation Meeting, June 7-11, 2010 -- Washington D.C. PDF icon lm001_das_2010_o.pdf More Documents & Publications Technical Cost Modeling - Life Cycle Analysis Basis for Program Focus Technical Cost Modeling - Life Cycle Analysis Basis for Program Focus Life Cycle Modeling of Propulsion Materials

  8. New Design Methods and Algorithms for Multi-component Distillation...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Design Methods and Algorithms for Multi-component Distillation Processes New Design Methods and Algorithms for Multi-component Distillation Processes PDF icon multicomponent.pdf ...

  9. ITP Steel: Theoretical Minimum Energies to Produce Steel for Selected

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Conditions, March 2000 | Department of Energy Theoretical Minimum Energies to Produce Steel for Selected Conditions, March 2000 ITP Steel: Theoretical Minimum Energies to Produce Steel for Selected Conditions, March 2000 PDF icon theoretical_minimum_energies.pdf More Documents & Publications Ironmaking Process Alternatives Screening Study ITP Steel: Steel Industry Marginal Opportunity Study September 2005 ITP Steel: Steel Industry Energy Bandwidth Stu

  10. Theoretical crystallography with the Advanced Visualization System

    SciTech Connect (OSTI)

    Younkin, C.R.; Thornton, E.N.; Nicholas, J.B.; Jones, D.R.; Hess, A.C.

    1993-05-01

    Space is an Application Visualization System (AVS) graphics module designed for crystallographic and molecular research. The program can handle molecules, two-dimensional periodic systems, and three-dimensional periodic systems, all referred to in the paper as models. Using several methods, the user can select atoms, groups of atoms, or entire molecules. Selections can be moved, copied, deleted, and merged. An important feature of Space is the crystallography component. The program allows the user to generate the unit cell from the asymmetric unit, manipulate the unit cell, and replicate it in three dimensions. Space includes the Buerger reduction algorithm which determines the asymmetric unit and the space group of highest symmetry of an input unit cell. Space also allows the user to display planes in the lattice based on Miller indices, and to cleave the crystal to expose the surface. The user can display important precalculated volumetric data in Space, such as electron densities and electrostatic surfaces. With a variety of methods, Space can compute the electrostatic potential of any chemical system based on input point charges.

  11. Bootstrap performance profiles in stochastic algorithms assessment

    SciTech Connect (OSTI)

    Costa, Lino; Esprito Santo, Isabel A.C.P.; Oliveira, Pedro

    2015-03-10

    Optimization with stochastic algorithms has become a relevant research field. Due to its stochastic nature, its assessment is not straightforward and involves integrating accuracy and precision. Performance profiles for the mean do not show the trade-off between accuracy and precision, and parametric stochastic profiles require strong distributional assumptions and are limited to the mean performance for a large number of runs. In this work, bootstrap performance profiles are used to compare stochastic algorithms for different statistics. This technique allows the estimation of the sampling distribution of almost any statistic even with small samples. Multiple comparison profiles are presented for more than two algorithms. The advantages and drawbacks of each assessment methodology are discussed.

  12. Nonlinear Global Optimization Using Curdling Algorithm

    Energy Science and Technology Software Center (OSTI)

    1996-03-01

    An algorithm for performing curdling optimization which is a derivative-free, grid-refinement approach to nonlinear optimization was developed and implemented in software. This approach overcomes a number of deficiencies in existing approaches. Most notably, it finds extremal regions rather than only single external extremal points. The program is interactive and collects information on control parameters and constraints using menus. For up to four dimensions, function convergence is displayed graphically. Because the algorithm does not compute derivatives,more » gradients or vectors, it is numerically stable. It can find all the roots of a polynomial in one pass. It is an inherently parallel algorithm. Constraints are handled as being initially fuzzy, but become tighter with each iteration.« less

  13. Parallelism of the SANDstorm hash algorithm.

    SciTech Connect (OSTI)

    Torgerson, Mark Dolan; Draelos, Timothy John; Schroeppel, Richard Crabtree

    2009-09-01

    Mainstream cryptographic hashing algorithms are not parallelizable. This limits their speed and they are not able to take advantage of the current trend of being run on multi-core platforms. Being limited in speed limits their usefulness as an authentication mechanism in secure communications. Sandia researchers have created a new cryptographic hashing algorithm, SANDstorm, which was specifically designed to take advantage of multi-core processing and be parallelizable on a wide range of platforms. This report describes a late-start LDRD effort to verify the parallelizability claims of the SANDstorm designers. We have shown, with operating code and bench testing, that the SANDstorm algorithm may be trivially parallelized on a wide range of hardware platforms. Implementations using OpenMP demonstrates a linear speedup with multiple cores. We have also shown significant performance gains with optimized C code and the use of assembly instructions to exploit particular platform capabilities.

  14. Theoretical priors on modified growth parametrisations

    SciTech Connect (OSTI)

    Song, Yong-Seon; Hollenstein, Lukas; Caldera-Cabral, Gabriela; Koyama, Kazuya E-mail: Lukas.Hollenstein@unige.ch E-mail: Kazuya.Koyama@port.ac.uk

    2010-04-01

    Next generation surveys will observe the large-scale structure of the Universe with unprecedented accuracy. This will enable us to test the relationships between matter over-densities, the curvature perturbation and the Newtonian potential. Any large-distance modification of gravity or exotic nature of dark energy modifies these relationships as compared to those predicted in the standard smooth dark energy model based on General Relativity. In linear theory of structure growth such modifications are often parameterised by virtue of two functions of space and time that enter the relation of the curvature perturbation to, first, the matter over- density, and second, the Newtonian potential. We investigate the predictions for these functions in Brans-Dicke theory, clustering dark energy models and interacting dark energy models. We find that each theory has a distinct path in the parameter space of modified growth. Understanding these theoretical priors on the parameterisations of modified growth is essential to reveal the nature of cosmic acceleration with the help of upcoming observations of structure formation.

  15. Resilient Control Systems Practical Metrics Basis for Defining Mission Impact

    SciTech Connect (OSTI)

    Craig G. Rieger

    2014-08-01

    "Resilience describes how systems operate at an acceptable level of normalcy despite disturbances or threats. In this paper we first consider the cognitive, cyber-physical interdependencies inherent in critical infrastructure systems and how resilience differs from reliability to mitigate these risks. Terminology and metrics basis are provided to integrate the cognitive, cyber-physical aspects that should be considered when defining solutions for resilience. A practical approach is taken to roll this metrics basis up to system integrity and business case metrics that establish proper operation and impact. A notional chemical processing plant is the use case for demonstrating how the system integrity metrics can be applied to establish performance, and

  16. Berkeley Algorithms Help Researchers Understand Dark Energy

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Berkeley Algorithms Help Researchers Understand Dark Energy Berkeley Algorithms Help Researchers Understand Dark Energy November 24, 2014 Contact: Linda Vu, +1 510 495 2402, lvu@lbl.gov Scientists believe that dark energy-the mysterious force that is accelerating cosmic expansion-makes up about 70 percent of the mass and energy of the universe. But because they don't know what it is, they cannot observe it directly. To unlock the mystery of dark energy and its influence on the universe,

  17. Graph algorithms in the titan toolkit.

    SciTech Connect (OSTI)

    McLendon, William Clarence, III; Wylie, Brian Neil

    2009-10-01

    Graph algorithms are a key component in a wide variety of intelligence analysis activities. The Graph-Based Informatics for Non-Proliferation and Counter-Terrorism project addresses the critical need of making these graph algorithms accessible to Sandia analysts in a manner that is both intuitive and effective. Specifically we describe the design and implementation of an open source toolkit for doing graph analysis, informatics, and visualization that provides Sandia with novel analysis capability for non-proliferation and counter-terrorism.

  18. Theoretical Model for Nanoporous Carbon Supercapacitors

    SciTech Connect (OSTI)

    Sumpter, Bobby G; Meunier, Vincent; Huang, Jingsong

    2008-01-01

    The unprecedented anomalous increase in capacitance of nanoporous carbon supercapacitors at pore sizes smaller than 1 nm [Science 2006, 313, 1760.] challenges the long-held presumption that pores smaller than the size of solvated electrolyte ions do not contribute to energy storage. We propose a heuristic model to replace the commonly used model for an electric double-layer capacitor (EDLC) on the basis of an electric double-cylinder capacitor (EDCC) for mesopores (2 {50 nm pore size), which becomes an electric wire-in-cylinder capacitor (EWCC) for micropores (< 2 nm pore size). Our analysis of the available experimental data in the micropore regime is confirmed by 1st principles density functional theory calculations and reveals significant curvature effects for carbon capacitance. The EDCC (and/or EWCC) model allows the supercapacitor properties to be correlated with pore size, specific surface area, Debye length, electrolyte concentration and dielectric constant, and solute ion size. The new model not only explains the experimental data, but also offers a practical direction for the optimization of the properties of carbon supercapacitors through experiments.

  19. Canister Storage Building (CSB) Design Basis Accident Analysis Documentation

    SciTech Connect (OSTI)

    CROWE, R.D.

    1999-09-09

    This document provides the detailed accident analysis to support ''HNF-3553, Spent Nuclear Fuel Project Final Safety, Analysis Report, Annex A,'' ''Canister Storage Building Final Safety Analysis Report.'' All assumptions, parameters, and models used to provide the analysis of the design basis accidents are documented to support the conclusions in the Canister Storage Building Final Safety Analysis Report.

  20. Solar Power Tower Design Basis Document, Revision 0

    SciTech Connect (OSTI)

    ZAVOICO,ALEXIS B.

    2001-07-01

    This report contains the design basis for a generic molten-salt solar power tower. A solar power tower uses a field of tracking mirrors (heliostats) that redirect sunlight on to a centrally located receiver mounted on top a tower, which absorbs the concentrated sunlight. Molten nitrate salt, pumped from a tank at ground level, absorbs the sunlight, heating it up to 565 C. The heated salt flows back to ground level into another tank where it is stored, then pumped through a steam generator to produce steam and make electricity. This report establishes a set of criteria upon which the next generation of solar power towers will be designed. The report contains detailed criteria for each of the major systems: Collector System, Receiver System, Thermal Storage System, Steam Generator System, Master Control System, and Electric Heat Tracing System. The Electric Power Generation System and Balance of Plant discussions are limited to interface requirements. This design basis builds on the extensive experience gained from the Solar Two project and includes potential design innovations that will improve reliability and lower technical risk. This design basis document is a living document and contains several areas that require trade-studies and design analysis to fully complete the design basis. Project- and site-specific conditions and requirements will also resolve open To Be Determined issues.

  1. CRAD, Safety Basis- Idaho MF-628 Drum Treatment Facility

    Broader source: Energy.gov [DOE]

    A section of Appendix C to DOE G 226.1-2 "Federal Line Management Oversight of Department of Energy Nuclear Facilities." Consists of Criteria Review and Approach Documents (CRADs) used for a May 2007 readiness assessment of the Safety Basis at the Advanced Mixed Waste Treatment Project.

  2. Cold Vacuum Drying (CVD) Facility Design Basis Accident Analysis Documentation

    SciTech Connect (OSTI)

    PIEPHO, M.G.

    1999-10-20

    This document provides the detailed accident analysis to support HNF-3553, Annex B, Spent Nuclear Fuel Project Final Safety Analysis Report, ''Cold Vacuum Drying Facility Final Safety Analysis Report (FSAR).'' All assumptions, parameters and models used to provide the analysis of the design basis accidents are documented to support the conclusions in the FSAR.

  3. Canister Storage Building (CSB) Design Basis Accident Analysis Documentation

    SciTech Connect (OSTI)

    CROWE, R.D.; PIEPHO, M.G.

    2000-03-23

    This document provided the detailed accident analysis to support HNF-3553, Spent Nuclear Fuel Project Final Safety Analysis Report, Annex A, ''Canister Storage Building Final Safety Analysis Report''. All assumptions, parameters, and models used to provide the analysis of the design basis accidents are documented to support the conclusions in the Canister Storage Building Final Safety Analysis Report.

  4. Canister storage building design basis accident analysis documentation

    SciTech Connect (OSTI)

    KOPELIC, S.D.

    1999-02-25

    This document provides the detailed accident analysis to support HNF-3553, Spent Nuclear Fuel Project Final Safety Analysis Report, Annex A, ''Canister Storage Building Final Safety Analysis Report.'' All assumptions, parameters, and models used to provide the analysis of the design basis accidents are documented to support the conclusions in the Canister Storage Building Final Safety Analysis Report.

  5. CRAD, Safety Basis- Idaho Accelerated Retrieval Project Phase II

    Broader source: Energy.gov [DOE]

    A section of Appendix C to DOE G 226.1-2 "Federal Line Management Oversight of Department of Energy Nuclear Facilities." Consists of Criteria Review and Approach Documents (CRADs) used for a February 2006 Commencement of Operations assessment of the Safety Basis at the Idaho Accelerated Retrieval Project Phase II.

  6. Gamma-ray Spectral Analysis Algorithm Library

    Energy Science and Technology Software Center (OSTI)

    1997-09-25

    The routines of the Gauss Algorithm library are used to implement special purpose products that need to analyze gamma-ray spectra from GE semiconductor detectors as a part of their function. These routines provide the ability to calibrate energy, calibrate peakwidth, search for peaks, search for regions, and fit the spectral data in a given region to locate gamma rays.

  7. Gamma-ray spectral analysis algorithm library

    Energy Science and Technology Software Center (OSTI)

    2013-05-06

    The routines of the Gauss Algorithms library are used to implement special purpose products that need to analyze gamma-ray spectra from Ge semiconductor detectors as a part of their function. These routines provide the ability to calibrate energy, calibrate peakwidth, search for peaks, search for regions, and fit the spectral data in a given region to locate gamma rays.

  8. Control algorithms for autonomous robot navigation

    SciTech Connect (OSTI)

    Jorgensen, C.C.

    1985-09-20

    This paper examines control algorithm requirements for autonomous robot navigation outside laboratory environments. Three aspects of navigation are considered: navigation control in explored terrain, environment interactions with robot sensors, and navigation control in unanticipated situations. Major navigation methods are presented and relevance of traditional human learning theory is discussed. A new navigation technique linking graph theory and incidental learning is introduced.

  9. Experimental and Theoretical Investigation of Lubricant and Additive

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Effects on Engine Friction | Department of Energy Theoretical Investigation of Lubricant and Additive Effects on Engine Friction Experimental and Theoretical Investigation of Lubricant and Additive Effects on Engine Friction Combining data from motored engine friction, a theoretical engine model, a line friction contact rig, and a fired engine can provide better insight to lube oil and additive performance. PDF icon p-02_rohr.pdf More Documents & Publications Validation of a Small Engine

  10. Theoretical Predictions of the thermodynamic Properties of Solid Sorbents

    Office of Scientific and Technical Information (OSTI)

    Capture CO2 Applications (Conference) | SciTech Connect Conference: Theoretical Predictions of the thermodynamic Properties of Solid Sorbents Capture CO2 Applications Citation Details In-Document Search Title: Theoretical Predictions of the thermodynamic Properties of Solid Sorbents Capture CO2 Applications We are establishing a theoretical procedure to identify most potential candidates of CO{sub 2} solid sorbents from a large solid material databank to meet the DOE programmatic goal for

  11. Theoretical Description of the Fission Process

    SciTech Connect (OSTI)

    Witold Nazarewicz

    2009-10-25

    Advanced theoretical methods and high-performance computers may finally unlock the secrets of nuclear fission, a fundamental nuclear decay that is of great relevance to society. In this work, we studied the phenomenon of spontaneous fission using the symmetry-unrestricted nuclear density functional theory (DFT). Our results show that many observed properties of fissioning nuclei can be explained in terms of pathways in multidimensional collective space corresponding to different geometries of fission products. From the calculated collective potential and collective mass, we estimated spontaneous fission half-lives, and good agreement with experimental data was found. We also predicted a new phenomenon of trimodal spontaneous fission for some transfermium isotopes. Our calculations demonstrate that fission barriers of excited superheavy nuclei vary rapidly with particle number, pointing to the importance of shell effects even at large excitation energies. The results are consistent with recent experiments where superheavy elements were created by bombarding an actinide target with 48-calcium; yet even at high excitation energies, sizable fission barriers remained. Not only does this reveal clues about the conditions for creating new elements, it also provides a wider context for understanding other types of fission. Understanding of the fission process is crucial for many areas of science and technology. Fission governs existence of many transuranium elements, including the predicted long-lived superheavy species. In nuclear astrophysics, fission influences the formation of heavy elements on the final stages of the r-process in a very high neutron density environment. Fission applications are numerous. Improved understanding of the fission process will enable scientists to enhance the safety and reliability of the nations nuclear stockpile and nuclear reactors. The deployment of a fleet of safe and efficient advanced reactors, which will also minimize radiotoxic waste and be proliferation-resistant, is a goal for the advanced nuclear fuel cycles program. While in the past the design, construction, and operation of reactors were supported through empirical trials, this new phase in nuclear energy production is expected to heavily rely on advanced modeling and simulation capabilities.

  12. Theoretical Studies of Hydrogen Storage Alloys.

    SciTech Connect (OSTI)

    Jonsson, Hannes

    2012-03-22

    Theoretical calculations were carried out to search for lightweight alloys that can be used to reversibly store hydrogen in mobile applications, such as automobiles. Our primary focus was on magnesium based alloys. While MgH{sub 2} is in many respects a promising hydrogen storage material, there are two serious problems which need to be solved in order to make it useful: (i) the binding energy of the hydrogen atoms in the hydride is too large, causing the release temperature to be too high, and (ii) the diffusion of hydrogen through the hydride is so slow that loading of hydrogen into the metal takes much too long. In the first year of the project, we found that the addition of ca. 15% of aluminum decreases the binding energy to the hydrogen to the target value of 0.25 eV which corresponds to release of 1 bar hydrogen gas at 100 degrees C. Also, the addition of ca. 15% of transition metal atoms, such as Ti or V, reduces the formation energy of interstitial H-atoms making the diffusion of H-atoms through the hydride more than ten orders of magnitude faster at room temperature. In the second year of the project, several calculations of alloys of magnesium with various other transition metals were carried out and systematic trends in stability, hydrogen binding energy and diffusivity established. Some calculations of ternary alloys and their hydrides were also carried out, for example of Mg{sub 6}AlTiH{sub 16}. It was found that the binding energy reduction due to the addition of aluminum and increased diffusivity due to the addition of a transition metal are both effective at the same time. This material would in principle work well for hydrogen storage but it is, unfortunately, unstable with respect to phase separation. A search was made for a ternary alloy of this type where both the alloy and the corresponding hydride are stable. Promising results were obtained by including Zn in the alloy.

  13. Improvements of Nuclear Data and Its Uncertainties by Theoretical...

    Office of Scientific and Technical Information (OSTI)

    Improvements of Nuclear Data and Its Uncertainties by Theoretical Modeling Talou, Patrick Los Alamos National Laboratory; Nazarewicz, Witold University of Tennessee, Knoxville,...

  14. Theoretical/Computational Tools for Energy-Relevant Catalysis...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    TheoreticalComputational Tools for Energy-Relevant Catalysis FWPProject Description: Project Leader(s): James Evans, Mark Gordon Principal Investigators: James Evans, Mark Gordon...

  15. Advanced Fuel Cycle Economic Tools, Algorithms, and Methodologies

    SciTech Connect (OSTI)

    David E. Shropshire

    2009-05-01

    The Advanced Fuel Cycle Initiative (AFCI) Systems Analysis supports engineering economic analyses and trade-studies, and requires a requisite reference cost basis to support adequate analysis rigor. In this regard, the AFCI program has created a reference set of economic documentation. The documentation consists of the “Advanced Fuel Cycle (AFC) Cost Basis” report (Shropshire, et al. 2007), “AFCI Economic Analysis” report, and the “AFCI Economic Tools, Algorithms, and Methodologies Report.” Together, these documents provide the reference cost basis, cost modeling basis, and methodologies needed to support AFCI economic analysis. The application of the reference cost data in the cost and econometric systems analysis models will be supported by this report. These methodologies include: the energy/environment/economic evaluation of nuclear technology penetration in the energy market—domestic and internationally—and impacts on AFCI facility deployment, uranium resource modeling to inform the front-end fuel cycle costs, facility first-of-a-kind to nth-of-a-kind learning with application to deployment of AFCI facilities, cost tradeoffs to meet nuclear non-proliferation requirements, and international nuclear facility supply/demand analysis. The economic analysis will be performed using two cost models. VISION.ECON will be used to evaluate and compare costs under dynamic conditions, consistent with the cases and analysis performed by the AFCI Systems Analysis team. Generation IV Excel Calculations of Nuclear Systems (G4-ECONS) will provide static (snapshot-in-time) cost analysis and will provide a check on the dynamic results. In future analysis, additional AFCI measures may be developed to show the value of AFCI in closing the fuel cycle. Comparisons can show AFCI in terms of reduced global proliferation (e.g., reduction in enrichment), greater sustainability through preservation of a natural resource (e.g., reduction in uranium ore depletion), value from weaning the U.S. from energy imports (e.g., measures of energy self-sufficiency), and minimization of future high level waste (HLW) repositories world-wide.

  16. Semi-Implicit Reversible Algorithms for Rigid Body Rotational Dynamics

    SciTech Connect (OSTI)

    Nukala, Phani K; Shelton Jr, William Allison

    2006-09-01

    This paper presents two semi-implicit algorithms based on splitting methodology for rigid body rotational dynamics. The first algorithm is a variation of partitioned Runge-Kutta (PRK) methodology that can be formulated as a splitting method. The second algorithm is akin to a multiple time stepping scheme and is based on modified Crouch-Grossman (MCG) methodology, which can also be expressed as a splitting algorithm. These algorithms are second-order accurate and time-reversible; however, they are not Poisson integrators, i.e., non-symplectic. These algorithms conserve some of the first integrals of motion, but some others are not conserved; however, the fluctuations in these invariants are bounded over exponentially long time intervals. These algorithms exhibit excellent long-term behavior because of their reversibility property and their (approximate) Poisson structure preserving property. The numerical results indicate that the proposed algorithms exhibit superior performance compared to some of the currently well known algorithms such as the Simo-Wong algorithm, Newmark algorithm, discrete Moser-Veselov algorithm, Lewis-Simo algorithm, and the LIEMID[EA] algorithm.

  17. Design-Load Basis for LANL Structures, Systems, and Components

    SciTech Connect (OSTI)

    I. Cuesta

    2004-09-01

    This document supports the recommendations in the Los Alamos National Laboratory (LANL) Engineering Standard Manual (ESM), Chapter 5--Structural providing the basis for the loads, analysis procedures, and codes to be used in the ESM. It also provides the justification for eliminating the loads to be considered in design, and evidence that the design basis loads are appropriate and consistent with the graded approach required by the Department of Energy (DOE) Code of Federal Regulation Nuclear Safety Management, 10, Part 830. This document focuses on (1) the primary and secondary natural phenomena hazards listed in DOE-G-420.1-2, Appendix C, (2) additional loads not related to natural phenomena hazards, and (3) the design loads on structures during construction.

  18. Basis for NGNP Reactor Design Down-Selection

    SciTech Connect (OSTI)

    L.E. Demick

    2011-11-01

    The purpose of this paper is to identify the extent of technology development, design and licensing maturity anticipated to be required to credibly identify differences that could make a technical choice practical between the prismatic and pebble bed reactor designs. This paper does not address a business decision based on the economics, business model and resulting business case since these will vary based on the reactor application. The selection of the type of reactor, the module ratings, the number of modules, the configuration of the balance of plant and other design selections will be made on the basis of optimizing the Business Case for the application. These are not decisions that can be made on a generic basis.

  19. WIPP - Passive Institutional Controls (PICs) Technical and Conceptual Basis

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Technical Basis Materials Analysis - evaluation of permanent marker conceptual design elements Monument Survey - examination of regional stone and ancient petroglyphs Ancient Cementitious Materials - literature review of sustainable manmade structures Conceptual design, plans, and description - as submitted to U.S. EPA as part of the application to open WIPP 10,000 years of Solitude - independent analysis - 1990 Expert Judgment on Inadvertent Intrusion into WIPP - 1991 Expert Judgment on Markers

  20. Interim Safety Basis for Fuel Supply Shutdown Facility

    SciTech Connect (OSTI)

    BENECKE, M.W.

    2000-09-07

    This ISB, in conjunction with the IOSR, provides the required basis for interim operation or restrictions on interim operations and administrative controls for the facility until a SAR is prepared in accordance with the new requirements or the facility is shut down. It is concluded that the risks associated with tha current and anticipated mode of the facility, uranium disposition, clean up, and transition activities required for permanent closure, are within risk guidelines.

  1. Online Monitoring Technical Basis and Analysis Framework for Large Power

    Energy Savers [EERE]

    Transformers; Interim Report for FY 2012 | Department of Energy for Large Power Transformers; Interim Report for FY 2012 Online Monitoring Technical Basis and Analysis Framework for Large Power Transformers; Interim Report for FY 2012 The Light Water Reactor Sustainability Program is a research, development, and deployment program sponsored by the U.S. Department of Energy Office of Nuclear Energy. The program is operated in collaboration with the Electric Power Research Institute's (EPRI's)

  2. A garbage collection algorithm for shared memory parallel processors

    SciTech Connect (OSTI)

    Crammond, J. )

    1988-12-01

    This paper describes a technique for adapting the Morris sliding garbage collection algorithm to execute on parallel machines with shared memory. The algorithm is described within the framework of an implementation of the parallel logic language Parlog. However, the algorithm is a general one and can easily be adapted to parallel Prolog systems and to other languages. The performance of the algorithm executing a few simple Parlog benchmarks is analyzed. Finally, it is shown how the technique for parallelizing the sequential algorithm can be adapted for a semi-space copying algorithm.

  3. Theoretical minimum energies to produce steel for selected conditions

    SciTech Connect (OSTI)

    Fruehan, R. J.; Fortini, O.; Paxton, H. W.; Brindle, R.

    2000-03-01

    An ITP study has determined the theoretical minimum energy requirements for producing steel from ore, scrap, and direct reduced iron. Dr. Richard Fruehan's report, Theoretical Minimum Energies to Produce Steel for Selected Conditions, provides insight into the potential energy savings (and associated reductions in carbon dioxide emissions) for ironmaking, steelmaking, and rolling processes (PDF459 KB).

  4. new wind-turbine controls algorithms

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    wind-turbine controls algorithms - Sandia Energy Energy Search Icon Sandia Home Locations Contact Us Employee Locator Energy & Climate Secure & Sustainable Energy Future Stationary Power Energy Conversion Efficiency Solar Energy Wind Energy Water Power Supercritical CO2 Geothermal Natural Gas Safety, Security & Resilience of the Energy Infrastructure Energy Storage Nuclear Power & Engineering Grid Modernization Battery Testing Nuclear Fuel Cycle Defense Waste Management Programs

  5. CRAD, Safety Basis Upgrade Review (DOE-STD-3009-2014) - May 15...

    Office of Environmental Management (EM)

    1) provides objectives, criteria, and approaches for establishing and maintaining the safety basis at nuclear facilities. CRAD, Safety Basis Upgrade Review (DOE-STD-3009-2014)...

  6. Scaling Up Coordinate Descent Algorithms for Large ?1 Regularization Problems

    SciTech Connect (OSTI)

    Scherrer, Chad; Halappanavar, Mahantesh; Tewari, Ambuj; Haglin, David J.

    2012-07-03

    We present a generic framework for parallel coordinate descent (CD) algorithms that has as special cases the original sequential algorithms of Cyclic CD and Stochastic CD, as well as the recent parallel Shotgun algorithm of Bradley et al. We introduce two novel parallel algorithms that are also special cases---Thread-Greedy CD and Coloring-Based CD---and give performance measurements for an OpenMP implementation of these.

  7. RELEASE OF DRIED RADIOACTIVE WASTE MATERIALS TECHNICAL BASIS DOCUMENT

    SciTech Connect (OSTI)

    KOZLOWSKI, S.D.

    2007-05-30

    This technical basis document was developed to support RPP-23429, Preliminary Documented Safety Analysis for the Demonstration Bulk Vitrification System (PDSA) and RPP-23479, Preliminary Documented Safety Analysis for the Contact-Handled Transuranic Mixed (CH-TRUM) Waste Facility. The main document describes the risk binning process and the technical basis for assigning risk bins to the representative accidents involving the release of dried radioactive waste materials from the Demonstration Bulk Vitrification System (DBVS) and to the associated represented hazardous conditions. Appendices D through F provide the technical basis for assigning risk bins to the representative dried waste release accident and associated represented hazardous conditions for the Contact-Handled Transuranic Mixed (CH-TRUM) Waste Packaging Unit (WPU). The risk binning process uses an evaluation of the frequency and consequence of a given representative accident or represented hazardous condition to determine the need for safety structures, systems, and components (SSC) and technical safety requirement (TSR)-level controls. A representative accident or a represented hazardous condition is assigned to a risk bin based on the potential radiological and toxicological consequences to the public and the collocated worker. Note that the risk binning process is not applied to facility workers because credible hazardous conditions with the potential for significant facility worker consequences are considered for safety-significant SSCs and/or TSR-level controls regardless of their estimated frequency. The controls for protection of the facility workers are described in RPP-23429 and RPP-23479. Determination of the need for safety-class SSCs was performed in accordance with DOE-STD-3009-94, Preparation Guide for US. Department of Energy Nonreactor Nuclear Facility Documented Safety Analyses, as described below.

  8. The Bender-Dunne basis operators as Hilbert space operators

    SciTech Connect (OSTI)

    Bunao, Joseph; Galapon, Eric A. E-mail: eric.galapon@upd.edu.ph

    2014-02-15

    The Bender-Dunne basis operators, T{sub ?m,n}=2{sup ?n}?{sub k=0}{sup n}(n/k )q{sup k}p{sup ?m}q{sup n?k} where q and p are the position and momentum operators, respectively, are formal integral operators in position representation in the entire real line R for positive integers n and m. We show, by explicit construction of a dense domain, that the operators T{sub ?m,n}'s are densely defined operators in the Hilbert space L{sup 2}(R)

  9. Guidance For Preparatioon of Basis For Interim Operation (BIO) Documents

    Energy Savers [EERE]

    3011-2002 December 2002 Superceding DOE-STD-3011-94 November 1994 DOE STANDARD GUIDANCE FOR PREPARATION OF BASIS FOR INTERIM OPERATION (BIO) DOCUMENTS U.S. Department of Energy AREA SAFT Washington, D.C. 20585 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. NOT MEASUREMENT SENSITIVE DOE-STD-3011-2002 ii This document has been reproduced directly from the best available copy. Available to DOE and DOE contractors from ES&H Technical Information Services, U.S.

  10. Preparation of Safety Basis Documents for Transuranic (TRU) Waste Facilities

    Energy Savers [EERE]

    5506-2007 April 2007 DOE STANDARD Preparation of Safety Basis Documents for Transuranic (TRU) Waste Facilities U.S. Department of Energy Washington, D.C. 20585 AREA-SAFT DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. DOE-STD-5506-2007 ii Available on the Department of Energy Technical Standards Program Web Site at Http://tis.eh.doe.gov/techstds/ DOE-STD-5506-2007 iii Foreword This Standard provides analytical assumptions and methods, as well as hazard controls

  11. Modeling and Algorithmic Approaches to Constitutively-Complex, Micro-structured Fluids

    SciTech Connect (OSTI)

    Forest, Mark Gregory [University of North Carolina at Chapel Hill] [University of North Carolina at Chapel Hill

    2014-05-06

    The team for this Project made significant progress on modeling and algorithmic approaches to hydrodynamics of fluids with complex microstructure. Our advances are broken down into modeling and algorithmic approaches. In experiments a driven magnetic bead in a complex fluid accelerates out of the Stokes regime and settles into another apparent linear response regime. The modeling explains the take-off as a deformation of entanglements, and the longtime behavior is a nonlinear, far-from-equilibrium property. Furthermore, the model has predictive value, as we can tune microstructural properties relative to the magnetic force applied to the bead to exhibit all possible behaviors. Wave-theoretic probes of complex fluids have been extended in two significant directions, to small volumes and the nonlinear regime. Heterogeneous stress and strain features that lie beyond experimental capability were studied. It was shown that nonlinear penetration of boundary stress in confined viscoelastic fluids is not monotone, indicating the possibility of interlacing layers of linear and nonlinear behavior, and thus layers of variable viscosity. Models, algorithms, and codes were developed and simulations performed leading to phase diagrams of nanorod dispersion hydrodynamics in parallel shear cells and confined cavities representative of film and membrane processing conditions. Hydrodynamic codes for polymeric fluids are extended to include coupling between microscopic and macroscopic models, and to the strongly nonlinear regime.

  12. A Faster Parallel Algorithm and Efficient Multithreaded Implementations for Evaluating Betweenness Centrality on Massive Datasets

    SciTech Connect (OSTI)

    Madduri, Kamesh; Ediger, David; Jiang, Karl; Bader, David A.; Chavarria-Miranda, Daniel

    2009-02-15

    We present a new lock-free parallel algorithm for computing betweenness centralityof massive small-world networks. With minor changes to the data structures, ouralgorithm also achieves better spatial cache locality compared to previous approaches. Betweenness centrality is a key algorithm kernel in HPCS SSCA#2, a benchmark extensively used to evaluate the performance of emerging high-performance computing architectures for graph-theoretic computations. We design optimized implementations of betweenness centrality and the SSCA#2 benchmark for two hardware multithreaded systems: a Cray XMT system with the Threadstorm processor, and a single-socket Sun multicore server with the UltraSPARC T2 processor. For a small-world network of 134 million vertices and 1.073 billion edges, the 16-processor XMT system and the 8-core Sun Fire T5120 server achieve TEPS scores (an algorithmic performance count for the SSCA#2 benchmark) of 160 million and 90 million respectively, which corresponds to more than a 2X performance improvement over the previous parallel implementations. To better characterize the performance of these multithreaded systems, we correlate the SSCA#2 performance results with data from the memory-intensive STREAM and RandomAccess benchmarks. Finally, we demonstrate the applicability of our implementation to analyze massive real-world datasets by computing approximate betweenness centrality for a large-scale IMDb movie-actor network.

  13. US-VISIT Identity Matching Algorithm Evaluation Program: ADIS Algorithm Evaluation Project Plan Update

    SciTech Connect (OSTI)

    Grant, C W; Lenderman, J S; Gansemer, J D

    2011-02-24

    This document is an update to the 'ADIS Algorithm Evaluation Project Plan' specified in the Statement of Work for the US-VISIT Identity Matching Algorithm Evaluation Program, as deliverable II.D.1. The original plan was delivered in August 2010. This document modifies the plan to reflect modified deliverables reflecting delays in obtaining a database refresh. This document describes the revised schedule of the program deliverables. The detailed description of the processes used, the statistical analysis processes and the results of the statistical analysis will be described fully in the program deliverables. The US-VISIT Identity Matching Algorithm Evaluation Program is work performed by Lawrence Livermore National Laboratory (LLNL) under IAA HSHQVT-07-X-00002 P00004 from the Department of Homeland Security (DHS).

  14. Cold Vacuum Drying facility design basis accident analysis documentation

    SciTech Connect (OSTI)

    CROWE, R.D.

    2000-08-08

    This document provides the detailed accident analysis to support HNF-3553, Annex B, Spent Nuclear Fuel Project Final Safety Analysis Report (FSAR), ''Cold Vacuum Drying Facility Final Safety Analysis Report.'' All assumptions, parameters, and models used to provide the analysis of the design basis accidents are documented to support the conclusions in the FSAR. The calculations in this document address the design basis accidents (DBAs) selected for analysis in HNF-3553, ''Spent Nuclear Fuel Project Final Safety Analysis Report'', Annex B, ''Cold Vacuum Drying Facility Final Safety Analysis Report.'' The objective is to determine the quantity of radioactive particulate available for release at any point during processing at the Cold Vacuum Drying Facility (CVDF) and to use that quantity to determine the amount of radioactive material released during the DBAs. The radioactive material released is used to determine dose consequences to receptors at four locations, and the dose consequences are compared with the appropriate evaluation guidelines and release limits to ascertain the need for preventive and mitigative controls.

  15. Algorithms for Contact in a Mulitphysics Environment

    Energy Science and Technology Software Center (OSTI)

    2001-12-19

    Many codes require either a contact capability or a need to determine geometric proximity of non-connected topological entities (which is a subset of what contact requires). ACME is a library to provide services to determine contact forces and/or geometric proximity interactions. This includes generic capabilities such as determining points in Cartesian volumes, finding faces in Cartesian volumes, etc. ACME can be run in single or multi-processor mode (the basic algorithms have been tested up tomore » 4500 processors).« less

  16. A Monte Carlo algorithm for degenerate plasmas

    SciTech Connect (OSTI)

    Turrell, A.E. Sherlock, M.; Rose, S.J.

    2013-09-15

    A procedure for performing Monte Carlo calculations of plasmas with an arbitrary level of degeneracy is outlined. It has possible applications in inertial confinement fusion and astrophysics. Degenerate particles are initialised according to the FermiDirac distribution function, and scattering is via a Pauli blocked binary collision approximation. The algorithm is tested against degenerate electronion equilibration, and the degenerate resistivity transport coefficient from unmagnetised first order transport theory. The code is applied to the cold fuel shell and alpha particle equilibration problem of inertial confinement fusion.

  17. Algorithmic crystal chemistry: A cellular automata approach

    SciTech Connect (OSTI)

    Krivovichev, S. V.

    2012-01-15

    Atomic-molecular mechanisms of crystal growth can be modeled based on crystallochemical information using cellular automata (a particular case of finite deterministic automata). In particular, the formation of heteropolyhedral layered complexes in uranyl selenates can be modeled applying a one-dimensional three-colored cellular automaton. The use of the theory of calculations (in particular, the theory of automata) in crystallography allows one to interpret crystal growth as a computational process (the realization of an algorithm or program with a finite number of steps).

  18. A Unified Differential Evolution Algorithm for Global Optimization

    SciTech Connect (OSTI)

    Qiang, Ji; Mitchell, Chad

    2014-06-24

    Abstract?In this paper, we propose a new unified differential evolution (uDE) algorithm for single objective global optimization. Instead of selecting among multiple mutation strategies as in the conventional differential evolution algorithm, this algorithm employs a single equation as the mutation strategy. It has the virtue of mathematical simplicity and also provides users the flexbility for broader exploration of different mutation strategies. Numerical tests using twelve basic unimodal and multimodal functions show promising performance of the proposed algorithm in comparison to convential differential evolution algorithms.

  19. Evaluation of machine learning algorithms for prediction of regions of high RANS uncertainty

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Ling, Julia; Templeton, Jeremy Alan

    2015-08-04

    Reynolds Averaged Navier Stokes (RANS) models are widely used in industry to predict fluid flows, despite their acknowledged deficiencies. Not only do RANS models often produce inaccurate flow predictions, but there are very limited diagnostics available to assess RANS accuracy for a given flow configuration. If experimental or higher fidelity simulation results are not available for RANS validation, there is no reliable method to evaluate RANS accuracy. This paper explores the potential of utilizing machine learning algorithms to identify regions of high RANS uncertainty. Three different machine learning algorithms were evaluated: support vector machines, Adaboost decision trees, and random forests.more » The algorithms were trained on a database of canonical flow configurations for which validated direct numerical simulation or large eddy simulation results were available, and were used to classify RANS results on a point-by-point basis as having either high or low uncertainty, based on the breakdown of specific RANS modeling assumptions. Classifiers were developed for three different basic RANS eddy viscosity model assumptions: the isotropy of the eddy viscosity, the linearity of the Boussinesq hypothesis, and the non-negativity of the eddy viscosity. It is shown that these classifiers are able to generalize to flows substantially different from those on which they were trained. As a result, feature selection techniques, model evaluation, and extrapolation detection are discussed in the context of turbulence modeling applications.« less

  20. Evaluation of machine learning algorithms for prediction of regions of high RANS uncertainty

    SciTech Connect (OSTI)

    Ling, Julia; Templeton, Jeremy Alan

    2015-08-04

    Reynolds Averaged Navier Stokes (RANS) models are widely used in industry to predict fluid flows, despite their acknowledged deficiencies. Not only do RANS models often produce inaccurate flow predictions, but there are very limited diagnostics available to assess RANS accuracy for a given flow configuration. If experimental or higher fidelity simulation results are not available for RANS validation, there is no reliable method to evaluate RANS accuracy. This paper explores the potential of utilizing machine learning algorithms to identify regions of high RANS uncertainty. Three different machine learning algorithms were evaluated: support vector machines, Adaboost decision trees, and random forests. The algorithms were trained on a database of canonical flow configurations for which validated direct numerical simulation or large eddy simulation results were available, and were used to classify RANS results on a point-by-point basis as having either high or low uncertainty, based on the breakdown of specific RANS modeling assumptions. Classifiers were developed for three different basic RANS eddy viscosity model assumptions: the isotropy of the eddy viscosity, the linearity of the Boussinesq hypothesis, and the non-negativity of the eddy viscosity. It is shown that these classifiers are able to generalize to flows substantially different from those on which they were trained. As a result, feature selection techniques, model evaluation, and extrapolation detection are discussed in the context of turbulence modeling applications.

  1. Neutron-Antineutron Oscillations: Theoretical Status and Experimental Prospects

    SciTech Connect (OSTI)

    Phillips, D. G.; Snow, W. M.; Babu, K.; Banerjee, S.; Baxter, D. V.; Berezhiani, Z.; Bergevin, M.; Bhattacharya, S.; Brooijmans, G.; Castellanos, L.; et al.,

    2014-10-04

    This paper summarizes the relevant theoretical developments, outlines some ideas to improve experimental searches for free neutron-antineutron oscillations, and suggests avenues for future improvement in the experimental sensitivity.

  2. Theoretical/best practice energy use in metalcasting operations

    SciTech Connect (OSTI)

    Schifo, J. F.; Radia, J. T.

    2004-05-01

    This study determined the theoretical minimum energy requirements for melting processes for all ferrous and noferrous engenieering alloys. Also the report details the Best Practice energy consumption for the industry.

  3. An Adaptive Unified Differential Evolution Algorithm for Global Optimization

    SciTech Connect (OSTI)

    Qiang, Ji; Mitchell, Chad

    2014-11-03

    In this paper, we propose a new adaptive unified differential evolution algorithm for single-objective global optimization. Instead of the multiple mutation strate- gies proposed in conventional differential evolution algorithms, this algorithm employs a single equation unifying multiple strategies into one expression. It has the virtue of mathematical simplicity and also provides users the flexibility for broader exploration of the space of mutation operators. By making all control parameters in the proposed algorithm self-adaptively evolve during the process of optimization, it frees the application users from the burden of choosing appro- priate control parameters and also improves the performance of the algorithm. In numerical tests using thirteen basic unimodal and multimodal functions, the proposed adaptive unified algorithm shows promising performance in compari- son to several conventional differential evolution algorithms.

  4. Theoretical Study on Catalysis by Protein Enzymes and Ribozyme

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Theoretical Study on Catalysis by Protein Enzymes and Ribozyme Theoretical Study on Catalysis by Protein Enzymes and Ribozyme 2000 NERSC Annual Report 17shkarplus.jpg The energetics were determined for three mechanisms proposed for TIM catalyzed reactions. Results from reaction path calculations suggest that the two mechanisms that involve an enediol intermediate are likely to occur, while the direct intra-substrate proton transfer mechanism (in green) is energetically unfavorable due to the

  5. Modified Theoretical Minimum Emittance Lattice for an Electron Storage Ring

    Office of Scientific and Technical Information (OSTI)

    with Extreme-Low Emittance (Journal Article) | SciTech Connect Modified Theoretical Minimum Emittance Lattice for an Electron Storage Ring with Extreme-Low Emittance Citation Details In-Document Search Title: Modified Theoretical Minimum Emittance Lattice for an Electron Storage Ring with Extreme-Low Emittance Authors: Jiao, Yi ; Cai, Yunhai ; Chao, Alexander Wu ; /SLAC Publication Date: 2013-06-04 OSTI Identifier: 1082826 Report Number(s): SLAC-REPRINT-2013-081 DOE Contract Number:

  6. Improvements of Nuclear Data and Its Uncertainties by Theoretical Modeling

    Office of Scientific and Technical Information (OSTI)

    (Technical Report) | SciTech Connect Improvements of Nuclear Data and Its Uncertainties by Theoretical Modeling Citation Details In-Document Search Title: Improvements of Nuclear Data and Its Uncertainties by Theoretical Modeling Authors: Talou, Patrick [1] ; Nazarewicz, Witold [2] ; Prinja, Anil [3] ; Danon, Yaron [4] + Show Author Affiliations Los Alamos National Laboratory University of Tennessee, Knoxville, TN 37996, USA University of New Mexico, USA Rensselaer Polytechnic Institute, USA

  7. Improvements to Nuclear Data and Its Uncertainties by Theoretical Modeling

    Office of Scientific and Technical Information (OSTI)

    (Technical Report) | SciTech Connect Improvements to Nuclear Data and Its Uncertainties by Theoretical Modeling Citation Details In-Document Search Title: Improvements to Nuclear Data and Its Uncertainties by Theoretical Modeling This project addresses three important gaps in existing evaluated nuclear data libraries that represent a significant hindrance against highly advanced modeling and simulation capabilities for the Advanced Fuel Cycle Initiative (AFCI). This project will: Develop

  8. The Geometry Of Disorder: Theoretical Investigations Of Quasicrystals And

    Office of Scientific and Technical Information (OSTI)

    Frustrated Magnets: Quasi-Crystals And Quasi-Equivalence: Symmetries And Energies In Alloys And Biological Materials (Technical Report) | SciTech Connect The Geometry Of Disorder: Theoretical Investigations Of Quasicrystals And Frustrated Magnets: Quasi-Crystals And Quasi-Equivalence: Symmetries And Energies In Alloys And Biological Materials Citation Details In-Document Search Title: The Geometry Of Disorder: Theoretical Investigations Of Quasicrystals And Frustrated Magnets: Quasi-Crystals

  9. Theoretical atomic physics code development I: CATS: Cowan Atomic Structure

    Office of Scientific and Technical Information (OSTI)

    Code (Technical Report) | SciTech Connect Technical Report: Theoretical atomic physics code development I: CATS: Cowan Atomic Structure Code Citation Details In-Document Search Title: Theoretical atomic physics code development I: CATS: Cowan Atomic Structure Code × You are accessing a document from the Department of Energy's (DOE) SciTech Connect. This site is a product of DOE's Office of Scientific and Technical Information (OSTI) and is provided as a public service. Visit OSTI to utilize

  10. Are There Practical Approaches for Achieving the Theoretical Maximum Engine

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Efficiency? | Department of Energy Are There Practical Approaches for Achieving the Theoretical Maximum Engine Efficiency? Are There Practical Approaches for Achieving the Theoretical Maximum Engine Efficiency? 2004 Diesel Engine Emissions Reduction (DEER) Conference Presentation: University of Wisconsin, Madison PDF icon 2004_deer_foster.pdf More Documents & Publications Fuel Modification t Facilitate Future Combustion Regimes? The Next ICE Age The Next ICE Age

  11. Experimental and theoretical investigations of non-centrosymmetric

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    8-hydroxyquinolinium dibenzoyl-(L)-tartrate methanol monohydrate single crystal (Journal Article) | SciTech Connect Experimental and theoretical investigations of non-centrosymmetric 8-hydroxyquinolinium dibenzoyl-(L)-tartrate methanol monohydrate single crystal Citation Details In-Document Search Title: Experimental and theoretical investigations of non-centrosymmetric 8-hydroxyquinolinium dibenzoyl-(L)-tartrate methanol monohydrate single crystal Graphical abstract: ORTEP diagram of HQDBT.

  12. EXPERIMENTAL AND THEORETICAL DETERMINATION OF HEAVY OIL VISCOSITY UNDER

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    RESERVOIR CONDITIONS (Technical Report) | SciTech Connect Technical Report: EXPERIMENTAL AND THEORETICAL DETERMINATION OF HEAVY OIL VISCOSITY UNDER RESERVOIR CONDITIONS Citation Details In-Document Search Title: EXPERIMENTAL AND THEORETICAL DETERMINATION OF HEAVY OIL VISCOSITY UNDER RESERVOIR CONDITIONS × You are accessing a document from the Department of Energy's (DOE) SciTech Connect. This site is a product of DOE's Office of Scientific and Technical Information (OSTI) and is provided as

  13. Research in theoretical nuclear and neutrino physics. Final report

    Office of Scientific and Technical Information (OSTI)

    (Technical Report) | SciTech Connect Technical Report: Research in theoretical nuclear and neutrino physics. Final report Citation Details In-Document Search Title: Research in theoretical nuclear and neutrino physics. Final report The main focus of the research supported by the nuclear theory grant DE-FG02-04ER41319 was on studying parton dynamics in high-energy heavy ion collisions, perturbative approach to charm production and its contribution to atmospheric neutrinos, application of

  14. Theoretical Synthesis of Mixed Materials for CO2 Capture Applications

    Office of Scientific and Technical Information (OSTI)

    (Conference) | SciTech Connect Theoretical Synthesis of Mixed Materials for CO2 Capture Applications Citation Details In-Document Search Title: Theoretical Synthesis of Mixed Materials for CO2 Capture Applications These pages provide an example of the layout and style required for the preparation of four-page papers for the TechConnect World 2015 technical proceedings.Documents must be submitted in electronic (Adobe PDFfile) format. Please study the enclosed materials beforebeginning the

  15. Theoretical and experimental studies of electrified interfaces relevant to

    Office of Scientific and Technical Information (OSTI)

    energy storage. (Technical Report) | SciTech Connect Technical Report: Theoretical and experimental studies of electrified interfaces relevant to energy storage. Citation Details In-Document Search Title: Theoretical and experimental studies of electrified interfaces relevant to energy storage. Advances in technology for electrochemical energy storage require increased understanding of electrolyte/electrode interfaces, including the electric double layer structure, and processes involved in

  16. Theoretical and experimental studies of electrified interfaces relevant to

    Office of Scientific and Technical Information (OSTI)

    energy storage. (Technical Report) | SciTech Connect Technical Report: Theoretical and experimental studies of electrified interfaces relevant to energy storage. Citation Details In-Document Search Title: Theoretical and experimental studies of electrified interfaces relevant to energy storage. × You are accessing a document from the Department of Energy's (DOE) SciTech Connect. This site is a product of DOE's Office of Scientific and Technical Information (OSTI) and is provided as a public

  17. Theoretical calculating the thermodynamic properties of solid sorbents for

    Office of Scientific and Technical Information (OSTI)

    CO{sub 2} capture applications (Technical Report) | SciTech Connect Technical Report: Theoretical calculating the thermodynamic properties of solid sorbents for CO{sub 2} capture applications Citation Details In-Document Search Title: Theoretical calculating the thermodynamic properties of solid sorbents for CO{sub 2} capture applications Since current technologies for capturing CO{sub 2} to fight global climate change are still too energy intensive, there is a critical need for development

  18. Theoretical study on electromagnetically induced transparency in molecular

    Office of Scientific and Technical Information (OSTI)

    aggregate models using quantum Liouville equation method (Journal Article) | SciTech Connect Theoretical study on electromagnetically induced transparency in molecular aggregate models using quantum Liouville equation method Citation Details In-Document Search Title: Theoretical study on electromagnetically induced transparency in molecular aggregate models using quantum Liouville equation method Electromagnetically induced transparency (EIT), which is known as an efficient control method of

  19. Toward Catalyst Design from Theoretical Calculations (464th Brookhaven

    Office of Scientific and Technical Information (OSTI)

    Lecture) (Conference) | SciTech Connect Conference: Toward Catalyst Design from Theoretical Calculations (464th Brookhaven Lecture) Citation Details In-Document Search Title: Toward Catalyst Design from Theoretical Calculations (464th Brookhaven Lecture) Catalysts have been used to speed up chemical reactions as long as yeast has been used to make bread rise. Today, catalysts are used everywhere from home kitchens to industrial chemical factories. In the near future, new catalysts being

  20. Catalysis by Design - Theoretical and Experimental Studies of Model

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Catalysts for Lean NOx Treatment | Department of Energy Design - Theoretical and Experimental Studies of Model Catalysts for Lean NOx Treatment Catalysis by Design - Theoretical and Experimental Studies of Model Catalysts for Lean NOx Treatment Presentation given at DEER 2006, August 20-24, 2006, Detroit, Michigan. Sponsored by the U.S. DOE's EERE FreedomCar and Fuel Partnership and 21st Century Truck Programs. PDF icon 2006_deer_narula.pdf More Documents & Publications Lean NOx Traps -

  1. Catalyst by Design - Theoretical, Nanostructural, and Experimental Studies

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    of Emission Treatment Catalyst | Department of Energy Emission Treatment Catalyst Catalyst by Design - Theoretical, Nanostructural, and Experimental Studies of Emission Treatment Catalyst Poster presented at the 16th Directions in Engine-Efficiency and Emissions Research (DEER) Conference in Detroit, MI, September 27-30, 2010. PDF icon p-08_narula.pdf More Documents & Publications Catalyst by Design - Theoretical, Nanostructural, and Experimental Studies of Oxidation Catalyst for Diesel

  2. Catalyst by Design - Theoretical, Nanostructural, and Experimental Studies

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    of Oxidation Catalyst for Diesel Engine Emission Treatment | Department of Energy Oxidation Catalyst for Diesel Engine Emission Treatment Catalyst by Design - Theoretical, Nanostructural, and Experimental Studies of Oxidation Catalyst for Diesel Engine Emission Treatment The overlap among theory, structure, and fully formed catalysts form the foundation of this study PDF icon deer09_narula.pdf More Documents & Publications Catalyst by Design - Theoretical, Nanostructural, and

  3. Electronic structure basis for the titanic magnetoresistance in WTe?

    SciTech Connect (OSTI)

    Pletikosic, I.; Ali, Mazhar N.; Fedorov, A. V.; Cava, R. J.; Valla, T.

    2014-11-19

    The electronic structure basis of the extremely large magnetoresistance in layered non-magnetic tungsten ditelluride has been investigated by angle-resolved photoelectron spectroscopy. Hole and electron pockets of approximately the same size were found at the Fermi level, suggesting that carrier compensation should be considered the primary source of the effect. The material exhibits a highly anisotropic, quasi one-dimensional Fermi surface from which the pronounced anisotropy of the magnetoresistance follows. A change in the Fermi surface with temperature was found and a high-density-of-states band that may take over conduction at higher temperatures and cause the observed turn-on behavior of the magnetoresistance in WTe? was identified.

  4. Electronic structure basis for the titanic magnetoresistance in WTe?

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Pletikosic, I.; Ali, Mazhar N.; Fedorov, A. V.; Cava, R. J.; Valla, T.

    2014-11-19

    The electronic structure basis of the extremely large magnetoresistance in layered non-magnetic tungsten ditelluride has been investigated by angle-resolved photoelectron spectroscopy. Hole and electron pockets of approximately the same size were found at the Fermi level, suggesting that carrier compensation should be considered the primary source of the effect. The material exhibits a highly anisotropic, quasi one-dimensional Fermi surface from which the pronounced anisotropy of the magnetoresistance follows. A change in the Fermi surface with temperature was found and a high-density-of-states band that may take over conduction at higher temperatures and cause the observed turn-on behavior of the magnetoresistance inmoreWTe? was identified.less

  5. Draft Geologic Disposal Requirements Basis for STAD Specification

    SciTech Connect (OSTI)

    Ilgen, Anastasia G.; Bryan, Charles R.; Hardin, Ernest

    2015-03-25

    This document provides the basis for requirements in the current version of Performance Specification for Standardized Transportation, Aging, and Disposal Canister Systems, (FCRD-NFST-2014-0000579) that are driven by storage and geologic disposal considerations. Performance requirements for the Standardized Transportation, Aging, and Disposal (STAD) canister are given in Section 3.1 of that report. Here, the requirements are reviewed and the rationale for each provided. Note that, while FCRD-NFST-2014-0000579 provides performance specifications for other components of the STAD storage system (e.g. storage overpack, transfer and transportation casks, and others), these have no impact on the canister performance during disposal, and are not discussed here.

  6. Electronic structure basis for the extraordinary magnetoresistance in WTe2

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Pletikosi?, I.; Ali, Mazhar N.; Fedorov, A. V.; Cava, R. J.; Valla, T.

    2014-11-19

    The electronic structure basis of the extremely large magnetoresistance in layered non-magnetic tungsten ditelluride has been investigated by angle-resolved photoelectron spectroscopy. Hole and electron pockets of approximately the same size were found at the Fermi level, suggesting that carrier compensation should be considered the primary source of the effect. The material exhibits a highly anisotropic, quasi one-dimensional Fermi surface from which the pronounced anisotropy of the magnetoresistance follows. As a result, a change in the Fermi surface with temperature was found and a high-density-of-states band that may take over conduction at higher temperatures and cause the observed turn-on behavior ofmorethe magnetoresistance in WTe? was identified.less

  7. Hanford External Dosimetry Technical Basis Manual PNL-MA-842

    SciTech Connect (OSTI)

    Rathbone, Bruce A.

    2009-08-28

    The Hanford External Dosimetry Technical Basis Manual PNL-MA-842 documents the design and implementation of the external dosimetry system used at Hanford. The manual describes the dosimeter design, processing protocols, dose calculation methodology, radiation fields encountered, dosimeter response characteristics, limitations of dosimeter design under field conditions, and makes recommendations for effective use of the dosimeters in the field. The manual describes the technical basis for the dosimetry system in a manner intended to help ensure defensibility of the dose of record at Hanford and to demonstrate compliance with 10 CFR 835, DOELAP, DOE-RL, ORP, PNSO, and Hanford contractor requirements. The dosimetry system is operated by PNNLs Hanford External Dosimetry Program (HEDP) which provides dosimetry services to all Hanford contractors. The primary users of this manual are DOE and DOE contractors at Hanford using the dosimetry services of PNNL. Development and maintenance of this manual is funded directly by DOE and DOE contractors. Its contents have been reviewed and approved by DOE and DOE contractors at Hanford through the Hanford Personnel Dosimetry Advisory Committee (HPDAC) which is chartered and chaired by DOE-RL and serves as means of coordinating dosimetry practices across contractors at Hanford. This manual was established in 1996. Since inception, it has been revised many times and maintained by PNNL as a controlled document with controlled distribution. The first revision to be released through PNNLs Electronic Records & Information Capture Architecture (ERICA) database was designated Revision 0. Revision numbers that are whole numbers reflect major revisions typically involving changes to all chapters in the document. Revision numbers that include a decimal fraction reflect minor revisions, usually restricted to selected chapters or selected pages in the document.

  8. Daylighting simulation: methods, algorithms, and resources

    SciTech Connect (OSTI)

    Carroll, William L.

    1999-12-01

    This document presents work conducted as part of Subtask C, ''Daylighting Design Tools'', Subgroup C2, ''New Daylight Algorithms'', of the IEA SHC Task 21 and the ECBCS Program Annex 29 ''Daylight in Buildings''. The search for and collection of daylighting analysis methods and algorithms led to two important observations. First, there is a wide range of needs for different types of methods to produce a complete analysis tool. These include: Geometry; Light modeling; Characterization of the natural illumination resource; Materials and components properties, representations; and Usability issues (interfaces, interoperability, representation of analysis results, etc). Second, very advantageously, there have been rapid advances in many basic methods in these areas, due to other forces. They are in part driven by: The commercial computer graphics community (commerce, entertainment); The lighting industry; Architectural rendering and visualization for projects; and Academia: Course materials, research. This has led to a very rich set of information resources that have direct applicability to the small daylighting analysis community. Furthermore, much of this information is in fact available online. Because much of the information about methods and algorithms is now online, an innovative reporting strategy was used: the core formats are electronic, and used to produce a printed form only secondarily. The electronic forms include both online WWW pages and a downloadable .PDF file with the same appearance and content. Both electronic forms include live primary and indirect links to actual information sources on the WWW. In most cases, little additional commentary is provided regarding the information links or citations that are provided. This in turn allows the report to be very concise. The links are expected speak for themselves. The report consists of only about 10+ pages, with about 100+ primary links, but with potentially thousands of indirect links. For purposes of the printed version, a list of the links is explicitly provided. This document exists in HTML form at the URL address: http://eande.lbl.gov/Task21/dlalgorithms.html. An equivalent downloadable PDF version, also with live links, at the URL address: http://eande.lbl.gov/Task21/dlalgorithms.pdf. A printed report can be derived directly from either of the electronic versions by simply printing either of them. In addition to the live links in the electronic forms, all report forms, electronic and paper, also have explicitly listed link addresses so that they can be followed up or referenced manually.

  9. Automated DNA Base Pair Calling Algorithm

    Energy Science and Technology Software Center (OSTI)

    1999-07-07

    The procedure solves the problem of calling the DNA base pair sequence from two channel electropherogram separations in an automated fashion. The core of the program involves a peak picking algorithm based upon first, second, and third derivative spectra for each electropherogram channel, signal levels as a function of time, peak spacing, base pair signal to noise sequence patterns, frequency vs ratio of the two channel histograms, and confidence levels generated during the run. Themore »ratios of the two channels at peak centers can be used to accurately and reproducibly determine the base pair sequence. A further enhancement is a novel Gaussian deconvolution used to determine the peak heights used in generating the ratio.« less

  10. Neurons to algorithms LDRD final report.

    SciTech Connect (OSTI)

    Rothganger, Fredrick H.; Aimone, James Bradley; Warrender, Christina E.; Trumbo, Derek

    2013-09-01

    Over the last three years the Neurons to Algorithms (N2A) LDRD project teams has built infrastructure to discover computational structures in the brain. This consists of a modeling language, a tool that enables model development and simulation in that language, and initial connections with the Neuroinformatics community, a group working toward similar goals. The approach of N2A is to express large complex systems like the brain as populations of a discrete part types that have specific structural relationships with each other, along with internal and structural dynamics. Such an evolving mathematical system may be able to capture the essence of neural processing, and ultimately of thought itself. This final report is a cover for the actual products of the project: the N2A Language Specification, the N2A Application, and a journal paper summarizing our methods.

  11. Component evaluation testing and analysis algorithms.

    SciTech Connect (OSTI)

    Hart, Darren M.; Merchant, Bion John

    2011-10-01

    The Ground-Based Monitoring R&E Component Evaluation project performs testing on the hardware components that make up Seismic and Infrasound monitoring systems. The majority of the testing is focused on the Digital Waveform Recorder (DWR), Seismic Sensor, and Infrasound Sensor. In order to guarantee consistency, traceability, and visibility into the results of the testing process, it is necessary to document the test and analysis procedures that are in place. Other reports document the testing procedures that are in place (Kromer, 2007). This document serves to provide a comprehensive overview of the analysis and the algorithms that are applied to the Component Evaluation testing. A brief summary of each test is included to provide the context for the analysis that is to be performed.

  12. Hanford External Dosimetry Technical Basis Manual PNL-MA-842

    SciTech Connect (OSTI)

    Rathbone, Bruce A.

    2010-04-01

    The Hanford External Dosimetry Technical Basis Manual PNL-MA-842 documents the design and implementation of the external dosimetry system used at the U.S. Department of Energy (DOE) Hanford site. The manual describes the dosimeter design, processing protocols, dose calculation methodology, radiation fields encountered, dosimeter response characteristics, limitations of dosimeter design under field conditions, and makes recommendations for effective use of the dosimeters in the field. The manual describes the technical basis for the dosimetry system in a manner intended to help ensure defensibility of the dose of record at Hanford and to demonstrate compliance with requirements of 10 CFR 835, the DOE Laboratory Accreditation Program, the DOE Richland Operations Office, DOE Office of River Protection, DOE Pacific Northwest Office of Science, and Hanfords DOE contractors. The dosimetry system is operated by the Pacific Northwest National Laboratory (PNNL) Hanford External Dosimetry Program which provides dosimetry services to PNNL and all Hanford contractors. The primary users of this manual are DOE and DOE contractors at Hanford using the dosimetry services of PNNL. Development and maintenance of this manual is funded directly by DOE and DOE contractors. Its contents have been reviewed and approved by DOE and DOE contractors at Hanford through the Hanford Personnel Dosimetry Advisory Committee which is chartered and chaired by DOE-RL and serves as means of coordinating dosimetry practices across contractors at Hanford. This manual was established in 1996. Since its inception, it has been revised many times and maintained by PNNL as a controlled document with controlled distribution. The first revision to be released through PNNLs Electronic Records & Information Capture Architecture database was designated Revision 0. Revision numbers that are whole numbers reflect major revisions typically involving significant changes to all chapters in the document. Revision numbers that include a decimal fraction reflect minor revisions, usually restricted to selected chapters or selected pages in the document. Maintenance and distribution of controlled hard copies of the manual by PNNL was discontinued beginning with Revision 0.2.

  13. Hanford External Dosimetry Technical Basis Manual PNL-MA-842

    SciTech Connect (OSTI)

    Rathbone, Bruce A.

    2011-04-04

    The Hanford External Dosimetry Technical Basis Manual PNL-MA-842 documents the design and implementation of the external dosimetry system used at the U.S. Department of Energy (DOE) Hanford site. The manual describes the dosimeter design, processing protocols, dose calculation methodology, radiation fields encountered, dosimeter response characteristics, limitations of dosimeter design under field conditions, and makes recommendations for effective use of the dosimeters in the field. The manual describes the technical basis for the dosimetry system in a manner intended to help ensure defensibility of the dose of record at Hanford and to demonstrate compliance with requirements of 10 CFR 835, the DOE Laboratory Accreditation Program, the DOE Richland Operations Office, DOE Office of River Protection, DOE Pacific Northwest Office of Science, and Hanfords DOE contractors. The dosimetry system is operated by the Pacific Northwest National Laboratory (PNNL) Hanford External Dosimetry Program which provides dosimetry services to PNNL and all Hanford contractors. The primary users of this manual are DOE and DOE contractors at Hanford using the dosimetry services of PNNL. Development and maintenance of this manual is funded directly by DOE and DOE contractors. Its contents have been reviewed and approved by DOE and DOE contractors at Hanford through the Hanford Personnel Dosimetry Advisory Committee which is chartered and chaired by DOE-RL and serves as means of coordinating dosimetry practices across contractors at Hanford. This manual was established in 1996. Since its inception, it has been revised many times and maintained by PNNL as a controlled document with controlled distribution. The first revision to be released through PNNLs Electronic Records & Information Capture Architecture database was designated Revision 0. Revision numbers that are whole numbers reflect major revisions typically involving significant changes to all chapters in the document. Revision numbers that include a decimal fraction reflect minor revisions, usually restricted to selected chapters or selected pages in the document. Maintenance and distribution of controlled hard copies of the manual by PNNL was discontinued beginning with Revision 0.2.

  14. Hanford External Dosimetry Technical Basis Manual PNL-MA-842

    SciTech Connect (OSTI)

    Rathbone, Bruce A.

    2007-03-12

    The Hanford External Dosimetry Technical Basis Manual PNL-MA-842 documents the design and implementation of the external dosimetry system used at Hanford. The manual describes the dosimeter design, processing protocols, dose calculation methodology, radiation fields encountered, dosimeter response characteristics, limitations of dosimeter design under field conditions, and makes recommendations for effective use of the dosimeters in the field. The manual describes the technical basis for the dosimetry system in a manner intended to help ensure defensibility of the dose of record at Hanford and to demonstrate compliance with 10 CFR 835, DOELAP, DOE-RL, ORP, PNSO, and Hanford contractor requirements. The dosimetry system is operated by PNNLs Hanford External Dosimetry Program (HEDP) which provides dosimetry services to all Hanford contractors. The primary users of this manual are DOE and DOE contractors at Hanford using the dosimetry services of PNNL. Development and maintenance of this manual is funded directly by DOE and DOE contractors. Its contents have been reviewed and approved by DOE and DOE contractors at Hanford through the Hanford Personnel Dosimetry Advisory Committee (HPDAC) which is chartered and chaired by DOE-RL and serves as means of coordinating dosimetry practices across contractors at Hanford. This manual was established in 1996. Since inception, it has been revised many times and maintained by PNNL as a controlled document with controlled distribution. Rev. 0 marks the first revision to be released through PNNLs Electronic Records & Information Capture Architecture (ERICA) database. Revision numbers that are whole numbers reflect major revisions typically involving changes to all chapters in the document. Revision numbers that include a decimal fraction reflect minor revisions, usually restricted to selected chapters or selected pages in the document. Revision Log: Rev. 0 (2/25/2005) Major revision and expansion. Rev. 0.1 (3/12/2007) Minor revision. Updated Chapters 5, 6 and 9 to reflect change in default ring calibration factor used in HEDP dose calculation software. Factor changed from 1.5 to 2.0 beginning January 1, 2007. Pages on which changes were made are as follows: 5.23, 5.69, 5.78, 5.80, 5.82, 6.3, 6.5, 6.29, 9.2.

  15. Solar and Moon Position Algorithm (SAMPA) - Energy Innovation Portal

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Energy Analysis Energy Analysis Find More Like This Return to Search Solar and Moon Position Algorithm (SAMPA) National Renewable Energy Laboratory Contact NREL About This Technology Technology Marketing Summary This algorithm calculates the solar and lunar zenith and azimuth angles in the period from the year -2000 to 6000, with uncertainties of +/- 0.0003 degrees for the Sun and +/- 0.003 degrees for the Moon, based on the date, time, and location on Earth. Description The algorithm can be

  16. Development of engineering technology basis for industrialization of pyrometallurgical reprocessing

    SciTech Connect (OSTI)

    Koyama, Tadafumi; Hijikata, Takatoshi; Yokoo, Takeshi; Inoue, Tadashi

    2007-07-01

    Development of the engineering technology basis of pyrometallurgical reprocessing is a key issue for industrialization. For development of the transport technologies of molten salt and liquid cadmium at around 500 deg. C, a salt transport test rig and a metal transport test rig were installed in Ar glove box. Function of centrifugal pump and 1/2' declined tubing were confirmed with LiCl- KCl molten salt. The transport behavior of molten salt was found to follow that of water. Function of centrifugal pump, vacuum sucking and 1/2' declined tubing were confirmed with liquid Cd. With employing the transport technologies, industrialization applicable electro-refiner was newly designed and engineering-scale model was fabricated in Ar glove box. The electro-refiner has semi-continuous liquid Cd cathode instead of conventional one used in small-scale tests. With using actinide-simulating elements, demonstration of industrial-scale throughput will be carried out in this electro-refiner for more precise evaluation of industrialization potential of pyrometallurgical reprocessing. (authors)

  17. Hanford Technical Basis for Multiple Dosimetry Effective Dose Methodology

    SciTech Connect (OSTI)

    Hill, Robin L.; Rathbone, Bruce A.

    2010-08-01

    The current method at Hanford for dealing with the results from multiple dosimeters worn during non-uniform irradiation is to use a compartmentalization method to calculate the effective dose (E). The method, as documented in the current version of Section 6.9.3 in the 'Hanford External Dosimetry Technical Basis Manual, PNL-MA-842,' is based on the compartmentalization method presented in the 1997 ANSI/HPS N13.41 standard, 'Criteria for Performing Multiple Dosimetry.' With the adoption of the ICRP 60 methodology in the 2007 revision to 10 CFR 835 came changes that have a direct affect on the compartmentalization method described in the 1997 ANSI/HPS N13.41 standard, and, thus, to the method used at Hanford. The ANSI/HPS N13.41 standard committee is in the process of updating the standard, but the changes to the standard have not yet been approved. And, the drafts of the revision of the standard tend to align more with ICRP 60 than with the changes specified in the 2007 revision to 10 CFR 835. Therefore, a revised method for calculating effective dose from non-uniform external irradiation using a compartmental method was developed using the tissue weighting factors and remainder organs specified in 10 CFR 835 (2007).

  18. Climate Change: The Physical Basis and Latest Results

    ScienceCinema (OSTI)

    None

    2011-10-06

    The 2007 Assessment Report of the Intergovernmental Panel on Climate Change (IPCC) concludes: "Warming in the climate system is unequivocal." Without the contribution of Physics to climate science over many decades, such a statement would not have been possible. Experimental physics enables us to read climate archives such as polar ice cores and so provides the context for the current changes. For example, today the concentration of CO2 in the atmosphere, the second most important greenhouse gas, is 28% higher than any time during the last 800,000 years. Classical fluid mechanics and numerical mathematics are the basis of climate models from which estimates of future climate change are obtained. But major instabilities and surprises in the Earth System are still unknown. These are also to be considered when the climatic consequences of proposals for geo-engineering are estimated. Only Physics will permit us to further improve our understanding in order to provide the foundation for policy decisions facing the global climate change challenge.

  19. A practical and theoretical definition of very small field size for radiotherapy output factor measurements

    SciTech Connect (OSTI)

    Charles, P. H. Crowe, S. B.; Langton, C. M.; Trapp, J. V.; Cranmer-Sargison, G.; Thwaites, D. I.; Kairn, T.; Knight, R. T.; Kenny, J.

    2014-04-15

    Purpose: This work introduces the concept of very small field size. Output factor (OPF) measurements at these field sizes require extremely careful experimental methodology including the measurement of dosimetric field size at the same time as each OPF measurement. Two quantifiable scientific definitions of the threshold of very small field size are presented. Methods: A practical definition was established by quantifying the effect that a 1 mm error in field size or detector position had on OPFs and setting acceptable uncertainties on OPF at 1%. Alternatively, for a theoretical definition of very small field size, the OPFs were separated into additional factors to investigate the specific effects of lateral electronic disequilibrium, photon scatter in the phantom, and source occlusion. The dominant effect was established and formed the basis of a theoretical definition of very small fields. Each factor was obtained using Monte Carlo simulations of a Varian iX linear accelerator for various square field sizes of side length from 4 to 100 mm, using a nominal photon energy of 6 MV. Results: According to the practical definition established in this project, field sizes ?15 mm were considered to be very small for 6 MV beams for maximal field size uncertainties of 1 mm. If the acceptable uncertainty in the OPF was increased from 1.0% to 2.0%, or field size uncertainties are 0.5 mm, field sizes ?12 mm were considered to be very small. Lateral electronic disequilibrium in the phantom was the dominant cause of change in OPF at very small field sizes. Thus the theoretical definition of very small field size coincided to the field size at which lateral electronic disequilibrium clearly caused a greater change in OPF than any other effects. This was found to occur at field sizes ?12 mm. Source occlusion also caused a large change in OPF for field sizes ?8 mm. Based on the results of this study, field sizes ?12 mm were considered to be theoretically very small for 6 MV beams. Conclusions: Extremely careful experimental methodology including the measurement of dosimetric field size at the same time as output factor measurement for each field size setting and also very precise detector alignment is required at field sizes at least ?12 mm and more conservatively?15 mm for 6 MV beams. These recommendations should be applied in addition to all the usual considerations for small field dosimetry, including careful detector selection.

  20. Evaluation of Monte Carlo Electron-Transport Algorithms in the...

    Office of Scientific and Technical Information (OSTI)

    Evaluation of Monte Carlo Electron-Transport Algorithms in the Integrated Tiger Series Codes for Stochastic-Media Simulations. Citation Details In-Document Search Title: Evaluation...

  1. Development of an Outdoor Temperature-Based Control Algorithm...

    Office of Scientific and Technical Information (OSTI)

    Development of an Outdoor Temperature-Based Control Algorithm for Residential Mechanical Ventilation Control Citation Details In-Document Search Title: Development of an Outdoor...

  2. Solar Position Algorithm for Solar Radiation Applications (Revised...

    Office of Scientific and Technical Information (OSTI)

    Revised January 2008 * NRELTP-560-34302 Solar Position Algorithm for Solar Radiation Applications Ibrahim Reda and Afshin Andreas National Renewable Energy Laboratory 1617 Cole...

  3. Algorithm for Finding Similar Shapes in Large Molecular Structures Libraries

    Energy Science and Technology Software Center (OSTI)

    1994-10-19

    The SHAPES software consists of methods and algorithms for representing and rapidly comparing molecular shapes. Molecular shapes algorithms are a class of algorithm derived and applied for recognizing when two three-dimensional shapes share common features. They proceed from the notion that the shapes to be compared are regions in three-dimensional space. The algorithms allow recognition of when localized subregions from two or more different shapes could never be superimposed by any rigid-body motion. Rigid-body motionsmoreare arbitrary combinations of translations and rotations.less

  4. NREL: Awards and Honors - Current Interrupt Charging Algorithm...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Current Interrupt Charging Algorithm for Lead-Acid Batteries Developers: Matthew A. Keyser, Ahmad A. Pesaran, and Mark M. Mihalic, National Renewable Energy Laboratory; Robert F....

  5. EnPI V4.0 Tool Algorithm

    Broader source: Energy.gov [DOE]

    This document provides background information and detail about the algorithms and calculations that drive the Energy Performance Indicator (EnPI) Tool.

  6. Extending vanLeer's Algorithm to Multiple Dimensions. (Conference...

    Office of Scientific and Technical Information (OSTI)

    Title: Extending vanLeer's Algorithm to Multiple Dimensions. Abstract not provided. Authors: Mosso, Stewart John ; Voth, Thomas Eugene ; Drake, Richard R. Publication Date: ...

  7. Theoretical solution of the minimum charge problem for gaseous detonations

    SciTech Connect (OSTI)

    Ostensen, R.W.

    1990-12-01

    A theoretical model was developed for the minimum charge to trigger a gaseous detonation in spherical geometry as a generalization of the Zeldovich model. Careful comparisons were made between the theoretical predictions and experimental data on the minimum charge to trigger detonations in propane-air mixtures. The predictions are an order of magnitude too high, and there is no apparent resolution to the discrepancy. A dynamic model, which takes into account the experimentally observed oscillations in the detonation zone, may be necessary for reliable predictions. 27 refs., 9 figs.

  8. Theoretical investigations of two Si-based spintronic materials

    Office of Scientific and Technical Information (OSTI)

    (Conference) | SciTech Connect Conference: Theoretical investigations of two Si-based spintronic materials Citation Details In-Document Search Title: Theoretical investigations of two Si-based spintronic materials Two Si-based spintronic materials, a Mn-Si digital ferromagnetic heterostructure ({delta}-layer of Mn doped in Si) with defects and dilutely doped Mn{sub x}Si{sub 1-x} alloy are investigated using a density-functional based approach. We model the heterostructure and alloy with a

  9. Theoretical and experimental investigation of heat pipe solar collector

    SciTech Connect (OSTI)

    Azad, E.

    2008-09-15

    Heat pipe solar collector was designed and constructed at IROST and its performance was measured on an outdoor test facility. The thermal behavior of a gravity assisted heat pipe solar collector was investigated theoretically and experimentally. A theoretical model based on effectiveness-NTU method was developed for evaluating the thermal efficiency of the collector, the inlet, outlet water temperatures and heat pipe temperature. Optimum value of evaporator length to condenser length ratio is also determined. The modelling predictions were validated using experimental data and it shows that there is a good concurrence between measured and predicted results. (author)

  10. Can we derive Tully's surface-hopping algorithm from the semiclassical quantum Liouville equation? Almost, but only with decoherence

    SciTech Connect (OSTI)

    Subotnik, Joseph E. Ouyang, Wenjun; Landry, Brian R.

    2013-12-07

    In this article, we demonstrate that Tully's fewest-switches surface hopping (FSSH) algorithm approximately obeys the mixed quantum-classical Liouville equation (QCLE), provided that several conditions are satisfied some major conditions, and some minor. The major conditions are: (1) nuclei must be moving quickly with large momenta; (2) there cannot be explicit recoherences or interference effects between nuclear wave packets; (3) force-based decoherence must be added to the FSSH algorithm, and the trajectories can no longer rigorously be independent (though approximations for independent trajectories are possible). We furthermore expect that FSSH (with decoherence) will be most robust when nonadiabatic transitions in an adiabatic basis are dictated primarily by derivative couplings that are presumably localized to crossing regions, rather than by small but pervasive off-diagonal force matrix elements. In the end, our results emphasize the strengths of and possibilities for the FSSH algorithm when decoherence is included, while also demonstrating the limitations of the FSSH algorithm and its inherent inability to follow the QCLE exactly.

  11. Structural Basis of Selective Ubiquitination of TRF1 by SCFFbx4...

    Office of Scientific and Technical Information (OSTI)

    Structural Basis of Selective Ubiquitination of TRF1 by SCFFbx4 Title: Structural Basis of Selective Ubiquitination of TRF1 by SCFFbx4 Authors: Zeng, Zhixiong ; Wang, Wei ; Yang, ...

  12. Structural Basis of UV DNA-Damage Recognition by the DDB1-DDB2...

    Office of Scientific and Technical Information (OSTI)

    Structural Basis of UV DNA-Damage Recognition by the DDB1-DDB2 Complex Citation Details In-Document Search Title: Structural Basis of UV DNA-Damage Recognition by the DDB1-DDB2 ...

  13. Structural basis for Notch1 engagement of Delta-like 4 (Journal...

    Office of Scientific and Technical Information (OSTI)

    Structural basis for Notch1 engagement of Delta-like 4 Citation Details In-Document Search Title: Structural basis for Notch1 engagement of Delta-like 4 Authors: Luca, Vincent C. ;...

  14. Safety evaluation of MHTGR licensing basis accident scenarios

    SciTech Connect (OSTI)

    Kroeger, P.G.

    1989-04-01

    The safety potential of the Modular High-Temperature Gas Reactor (MHTGR) was evaluated, based on the Preliminary Safety Information Document (PSID), as submitted by the US Department of Energy to the US Nuclear Regulatory Commission. The relevant reactor safety codes were extended for this purpose and applied to this new reactor concept, searching primarily for potential accident scenarios that might lead to fuel failures due to excessive core temperatures and/or to vessel damage, due to excessive vessel temperatures. The design basis accident scenario leading to the highest vessel temperatures is the depressurized core heatup scenario without any forced cooling and with decay heat rejection to the passive Reactor Cavity Cooling System (RCCS). This scenario was evaluated, including numerous parametric variations of input parameters, like material properties and decay heat. It was found that significant safety margins exist, but that high confidence levels in the core effective thermal conductivity, the reactor vessel and RCCS thermal emissivities and the decay heat function are required to maintain this safety margin. Severe accident extensions of this depressurized core heatup scenario included the cases of complete RCCS failure, cases of massive air ingress, core heatup without scram and cases of degraded RCCS performance due to absorbing gases in the reactor cavity. Except for no-scram scenarios extending beyond 100 hr, the fuel never reached the limiting temperature of 1600/degree/C, below which measurable fuel failures are not expected. In some of the scenarios, excessive vessel and concrete temperatures could lead to investment losses but are not expected to lead to any source term beyond that from the circulating inventory. 19 refs., 56 figs., 11 tabs.

  15. Theoretical investigations of defects in a Si-based digital ferromagne...

    Office of Scientific and Technical Information (OSTI)

    Theoretical investigations of defects in a Si-based digital ferromagnetic heterostructure - a spintronic material Citation Details In-Document Search Title: Theoretical...

  16. Particle Communication and Domain Neighbor Coupling: Scalable Domain Decomposed Algorithms for Monte Carlo Particle Transport

    SciTech Connect (OSTI)

    O'Brien, M J; Brantley, P S

    2015-01-20

    In order to run Monte Carlo particle transport calculations on new supercomputers with hundreds of thousands or millions of processors, care must be taken to implement scalable algorithms. This means that the algorithms must continue to perform well as the processor count increases. In this paper, we examine the scalability of:(1) globally resolving the particle locations on the correct processor, (2) deciding that particle streaming communication has finished, and (3) efficiently coupling neighbor domains together with different replication levels. We have run domain decomposed Monte Carlo particle transport on up to 221 = 2,097,152 MPI processes on the IBM BG/Q Sequoia supercomputer and observed scalable results that agree with our theoretical predictions. These calculations were carefully constructed to have the same amount of work on every processor, i.e. the calculation is already load balanced. We also examine load imbalanced calculations where each domains replication level is proportional to its particle workload. In this case we show how to efficiently couple together adjacent domains to maintain within workgroup load balance and minimize memory usage.

  17. TECHNICAL BASIS FOR VENTILATION REQUIREMENTS IN TANK FARMS OPERATING SPECIFICATIONS DOCUMENTS

    SciTech Connect (OSTI)

    BERGLIN, E J

    2003-06-23

    This report provides the technical basis for high efficiency particulate air filter (HEPA) for Hanford tank farm ventilation systems (sometimes known as heating, ventilation and air conditioning [HVAC]) to support limits defined in Process Engineering Operating Specification Documents (OSDs). This technical basis included a review of older technical basis and provides clarifications, as necessary, to technical basis limit revisions or justification. This document provides an updated technical basis for tank farm ventilation systems related to Operation Specification Documents (OSDs) for double-shell tanks (DSTs), single-shell tanks (SSTs), double-contained receiver tanks (DCRTs), catch tanks, and various other miscellaneous facilities.

  18. Establishing the Technical Basis for Disposal of Heat-generating Waste in

    Office of Environmental Management (EM)

    Salt | Department of Energy Establishing the Technical Basis for Disposal of Heat-generating Waste in Salt Establishing the Technical Basis for Disposal of Heat-generating Waste in Salt The report summarizes available historic tests and the developed technical basis for disposal of heat-generating waste in salt, and the means by which a safety case for disposal of heat generating waste at a generic salt site can be initiated from the existing technical basis. Though the basis for a salt

  19. A brief comparison between grid based real space algorithms andspectrum algorithms for electronic structure calculations

    SciTech Connect (OSTI)

    Wang, Lin-Wang

    2006-12-01

    Quantum mechanical ab initio calculation constitutes the biggest portion of the computer time in material science and chemical science simulations. As a computer center like NERSC, to better serve these communities, it will be very useful to have a prediction for the future trends of ab initio calculations in these areas. Such prediction can help us to decide what future computer architecture can be most useful for these communities, and what should be emphasized on in future supercomputer procurement. As the size of the computer and the size of the simulated physical systems increase, there is a renewed interest in using the real space grid method in electronic structure calculations. This is fueled by two factors. First, it is generally assumed that the real space grid method is more suitable for parallel computation for its limited communication requirement, compared with spectrum method where a global FFT is required. Second, as the size N of the calculated system increases together with the computer power, O(N) scaling approaches become more favorable than the traditional direct O(N{sup 3}) scaling methods. These O(N) methods are usually based on localized orbital in real space, which can be described more naturally by the real space basis. In this report, the author compares the real space methods versus the traditional plane wave (PW) spectrum methods, for their technical pros and cons, and the possible of future trends. For the real space method, the author focuses on the regular grid finite different (FD) method and the finite element (FE) method. These are the methods used mostly in material science simulation. As for chemical science, the predominant methods are still Gaussian basis method, and sometime the atomic orbital basis method. These two basis sets are localized in real space, and there is no indication that their roles in quantum chemical simulation will change anytime soon. The author focuses on the density functional theory (DFT), which is the most used method for quantum mechanical material science simulation.

  20. Theoretical evaluation of the optimal performance of a thermoacoustic refrigerator

    SciTech Connect (OSTI)

    Minner, B.L.; Braun, J.E.; Mongeau, L.G.

    1997-12-31

    Theoretical models were integrated with a design optimization tool to allow estimates of the maximum coefficient of performance for thermoacoustic cooling systems. The system model was validated using experimental results for a well-documented prototype. The optimization tool was then applied to this prototype to demonstrate the benefits of systematic optimization. A twofold increase in performance was predicted through the variation of component dimensions alone, while a threefold improvement was estimated when the working fluid parameters were also considered. Devices with a similar configuration were optimized for operating requirements representative of a home refrigerator. The results indicate that the coefficients of performance are comparable to those of existing vapor-compression equipment for this application. In addition to the choice of working fluid, the heat exchanger configuration was found to be a critical design factor affecting performance. Further experimental work is needed to confirm the theoretical predictions presented in this paper.

  1. Theoretical model for plasma expansion generated by hypervelocity impact

    SciTech Connect (OSTI)

    Ju, Yuanyuan; Zhang, Qingming Zhang, Dongjiang; Long, Renrong; Chen, Li; Huang, Fenglei; Gong, Zizheng

    2014-09-15

    The hypervelocity impact experiments of spherical LY12 aluminum projectile diameter of 6.4?mm on LY12 aluminum target thickness of 23?mm have been conducted using a two-stage light gas gun. The impact velocity of the projectile is 5.2, 5.7, and 6.3?km/s, respectively. The experimental results show that the plasma phase transition appears under the current experiment conditions, and the plasma expansion consists of accumulation, equilibrium, and attenuation. The plasma characteristic parameters decrease as the plasma expands outward and are proportional with the third power of the impact velocity, i.e., (T{sub e}, n{sub e})???v{sub p}{sup 3}. Based on the experimental results, a theoretical model on the plasma expansion is developed and the theoretical results are consistent with the experimental data.

  2. Alamos National Laboratory theoretical biologists Bette Korber, Will Fischer, Sydeaka

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    strategy expands immune responses March 3, 2010 Mosaic vaccines show promise in reducing the spread of deadly virus LOS ALAMOS, New Mexico, March 3, 2010-Two teams of researchers-including Los Alamos National Laboratory theoretical biologists Bette Korber, Will Fischer, Sydeaka Watson, and James Szinger-have announced an HIV vaccination strategy that has been shown to expand the breadth and depth of immune responses in rhesus monkeys. Rhesus monkeys provide the best animal model currently

  3. COLLOQUIUM: Theoretical and Experimental Aspects of Controlled Quantum

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Dynamics | Princeton Plasma Physics Lab March 25, 2015, 4:15pm to 5:30pm MBG Auditorium COLLOQUIUM: Theoretical and Experimental Aspects of Controlled Quantum Dynamics Professor Herschel Rabitz Princeton University Abstract: PDF icon COLL.03.25.15.pdf Controlling quantum dynamics phenomena spans a wide range of applications and potential technologies. Although some experiments are far more demanding than others, the experiments are collectively proving to be remarkably successful considering

  4. Materials for electrochemical capacitors: Theoretical and experimental constraints

    SciTech Connect (OSTI)

    Sarangapani, S.; Tilak, B.V.; Chen, C.P.

    1996-11-01

    Electrochemical capacitors, also called supercapacitors, are unique devices exhibiting 20 to 200 times greater capacitance than conventional capacitors. The large capacitance exhibited by these systems has been demonstrated to arise from a combination of the double-layer capacitance and pseudocapacitance associated with surface redox-type reactions. The purpose of this review is to survey the published data of available electrode materials possessing high specific double-layer or pseudocapacitance and examine their reported performance data in relation to their theoretical expectations.

  5. Genetic Algorithm Based Neural Networks for Nonlinear Optimization

    Energy Science and Technology Software Center (OSTI)

    1994-09-28

    This software develops a novel approach to nonlinear optimization using genetic algorithm based neural networks. To our best knowledge, this approach represents the first attempt at applying both neural network and genetic algorithm techniques to solve a nonlinear optimization problem. The approach constructs a neural network structure and an appropriately shaped energy surface whose minima correspond to optimal solutions of the problem. A genetic algorithm is employed to perform a parallel and powerful search ofmore » the energy surface.« less

  6. CRITICALITY SAFETY CONTROLS AND THE SAFETY BASIS AT PFP

    SciTech Connect (OSTI)

    Kessler, S

    2009-04-21

    With the implementation of DOE Order 420.1B, Facility Safety, and DOE-STD-3007-2007, 'Guidelines for Preparing Criticality Safety Evaluations at Department of Energy Non-Reactor Nuclear Facilities', a new requirement was imposed that all criticality safety controls be evaluated for inclusion in the facility Documented Safety Analysis (DSA) and that the evaluation process be documented in the site Criticality Safety Program Description Document (CSPDD). At the Hanford site in Washington State the CSPDD, HNF-31695, 'General Description of the FH Criticality Safety Program', requires each facility develop a linking document called a Criticality Control Review (CCR) to document performance of these evaluations. Chapter 5, Appendix 5B of HNF-7098, Criticality Safety Program, provided an example of a format for a CCR that could be used in lieu of each facility developing its own CCR. Since the Plutonium Finishing Plant (PFP) is presently undergoing Deactivation and Decommissioning (D&D), new procedures are being developed for cleanout of equipment and systems that have not been operated in years. Existing Criticality Safety Evaluations (CSE) are revised, or new ones written, to develop the controls required to support D&D activities. Other Hanford facilities, including PFP, had difficulty using the basic CCR out of HNF-7098 when first implemented. Interpretation of the new guidelines indicated that many of the controls needed to be elevated to TSR level controls. Criterion 2 of the standard, requiring that the consequence of a criticality be examined for establishing the classification of a control, was not addressed. Upon in-depth review by PFP Criticality Safety staff, it was not clear that the programmatic interpretation of criterion 8C could be applied at PFP. Therefore, the PFP Criticality Safety staff decided to write their own CCR. The PFP CCR provides additional guidance for the evaluation team to use by clarifying the evaluation criteria in DOE-STD-3007-2007. In reviewing documents used in classifying controls for Nuclear Safety, it was noted that DOE-HDBK-1188, 'Glossary of Environment, Health, and Safety Terms', defines an Administrative Control (AC) in terms that are different than typically used in Criticality Safety. As part of this CCR, a new term, Criticality Administrative Control (CAC) was defined to clarify the difference between an AC used for criticality safety and an AC used for nuclear safety. In Nuclear Safety terms, an AC is a provision relating to organization and management, procedures, recordkeeping, assessment, and reporting necessary to ensure safe operation of a facility. A CAC was defined as an administrative control derived in a criticality safety analysis that is implemented to ensure double contingency. According to criterion 2 of Section IV, 'Linkage to the Documented Safety Analysis', of DOESTD-3007-2007, the consequence of a criticality should be examined for the purposes of classifying the significance of a control or component. HNF-PRO-700, 'Safety Basis Development', provides control selection criteria based on consequence and risk that may be used in the development of a Criticality Safety Evaluation (CSE) to establish the classification of a component as a design feature, as safety class or safety significant, i.e., an Engineered Safety Feature (ESF), or as equipment important to safety; or merely provides defense-in-depth. Similar logic is applied to the CACs. Criterion 8C of DOE-STD-3007-2007, as written, added to the confusion of using the basic CCR from HNF-7098. The PFP CCR attempts to clarify this criterion by revising it to say 'Programmatic commitments or general references to control philosophy (e.g., mass control or spacing control or concentration control as an overall control strategy for the process without specific quantification of individual limits) is included in the PFP DSA'. Table 1 shows the PFP methodology for evaluating CACs. This evaluation process has been in use since February of 2008 and has proven to be simple and effective. Each control identified i

  7. A divide-conquer-recombine algorithmic paradigm for large spatiotemporal quantum molecular dynamics simulations

    SciTech Connect (OSTI)

    Shimojo, Fuyuki; Hattori, Shinnosuke [Collaboratory for Advanced Computing and Simulations, Department of Physics and Astronomy, Department of Computer Science, and Department of Chemical Engineering and Materials Science, University of Southern California, Los Angeles, California 90089-0242 (United States) [Collaboratory for Advanced Computing and Simulations, Department of Physics and Astronomy, Department of Computer Science, and Department of Chemical Engineering and Materials Science, University of Southern California, Los Angeles, California 90089-0242 (United States); Department of Physics, Kumamoto University, Kumamoto 860-8555 (Japan); Kalia, Rajiv K.; Mou, Weiwei; Nakano, Aiichiro; Nomura, Ken-ichi; Rajak, Pankaj; Vashishta, Priya [Collaboratory for Advanced Computing and Simulations, Department of Physics and Astronomy, Department of Computer Science, and Department of Chemical Engineering and Materials Science, University of Southern California, Los Angeles, California 90089-0242 (United States)] [Collaboratory for Advanced Computing and Simulations, Department of Physics and Astronomy, Department of Computer Science, and Department of Chemical Engineering and Materials Science, University of Southern California, Los Angeles, California 90089-0242 (United States); Kunaseth, Manaschai [Collaboratory for Advanced Computing and Simulations, Department of Physics and Astronomy, Department of Computer Science, and Department of Chemical Engineering and Materials Science, University of Southern California, Los Angeles, California 90089-0242 (United States) [Collaboratory for Advanced Computing and Simulations, Department of Physics and Astronomy, Department of Computer Science, and Department of Chemical Engineering and Materials Science, University of Southern California, Los Angeles, California 90089-0242 (United States); National Nanotechnology Center, Pathumthani 12120 (Thailand); Ohmura, Satoshi [Collaboratory for Advanced Computing and Simulations, Department of Physics and Astronomy, Department of Computer Science, and Department of Chemical Engineering and Materials Science, University of Southern California, Los Angeles, California 90089-0242 (United States) [Collaboratory for Advanced Computing and Simulations, Department of Physics and Astronomy, Department of Computer Science, and Department of Chemical Engineering and Materials Science, University of Southern California, Los Angeles, California 90089-0242 (United States); Department of Physics, Kumamoto University, Kumamoto 860-8555 (Japan); Department of Physics, Kyoto University, Kyoto 606-8502 (Japan); Shimamura, Kohei [Collaboratory for Advanced Computing and Simulations, Department of Physics and Astronomy, Department of Computer Science, and Department of Chemical Engineering and Materials Science, University of Southern California, Los Angeles, California 90089-0242 (United States) [Collaboratory for Advanced Computing and Simulations, Department of Physics and Astronomy, Department of Computer Science, and Department of Chemical Engineering and Materials Science, University of Southern California, Los Angeles, California 90089-0242 (United States); Department of Physics, Kumamoto University, Kumamoto 860-8555 (Japan); Department of Applied Quantum Physics and Nuclear Engineering, Kyushu University, Fukuoka 819-0395 (Japan)

    2014-05-14

    We introduce an extension of the divide-and-conquer (DC) algorithmic paradigm called divide-conquer-recombine (DCR) to perform large quantum molecular dynamics (QMD) simulations on massively parallel supercomputers, in which interatomic forces are computed quantum mechanically in the framework of density functional theory (DFT). In DCR, the DC phase constructs globally informed, overlapping local-domain solutions, which in the recombine phase are synthesized into a global solution encompassing large spatiotemporal scales. For the DC phase, we design a lean divide-and-conquer (LDC) DFT algorithm, which significantly reduces the prefactor of the O(N) computational cost for N electrons by applying a density-adaptive boundary condition at the peripheries of the DC domains. Our globally scalable and locally efficient solver is based on a hybrid real-reciprocal space approach that combines: (1) a highly scalable real-space multigrid to represent the global charge density; and (2) a numerically efficient plane-wave basis for local electronic wave functions and charge density within each domain. Hybrid space-band decomposition is used to implement the LDC-DFT algorithm on parallel computers. A benchmark test on an IBM Blue Gene/Q computer exhibits an isogranular parallel efficiency of 0.984 on 786?432 cores for a 50.3 10{sup 6}-atom SiC system. As a test of production runs, LDC-DFT-based QMD simulation involving 16?661 atoms is performed on the Blue Gene/Q to study on-demand production of hydrogen gas from water using LiAl alloy particles. As an example of the recombine phase, LDC-DFT electronic structures are used as a basis set to describe global photoexcitation dynamics with nonadiabatic QMD (NAQMD) and kinetic Monte Carlo (KMC) methods. The NAQMD simulations are based on the linear response time-dependent density functional theory to describe electronic excited states and a surface-hopping approach to describe transitions between the excited states. A series of techniques are employed for efficiently calculating the long-range exact exchange correction and excited-state forces. The NAQMD trajectories are analyzed to extract the rates of various excitonic processes, which are then used in KMC simulation to study the dynamics of the global exciton flow network. This has allowed the study of large-scale photoexcitation dynamics in 6400-atom amorphous molecular solid, reaching the experimental time scales.

  8. Generalizing the self-healing diffusion Monte Carlo approach to finite temperature: a path for the optimization of low-energy many-body basis expansions

    SciTech Connect (OSTI)

    Kim, Jeongnim; Reboredo, Fernando A

    2014-01-01

    The self-healing diffusion Monte Carlo method for complex functions [F. A. Reboredo J. Chem. Phys. {\\bf 136}, 204101 (2012)] and some ideas of the correlation function Monte Carlo approach [D. M. Ceperley and B. Bernu, J. Chem. Phys. {\\bf 89}, 6316 (1988)] are blended to obtain a method for the calculation of thermodynamic properties of many-body systems at low temperatures. In order to allow the evolution in imaginary time to describe the density matrix, we remove the fixed-node restriction using complex antisymmetric trial wave functions. A statistical method is derived for the calculation of finite temperature properties of many-body systems near the ground state. In the process we also obtain a parallel algorithm that optimizes the many-body basis of a small subspace of the many-body Hilbert space. This small subspace is optimized to have maximum overlap with the one expanded by the lower energy eigenstates of a many-body Hamiltonian. We show in a model system that the Helmholtz free energy is minimized within this subspace as the iteration number increases. We show that the subspace expanded by the small basis systematically converges towards the subspace expanded by the lowest energy eigenstates. Possible applications of this method to calculate the thermodynamic properties of many-body systems near the ground state are discussed. The resulting basis can be also used to accelerate the calculation of the ground or excited states with Quantum Monte Carlo.

  9. Theoretical hot methane line lists up to T = 2000 K for astrophysical applications

    SciTech Connect (OSTI)

    Rey, M.; Tyuterev, Vl. G.; Nikitin, A. V.

    2014-07-01

    The paper describes the construction of complete sets of hot methane lines based on accurate ab initio potential and dipole moment surfaces and extensive first-principle calculations. Four line lists spanning the [0-5000] cm{sup 1} infrared region were built at T = 500, 1000, 1500, and 2000 K. For each of these four temperatures, we have constructed two versions of line lists: a version for high-resolution applications containing strong and medium lines and a full version appropriate for low-resolution opacity calculations. A comparison with available empirical databases is discussed in detail for both cold and hot bands giving a very good agreement for line positions, typically <0.1-0.5 cm{sup 1} and ?5% for intensities of strong lines. Together with numerical tests using various basis sets, this confirms the computational convergence of our results for the most important lines, which is the major issue for theoretical spectra predictions. We showed that transitions with lower state energies up to 14,000 cm{sup 1} could give significant contributions to the methane opacity and have to be systematically taken into account. Our list at 2000 K calculated up to J = 50 contains 11.5 billion transitions for I > 10{sup 29} cm mol{sup 1}. These new lists are expected to be quantitatively accurate with respect to the precision of available and currently planned observations of astrophysical objects with improved spectral resolution.

  10. Sandia Energy - Genetic Algorithm for Innovative Device Designs...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Genetic Algorithm for Innovative Device Designs in High-Efficiency III-V Nitride Light-Emitting Diodes Home Energy Solid-State Lighting News Energy Efficiency News & Events Genetic...

  11. Gacs quantum algorithmic entropy in infinite dimensional Hilbert spaces

    SciTech Connect (OSTI)

    Benatti, Fabio, E-mail: benatti@ts.infn.it [Department of Physics, University of Trieste, Strada Costiera 11, I-34151 Trieste (Italy); Oskouei, Samad Khabbazi, E-mail: kh.oskuei@ut.ac.ir; Deh Abad, Ahmad Shafiei, E-mail: shafiei@khayam.ut.ac.ir [Department of Mathematics, School of Mathematics, Statistics and Computer Science, College of Science, University of Tehran, Tehran (Iran, Islamic Republic of)

    2014-08-15

    We extend the notion of Gacs quantum algorithmic entropy, originally formulated for finitely many qubits, to infinite dimensional quantum spin chains and investigate the relation of this extension with two quantum dynamical entropies that have been proposed in recent years.

  12. EnPI V4.0 Algorithm

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Department of Energy Advanced Manufacturing Office EnPI V4.0 Tool Algorithm Updated September 11 th , 2014 2 Contents Definition of Symbols ................................................................................................................................................3 Facility Level Calculations .........................................................................................................................................4 Calculation Methods when Actual Values are used

  13. Enterprise Assessments Targeted Review of the Safety Basis at the Savannah

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    River Site F-Area Central Laboratory Facility - January 2016 | Department of Energy Targeted Review of the Safety Basis at the Savannah River Site F-Area Central Laboratory Facility - January 2016 Enterprise Assessments Targeted Review of the Safety Basis at the Savannah River Site F-Area Central Laboratory Facility - January 2016 January 2016 Review of the Safety Basis F-Area Central Laboratory Facility at the Savannah River Site The Office of Nuclear Safety and Environmental Assessments,

  14. New Algorithm Enables Faster Simulations of Ultrafast Processes

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Algorithm Enables Faster Simulations of Ultrafast Processes New Algorithm Enables Faster Simulations of Ultrafast Processes Opens the Door for Real-Time Simulations in Atomic-Level Materials Research February 20, 2015 Contact: Rachel Berkowitz, 510-486-7254, rberkowitz@lbl.gov femtosecondalgorithm copy Model of ion (Cl) collision with atomically thin semiconductor (MoSe2). Collision region is shown in blue and zoomed in; red points show initial positions of Cl. The simulation calculates the

  15. Generation of Simulated Wind Data using an Intelligent Algorithm

    Office of Scientific and Technical Information (OSTI)

    (Conference) | SciTech Connect Conference: Generation of Simulated Wind Data using an Intelligent Algorithm Citation Details In-Document Search Title: Generation of Simulated Wind Data using an Intelligent Algorithm Authors: Weissbach, R. ; Wang, W. L. ; Hodge, B. M. ; Tang, M. H. ; Sonnenmeier, J. Publication Date: 2014-01-01 OSTI Identifier: 1176733 DOE Contract Number: AC36-08GO28308 Resource Type: Conference Resource Relation: Conference: Proceedings of the 2014 North American Power

  16. The differential algebra based multiple level fast multipole algorithm for

    Office of Scientific and Technical Information (OSTI)

    3D space charge field calculation and photoemission simulation (Journal Article) | SciTech Connect The differential algebra based multiple level fast multipole algorithm for 3D space charge field calculation and photoemission simulation Citation Details In-Document Search This content will become publicly available on September 28, 2016 Title: The differential algebra based multiple level fast multipole algorithm for 3D space charge field calculation and photoemission simulation Coulomb

  17. New Design Methods and Algorithms for Multi-component Distillation

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Processes | Department of Energy Design Methods and Algorithms for Multi-component Distillation Processes New Design Methods and Algorithms for Multi-component Distillation Processes PDF icon multicomponent.pdf More Documents & Publications CX-100137 Categorical Exclusion Determination ITP Chemicals: Hybripd Separations/Distillation Technology. Research Opportunities for Energy and Emissions Reduction ITP Energy Intensive Processes: Energy-Intensive Processes Portfolio: Addressing Key

  18. MEMORANDUM OF UNDERSTANDING Between The Numerical Algorithms Group Ltd

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Between The Numerical Algorithms Group Ltd and The University of California, as Management and Operating Contractor for Lawrence Berkeley National Laboratory on a Visitor Exchange Program This Memorandum of Understanding (MOU) is by and between the Numerical Algorithms Group Ltd (NAG) with a registered address at: Wilkinson House, Jordan hill Road, Oxford, UK and the University of California, as Management and Operating Contractor for Lawrence Berkeley National Laboratory, including its

  19. Levenberg--Marquardt algorithm: implementation and theory (Conference) |

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    SciTech Connect Conference: Levenberg--Marquardt algorithm: implementation and theory Citation Details In-Document Search Title: Levenberg--Marquardt algorithm: implementation and theory × You are accessing a document from the Department of Energy's (DOE) SciTech Connect. This site is a product of DOE's Office of Scientific and Technical Information (OSTI) and is provided as a public service. Visit OSTI to utilize additional information resources in energy science and technology. A paper

  20. The differential algebra based multiple level fast multipole algorithm for

    Office of Scientific and Technical Information (OSTI)

    3D space charge field calculation and photoemission simulation (Journal Article) | DOE PAGES The differential algebra based multiple level fast multipole algorithm for 3D space charge field calculation and photoemission simulation This content will become publicly available on September 28, 2016 Title: The differential algebra based multiple level fast multipole algorithm for 3D space charge field calculation and photoemission simulation Coulomb interaction between charged particles inside a

  1. Visualizing and improving the robustness of phase retrieval algorithms

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Tripathi, Ashish; Leyffer, Sven; Munson, Todd; Wild, Stefan M.

    2015-06-01

    Coherent x-ray diffractive imaging is a novel imaging technique that utilizes phase retrieval and nonlinear optimization methods to image matter at nanometer scales. We explore how the convergence properties of a popular phase retrieval algorithm, Fienup's HIO, behave by introducing a reduced dimensionality problem allowing us to visualize and quantify convergence to local minima and the globally optimal solution. We then introduce generalizations of HIO that improve upon the original algorithm's ability to converge to the globally optimal solution.

  2. NSS 18.3 Verification of Authorization Basis Documentation 12/8/03

    Broader source: Energy.gov [DOE]

    The objective of this surveillance is for the Facility Representative to verify that the facility's configuration and operations remain consistent with the authorization basis. As defined in DOE...

  3. CRAD, Review of Safety Basis Development - May 6, 2013 | Department of

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Energy Review of Safety Basis Development - May 6, 2013 CRAD, Review of Safety Basis Development - May 6, 2013 May 6, 2013 Review of Safety Basis Development for the Los Alamos National Laboratory Transuranic Waste Facility (HSS CRAD 45-59, Rev. 0) The review will consider selected aspects of the development of safety basis for the Transuranic Waste Facility (TWF) to assess the extent to which safety is integrated into the design of the TWF in accordance with DOE directives; in particular,

  4. Report to the Secretary of Energy on Beyond Design Basis Event...

    Broader source: Energy.gov (indexed) [DOE]

    BDBEReportfinal.pdf More Documents & Publications Report to the Secretary of Energy on Beyond Design Basis Event Pilot Evaluations, Results and Recommendations for Improvements...

  5. Technical Basis and Considerations for DOE M 435.1-1 (Appendix A)

    Broader source: Directives, Delegations, and Requirements [Office of Management (MA)]

    1999-07-09

    This appendix establishes the technical basis of the order revision process and of each of the requirements included in the revised radioactive waste management order.

  6. 5th International REAC/TS Symposium: The Medical Basis for Radiation...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    PrivacySecurity Statement 5th International REACTS Symposium: The Medical Basis for Radiation Accident Preparedness Skip site navigation and move to main content of page. Home...

  7. Impacts of Time Delays on Distributed Algorithms for Economic Dispatch

    SciTech Connect (OSTI)

    Yang, Tao; Wu, Di; Sun, Yannan; Lian, Jianming

    2015-07-26

    Economic dispatch problem (EDP) is an important problem in power systems. It can be formulated as an optimization problem with the objective to minimize the total generation cost subject to the power balance constraint and generator capacity limits. Recently, several consensus-based algorithms have been proposed to solve EDP in a distributed manner. However, impacts of communication time delays on these distributed algorithms are not fully understood, especially for the case where the communication network is directed, i.e., the information exchange is unidirectional. This paper investigates communication time delay effects on a distributed algorithm for directed communication networks. The algorithm has been tested by applying time delays to different types of information exchange. Several case studies are carried out to evaluate the effectiveness and performance of the algorithm in the presence of time delays in communication networks. It is found that time delay effects have negative effects on the convergence rate, and can even result in an incorrect converge value or fail the algorithm to converge.

  8. Incremental k-core decomposition: Algorithms and evaluation

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Sariyuce, Ahmet Erdem; Gedik, Bugra; Jacques-SIlva, Gabriela; Wu, Kun -Lung; Catalyurek, Umit V.

    2016-02-15

    A k-core of a graph is a maximal connected subgraph in which every vertex is connected to at least k vertices in the subgraph. k-core decomposition is often used in large-scale network analysis, such as community detection, protein function prediction, visualization, and solving NP-hard problems on real networks efficiently, like maximal clique finding. In many real-world applications, networks change over time. As a result, it is essential to develop efficient incremental algorithms for dynamic graph data. In this paper, we propose a suite of incremental k-core decomposition algorithms for dynamic graph data. These algorithms locate a small subgraph that ismore » guaranteed to contain the list of vertices whose maximum k-core values have changed and efficiently process this subgraph to update the k-core decomposition. We present incremental algorithms for both insertion and deletion operations, and propose auxiliary vertex state maintenance techniques that can further accelerate these operations. Our results show a significant reduction in runtime compared to non-incremental alternatives. We illustrate the efficiency of our algorithms on different types of real and synthetic graphs, at varying scales. Furthermore, for a graph of 16 million vertices, we observe relative throughputs reaching a million times, relative to the non-incremental algorithms.« less

  9. Monitoring and Commissioning Verification Algorithms for CHP Systems

    SciTech Connect (OSTI)

    Brambley, Michael R.; Katipamula, Srinivas; Jiang, Wei

    2008-03-31

    This document provides the algorithms for CHP system performance monitoring and commissioning verification (CxV). It starts by presenting system-level and component-level performance metrics, followed by descriptions of algorithms for performance monitoring and commissioning verification, using the metric presented earlier. Verification of commissioning is accomplished essentially by comparing actual measured performance to benchmarks for performance provided by the system integrator and/or component manufacturers. The results of these comparisons are then automatically interpreted to provide conclusions regarding whether the CHP system and its components have been properly commissioned and where problems are found, guidance is provided for corrections. A discussion of uncertainty handling is then provided, which is followed by a description of how simulations models can be used to generate data for testing the algorithms. A model is described for simulating a CHP system consisting of a micro-turbine, an exhaust-gas heat recovery unit that produces hot water, a absorption chiller and a cooling tower. The process for using this model for generating data for testing the algorithms for a selected set of faults is described. The next section applies the algorithms developed to CHP laboratory and field data to illustrate their use. The report then concludes with a discussion of the need for laboratory testing of the algorithms on a physical CHP systems and identification of the recommended next steps.

  10. Visual Empirical Region of Influence (VERI) Pattern Recognition Algorithms

    Energy Science and Technology Software Center (OSTI)

    2002-05-01

    We developed new pattern recognition (PR) algorithms based on a human visual perception model. We named these algorithms Visual Empirical Region of Influence (VERI) algorithms. To compare the new algorithm's effectiveness against othe PR algorithms, we benchmarked their clustering capabilities with a standard set of two-dimensional data that is well known in the PR community. The VERI algorithm succeeded in clustering all the data correctly. No existing algorithm had previously clustered all the pattens inmore » the data set successfully. The commands to execute VERI algorithms are quite difficult to master when executed from a DOS command line. The algorithm requires several parameters to operate correctly. From our own experiences we realized that if we wanted to provide a new data analysis tool to the PR community we would have to provide a new data analysis tool to the PR community we would have to make the tool powerful, yet easy and intuitive to use. That was our motivation for developing graphical user interfaces (GUI's) to the VERI algorithms. We developed GUI's to control the VERI algorithm in a single pass mode and in an optimization mode. We also developed a visualization technique that allows users to graphically animate and visually inspect multi-dimensional data after it has been classified by the VERI algorithms. The visualization technique that allows users to graphically animate and visually inspect multi-dimensional data after it has been classified by the VERI algorithms. The visualization package is integrated into the single pass interface. Both the single pass interface and optimization interface are part of the PR software package we have developed and make available to other users. The single pass mode only finds PR results for the sets of features in the data set that are manually requested by the user. The optimization model uses a brute force method of searching through the cominations of features in a data set for features that produce the best pattern recognition results. With a small number of features in a data set an exact solution can be determined. However, the number of possible combinations increases exponentially with the number of features and an alternate means of finding a solution must be found. We developed and implemented a technique for finding solutions in data sets with both small and large numbers of features. The VERI interface tools were written using the Tcl/Tk GUI programming language, version 8.1. Although the Tcl/Tk packages are designed to run on multiple computer platforms, we have concentrated our efforts to develop a user interface for the ubiquitous DOS environment. The VERI algorithms are compiled, executable programs. The interfaces run the VERI algorithms in Leave-One-Out mode using the Euclidean metric.« less

  11. Visual Empirical Region of Influence (VERI) Pattern Recognition Algorithms

    SciTech Connect (OSTI)

    2002-05-01

    We developed new pattern recognition (PR) algorithms based on a human visual perception model. We named these algorithms Visual Empirical Region of Influence (VERI) algorithms. To compare the new algorithm's effectiveness against othe PR algorithms, we benchmarked their clustering capabilities with a standard set of two-dimensional data that is well known in the PR community. The VERI algorithm succeeded in clustering all the data correctly. No existing algorithm had previously clustered all the pattens in the data set successfully. The commands to execute VERI algorithms are quite difficult to master when executed from a DOS command line. The algorithm requires several parameters to operate correctly. From our own experiences we realized that if we wanted to provide a new data analysis tool to the PR community we would have to provide a new data analysis tool to the PR community we would have to make the tool powerful, yet easy and intuitive to use. That was our motivation for developing graphical user interfaces (GUI's) to the VERI algorithms. We developed GUI's to control the VERI algorithm in a single pass mode and in an optimization mode. We also developed a visualization technique that allows users to graphically animate and visually inspect multi-dimensional data after it has been classified by the VERI algorithms. The visualization technique that allows users to graphically animate and visually inspect multi-dimensional data after it has been classified by the VERI algorithms. The visualization package is integrated into the single pass interface. Both the single pass interface and optimization interface are part of the PR software package we have developed and make available to other users. The single pass mode only finds PR results for the sets of features in the data set that are manually requested by the user. The optimization model uses a brute force method of searching through the cominations of features in a data set for features that produce the best pattern recognition results. With a small number of features in a data set an exact solution can be determined. However, the number of possible combinations increases exponentially with the number of features and an alternate means of finding a solution must be found. We developed and implemented a technique for finding solutions in data sets with both small and large numbers of features. The VERI interface tools were written using the Tcl/Tk GUI programming language, version 8.1. Although the Tcl/Tk packages are designed to run on multiple computer platforms, we have concentrated our efforts to develop a user interface for the ubiquitous DOS environment. The VERI algorithms are compiled, executable programs. The interfaces run the VERI algorithms in Leave-One-Out mode using the Euclidean metric.

  12. Theoretical and experimental research on multi-beam klystron

    SciTech Connect (OSTI)

    Ding Yaogen; Peng Jun; Zhu Yunshu; Shi Shaoming [Institute of Electronics, Chinese Academy of Sciences, Beijing 100080 (China)

    1999-05-07

    Theoretical and experimental research work on multi-beam klystron (MBK) conducted in Institute of Electronics, Chinese Academy of Sciences (IECAS) is described in this paper. Research progress on Interaction between multi-electron beam and microwave electric field, multi-beam cavity, filter loaded double gap cavity broadband output circuit, multi-beam electron gun, and periodic reversal focusing system is presented. Performance and measurement results of five types of MBK are also given out. The key technical problems for present MBK are discussed in this paper.

  13. Algorithms and tools for high-throughput geometry-based analysis...

    Office of Scientific and Technical Information (OSTI)

    Journal Article: Algorithms and tools for high-throughput geometry-based analysis of crystalline porous materials Citation Details In-Document Search Title: Algorithms and tools ...

  14. The theoretical study of passive and active optical devices via planewave based transfer (scattering) matrix method and other approaches

    SciTech Connect (OSTI)

    Zhuo, Ye

    2011-05-15

    In this thesis, we theoretically study the electromagnetic wave propagation in several passive and active optical components and devices including 2-D photonic crystals, straight and curved waveguides, organic light emitting diodes (OLEDs), and etc. Several optical designs are also presented like organic photovoltaic (OPV) cells and solar concentrators. The first part of the thesis focuses on theoretical investigation. First, the plane-wave-based transfer (scattering) matrix method (TMM) is briefly described with a short review of photonic crystals and other numerical methods to study them (Chapter 1 and 2). Next TMM, the numerical method itself is investigated in details and developed in advance to deal with more complex optical systems. In chapter 3, TMM is extended in curvilinear coordinates to study curved nanoribbon waveguides. The problem of a curved structure is transformed into an equivalent one of a straight structure with spatially dependent tensors of dielectric constant and magnetic permeability. In chapter 4, a new set of localized basis orbitals are introduced to locally represent electromagnetic field in photonic crystals as alternative to planewave basis. The second part of the thesis focuses on the design of optical devices. First, two examples of TMM applications are given. The first example is the design of metal grating structures as replacements of ITO to enhance the optical absorption in OPV cells (chapter 6). The second one is the design of the same structure as above to enhance the light extraction of OLEDs (chapter 7). Next, two design examples by ray tracing method are given, including applying a microlens array to enhance the light extraction of OLEDs (chapter 5) and an all-angle wide-wavelength design of solar concentrator (chapter 8). In summary, this dissertation has extended TMM which makes it capable of treating complex optical systems. Several optical designs by TMM and ray tracing method are also given as a full complement of this work.

  15. Theoretical Minimum Energies to Produce Steel for Selected Conditions

    SciTech Connect (OSTI)

    Fruehan, R.J.; Fortini, O.; Paxton, H.W.; Brindle, R.

    2000-05-01

    The energy used to produce liquid steel in today's integrated and electric arc furnace (EAF) facilities is significantly higher than the theoretical minimum energy requirements. This study presents the absolute minimum energy required to produce steel from ore and mixtures of scrap and scrap alternatives. Additional cases in which the assumptions are changed to more closely approximate actual operating conditions are also analyzed. The results, summarized in Table E-1, should give insight into the theoretical and practical potentials for reducing steelmaking energy requirements. The energy values have also been converted to carbon dioxide (CO{sub 2}) emissions in order to indicate the potential for reduction in emissions of this greenhouse gas (Table E-2). The study showed that increasing scrap melting has the largest impact on energy consumption. However, scrap should be viewed as having ''invested'' energy since at one time it was produced by reducing ore. Increasing scrap melting in the BOF mayor may not decrease energy if the ''invested'' energy in scrap is considered.

  16. The magnetic flywheel flow meter: Theoretical and experimental contributions

    SciTech Connect (OSTI)

    Buchenau, D., E-mail: d.buchenau@hzdr.de; Galindo, V.; Eckert, S. [Helmholtz-Zentrum Dresden-Rossendorf, Institute of Fluid Dynamics, Bautzner Landstrae 400, 01328 Dresden (Germany)

    2014-06-02

    The development of contactless flow meters is an important issue for monitoring and controlling of processes in different application fields, like metallurgy, liquid metal casting, or cooling systems for nuclear reactors and transmutation machines. Shercliff described in his book The Theory of Electromagnetic Flow Measurement, Cambridge University Press, 1962 a simple and robust device for contact-less measurements of liquid metal flow rates which is known as magnetic flywheel. The sensor consists of several permanent magnets attached on a rotatable soft iron plate. This arrangement will be placed closely to the liquid metal flow to be measured, so that the field of the permanent magnets penetrates into the fluid volume. The flywheel will be accelerated by a Lorentz force arising from the interaction between the magnetic field and the moving liquid. Steady rotation rates of the flywheel can be taken as a measure for the mean flow rate inside the fluid channel. The present paper provides a detailed theoretical description of the sensor in order to gain a better insight into the functional principle of the magnetic flywheel. Theoretical predictions are confirmed by corresponding laboratory experiments. For that purpose, a laboratory model of such a flow meter was built and tested on a GaInSn-loop under various test conditions.

  17. An Experimental and Theoretical High Energy Physics Program

    SciTech Connect (OSTI)

    Shipsey, Ian

    2012-07-31

    The Purdue High Energy Physics Group conducts research in experimental and theoretical elementary particle physics and experimental high energy astrophysics. Our goals, which we share with high energy physics colleagues around the world, are to understand at the most fundamental level the nature of matter, energy, space and time, and in order to explain the birth, evolution and fate of the Universe. The experiments in which we are currently involved are: CDF, CLEO-c, CMS, LSST, and VERITAS. We have been instrumental in establishing two major in-house facilities: The Purdue Particle Physics Microstructure Detector Facility (P3MD) in 1995 and the CMS Tier-2 center in 2005. The research efforts of the theory group span phenomenological and theoretical aspects of the Standard Model as well as many of its possible extensions. Recent work includes phenomenological consequences of supersymmetric models, string theory and applications of gauge/gravity duality, the cosmological implications of massive gravitons, and the physics of extra dimensions.

  18. Criteria Document for B-plant Surveillance and Maintenance Phase Safety Basis Document

    SciTech Connect (OSTI)

    SCHWEHR, B.A.

    1999-08-31

    This document is required by the Project Hanford Managing Contractor (PHMC) procedure, HNF-PRO-705, Safety Basis Planning, Documentation, Review, and Approval. This document specifies the criteria that shall be in the B Plant surveillance and maintenance phase safety basis in order to obtain approval of the DOE-RL. This CD describes the criteria to be addressed in the S&M Phase safety basis for the deactivated Waste Fractionization Facility (B Plant) on the Hanford Site in Washington state. This criteria document describes: the document type and format that will be used for the S&M Phase safety basis, the requirements documents that will be invoked for the document development, the deactivated condition of the B Plant facility, and the scope of issues to be addressed in the S&M Phase safety basis document.

  19. 2D/3D registration algorithm for lung brachytherapy

    SciTech Connect (OSTI)

    Zvonarev, P. S.; Farrell, T. J.; Hunter, R.; Wierzbicki, M.; Hayward, J. E.; Sur, R. K.

    2013-02-15

    Purpose: A 2D/3D registration algorithm is proposed for registering orthogonal x-ray images with a diagnostic CT volume for high dose rate (HDR) lung brachytherapy. Methods: The algorithm utilizes a rigid registration model based on a pixel/voxel intensity matching approach. To achieve accurate registration, a robust similarity measure combining normalized mutual information, image gradient, and intensity difference was developed. The algorithm was validated using a simple body and anthropomorphic phantoms. Transfer catheters were placed inside the phantoms to simulate the unique image features observed during treatment. The algorithm sensitivity to various degrees of initial misregistration and to the presence of foreign objects, such as ECG leads, was evaluated. Results: The mean registration error was 2.2 and 1.9 mm for the simple body and anthropomorphic phantoms, respectively. The error was comparable to the interoperator catheter digitization error of 1.6 mm. Preliminary analysis of data acquired from four patients indicated a mean registration error of 4.2 mm. Conclusions: Results obtained using the proposed algorithm are clinically acceptable especially considering the complications normally encountered when imaging during lung HDR brachytherapy.

  20. Mesh Algorithms for PDE with Sieve I: Mesh Distribution

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Knepley, Matthew G.; Karpeev, Dmitry A.

    2009-01-01

    We have developed a new programming framework, called Sieve, to support parallel numerical partial differential equation(s) (PDE) algorithms operating over distributed meshes. We have also developed a reference implementation of Sieve in C++ as a library of generic algorithms operating on distributed containers conforming to the Sieve interface. Sieve makes instances of the incidence relation, or arrows, the conceptual first-class objects represented in the containers. Further, generic algorithms acting on this arrow container are systematically used to provide natural geometric operations on the topology and also, through duality, on the data. Finally, coverings and duality are used to encode notmore » only individual meshes, but all types of hierarchies underlying PDE data structures, including multigrid and mesh partitions. In order to demonstrate the usefulness of the framework, we show how the mesh partition data can be represented and manipulated using the same fundamental mechanisms used to represent meshes. We present the complete description of an algorithm to encode a mesh partition and then distribute a mesh, which is independent of the mesh dimension, element shape, or embedding. Moreover, data associated with the mesh can be similarly distributed with exactly the same algorithm. The use of a high level of abstraction within the Sieve leads to several benefits in terms of code reuse, simplicity, and extensibility. We discuss these benefits and compare our approach to other existing mesh libraries.« less

  1. Theoretical analysis of sound transmission loss through graphene sheets

    SciTech Connect (OSTI)

    Natsuki, Toshiaki; Ni, Qing-Qing

    2014-11-17

    We examine the potential of using graphene sheets (GSs) as sound insulating materials that can be used for nano-devices because of their small size, super electronic, and mechanical properties. In this study, a theoretical analysis is proposed to predict the sound transmission loss through multi-layered GSs, which are formed by stacks of GS and bound together by van der Waals (vdW) forces between individual layers. The result shows that the resonant frequencies of the sound transmission loss occur in the multi-layered GSs and the values are very high. Based on the present analytical solution, we predict the acoustic insulation property for various layers of sheets under both normal incident wave and acoustic field of random incidence source. The scheme could be useful in vibration absorption application of nano devices and materials.

  2. Structural and Electronic Properties of Isolated Nanodiamonds: A Theoretical Perspective

    SciTech Connect (OSTI)

    Raty, J; Galli, G

    2004-09-09

    Nanometer sized diamond has been found in meteorites, proto-planetary nebulae and interstellar dusts, as well as in residues of detonation and in diamond films. Remarkably, the size distribution of diamond nanoparticles appears to be peaked around 2-5 nm, and to be largely independent of preparation conditions. Using ab-initio calculations, we have shown that in this size range nanodiamond has a fullerene-like surface and, unlike silicon and germanium, exhibit very weak quantum confinement effects. We called these carbon nanoparticles bucky-diamonds: their atomic structure, predicted by simulations, is consistent with many experimental findings. In addition, we carried out calculations of the stability of nanodiamond which provided a unifying explanation of its size distribution in extra-terrestrial samples, and in ultra-crystalline diamond films. Here we present a summary of our theoretical results and we briefly outline work in progress on doping of nanodiamond with nitrogen.

  3. Experimental And Theoretical High Energy Physics Research At UCLA

    SciTech Connect (OSTI)

    Cousins, Robert D.

    2013-07-22

    This is the final report of the UCLA High Energy Physics DOE Grant No. DE-FG02- 91ER40662. This report covers the last grant project period, namely the three years beginning January 15, 2010, plus extensions through April 30, 2013. The report describes the broad range of our experimental research spanning direct dark matter detection searches using both liquid xenon (XENON) and liquid argon (DARKSIDE); present (ICARUS) and R&D for future (LBNE) neutrino physics; ultra-high-energy neutrino and cosmic ray detection (ANITA); and the highest-energy accelerator-based physics with the CMS experiment and CERNs Large Hadron Collider. For our theory group, the report describes frontier activities including particle astrophysics and cosmology; neutrino physics; LHC interaction cross section calculations now feasible due to breakthroughs in theoretical techniques; and advances in the formal theory of supergravity.

  4. A theoretical analysis of rotating cavitation in inducers

    SciTech Connect (OSTI)

    Tsujimoto, Y.; Kamijo, K. (National Aerospace Lab., Miyagi, (Japan)); Yoshida, Y. (Osaka Univ., Toyonaka, (Japan). Engineering Science)

    1993-03-01

    Rotating cavitation was analyzed using an actuator disk method. Quasi-steady pressure performance of the impeller, mass flow gain factor, and cavitation compliance of the cavity were taken into account. Three types of destabilizing modes were predicted: rotation cavitation propagating faster than the rotational speed of the impeller, rotating cavitation propagating in the direction opposite that of the impeller, and rotating stall propagating slower than the rotational speed of the impeller. It was shown that both types of rotating cavitation were caused by the positive mass flow gain factor, while the rotating stall was caused by the positive slope of the pressure performance. Stability and propagation velocity maps are presented for the two types of rotating cavitation in the mass flow gain factor-cavitation compliance place. The correlation between theoretical results and experimental observations is discussed.

  5. Modeling an Application's Theoretical Minimum and Average Transactional Response Times

    SciTech Connect (OSTI)

    Paiz, Mary Rose

    2015-04-01

    The theoretical minimum transactional response time of an application serves as a ba- sis for the expected response time. The lower threshold for the minimum response time represents the minimum amount of time that the application should take to complete a transaction. Knowing the lower threshold is beneficial in detecting anomalies that are re- sults of unsuccessful transactions. On the converse, when an application's response time falls above an upper threshold, there is likely an anomaly in the application that is causing unusual performance issues in the transaction. This report explains how the non-stationary Generalized Extreme Value distribution is used to estimate the lower threshold of an ap- plication's daily minimum transactional response time. It also explains how the seasonal Autoregressive Integrated Moving Average time series model is used to estimate the upper threshold for an application's average transactional response time.

  6. EAGLE: 'EAGLE'Is an' Algorithmic Graph Library for Exploration

    SciTech Connect (OSTI)

    2015-01-16

    The Resource Description Framework (RDF) and SPARQL Protocol and RDF Query Language (SPARQL) were introduced about a decade ago to enable flexible schema-free data interchange on the Semantic Web. Today data scientists use the framework as a scalable graph representation for integrating, querying, exploring and analyzing data sets hosted at different sources. With increasing adoption, the need for graph mining capabilities for the Semantic Web has emerged. Today there is no tools to conduct "graph mining" on RDF standard data sets. We address that need through implementation of popular iterative Graph Mining algorithms (Triangle count, Connected component analysis, degree distribution, diversity degree, PageRank, etc.). We implement these algorithms as SPARQL queries, wrapped within Python scripts and call our software tool as EAGLE. In RDF style, EAGLE stands for "EAGLE 'Is an' algorithmic graph library for exploration. EAGLE is like 'MATLAB' for 'Linked Data.'

  7. EAGLE: 'EAGLE'Is an' Algorithmic Graph Library for Exploration

    Energy Science and Technology Software Center (OSTI)

    2015-01-16

    The Resource Description Framework (RDF) and SPARQL Protocol and RDF Query Language (SPARQL) were introduced about a decade ago to enable flexible schema-free data interchange on the Semantic Web. Today data scientists use the framework as a scalable graph representation for integrating, querying, exploring and analyzing data sets hosted at different sources. With increasing adoption, the need for graph mining capabilities for the Semantic Web has emerged. Today there is no tools to conduct "graphmore » mining" on RDF standard data sets. We address that need through implementation of popular iterative Graph Mining algorithms (Triangle count, Connected component analysis, degree distribution, diversity degree, PageRank, etc.). We implement these algorithms as SPARQL queries, wrapped within Python scripts and call our software tool as EAGLE. In RDF style, EAGLE stands for "EAGLE 'Is an' algorithmic graph library for exploration. EAGLE is like 'MATLAB' for 'Linked Data.'« less

  8. Nonlinear Global Optimization Using Curdling Algorithm in Mathematica Environmet

    Energy Science and Technology Software Center (OSTI)

    1997-08-05

    An algorithm for performing optimization which is a derivative-free, grid-refinement approach to nonlinear optimization was developed and implemented in software as OPTIMIZE. This approach overcomes a number of deficiencies in existing approaches. Most notably, it finds extremal regions rather than only single extremal points. the program is interactive and collects information on control parameters and constraints using menus. For up to two (and potentially three) dimensions, function convergence is displayed graphically. Because the algorithm doesmore » not compute derivatives, gradients, or vectors, it is numerically stable. It can find all the roots of a polynomial in one pass. It is an inherently parallel algorithm. OPTIMIZE-M is a modification of OPTIMIZE designed for use within the Mathematica environment created by Wolfram Research.« less

  9. Analysis of the Multi-Phase Copying Garbage Collection Algorithm

    SciTech Connect (OSTI)

    Podhorszki, Norbert

    2009-01-01

    The multi-phase copying garbage collection was designed to avoid the need for large amount of reserved memory usually required for the copying types of garbage collection algorithms. The collection is performed in multiple phases using the available free memory. This paper proves that the number of phases depends on the size of the reserved memory and the ratio of the garbage and accessible objects. The performance of the implemented algorithm is tested in a fine-grained parallel Prolog system. We find that reserving only 10% of memory for garbage collection is sufficient for good performance in practice. Additionally, an improvement of the generic algorithm specifically for the tested parallel Prolog system is described.

  10. A general higher-order remap algorithm for ALE calculations ...

    Office of Scientific and Technical Information (OSTI)

    The three phases of the Arbitrary Lagrangian Eulerian (ALE) methodology are outlined: the ... A donor cell method from the SALE code forms the basis of the remap step, but unlike SALE ...

  11. Theoretical Research in Cosmology, High-Energy Physics and String Theory

    SciTech Connect (OSTI)

    Ng, Y Jack; Dolan, Louise; Mersini-Houghton, Laura; Frampton, Paul

    2013-07-29

    The research was in the area of Theoretical Physics: Cosmology, High-Energy Physics and String Theory

  12. Safety basis academy summary of project implementation from 2007-2009

    SciTech Connect (OSTI)

    Johnston, Julie A

    2009-01-01

    During fiscal years 2007 through 2009, in accordance with Performance Based Incentives with DOE/NNSA Los Alamos Site Office, Los Alamos National Security (LANS) implemented and operated a Safety Basis Academy (SBA) to facilitate uniformity in technical qualifications of safety basis professionals across the nuclear weapons complex. The implementation phase of the Safety Basis Academy required development, delivery, and finalizing a set of 23 courses. The courses developed are capable of supporting qualification efforts for both federal and contractor personnel throughout the DOE/NNSA Complex. The LANS Associate Director for Nuclear and High Hazard Operations (AD-NHHO) delegated project responsibillity to the Safety Basis Division. The project was assigned to the Safety Basis Technical Services (SB-TS) Group at Los Alamos National Laboratory (LANL). The main tasks were project needs analysis, design, development, implementation of instructional delivery, and evaluation of SBA courses. DOE/NNSA responsibility for oversight of the SBA project was assigned to the Chief of Defense for Nuclear Safety, and delegated to the Authorization Basis Senior Advisor, Continuous Learning Chair (CDNS-ABSA/CLC). NNSA developed a memorandum of agreement with LANS AD-NHHO. Through a memorandum of agreement initiated by NNSA, the DOE National Training Center (NTC) will maintain the set of Safety Basis Academy courses and is able to facilitate course delivery throughout the DOE Complex.

  13. ARM: 10-second Raman Lidar: aerosol scattering ratio and backscattering coefficient profiles, from first Ferrare algorithm

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Chitra Sivaraman; Connor Flynn

    2004-10-01

    10-second Raman Lidar: aerosol scattering ratio and backscattering coefficient profiles, from first Ferrare algorithm

  14. ARM: 2-minute Raman Lidar: aerosol scattering ratio and backscattering coefficient profiles, from first Ferrare algorithm

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Chitra Sivaraman; Connor Flynn

    2004-10-01

    2-minute Raman Lidar: aerosol scattering ratio and backscattering coefficient profiles, from first Ferrare algorithm

  15. ARM: 10-minute Raman Lidar: aerosol extinction profiles and aerosol optical thickness, from first Ferrare algorithm

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Chitra Sivaraman; Connor Flynn

    1998-03-01

    10-minute Raman Lidar: aerosol extinction profiles and aerosol optical thickness, from first Ferrare algorithm

  16. ARM: 1-minute Raman Lidar: aerosol scattering ratio and backscattering coefficient profiles, from first Ferrare algorithm

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Chitra Sivaraman; Connor Flynn

    2004-10-01

    1-minute Raman Lidar: aerosol scattering ratio and backscattering coefficient profiles, from first Ferrare algorithm

  17. ARM: 1-minute Raman Lidar: aerosol extinction profiles and aerosol optical thickness, from first Ferrare algorithm

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Chitra Sivaraman; Connor Flynn

    2004-10-01

    1-minute Raman Lidar: aerosol extinction profiles and aerosol optical thickness, from first Ferrare algorithm

  18. ARM: 10-minute Raman Lidar: aerosol scattering ratio and backscattering coefficient profiles, from first Ferrare algorithm

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Newsom, Rob; Goldsmith, John

    1998-03-01

    10-minute Raman Lidar: aerosol scattering ratio and backscattering coefficient profiles, from first Ferrare algorithm

  19. ARM: SIRS: derived, correction of downwelling shortwave diffuse hemispheric measurements using Dutton and full algorithm

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Laura Riihimaki

    1997-03-21

    SIRS: derived, correction of downwelling shortwave diffuse hemispheric measurements using Dutton and full algorithm

  20. ARM: 10-second Raman Lidar: aerosol scattering ratio and backscattering coefficient profiles, from first Ferrare algorithm

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Chitra Sivaraman; Connor Flynn

    10-second Raman Lidar: aerosol scattering ratio and backscattering coefficient profiles, from first Ferrare algorithm

  1. ARM: 10-minute Raman Lidar: aerosol scattering ratio and backscattering coefficient profiles, from first Ferrare algorithm

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Chitra Sivaraman; Connor Flynn

    10-minute Raman Lidar: aerosol scattering ratio and backscattering coefficient profiles, from first Ferrare algorithm

  2. ARM: 10-minute Raman Lidar: aerosol extinction profiles and aerosol optical thickness, from first Ferrare algorithm

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Chitra Sivaraman; Connor Flynn

    10-minute Raman Lidar: aerosol extinction profiles and aerosol optical thickness, from first Ferrare algorithm

  3. ARM: 2-minute Raman Lidar: aerosol scattering ratio and backscattering coefficient profiles, from first Ferrare algorithm

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Sivaraman, Chitra; Flynn, Connor

    2-minute Raman Lidar: aerosol scattering ratio and backscattering coefficient profiles, from first Ferrare algorithm

  4. ARM: 1-minute Raman Lidar: aerosol extinction profiles and aerosol optical thickness, from first Ferrare algorithm

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Chitra Sivaraman; Connor Flynn

    1-minute Raman Lidar: aerosol extinction profiles and aerosol optical thickness, from first Ferrare algorithm

  5. ARM: 1-minute Raman Lidar: aerosol scattering ratio and backscattering coefficient profiles, from first Ferrare algorithm

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Chitra Sivaraman; Connor Flynn

    1-minute Raman Lidar: aerosol scattering ratio and backscattering coefficient profiles, from first Ferrare algorithm

  6. COLLOQUIUM: Introduction to Quantum Algorithms | Princeton Plasma Physics

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Lab December 9, 2015, 4:15pm to 5:30pm MBG AUDITORIUM COLLOQUIUM: Introduction to Quantum Algorithms Dr. Nadya Shirokova University of Santa Clara Quantum computers are not an abstraction anymore - Google, NASA and USRA recently announced formation of the Quantum Artificial Intelligence Lab equipped with 1,000-qubit quantum computer. In this talk we will focus on quantum algorithms such as Deutsch, Shor's and Grover's and will discuss why they are faster than the classical ones. We will also

  7. Microsoft Word - CR-091 Primary Basis of Cost Savings and Cost Savings Amount Custom Fields

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    CR-091 Primary Basis of Cost Savings and Cost Savings Amount Custom Fields Primary Basis of Cost Savings and Cost Savings Amount Custom Fields Background On August 29 th , 2013 the PSWG approved Change Request 091 which adds two new custom fields to STRIPES. The names of the fields are Primary Basis of Cost Savings and Cost Saving Amount. The purpose of these fields is to allow DOE to capture Cost Savings documented at time of award and the subsequent reporting capability of this data via IDW.

  8. Reducing Uncertainty in the Seismic Design Basis for the Waste Treatment Plant, Hanford, Washington

    SciTech Connect (OSTI)

    Brouns, Thomas M.; Rohay, Alan C.; Reidel, Steve; Gardner, Martin G.

    2007-02-27

    The seismic design basis for the Waste Treatment Plant (WTP) at the Department of Energys (DOE) Hanford Site near Richland was re-evaluated in 2005, resulting in an increase by up to 40% in the seismic design basis. The original seismic design basis for the WTP was established in 1999 based on a probabilistic seismic hazard analysis completed in 1996. The 2005 analysis was performed to address questions raised by the Defense Nuclear Facilities Safety Board (DNFSB) about the assumptions used in developing the original seismic criteria and adequacy of the site geotechnical surveys. The updated seismic response analysis used existing and newly acquired seismic velocity data, statistical analysis, expert elicitation, and ground motion simulation to develop interim design ground motion response spectra which enveloped the remaining uncertainties. The uncertainties in these response spectra were enveloped at approximately the 84th percentile to produce conservative design spectra, which contributed significantly to the increase in the seismic design basis.

  9. Non-homogeneous solutions of a Coulomb Schrdinger equation as basis set for scattering problems

    SciTech Connect (OSTI)

    Del Punta, J. A.; Ambrosio, M. J.; Gasaneo, G.; Zaytsev, S. A.; Ancarani, L. U.

    2014-05-15

    We introduce and study two-body Quasi Sturmian functions which are proposed as basis functions for applications in three-body scattering problems. They are solutions of a two-body non-homogeneous Schrdinger equation. We present different analytic expressions, including asymptotic behaviors, for the pure Coulomb potential with a driven term involving either Slater-type or Laguerre-type orbitals. The efficiency of Quasi Sturmian functions as basis set is numerically illustrated through a two-body scattering problem.

  10. Los Alamos National Laboratory fission basis (Conference) | SciTech Connect

    Office of Scientific and Technical Information (OSTI)

    National Laboratory fission basis Citation Details In-Document Search Title: Los Alamos National Laboratory fission basis Authors: Keksis, August L [1] ; Chadwick, Mark B [1] + Show Author Affiliations Los Alamos National Laboratory Publication Date: 2011-05-06 OSTI Identifier: 1063939 Report Number(s): LA-UR-11-02744; LA-UR-11-2744 DOE Contract Number: AC52-06NA25396 Resource Type: Conference Resource Relation: Conference: 14th International Symposium on Reactor Dosimetry ; May 22, 2011 ;

  11. Crystal structure of a ;#8203;BRAF kinase domain monomer explains basis for

    Office of Scientific and Technical Information (OSTI)

    allosteric regulation (Journal Article) | SciTech Connect Crystal structure of a ;#8203;BRAF kinase domain monomer explains basis for allosteric regulation Citation Details In-Document Search Title: Crystal structure of a ;#8203;BRAF kinase domain monomer explains basis for allosteric regulation Authors: Thevakumaran, Neroshan ; Lavoie, Hugo ; Critton, David A. ; Tebben, Andrew ; Marinier, Anne ; Sicheri, Frank ; Therrien , Marc [1] ; Montreal) [2] ; BMS) [2] + Show Author Affiliations

  12. Structural Basis for Specificity and Flexibility in a Plant 4-Coumarate:CoA

    Office of Scientific and Technical Information (OSTI)

    Ligase (Journal Article) | SciTech Connect Structural Basis for Specificity and Flexibility in a Plant 4-Coumarate:CoA Ligase Citation Details In-Document Search Title: Structural Basis for Specificity and Flexibility in a Plant 4-Coumarate:CoA Ligase Authors: Li, Zhi ; Nair, Satish K. [1] + Show Author Affiliations UIUC Publication Date: 2015-12-04 OSTI Identifier: 1227510 Resource Type: Journal Article Resource Relation: Journal Name: Structure; Journal Volume: 23; Journal Issue: (11) ;

  13. Structural Basis of Selective Ubiquitination of TRF1 by SCFFbx4 (Journal

    Office of Scientific and Technical Information (OSTI)

    Article) | SciTech Connect Structural Basis of Selective Ubiquitination of TRF1 by SCFFbx4 Citation Details In-Document Search Title: Structural Basis of Selective Ubiquitination of TRF1 by SCFFbx4 Authors: Zeng, Zhixiong ; Wang, Wei ; Yang, Yuting ; Chen, Yong ; Yang, Xiaomei ; Diehl, J. Alan ; Liu, Xuedong ; Lei, Ming Publication Date: 2010-02-01 OSTI Identifier: 1198117 Grant/Contract Number: AC02-06CH11357 Type: Published Article Journal Name: Developmental Cell Additional Journal

  14. Structural Basis of UV DNA-Damage Recognition by the DDB1-DDB2 Complex

    Office of Scientific and Technical Information (OSTI)

    (Journal Article) | SciTech Connect Structural Basis of UV DNA-Damage Recognition by the DDB1-DDB2 Complex Citation Details In-Document Search Title: Structural Basis of UV DNA-Damage Recognition by the DDB1-DDB2 Complex Ultraviolet (UV) light-induced pyrimidine photodimers are repaired by the nucleotide excision repair pathway. Photolesions have biophysical parameters closely resembling undamaged DNA, impeding discovery through damage surveillance proteins. The DDB1DDB2 complex serves in

  15. Structural basis for the prion-like MAVS filaments in antiviral innate

    Office of Scientific and Technical Information (OSTI)

    immunity (Journal Article) | SciTech Connect Structural basis for the prion-like MAVS filaments in antiviral innate immunity Citation Details In-Document Search Title: Structural basis for the prion-like MAVS filaments in antiviral innate immunity Authors: Xu, Hui ; He, Xiaojing ; Zheng, Hui ; Huang, Lily J ; Hou, Fajian ; Yu, Zhiheng ; de la Cruz, Michael Jason ; Borkowski, Brian ; Zhang, Xuewu ; Chen, Zhijian J ; Jiang, Qiu-Xing [1] ; HHMI) [2] + Show Author Affiliations (UTSMC) (

  16. The Three-Dimensional Structural Basis of Type II Hyperprolinemia (Journal

    Office of Scientific and Technical Information (OSTI)

    Article) | SciTech Connect The Three-Dimensional Structural Basis of Type II Hyperprolinemia Citation Details In-Document Search Title: The Three-Dimensional Structural Basis of Type II Hyperprolinemia Type II hyperprolinemia is an autosomal recessive disorder caused by a deficiency in {Delta}{sup 1}-pyrroline-5-carboxylate dehydrogenase (P5CDH; also known as ALDH4A1), the aldehyde dehydrogenase that catalyzes the oxidation of glutamate semialdehyde to glutamate. Here, we report the first

  17. Basis for Section 3116 Determination for Salt Waste Disposal at the

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Savannah River Site | Department of Energy Basis for Section 3116 Determination for Salt Waste Disposal at the Savannah River Site Basis for Section 3116 Determination for Salt Waste Disposal at the Savannah River Site The Secretary of Energy is making this 3116 Determination pursuant to Section 3116 of the Ronald W. Reagan National Defense Authorization Act for Fiscal Year 2005 (NDAA) [1]. This 3116 Determination concerns the disposal of separated, solidified low-activity radioactive salt

  18. Report to the Secretary of Energy on Beyond Design Basis Event Pilot

    Energy Savers [EERE]

    Evaluations, Results and Recommendations for Improvements to Enhance Nuclear Safety at DOE Nuclear Facilities | Department of Energy Report to the Secretary of Energy on Beyond Design Basis Event Pilot Evaluations, Results and Recommendations for Improvements to Enhance Nuclear Safety at DOE Nuclear Facilities Report to the Secretary of Energy on Beyond Design Basis Event Pilot Evaluations, Results and Recommendations for Improvements to Enhance Nuclear Safety at DOE Nuclear Facilities In

  19. DOE's Safety Bulletin No. 2011-01, Events Beyond Design Safety Basis

    Office of Environmental Management (EM)

    Analysis, March 2011 | Department of Energy DOE's Safety Bulletin No. 2011-01, Events Beyond Design Safety Basis Analysis, March 2011 DOE's Safety Bulletin No. 2011-01, Events Beyond Design Safety Basis Analysis, March 2011 PURPOSE This Safety Alert provides information on a safety concern related to the identification and mitigation of events that may fall outside those analyzed in the documented safety analysis. BACKGROUND On March 11, 2011, the Fukushima Daiichi nuclear power station in

  20. Numerical Optimization Algorithms and Software for Systems Biology

    SciTech Connect (OSTI)

    Saunders, Michael

    2013-02-02

    The basic aims of this work are: to develop reliable algorithms for solving optimization problems involving large stoi- chiometric matrices; to investigate cyclic dependency between metabolic and macromolecular biosynthetic networks; and to quantify the significance of thermodynamic constraints on prokaryotic metabolism.

  1. Finding the needle in the haystack: Algorithms for conformational optimization

    SciTech Connect (OSTI)

    Andricioaei, I.; Straub, J.E.

    1996-09-01

    Algorithms are given for comformational optimization of proteins. The protein folding problems is regarded as a problem of global energy mimimization. Since proteins have hundreds of atoms, finding the lowest-energy conformation in a many-dimensional configuration space becomes a computationally demanding problem.{copyright} {ital American Institute of Physics.}

  2. A Parallel Ghosting Algorithm for The Flexible Distributed Mesh Database

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Mubarak, Misbah; Seol, Seegyoung; Lu, Qiukai; Shephard, Mark S.

    2013-01-01

    Critical to the scalability of parallel adaptive simulations are parallel control functions including load balancing, reduced inter-process communication and optimal data decomposition. In distributed meshes, many mesh-based applications frequently access neighborhood information for computational purposes which must be transmitted efficiently to avoid parallel performance degradation when the neighbors are on different processors. This article presents a parallel algorithm of creating and deleting data copies, referred to as ghost copies, which localize neighborhood data for computation purposes while minimizing inter-process communication. The key characteristics of the algorithm are: (1) It can create ghost copies of any permissible topological order inmore » a 1D, 2D or 3D mesh based on selected adjacencies. (2) It exploits neighborhood communication patterns during the ghost creation process thus eliminating all-to-all communication. (3) For applications that need neighbors of neighbors, the algorithm can create n number of ghost layers up to a point where the whole partitioned mesh can be ghosted. Strong and weak scaling results are presented for the IBM BG/P and Cray XE6 architectures up to a core count of 32,768 processors. The algorithm also leads to scalable results when used in a parallel super-convergent patch recovery error estimator, an application that frequently accesses neighborhood data to carry out computation.« less

  3. Researcher, Los Alamos National Laboratory - Methods and Algorithms Group |

    National Nuclear Security Administration (NNSA)

    National Nuclear Security Administration Methods and Algorithms Group | National Nuclear Security Administration Facebook Twitter Youtube Flickr RSS People Mission Managing the Stockpile Preventing Proliferation Powering the Nuclear Navy Emergency Response Recapitalizing Our Infrastructure Countering Nuclear Terrorism About Our Programs Our History Who We Are Our Leadership Our Locations Budget Our Operations Library Bios Congressional Testimony Fact Sheets Newsletters Press Releases Photo

  4. Genetic algorithms and their use in Geophysical Problems

    SciTech Connect (OSTI)

    Parker, Paul B.

    1999-04-01

    Genetic algorithms (GAs), global optimization methods that mimic Darwinian evolution are well suited to the nonlinear inverse problems of geophysics. A standard genetic algorithm selects the best or ''fittest'' models from a ''population'' and then applies operators such as crossover and mutation in order to combine the most successful characteristics of each model and produce fitter models. More sophisticated operators have been developed, but the standard GA usually provides a robust and efficient search. Although the choice of parameter settings such as crossover and mutation rate may depend largely on the type of problem being solved, numerous results show that certain parameter settings produce optimal performance for a wide range of problems and difficulties. In particular, a low (about half of the inverse of the population size) mutation rate is crucial for optimal results, but the choice of crossover method and rate do not seem to affect performance appreciably. Optimal efficiency is usually achieved with smaller (< 50) populations. Lastly, tournament selection appears to be the best choice of selection methods due to its simplicity and its autoscaling properties. However, if a proportional selection method is used such as roulette wheel selection, fitness scaling is a necessity, and a high scaling factor (> 2.0) should be used for the best performance. Three case studies are presented in which genetic algorithms are used to invert for crustal parameters. The first is an inversion for basement depth at Yucca mountain using gravity data, the second an inversion for velocity structure in the crust of the south island of New Zealand using receiver functions derived from teleseismic events, and the third is a similar receiver function inversion for crustal velocities beneath the Mendocino Triple Junction region of Northern California. The inversions demonstrate that genetic algorithms are effective in solving problems with reasonably large numbers of free parameters and with computationally expensive objective function calculations. More sophisticated techniques are presented for special problems. Niching and island model algorithms are introduced as methods to find multiple, distinct solutions to the nonunique problems that are typically seen in geophysics. Finally, hybrid algorithms are investigated as a way to improve the efficiency of the standard genetic algorithm.

  5. An algorithm to estimate the object support in truncated images

    SciTech Connect (OSTI)

    Hsieh, Scott S.; Nett, Brian E.; Cao, Guangzhi; Pelc, Norbert J.

    2014-07-15

    Purpose: Truncation artifacts in CT occur if the object to be imaged extends past the scanner field of view (SFOV). These artifacts impede diagnosis and could possibly introduce errors in dose plans for radiation therapy. Several approaches exist for correcting truncation artifacts, but existing correction algorithms do not accurately recover the skin line (or support) of the patient, which is important in some dose planning methods. The purpose of this paper was to develop an iterative algorithm that recovers the support of the object. Methods: The authors assume that the truncated portion of the image is made up of soft tissue of uniform CT number and attempt to find a shape consistent with the measured data. Each known measurement in the sinogram is interpreted as an estimate of missing mass along a line. An initial estimate of the object support is generated by thresholding a reconstruction made using a previous truncation artifact correction algorithm (e.g., water cylinder extrapolation). This object support is iteratively deformed to reduce the inconsistency with the measured data. The missing data are estimated using this object support to complete the dataset. The method was tested on simulated and experimentally truncated CT data. Results: The proposed algorithm produces a better defined skin line than water cylinder extrapolation. On the experimental data, the RMS error of the skin line is reduced by about 60%. For moderately truncated images, some soft tissue contrast is retained near the SFOV. As the extent of truncation increases, the soft tissue contrast outside the SFOV becomes unusable although the skin line remains clearly defined, and in reformatted images it varies smoothly from slice to slice as expected. Conclusions: The support recovery algorithm provides a more accurate estimate of the patient outline than thresholded, basic water cylinder extrapolation, and may be preferred in some radiation therapy applications.

  6. Windmill wake turbulence decay: a preliminary theoretical model

    SciTech Connect (OSTI)

    Bossanyi, E.A.

    1983-02-01

    The results are given of initial theoretical attempts to predict dynamic wake characteristics, particularly turbulence decay, downstream of wind turbine generators in order to assess the potential for acoustic noise generation in clusters or arrays of turbines. These results must be considered preliminary, because the model described is at least partially based on the assumption of isotropy in the turbine wakes; however, anisotrpic conditions may actually exist, particularly in the near-wake regions. The results indicate that some excess spectral energy may still exist. The turbine-generated turbulence from one machine can reach the next machine in the cluster and, depending on the turbulent wavelengths critical for acoustic noise production and perhaps structural excitation, this may be a cause for concern. Such a situation is most likely to occur in the evening or morining, during the transition from the daytime to the nocturnal boundary layer and vice-versa, particularly at more elevated sites where the winds tend to increase after dark.

  7. Theoretical and computer models of detonation in solid explosives

    SciTech Connect (OSTI)

    Tarver, C.M.; Urtiew, P.A.

    1997-10-01

    Recent experimental and theoretical advances in understanding energy transfer and chemical kinetics have led to improved models of detonation waves in solid explosives. The Nonequilibrium Zeldovich - von Neumann - Doring (NEZND) model is supported by picosecond laser experiments and molecular dynamics simulations of the multiphonon up-pumping and internal vibrational energy redistribution (IVR) processes by which the unreacted explosive molecules are excited to the transition state(s) preceding reaction behind the leading shock front(s). High temperature, high density transition state theory calculates the induction times measured by laser interferometric techniques. Exothermic chain reactions form product gases in highly excited vibrational states, which have been demonstrated to rapidly equilibrate via supercollisions. Embedded gauge and Fabry-Perot techniques measure the rates of reaction product expansion as thermal and chemical equilibrium is approached. Detonation reaction zone lengths in carbon-rich condensed phase explosives depend on the relatively slow formation of solid graphite or diamond. The Ignition and Growth reactive flow model based on pressure dependent reaction rates and Jones-Wilkins-Lee (JWL) equations of state has reproduced this nanosecond time resolved experimental data and thus has yielded accurate average reaction zone descriptions in one-, two- and three- dimensional hydrodynamic code calculations. The next generation reactive flow model requires improved equations of state and temperature dependent chemical kinetics. Such a model is being developed for the ALE3D hydrodynamic code, in which heat transfer and Arrhenius kinetics are intimately linked to the hydrodynamics.

  8. Theoretical rate coefficients for allyl + HO2 and allyloxy decomposition

    SciTech Connect (OSTI)

    Goldsmith, C. F.; Klippenstein, S. J.; Green, W. H.

    2011-01-01

    The kinetics of the allyl + HO{sub 2} bimolecular reaction, the thermal decomposition of C{sub 3}H{sub 5}OOH, and the unimolecular reactions of C{sub 3}H{sub 5}O are studied theoretically. High-level ab initio calculations of the C{sub 3}H{sub 5}OOH and C{sub 3}H{sub 5}O potential energy surfaces are coupled with RRKM master equation methods to compute the temperature- and pressure-dependence of the rate coefficients. Variable reaction coordinate transition state theory is used to characterize the barrierless transition states for the allyl + HO{sub 2} and C{sub 3}H{sub 5}O + OH reactions. The predicted rate coefficients for allyl + HO{sub 2} ? C{sub 3}H{sub 5}OOH ? products are in good agreement with experimental values. The calculations for allyl + HO{sub 2} ? C{sub 3}H{sub 6} + O{sub 2} underpredict the observed rate. The new rate coefficients suggest that the reaction of allyl + HO{sub 2} will promote chain-branching significantly more than previous models suggest.

  9. Theoretical and Experimental Studies of Elementary Particle Physics

    SciTech Connect (OSTI)

    Evans, Harold G; Kostelecky, V Alan; Musser, James A

    2013-07-29

    The elementary particle physics research program at Indiana University spans a broad range of the most interesting topics in this fundamental field, including important contributions to each of the frontiers identified in the recent report of HEPAP's Particle Physics Prioritization Panel: the Energy Frontier, the Intensity Frontier, and the Cosmic Frontier. Experimentally, we contribute to knowledge at the Energy Frontier through our work on the D0 and ATLAS collaborations. We work at the Intensity Frontier on the MINOS and NOvA experiments and participate in R&D for LBNE. We are also very active on the theoretical side of each of these areas with internationally recognized efforts in phenomenology both in and beyond the Standard Model and in lattice QCD. Finally, although not part of this grant, members of the Indiana University particle physics group have strong involvement in several astrophysics projects at the Cosmic Frontier. Our research efforts are divided into three task areas. The Task A group works on D0 and ATLAS; Task B is our theory group; and Task C contains our MINOS, NOvA, and LBNE (LArTPC) research. Each task includes contributions from faculty, senior scientists, postdocs, graduate and undergraduate students, engineers, technicians, and administrative personnel. This work was supported by DOE Grant DE-FG02-91ER40661. In the following, we describe progress made in the research of each task during the final period of the grant, from November 1, 2009 to April 30, 2013.

  10. Protonated Forms of Monoclinic Zirconia: A Theoretical Study

    SciTech Connect (OSTI)

    Mantz, Yves A.; Gemmen, Randall S.

    2010-05-06

    In various materials applications of zirconia, protonated forms of monoclinic zirconia may be formed, motivating their study within the framework of density-functional theory. Using the HCTH/120 exchange-correlation functional, the equations of state of yttria and of the three low-pressure zirconia polymorphs are computed, to verify our approach. Next, the favored charge state of a hydrogen atom in monoclinic zirconia is shown to be positive for all Fermilevel energies in the band gap, by the computation of defect formation energies.This result is consistent with a single previous theoretical prediction at midgap as well as muonium spectroscopy experiments. For the formally positively (+1e) charged system of a proton in monoclinic zirconia (with a homogeneous neutralizing background charge densityimplicitly included), modeled using up to a 3 x 3 x 3 arrangement of unit cells, different stable and metastable structures are identified. They are similar to those structures previously proposed for the neutral system of hydrogen-doedmonoclinic zirconia, at a similar level of theory. As predicted using the HCTH/120 functional, the lowest energy structure of the proton bonded to one of the two available oxygen atom types, O1, is favored by 0.39 eV compared to that of the proton bonded to O2. The rate of proton transfer between O1 ions is slower than that for hydrogen-dopedmonoclinic zirconia, whose transition-state structures may be lowered in energy by the extra electron.

  11. Effect of cosolvent on protein stability: A theoretical investigation

    SciTech Connect (OSTI)

    Chalikian, Tigran V.

    2014-12-14

    We developed a statistical thermodynamic algorithm for analyzing solvent-induced folding/unfolding transitions of proteins. The energetics of protein transitions is governed by the interplay between the cavity formation contribution and the term reflecting direct solute-cosolvent interactions. The latter is viewed as an exchange reaction in which the binding of a cosolvent to a solute is accompanied by release of waters of hydration to the bulk. Our model clearly differentiates between the stoichiometric and non-stoichiometric interactions of solvent or co-solvent molecules with a solute. We analyzed the urea- and glycine betaine (GB)-induced conformational transitions of model proteins of varying size which are geometrically approximated by a sphere in their native state and a spherocylinder in their unfolded state. The free energy of cavity formation and its changes accompanying protein transitions were computed based on the concepts of scaled particle theory. The free energy of direct solute-cosolvent interactions were analyzed using empirical parameters previously determined for urea and GB interactions with low molecular weight model compounds. Our computations correctly capture the mode of action of urea and GB and yield realistic numbers for (??G/?a{sub 3}){sub T,P} which are related to the m-values of protein denaturation. Urea is characterized by negative values of (??G/?a{sub 3}){sub T,P} within the entire range of urea concentrations analyzed. At concentrations below ?1 M, GB exhibits positive values of (??G/?a{sub 3}){sub T,P} which turn positive at higher GB concentrations. The balance between the thermodynamic contributions of cavity formation and direct solute-cosolvent interactions that, ultimately, defines the mode of cosolvent action is extremely subtle. A 20% increase or decrease in the equilibrium constant for solute-cosolvent binding may change the sign of (??G/?a{sub 3}){sub T,P} thereby altering the mode of cosolvent action (stabilizing to destabilizing or vice versa)

  12. Theoretical and experimental study on regenerative rotary displacer Stirling engine

    SciTech Connect (OSTI)

    Raggi, L.; Katsuta, Masafumi; Isshiki, Naotsugu; Isshiki, Seita

    1997-12-31

    Recently a quite new type of hot air engine called rotary displacer engine, in which the displacer is a rotating disk enclosed in a cylinder, has been conceived and developed. The working gas, contained in a notch excavated in the disk, is heated and cooled alternately, on account of the heat transferred through the enclosing cylinder that is heated at one side and cooled at the opposite one. The gas temperature oscillations cause the pressure fluctuations that get out mechanical power acting on a power piston. In order to attempt to increase the performances for this kind of engine, the authors propose three different regeneration methods. The first one comprises two coaxial disks that, revolving in opposite ways, cause a temperature gradient on the cylinder wall and a regenerative axial heat conduction through fins shaped on the cylinder inner wall. The other two methods are based on the heat transferred by a proper closed circuit that in one case has a circulating liquid inside and in the other one is formed by several heat pipes working each one for different temperatures. An engine based on the first principle, the Regenerative Tandem Contra-Rotary Displacer Stirling Engine, has been realized and experimented. In this paper experimental results with and without regeneration are reported comparatively with a detailed description of the unity. A basic explanation of the working principle of this engine and a theoretical analysis investigating the main influential parameters for the regenerative effect are done. This new rotating displacer Stirling engines, for their simplicity, are expected to attain high rotational speed especially for applications as demonstration and hobby unities.

  13. Specification of Selected Performance Monitoring and Commissioning Verification Algorithms for CHP Systems

    SciTech Connect (OSTI)

    Brambley, Michael R.; Katipamula, Srinivas

    2006-10-06

    Pacific Northwest National Laboratory (PNNL) is assisting the U.S. Department of Energy (DOE) Distributed Energy (DE) Program by developing advanced control algorithms that would lead to development of tools to enhance performance and reliability, and reduce emissions of distributed energy technologies, including combined heat and power technologies. This report documents phase 2 of the program, providing a detailed functional specification for algorithms for performance monitoring and commissioning verification, scheduled for development in FY 2006. The report identifies the systems for which algorithms will be developed, the specific functions of each algorithm, metrics which the algorithms will output, and inputs required by each algorithm.

  14. Spatio-spectral image analysis using classical and neural algorithms

    SciTech Connect (OSTI)

    Roberts, S.; Gisler, G.R.; Theiler, J.

    1996-12-31

    Remote imaging at high spatial resolution has a number of environmental, industrial, and military applications. Analysis of high-resolution multi-spectral images usually involves either spectral analysis of single pixels in a multi- or hyper-spectral image or spatial analysis of multi-pixels in a panchromatic or monochromatic image. Although insufficient for some pattern recognition applications individually, the combination of spatial and spectral analytical techniques may allow the identification of more complex signatures that might not otherwise be manifested in the individual spatial or spectral domains. We report on some preliminary investigation of unsupervised classification methodologies (using both ``classical`` and ``neural`` algorithms) to identify potentially revealing features in these images. We apply dimension-reduction preprocessing to the images, duster, and compare the clusterings obtained by different algorithms. Our classification results are analyzed both visually and with a suite of objective, quantitative measures.

  15. Parallel Algorithms for Graph Optimization using Tree Decompositions

    SciTech Connect (OSTI)

    Sullivan, Blair D; Weerapurage, Dinesh P; Groer, Christopher S

    2012-06-01

    Although many $\\cal{NP}$-hard graph optimization problems can be solved in polynomial time on graphs of bounded tree-width, the adoption of these techniques into mainstream scientific computation has been limited due to the high memory requirements of the necessary dynamic programming tables and excessive runtimes of sequential implementations. This work addresses both challenges by proposing a set of new parallel algorithms for all steps of a tree decomposition-based approach to solve the maximum weighted independent set problem. A hybrid OpenMP/MPI implementation includes a highly scalable parallel dynamic programming algorithm leveraging the MADNESS task-based runtime, and computational results demonstrate scaling. This work enables a significant expansion of the scale of graphs on which exact solutions to maximum weighted independent set can be obtained, and forms a framework for solving additional graph optimization problems with similar techniques.

  16. APPLICATION OF NEURAL NETWORK ALGORITHMS FOR BPM LINEARIZATION

    SciTech Connect (OSTI)

    Musson, John C.; Seaton, Chad; Spata, Mike F.; Yan, Jianxun

    2012-11-01

    Stripline BPM sensors contain inherent non-linearities, as a result of field distortions from the pickup elements. Many methods have been devised to facilitate corrections, often employing polynomial fitting. The cost of computation makes real-time correction difficult, particulalry when integer math is utilized. The application of neural-network technology, particularly the multi-layer perceptron algorithm, is proposed as an efficient alternative for electrode linearization. A process of supervised learning is initially used to determine the weighting coefficients, which are subsequently applied to the incoming electrode data. A non-linear layer, known as an ?activation layer,? is responsible for the removal of saturation effects. Implementation of a perceptron in an FPGA-based software-defined radio (SDR) is presented, along with performance comparisons. In addition, efficient calculation of the sigmoidal activation function via the CORDIC algorithm is presented.

  17. A spectral unaveraged algorithm for free electron laser simulations

    SciTech Connect (OSTI)

    Andriyash, I.A.; Lehe, R.; Malka, V.

    2015-02-01

    We propose and discuss a numerical method to model electromagnetic emission from the oscillating relativistic charged particles and its coherent amplification. The developed technique is well suited for free electron laser simulations, but it may also be useful for a wider range of physical problems involving resonant field–particles interactions. The algorithm integrates the unaveraged coupled equations for the particles and the electromagnetic fields in a discrete spectral domain. Using this algorithm, it is possible to perform full three-dimensional or axisymmetric simulations of short-wavelength amplification. In this paper we describe the method, its implementation, and we present examples of free electron laser simulations comparing the results with the ones provided by commonly known free electron laser codes.

  18. Optimized Algorithm for Collision Probability Calculations in Cubic Geometry

    SciTech Connect (OSTI)

    Garcia, R.D.M.

    2004-06-15

    An optimized algorithm for implementing a recently developed method of computing collision probabilities (CPs) in three dimensions is reported in this work for the case of a homogeneous cube. Use is made of the geometrical regularity of the domain to rewrite, in a very compact way, the approximate formulas for calculating CPs in general three-dimensional geometry that were derived in a previous work by the author. The ensuing gain in computation time is found to be substantial: While the computation time associated with the general formulas increases as K{sup 2}, where K is the number of elements used in the calculation, that of the specific formulas increases only linearly with K. Accurate numerical results are given for several test cases, and an extension of the algorithm for computing the self-collision probability for a hexahedron is reported at the end of the work.

  19. Comparison of Theoretical Efficiencies of Multi-junction Concentrator Solar Cells

    SciTech Connect (OSTI)

    Kurtz, S.; Myers, D.; McMahon, W. E.; Geisz, J.; Steiner, M.

    2008-01-01

    Champion concentrator cell efficiencies have surpassed 40% and now many are asking whether the efficiencies will surpass 50%. Theoretical efficiencies of >60% are described for many approaches, but there is often confusion about the theoretical efficiency for a specific structure. The detailed balance approach to calculating theoretical efficiency gives an upper bound that can be independent of material parameters and device design. Other models predict efficiencies that are closer to those that have been achieved. Changing reference spectra and the choice of concentration further complicate comparison of theoretical efficiencies. This paper provides a side-by-side comparison of theoretical efficiencies of multi-junction solar cells calculated with the detailed balance approach and a common one-dimensional-transport model for different spectral and irradiance conditions. Also, historical experimental champion efficiencies are compared with the theoretical efficiencies.

  20. Invariant patterns in crystal lattices: Implications for protein folding algorithms

    SciTech Connect (OSTI)

    Hart, W.E.; Istrail, S.

    1995-12-11

    Crystal lattices are infinite periodic graphs that occur naturally in a variety of geometries and which are of fundamental importance in polymer science. Discrete models of protein folding use crystal lattices to define the space of protein conformations. Because various crystal lattices provide discretizations of the same physical phenomenon, it is reasonable to expect that there will exist ``invariants`` across lattices that define fundamental properties of protein folding process; an invariant defines a property that transcends particular lattice formulations. This paper identifies two classes of invariants, defined in terms of sublattices that are related to the design of algorithms for the structure prediction problem. The first class of invariants is, used to define a master approximation algorithm for which provable performance guarantees exist. This algorithm can be applied to generalizations of the hydrophobic-hydrophilic model that have lattices other than the cubic lattice, including most of the crystal lattices commonly used in protein folding lattice models. The second class of invariants applies to a related lattice model. Using these invariants, we show that for this model the structure prediction problem is intractable across a variety of three-dimensional lattices. It`` turns out that these two classes of invariants are respectively sublattices of the two- and three-dimensional square lattice. As the square lattices are the standard lattices used in empirical protein folding` studies, our results provide a rigorous confirmation of the ability of these lattices to provide insight into biological phenomenon. Our results are the first in the literature that identify algorithmic paradigms for the protein structure prediction problem which transcend particular lattice formulations.

  1. High-Resolution Computational Algorithms for Simulating Offshore Wind Farms

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Computational Algorithms for Simulating Offshore Wind Farms - Sandia Energy Energy Search Icon Sandia Home Locations Contact Us Employee Locator Energy & Climate Secure & Sustainable Energy Future Stationary Power Energy Conversion Efficiency Solar Energy Wind Energy Water Power Supercritical CO2 Geothermal Natural Gas Safety, Security & Resilience of the Energy Infrastructure Energy Storage Nuclear Power & Engineering Grid Modernization Battery Testing Nuclear Fuel Cycle Defense

  2. Physics-based signal processing algorithms for micromachined cantilever arrays

    DOE Patents [OSTI]

    Candy, James V; Clague, David S; Lee, Christopher L; Rudd, Robert E; Burnham, Alan K; Tringe, Joseph W

    2013-11-19

    A method of using physics-based signal processing algorithms for micromachined cantilever arrays. The methods utilize deflection of a micromachined cantilever that represents the chemical, biological, or physical element being detected. One embodiment of the method comprises the steps of modeling the deflection of the micromachined cantilever producing a deflection model, sensing the deflection of the micromachined cantilever and producing a signal representing the deflection, and comparing the signal representing the deflection with the deflection model.

  3. ANL CT Reconstruction Algorithm for Utilizing Digital X-ray

    Energy Science and Technology Software Center (OSTI)

    2004-05-01

    Reconstructs X-ray computed tomographic images from large data sets known as 16-bit binary sinograms when using a massively parallelized computer architecture such as a Beowuif cluster by parallelizing the X-ray CT reconstruction routine. The algorithm uses the concept of generation of an image from carefully obtained multiple 1-D or 2-D X-ray projections. The individual projections are filtered using a digital Fast Fourier Transform. The literature refers to this as filtered back projection.

  4. Protocol for Enhanced Evaluations of Beyond Design Basis Events Supporting Implementation of Operating Experience Report 2013-01

    Broader source: Energy.gov [DOE]

    Protocol for Enhanced Evaluations of Beyond Design Basis Events Supporting Implementation of Operating Experience Report 2013-01

  5. The power of simplification: Operator interface with the AP1000{sup R} during design-basis and beyond design-basis events

    SciTech Connect (OSTI)

    Williams, M. G.; Mouser, M. R.; Simon, J. B.

    2012-07-01

    The AP1000{sup R} plant is an 1100-MWe pressurized water reactor with passive safety features and extensive plant simplifications that enhance construction, operation, maintenance, safety and cost. The passive safety features are designed to function without safety-grade support systems such as component cooling water, service water, compressed air or HVAC. The AP1000 passive safety features achieve and maintain safe shutdown in case of a design-basis accident for 72 hours without need for operator action, meeting the expectations provided in the European Utility Requirements and the Utility Requirement Document for passive plants. Limited operator actions may be required to maintain safe conditions in the spent fuel pool (SFP) via passive means. This safety approach therefore minimizes the reliance on operator action for accident mitigation, and this paper examines the operator interaction with the Human-System Interface (HSI) as the severity of an accident increases from an anticipated transient to a design basis accident and finally, to a beyond-design-basis event. The AP1000 Control Room design provides an extremely effective environment for addressing the first 72 hours of design-basis events and transients, providing ease of information dissemination and minimal reliance upon operator actions. Symptom-based procedures including Emergency Operating Procedures (EOPs), Abnormal Operating Procedures (AOPs) and Alarm Response Procedures (ARPs) are used to mitigate design basis transients and accidents. Use of the Computerized Procedure System (CPS) aids the operators during mitigation of the event. The CPS provides cues and direction to the operators as the event progresses. If the event becomes progressively worse or lasts longer than 72 hours, and depending upon the nature of failures that may have occurred, minimal operator actions may be required outside of the control room in areas that have been designed to be accessible using components that have been designed to be reliable in these conditions. The primary goal of any such actions is to maintain or refill the passive inventory available to cool the core, containment and spent fuel pool in the safety-related and seismically qualified Passive Containment Cooling Water Storage Tank (PCCWST). The seismically-qualified, ground-mounted Passive Containment Cooling Ancillary Water Storage Tank (PCCAWST) is also available for this function as appropriate. The primary effect of these actions would be to increase the coping time for the AP1000 during design basis events, as well as events such as those described above, from 72 hours without operator intervention to 7 days with minimal operator actions. These Operator actions necessary to protect the health and safety of the public are addressed in the Post-72 Hour procedures, as well as some EOPs, AOPs, ARPs and the Severe Accident Management Guidelines (SAMGs). Should the event continue to become more severe and plant conditions degrade further with indications of inadequate core cooling, the SAMGs provide guidance for strategies to address these hypothetical severe accident conditions. The AP1000 SAMG diagnoses and actions are prioritized to first utilize the AP1000 features that are expected to retain a damaged core inside the reactor vessel. Only one strategy is undertaken at any time. This strategy will be followed and its effectiveness evaluated before other strategies are undertaken. This is a key feature of both the symptom-oriented AP1000 EOPs and the AP1000 SAMGs which maximizes the probability of retaining a damaged core inside the reactor vessel and containment while minimizing the chances for confusion and human errors during implementation. The AP1000 SAMGs are simple and straight-forward and have been developed with considerable input from human factors and plant operations experts. Most importantly, and different from severe accident management strategies for other plants, the AP1000 SAMGs do not require diagnosis of the location of the core (i.e., whether reactor vessel failure has occurred). This is a fun

  6. Time-dependent density functional theory quantum transport simulation in non-orthogonal basis

    SciTech Connect (OSTI)

    Kwok, Yan Ho; Xie, Hang; Yam, Chi Yung; Chen, Guan Hua; Zheng, Xiao

    2013-12-14

    Basing on the earlier works on the hierarchical equations of motion for quantum transport, we present in this paper a first principles scheme for time-dependent quantum transport by combining time-dependent density functional theory (TDDFT) and Keldysh's non-equilibrium Green's function formalism. This scheme is beyond the wide band limit approximation and is directly applicable to the case of non-orthogonal basis without the need of basis transformation. The overlap between the basis in the lead and the device region is treated properly by including it in the self-energy and it can be shown that this approach is equivalent to a lead-device orthogonalization. This scheme has been implemented at both TDDFT and density functional tight-binding level. Simulation results are presented to demonstrate our method and comparison with wide band limit approximation is made. Finally, the sparsity of the matrices and computational complexity of this method are analyzed.

  7. A fast contour descriptor algorithm for supernova imageclassification

    SciTech Connect (OSTI)

    Aragon, Cecilia R.; Aragon, David Bradburn

    2006-07-16

    We describe a fast contour descriptor algorithm and its application to a distributed supernova detection system (the Nearby Supernova Factory) that processes 600,000 candidate objects in 80 GB of image data per night. Our shape-detection algorithm reduced the number of false positives generated by the supernova search pipeline by 41% while producing no measurable impact on running time. Fourier descriptors are an established method of numerically describing the shapes of object contours, but transform-based techniques are ordinarily avoided in this type of application due to their computational cost. We devised a fast contour descriptor implementation for supernova candidates that meets the tight processing budget of the application. Using the lowest-order descriptors (F{sub 1} and F{sub -1}) and the total variance in the contour, we obtain one feature representing the eccentricity of the object and another denoting its irregularity. Because the number of Fourier terms to be calculated is fixed and small, the algorithm runs in linear time, rather than the O(n log n) time of an FFT. Constraints on object size allow further optimizations so that the total cost of producing the required contour descriptors is about 4n addition/subtraction operations, where n is the length of the contour.

  8. GMG: A Guaranteed, Efficient Global Optimization Algorithm for Remote Sensing.

    SciTech Connect (OSTI)

    D'Helon, CD

    2004-08-18

    The monocular passive ranging (MPR) problem in remote sensing consists of identifying the precise range of an airborne target (missile, plane, etc.) from its observed radiance. This inverse problem may be set as a global optimization problem (GOP) whereby the difference between the observed and model predicted radiances is minimized over the possible ranges and atmospheric conditions. Using additional information about the error function between the predicted and observed radiances of the target, we developed GMG, a new algorithm to find the Global Minimum with a Guarantee. The new algorithm transforms the original continuous GOP into a discrete search problem, thereby guaranteeing to find the position of the global minimum in a reasonably short time. The algorithm is first applied to the golf course problem, which serves as a litmus test for its performance in the presence of both complete and degraded additional information. GMG is further assessed on a set of standard benchmark functions and then applied to various realizations of the MPR problem.

  9. Theoretical Condensed Matter Physics | U.S. DOE Office of Science (SC)

    Office of Science (SC) Website

    Theoretical Condensed Matter Physics Materials Sciences and Engineering (MSE) Division MSE Home About Research Areas Energy Frontier Research Centers (EFRCs) DOE Energy Innovation Hubs BES Funding Opportunities Reports and Activities Science Highlights Principal Investigators' Meetings BES Home Research Areas Theoretical Condensed Matter Physics Print Text Size: A A A FeedbackShare Page This research area supports theoretical condensed matter physics emphasizing theory, modeling, and simulation

  10. Computational and Theoretical Chemistry | U.S. DOE Office of Science (SC)

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Computational and Theoretical Chemistry Chemical Sciences, Geosciences, & Biosciences (CSGB) Division CSGB Home About Research Areas Energy Frontier Research Centers (EFRCs) DOE Energy Innovation Hubs Reports and Activities Science Highlights Principal Investigators' Meetings BES Home Research Areas Computational and Theoretical Chemistry Print Text Size: A A A FeedbackShare Page Research in Computational and Theoretical Chemistry emphasizes integration and development of new and existing

  11. Theoretical study of Ag- and Au-filled skutterudites. | Department of

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Energy Theoretical study of Ag- and Au-filled skutterudites. Theoretical study of Ag- and Au-filled skutterudites. Uses ab initio atomistic DFT modeling as implemented in VASP to determine theoretical values of thermoelectric properties for Ag-filled skutterudites. PDF icon stoica.pdf More Documents & Publications Thermoelectric Generator Development for Automotive Waste Heat Recovery Recent Progress in the Development of N-type Skutterudites Advanced Thermoelectric Materials and

  12. ITP Metal Casting: Theoretical/Best Practice Energy Use in Metalcastin...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Best Practice Energy Use in Metalcasting Operations ITP Metal Casting: TheoreticalBest Practice Energy Use in Metalcasting Operations PDF icon doebestpractice052804.pdf More...

  13. Theoretical investigations of defects in a Si-based digital ferromagne...

    Office of Scientific and Technical Information (OSTI)

    in a Si-based digital ferromagnetic heterostructure - a spintronic material Citation Details In-Document Search Title: Theoretical investigations of defects in a Si-based ...

  14. Numerical method to test a theoretical model of the quantum interferen...

    Office of Scientific and Technical Information (OSTI)

    Numerical method to test a theoretical model of the quantum interference effect in layered manganites Citation Details In-Document Search Title: Numerical method to test a ...

  15. Structural Basis for Microcin C7 Inactivation by the MccE Acetyltransferase

    Office of Scientific and Technical Information (OSTI)

    (Journal Article) | SciTech Connect SciTech Connect Search Results Journal Article: Structural Basis for Microcin C7 Inactivation by the MccE Acetyltransferase Citation Details In-Document Search Title: Structural Basis for Microcin C7 Inactivation by the MccE Acetyltransferase The antibiotic microcin C7 (McC) acts as a bacteriocide by inhibiting aspartyl-tRNA synthetase and stalling the protein translation machinery. McC is synthesized as a heptapeptide-nucleotide conjugate, which is

  16. Structural Basis of Selective Ubiquitination of TRF1 by SCF[superscript

    Office of Scientific and Technical Information (OSTI)

    Fbx4] (Journal Article) | SciTech Connect SciTech Connect Search Results Journal Article: Structural Basis of Selective Ubiquitination of TRF1 by SCF[superscript Fbx4] Citation Details In-Document Search Title: Structural Basis of Selective Ubiquitination of TRF1 by SCF[superscript Fbx4] TRF1 is a critical regulator of telomere length. As such, TRF1 levels are regulated by ubiquitin-dependent proteolysis via an SCF E3 ligase where Fbx4 contributes to substrate specification. Here, we report

  17. The Structural Basis for Tight Control of PP2A Methylation and Function by

    Office of Scientific and Technical Information (OSTI)

    LCMT-1 (Journal Article) | SciTech Connect The Structural Basis for Tight Control of PP2A Methylation and Function by LCMT-1 Citation Details In-Document Search Title: The Structural Basis for Tight Control of PP2A Methylation and Function by LCMT-1 Proper formation of protein phosphatase 2A (PP2A) holoenzymes is essential for the fitness of all eukaryotic cells. Carboxyl methylation of the PP2A catalytic subunit plays a critical role in regulating holoenzyme assembly; methylation is

  18. Analytic matrix elements for the two-electron atomic basis with logarithmic terms

    SciTech Connect (OSTI)

    Liverts, Evgeny Z.; Barnea, Nir

    2014-08-01

    The two-electron problem for the helium-like atoms in S-state is considered. The basis containing the integer powers of ln r, where r is a radial variable of the Fock expansion, is studied. In this basis, the analytic expressions for the matrix elements of the corresponding Hamiltonian are presented. These expressions include only elementary and special functions, what enables very fast and accurate computation of the matrix elements. The decisive contribution of the correct logarithmic terms to the behavior of the two-electron wave function in the vicinity of the triple-coalescence point is reaffirmed.

  19. Technical Basis Spent Nuclear Fuel (SNF) Project Radiation and Contamination Trending Program

    SciTech Connect (OSTI)

    ELGIN, J.C.

    2000-10-02

    This report documents the technical basis for the Spent Nuclear Fuel (SNF) Program radiation and contamination trending program. The program consists of standardized radiation and contamination surveys of the KE Basin, radiation surveys of the KW basin, radiation surveys of the Cold Vacuum Drying Facility (CVD), and radiation surveys of the Canister Storage Building (CSB) with the associated tracking. This report also discusses the remainder of radiological areas within the SNFP that do not have standardized trending programs and the basis for not having this program in those areas.

  20. Structural Basis of Wnt Signaling Inhibition by Dickkopf Binding to LRP5/6

    Office of Scientific and Technical Information (OSTI)

    (Journal Article) | SciTech Connect Basis of Wnt Signaling Inhibition by Dickkopf Binding to LRP5/6 Citation Details In-Document Search Title: Structural Basis of Wnt Signaling Inhibition by Dickkopf Binding to LRP5/6 Authors: Ahn, Victoria E. ; Chu, Matthew Ling-Hon ; Choi, Hee-Jung ; Tran, Denise ; Abo, Arie ; Weis, William I. Publication Date: 2011-11-01 OSTI Identifier: 1198118 Type: Published Article Journal Name: Developmental Cell Additional Journal Information: Journal Volume: 21;

  1. Structural basis of GSK-3 inhibition by N-terminal phosphorylation and by

    Office of Scientific and Technical Information (OSTI)

    the Wnt receptor LRP6 (Journal Article) | SciTech Connect Structural basis of GSK-3 inhibition by N-terminal phosphorylation and by the Wnt receptor LRP6 Citation Details In-Document Search Title: Structural basis of GSK-3 inhibition by N-terminal phosphorylation and by the Wnt receptor LRP6 Authors: Stamos, Jennifer L. ; Chu, Matthew Ling-Hon ; Enos, Michael D. ; Shah, Niket ; Weis, William I. [1] + Show Author Affiliations (Stanford) Publication Date: 2015-02-19 OSTI Identifier: 1168492

  2. Basis for Identification of Disposal Options for R and D for Spent Nuclear

    Energy Savers [EERE]

    Fuel and High-Level Waste | Department of Energy Basis for Identification of Disposal Options for R and D for Spent Nuclear Fuel and High-Level Waste Basis for Identification of Disposal Options for R and D for Spent Nuclear Fuel and High-Level Waste The Used Fuel Disposition campaign (UFD) is selecting a set of geologic media for further study that spans a suite of behavior characteristics that impose a broad range of potential conditions on the design of the repository, the engineered

  3. Enhanced Algorithm for Traceability Measurements in UF6 Flow Pipe

    SciTech Connect (OSTI)

    Copinger, Thomas E; March-Leuba, Jose A; Upadhyaya, Belle R

    2007-01-01

    The Blend Down Monitoring System (BDMS) is used to continually assess the mixing and downblending of highly enriched uranium (HEU) with low-enriched uranium (LEU). This is accomplished by measuring the enrichment and the fissile mass flow rate of the UF{sub 6} gas located in each process pipe of the system by inducing the fission of the {sup 235}U contained in the gas. Measurements are taken along this process route to trace the HEU content all the way to the product stream, ensuring that HEU was down blended. A problem associated with the current traceability measuring algorithm is that it does not account for the time-varying background that is introduced to the system by the movement of the shutter located at the HEU leg of the process. The current way of dealing with that problem is to discard the data for periods when the HEU shutter is open (50% of overall data) because it correlates with the same timeframe in which the direct contribution to background from the HEU shutter was seen. The advanced algorithm presented in this paper allows for continuous measurement of traceability (100%) by accurately accounting for the varying background during the shutter-movement cycle. This algorithm utilizes advanced processing techniques that identify and discriminate the different sources of background radiation, instead of grouping them into one background group for the whole measurement cycle. By using this additional information, the traceability measurement statistics can achieve a greater number of values, thus improving the overall usefulness of these measurements in the BDMS. The effectiveness of the new algorithm was determined by modeling it in a simulation and ensuring that it retained its integrity through a large number of runs, including various shutter-failure conditions. Each run was performed with varying amounts of background radiation from each individual source and with varying traceability counts. The simulations documented in this paper prove that the algorithm can stand up to various transients introduced into the system, such as failure of shutter movement.

  4. An efficient algorithm for incompressible N-phase flows

    SciTech Connect (OSTI)

    Dong, S.

    2014-11-01

    We present an efficient algorithm within the phase field framework for simulating the motion of a mixture of N (N?2) immiscible incompressible fluids, with possibly very different physical properties such as densities, viscosities, and pairwise surface tensions. The algorithm employs a physical formulation for the N-phase system that honors the conservations of mass and momentum and the second law of thermodynamics. We present a method for uniquely determining the mixing energy density coefficients involved in the N-phase model based on the pairwise surface tensions among the N fluids. Our numerical algorithm has several attractive properties that make it computationally very efficient: (i) it has completely de-coupled the computations for different flow variables, and has also completely de-coupled the computations for the (N?1) phase field functions; (ii) the algorithm only requires the solution of linear algebraic systems after discretization, and no nonlinear algebraic solve is needed; (iii) for each flow variable the linear algebraic system involves only constant and time-independent coefficient matrices, which can be pre-computed during pre-processing, despite the variable density and variable viscosity of the N-phase mixture; (iv) within a time step the semi-discretized system involves only individual de-coupled Helmholtz-type (including Poisson) equations, despite the strongly-coupled phasefield system of fourth spatial order at the continuum level; (v) the algorithm is suitable for large density contrasts and large viscosity contrasts among the N fluids. Extensive numerical experiments have been presented for several problems involving multiple fluid phases, large density contrasts and large viscosity contrasts. In particular, we compare our simulations with the de Gennes theory, and demonstrate that our method produces physically accurate results for multiple fluid phases. We also demonstrate the significant and sometimes dramatic effects of the gravity, density ratios, pairwise surface tensions, and drop sizes on the N-phase configurations and dynamics. The numerical results show that the method developed herein is capable of dealing with N-phase systems with large density ratios, large viscosity ratios, and pairwise surface tensions, and that it can be a powerful tool for studying the interactions among multiple types of fluid interfaces.

  5. Design-Basis Flood Estimation for Site Characterization at Nuclear Power Plants in the United States of America

    SciTech Connect (OSTI)

    Prasad, Rajiv; Hibler, Lyle F.; Coleman, Andre M.; Ward, Duane L.

    2011-11-01

    The purpose of this document is to describe approaches and methods for estimation of the design-basis flood at nuclear power plant sites. Chapter 1 defines the design-basis flood and lists the U.S. Nuclear Regulatory Commission's (NRC) regulations that require estimation of the design-basis flood. For comparison, the design-basis flood estimation methods used by other Federal agencies are also described. A brief discussion of the recommendations of the International Atomic Energy Agency for estimation of the design-basis floods in its member States is also included.

  6. CRAD, Safety Basis- Oak Ridge National Laboratory TRU ALPHA LLWT Project

    Broader source: Energy.gov [DOE]

    A section of Appendix C to DOE G 226.1-2 "Federal Line Management Oversight of Department of Energy Nuclear Facilities." Consists of Criteria Review and Approach Documents (CRADs) used for a November 2003 assessment of the Safety Basis portion of an Operational Readiness Review of the Oak Ridge National Laboratory TRU ALPHA LLWT Project.

  7. CRAD, Safety Basis- Y-12 Enriched Uranium Operations Oxide Conversion Facility

    Broader source: Energy.gov [DOE]

    A section of Appendix C to DOE G 226.1-2 "Federal Line Management Oversight of Department of Energy Nuclear Facilities." Consists of Criteria Review and Approach Documents (CRADs) used for a January 2005 assessment of the Safety Basis at the Y-12 - Enriched Uranium Operations Oxide Conversion Facility.

  8. CRAD, Safety Basis- Los Alamos National Laboratory Waste Characterization, Reduction, and Repackaging Facility

    Broader source: Energy.gov [DOE]

    A section of Appendix C to DOE G 226.1-2 "Federal Line Management Oversight of Department of Energy Nuclear Facilities." Consists of Criteria Review and Approach Documents (CRADs) used for an assessment of the Safety Basis portion of an Operational Readiness Review at the Los Alamos National Laboratory Waste Characterization, Reduction, and Repackaging Facility.

  9. CRAD, Safety Basis- Oak Ridge National Laboratory High Flux Isotope Reactor Contractor ORR

    Broader source: Energy.gov [DOE]

    A section of Appendix C to DOE G 226.1-2 "Federal Line Management Oversight of Department of Energy Nuclear Facilities." Consists of Criteria Review and Approach Documents (CRADs) used for a February 2007 assessment of the Safety Basis portion of an Operational Readiness Review of the Oak Ridge National Laboratory High Flux Isotope Reactor.

  10. CRAD, Safety Basis- Oak Ridge National Laboratory High Flux Isotope Reactor

    Broader source: Energy.gov [DOE]

    A section of Appendix C to DOE G 226.1-2 "Federal Line Management Oversight of Department of Energy Nuclear Facilities." Consists of Criteria Review and Approach Documents (CRADs) used for a February 2007 assessment of the Safety Basis in preparation for restart of the Oak Ridge National Laboratory High Flux Isotope Reactor.

  11. Sensitivity of the Properties of Ruthenium Blue Dimer to Method, Basis Set, and Continuum Model

    SciTech Connect (OSTI)

    Ozkanlar, Abdullah; Clark, Aurora E.

    2012-05-23

    The ruthenium blue dimer [(bpy)2RuIIIOH2]2O4+ is best known as the first well-defined molecular catalyst for water oxidation. It has been subject to numerous computational studies primarily employing density functional theory. However, those studies have been limited in the functionals, basis sets, and continuum models employed. The controversy in the calculated electronic structure and the reaction energetics of this catalyst highlights the necessity of benchmark calculations that explore the role of density functionals, basis sets, and continuum models upon the essential features of blue-dimer reactivity. In this paper, we report Kohn-Sham complete basis set (KS-CBS) limit extrapolations of the electronic structure of blue dimer using GGA (BPW91 and BP86), hybrid-GGA (B3LYP), and meta-GGA (M06-L) density functionals. The dependence of solvation free energy corrections on the different cavity types (UFF, UA0, UAHF, UAKS, Bondi, and Pauling) within polarizable and conductor-like polarizable continuum model has also been investigated. The most common basis sets of double-zeta quality are shown to yield results close to the KS-CBS limit; however, large variations are observed in the reaction energetics as a function of density functional and continuum cavity model employed.

  12. CRAD, Safety Basis- Los Alamos National Laboratory TA 55 SST Facility

    Broader source: Energy.gov [DOE]

    A section of Appendix C to DOE G 226.1-2 "Federal Line Management Oversight of Department of Energy Nuclear Facilities." Consists of Criteria Review and Approach Documents (CRADs) used for an assessment of the Safety Basis at the Los Alamos National Laboratory TA 55 SST Facility.

  13. Margin of Safety Definition and Examples Used in Safety Basis Documents and the USQ Process

    SciTech Connect (OSTI)

    Beaulieu, R. A.

    2013-10-03

    The Nuclear Safety Management final rule, 10 CFR 830, provides an undefined term, margin of safety (MOS). Safe harbors listed in 10 CFR 830, Table 2, such as DOE?STD?3009 use but do not define the term. This lack of definition has created the need for the definition. This paper provides a definition of MOS and documents examples of MOS as applied in a U.S. Department of Energy (DOE) approved safety basis for an existing nuclear facility. If we understand what MOS looks like regarding Technical Safety Requirements (TSR) parameters, then it helps us compare against other parameters that do not involve a MOS. This paper also documents parameters that are not MOS. These criteria could be used to determine if an MOS exists in safety basis documents. This paper helps DOE, including the National Nuclear Security Administration (NNSA) and its contractors responsible for the safety basis improve safety basis documents and the unreviewed safety question (USQ) process with respect to MOS.

  14. Automated Algorithms for Quantum-Level Accuracy in Atomistic Simulations: LDRD Final Report.

    SciTech Connect (OSTI)

    Thompson, Aidan P.; Schultz, Peter A.; Crozier, Paul; Moore, Stan Gerald; Swiler, Laura Painton; Stephens, John Adam; Trott, Christian Robert; Foiles, Stephen M.; Tucker, Garritt J.

    2014-09-01

    This report summarizes the result of LDRD project 12-0395, titled %22Automated Algorithms for Quantum-level Accuracy in Atomistic Simulations.%22 During the course of this LDRD, we have developed an interatomic potential for solids and liquids called Spectral Neighbor Analysis Poten- tial (SNAP). The SNAP potential has a very general form and uses machine-learning techniques to reproduce the energies, forces, and stress tensors of a large set of small configurations of atoms, which are obtained using high-accuracy quantum electronic structure (QM) calculations. The local environment of each atom is characterized by a set of bispectrum components of the local neighbor density projected on to a basis of hyperspherical harmonics in four dimensions. The SNAP coef- ficients are determined using weighted least-squares linear regression against the full QM training set. This allows the SNAP potential to be fit in a robust, automated manner to large QM data sets using many bispectrum components. The calculation of the bispectrum components and the SNAP potential are implemented in the LAMMPS parallel molecular dynamics code. Global optimization methods in the DAKOTA software package are used to seek out good choices of hyperparameters that define the overall structure of the SNAP potential. FitSnap.py, a Python-based software pack- age interfacing to both LAMMPS and DAKOTA is used to formulate the linear regression problem, solve it, and analyze the accuracy of the resultant SNAP potential. We describe a SNAP potential for tantalum that accurately reproduces a variety of solid and liquid properties. Most significantly, in contrast to existing tantalum potentials, SNAP correctly predicts the Peierls barrier for screw dislocation motion. We also present results from SNAP potentials generated for indium phosphide (InP) and silica (SiO 2 ). We describe efficient algorithms for calculating SNAP forces and energies in molecular dynamics simulations using massively parallel computers and advanced processor ar- chitectures. Finally, we briefly describe the MSM method for efficient calculation of electrostatic interactions on massively parallel computers.

  15. MODEL AND ALGORITHM EVALUATION FOR THE HYBRID UF6 CONTAINER INSPECTION SYSTEM

    SciTech Connect (OSTI)

    McDonald, Benjamin S.; Jordan, David V.; Orton, Christopher R.; Mace, Emily K.; Smith, Leon E.; Wittman, Richard S.

    2011-06-14

    ABSTRACT Pacific Northwest National Laboratory (PNNL) is developing an automated UF6 cylinder verification station concept based on the combined collection of traditional enrichment-meter (186 keV photons from U-235) data and non-traditional, neutron-induced, high-energy gamma-signatures (3-8.5 MeV) with an array of collimated, medium-resolution scintillators. Previous (2010) work at PNNL demonstrated proof-of-principle that this hybrid method yields accurate, full-volume assay of the cylinder enrichment, reduces systematic errors when compared to several other enrichment assay methods, and provides simplified instrumentation and algorithms suitable for long-term unattended operations. We used Monte Carlo modeling with MCNP5 to support system design (e.g., number and configuration of detector arrays, and design of iron/poly collimators for enhanced (n,?) conversion) and enrichment algorithm development. We developed a first-generation modeling framework in 2010. These tools have since been expanded, refined and benchmarked against field measurements with a prototype system of a 30B cylinder population (0.2 to 4.95 weight % U-235). The MCNP5 model decomposes the radiation transport problem into a linear superposition of basis spectra representing contributions from the different uranium isotopes and gamma-ray generation mechanisms (e.g. neutron capture). This scheme accommodates fast generation of virtual assay signatures for arbitrary enrichment, material age, and fill variations. Ongoing (FY-2011) refinements to the physics model include accounting for generation of bremsstrahlung photons, arising primarily from the beta decay of Pa-234m, a U-238 daughter. We are using the refined model to optimize collimator design for the hybrid method. The traditional assay method benefits from a high degree of collimation (to isolate each detectors field-of-view) and relatively small detector area, while the non-traditional method benefits from a wide field-of-view, i.e. less collimation and larger detectors. We implement the enrichment-meter method by applying a square-wave digital filter to a raw spectrum and extracting the 186-keV peak area directly from the convolute spectrum. Ongoing enhancements to this approach include mitigating a systematic peak-area measurement deficit arising from curvature in the spectrum continuum shape. An optimized system prototype based on model results is utilized in a new set of 2011 field measurements, and model and measurement enrichment assay uncertainties are compared.

  16. Modeling and Algorithmic Approaches to Constitutively-Complex, Microstructured Fluids

    SciTech Connect (OSTI)

    Miller, Gregory H.; Forest, Gregory

    2011-12-22

    We present a new multiscale model for complex uids based on three scales: microscopic, kinetic, and continuum. We choose the microscopic level as Kramers' bead-rod model for polymers, which we describe as a system of stochastic di#11;erential equations with an implicit constraint formulation. The associated Fokker-Planck equation is then derived, and adiabatic elimination removes the fast momentum coordinates. Approached in this way, the kinetic level reduces to a dispersive drift equation. The continuum level is modeled with a #12;nite volume Godunov-projection algorithm. We demonstrate computation of viscoelastic stress divergence using this multiscale approach.

  17. A fast and memory-sparing probabilistic selection algorithm for the GPU

    SciTech Connect (OSTI)

    Monroe, Laura M; Wendelberger, Joanne; Michalak, Sarah

    2010-09-29

    A fast and memory-sparing probabilistic top-N selection algorithm is implemented on the GPU. This probabilistic algorithm gives a deterministic result and always terminates. The use of randomization reduces the amount of data that needs heavy processing, and so reduces both the memory requirements and the average time required for the algorithm. This algorithm is well-suited to more general parallel processors with multiple layers of memory hierarchy. Probabilistic Las Vegas algorithms of this kind are a form of stochastic optimization and can be especially useful for processors having a limited amount of fast memory available.

  18. A Numerical Algorithm for the Solution of a Phase-Field Model of Polycrystalline Materials

    SciTech Connect (OSTI)

    Dorr, M R; Fattebert, J; Wickett, M E; Belak, J F; Turchi, P A

    2008-12-04

    We describe an algorithm for the numerical solution of a phase-field model (PFM) of microstructure evolution in polycrystalline materials. The PFM system of equations includes a local order parameter, a quaternion representation of local orientation and a species composition parameter. The algorithm is based on the implicit integration of a semidiscretization of the PFM system using a backward difference formula (BDF) temporal discretization combined with a Newton-Krylov algorithm to solve the nonlinear system at each time step. The BDF algorithm is combined with a coordinate projection method to maintain quaternion unit length, which is related to an important solution invariant. A key element of the Newton-Krylov algorithm is the selection of a preconditioner to accelerate the convergence of the Generalized Minimum Residual algorithm used to solve the Jacobian linear system in each Newton step. Results are presented for the application of the algorithm to 2D and 3D examples.

  19. TRACC: Algorithm for Predicting and Tracking Barges on Inland Waterways

    Energy Science and Technology Software Center (OSTI)

    2010-04-23

    Algorithm developed in this work is used to predict the location and estimate the traveling speed of a barge moving in inland waterway network. Measurements obtained from GPS or other systems are corrupted with measurement noise and reported at large, irregular time intervals. Thus, creating uncertainty about the current location of the barge and minimizing the effectiveness of emergency response activities in case of an accident or act of terrorism. Developing a prediction algorithm becomemore » a non-trivial problem due to estimation of speed becomes challenging, attributed to the complex interactions between multiple systems associated in the process. This software, uses systems approach in modeling the motion dynamics of the barge and estimates the location and speed of the barge at next, user defined, time interval. In this work, first, to estimate the speed a non-linear, stochastic modeling technique was developed that take local variations and interactions existing in the system. Output speed is then used as an observation in a statistically optimal filtering technique, Kalman filter, formulated in state-space to minimize numerous errors observed in the system. The combined system synergistically fuses the local information available with measurements obtained to predict the location and speed of traveling of the barge accurately.« less

  20. Incorrect support and missing center tolerances of phasing algorithms

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Huang, Xiaojing; Nelson, Johanna; Steinbrener, Jan; Kirz, Janos; Turner, Joshua J.; Jacobsen, Chris

    2010-01-01

    In x-ray diffraction microscopy, iterative algorithms retrieve reciprocal space phase information, and a real space image, from an object's coherent diffraction intensities through the use of a priori information such as a finite support constraint. In many experiments, the object's shape or support is not well known, and the diffraction pattern is incompletely measured. We describe here computer simulations to look at the effects of both of these possible errors when using several common reconstruction algorithms. Overly tight object supports prevent successful convergence; however, we show that this can often be recognized through pathological behavior of the phase retrieval transfermore »function. Dynamic range limitations often make it difficult to record the central speckles of the diffraction pattern. We show that this leads to increasing artifacts in the image when the number of missing central speckles exceeds about 10, and that the removal of unconstrained modes from the reconstructed image is helpful only when the number of missing central speckles is less than about 50. In conclusion, this simulation study helps in judging the reconstructability of experimentally recorded coherent diffraction patterns.« less

  1. A cooperative control algorithm for camera based observational systems.

    SciTech Connect (OSTI)

    Young, Joseph G.

    2012-01-01

    Over the last several years, there has been considerable growth in camera based observation systems for a variety of safety, scientific, and recreational applications. In order to improve the effectiveness of these systems, we frequently desire the ability to increase the number of observed objects, but solving this problem is not as simple as adding more cameras. Quite often, there are economic or physical restrictions that prevent us from adding additional cameras to the system. As a result, we require methods that coordinate the tracking of objects between multiple cameras in an optimal way. In order to accomplish this goal, we present a new cooperative control algorithm for a camera based observational system. Specifically, we present a receding horizon control where we model the underlying optimal control problem as a mixed integer linear program. The benefit of this design is that we can coordinate the actions between each camera while simultaneously respecting its kinematics. In addition, we further improve the quality of our solution by coupling our algorithm with a Kalman filter. Through this integration, we not only add a predictive component to our control, but we use the uncertainty estimates provided by the filter to encourage the system to periodically observe any outliers in the observed area. This combined approach allows us to intelligently observe the entire region of interest in an effective and thorough manner.

  2. Comparison of CRBR design-basis events with those of foreign LMFBR plants

    SciTech Connect (OSTI)

    Agrawal, A.K.

    1983-04-01

    As part of the Construction Permit (CP) review of the Clinch River Breeder Reactor Plant (CRBR), the Brookhaven National Laboratory was asked to compare the Design Basis Accidents that are considered in CRBR Preliminary Safety Analysis Report with those of the foreign contemporary plants (PHENIX, SUPER-PHENIX, SNR-300, PFR, and MONJU). A brief introductory review of any special or unusual characteristics of these plants is given. This is followed by discussions of the design basis accidents and their acceptance criteria. In spite of some discrepancies due either to semantics or to licensing decisions, there appears to be a considerable degree of unanimity in the selection (definition) of DBAs in all of these plants.

  3. Licensing topical report: application of probabilistic risk assessment in the selection of design basis accidents. [HTGR

    SciTech Connect (OSTI)

    Houghton, W.J.

    1980-06-01

    A probabilistic risk assessment (PRA) approach is proposed to be used to scrutinize selection of accident sequences. A technique is described in this Licensing Topical Report to identify candidates for Design Basis Accidents (DBAs) utilizing the risk assessment results. As a part of this technique, it is proposed that events with frequencies below a specified limit would not be candidates. The use of the methodology described is supplementary to the traditional, deterministic approach and may result, in some cases, in the selection of multiple failure sequences as DBAs; it may also provide a basis for not considering some traditionally postulated events as being DBAs. A process is then described for selecting a list of DBAs based on the candidates from PRA as supplementary to knowledge and judgments from past licensing practice. These DBAs would be the events considered in Chapter 15 of Safety Analysis Reports of high-temperature gas-cooled reactors (HTGRs).

  4. 105-K Basin material design basis feed description for spent nuclear fuel project facilities

    SciTech Connect (OSTI)

    Praga, A.N.

    1998-01-08

    Revisions 0 and 0A of this document provided estimated chemical and radionuclide inventories of spent nuclear fuel and sludge currently stored within the Hanford Site`s 105-K Basins. This Revision (Rev. 1) incorporates the following changes into Revision 0A: (1) updates the tables to reflect: improved cross section data, a decision to use accountability data as the basis for total Pu, a corrected methodology for selection of the heat generation basis fee, and a revised decay date; (2) adds section 3.3.3.1 to expand the description of the approach used to calculate the inventory values and explain why that approach yields conservative results; (3) changes the pre-irradiation braze beryllium value.

  5. Technical Basis for U. S. Department of Energy Nuclear Safety Policy, DOE Policy 420.1

    Broader source: Energy.gov [DOE]

    This document provides the technical basis for the Department of Energy (DOE) Policy (P) 420.1, Nuclear Safety Policy, dated 2-8-2011. It includes an analysis of the revised Policy to determine whether it provides the necessary and sufficient high-level expectations that will lead DOE to establish and implement appropriate requirements to assure protection of the public, workers, and the environment from the hazards of DOE’s operation of nuclear facilities.

  6. Integrated Safety Management System as the Basis for Work Planning and Control for Research and Development

    Broader source: Energy.gov [DOE]

    Slide Presentation by Rich Davies, Kami Lowry, Mike Schlender, Pacific Northwest National Laboratory (PNNL) and Ted Pietrok, Pacific Northwest Site Office (PNSO). Integrated Safety Management System as the Basis for Work Planning and Control for Research and Development. Work Planning and Control (WP&C) is essential to assuring the safety of workers and the public regardless of the scope of work Research and Development (R&D) activities are no exception.

  7. Technical Basis for Work Place Air Monitoring for the Plutonium Finishing Plan (PFP)

    SciTech Connect (OSTI)

    JONES, R.A.

    1999-10-06

    This document establishes the basis for the Plutonium Finishing Plant's (PFP) work place air monitoring program in accordance with the following requirements: Title 10, Code of Federal Regulations (CFR), Part 835 ''Occupational Radiation Protection''; Hanford Site Radiological Control Manual (HSRCM-1); HNF-PRO-33 1, Work Place Air Monitoring; WHC-SD-CP-SAR-021, Plutonium Finishing Plant Final Safety Analysis Report; and Applicable recognized national standards invoked by DOE Orders and Policies.

  8. Hydro-Kansas (HK) Research Project: Tests of a Physical Basis of

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Statistical Self-Similarity in Peak Flows in the Whitewater Basin, Kansas Hydro-Kansas (HK) Research Project: Tests of a Physical Basis of Statistical Self-Similarity in Peak Flows in the Whitewater Basin, Kansas Gupta, Vijay University of Colorado Furey, Peter Colorado Research Associates Mantila, Ricardo University of Colorado Krajewski, Witold University of Iowa Kruger, Anton The University of Iowa Clayton, Jordan US Geological Survey and University of Iowa Category: Atmospheric State and

  9. A probabilistic risk assessment of the LLNL Plutonium facility`s evaluation basis fire operational accident

    SciTech Connect (OSTI)

    Brumburgh, G.

    1994-08-31

    The Lawrence Livermore National Laboratory (LLNL) Plutonium Facility conducts numerous involving plutonium to include device fabrication, development of fabrication techniques, metallurgy research, and laser isotope separation. A Safety Analysis Report (SAR) for the building 332 Plutonium Facility was completed rational safety and acceptable risk to employees, the public, government property, and the environment. This paper outlines the PRA analysis of the Evaluation Basis Fire (EDF) operational accident. The EBF postulates the worst-case programmatic impact event for the Plutonium Facility.

  10. Review and Approval of Nuclear Facility Safety Basis Documents (Documented Safety Analyses and Technical Safety Requirements)

    Energy Savers [EERE]

    DOE-STD-1104-96 November 2005 CHANGE NOTICE NO. 3 Date December 2005 DOE STANDARD REVIEW AND APPROVAL OF NUCLEAR FACILITY SAFETY BASIS DOCUMENTS (DOCUMENTED SAFETY ANALYSES AND TECHNICAL SAFETY REQUIREMENTS) U.S. Department of Energy AREA SAFT Washington, DC 20585 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. This document has been reproduced directly from the best available copy. Available to DOE and DOE contractors from ES&H Technical Information

  11. Safety basis for the 241-AN-107 mixer pump installation and caustic addition

    SciTech Connect (OSTI)

    Van Vleet, R.J.

    1994-10-05

    This safety Basis was prepared to determine whether or not the proposed activities of installing a 76 HP jet mixer pump and the addition of approximately 50,000 gallons of 19 M (50:50 wt %) aqueous caustic are within the safety envelope as described by Tank Farms (chapter six of WHC-SD-WM-ISB-001, Rev. 0). The safety basis covers the components, structures and systems for the caustic addition and mixer pump installation. These include: installation of the mixer pump and monitoring equipment; operation of the mixer pump, process monitoring equipment and caustic addition; the pump stand, caustic addition skid, the electrical skid, the video camera system and the two densitometers. Also covered is the removal and decontamination of the mixer pump and process monitoring system. Authority for this safety basis is WHC-IP-0842 (Waste Tank Administration). Section 15.9, Rev. 2 (Unreviewed Safety Questions) of WHC-IP-0842 requires that an evaluation be performed for all physical modifications.

  12. Demonstrating Structural Adequacy of Nuclear Power Plant Containment Structures for Beyond Design-Basis Pressure Loadings

    SciTech Connect (OSTI)

    Braverman, J.I.; Morante, R.

    2010-07-18

    ABSTRACT Demonstrating the structural integrity of U.S. nuclear power plant (NPP) containment structures, for beyond design-basis internal pressure loadings, is necessary to satisfy Nuclear Regulatory Commission (NRC) requirements and performance goals. This paper discusses methods for demonstrating the structural adequacy of the containment for beyond design-basis pressure loadings. Three distinct evaluations are addressed: (1) estimating the ultimate pressure capacity of the containment structure (10 CFR 50 and US NRC Standard Review Plan, Section 3.8) ; (2) demonstrating the structural adequacy of the containment subjected to pressure loadings associated with combustible gas generation (10 CFR 52 and 10 CFR 50); and (3) demonstrating the containment structural integrity for severe accidents (10 CFR 52 as well as SECY 90-016, SECY 93-087, and related NRC staff requirements memoranda (SRMs)). The paper describes the technical basis for specific aspects of the methods presented. It also presents examples of past issues identified in licensing activities related to these evaluations.

  13. Central Facilities Area Facilities Radioactive Waste Management Basis and DOE Manual 435.1-1 Compliance Tables

    SciTech Connect (OSTI)

    Lisa Harvego; Brion Bennett

    2011-11-01

    Department of Energy Order 435.1, 'Radioactive Waste Management,' along with its associated manual and guidance, requires development and maintenance of a radioactive waste management basis for each radioactive waste management facility, operation, and activity. This document presents a radioactive waste management basis for Idaho National Laboratory's Central Facilities Area facilities that manage radioactive waste. The radioactive waste management basis for a facility comprises existing laboratory-wide and facilityspecific documents. Department of Energy Manual 435.1-1, 'Radioactive Waste Management Manual,' facility compliance tables also are presented for the facilities. The tables serve as a tool for developing the radioactive waste management basis.

  14. Materials and Fuels Complex Facilities Radioactive Waste Management Basis and DOE Manual 435.1-1 Compliance Tables

    SciTech Connect (OSTI)

    Lisa Harvego; Brion Bennett

    2011-09-01

    Department of Energy Order 435.1, 'Radioactive Waste Management,' along with its associated manual and guidance, requires development and maintenance of a radioactive waste management basis for each radioactive waste management facility, operation, and activity. This document presents a radioactive waste management basis for Idaho National Laboratory's Materials and Fuels Complex facilities that manage radioactive waste. The radioactive waste management basis for a facility comprises existing laboratory-wide and facility-specific documents. Department of Energy Manual 435.1-1, 'Radioactive Waste Management Manual,' facility compliance tables also are presented for the facilities. The tables serve as a tool for developing the radioactive waste management basis.

  15. Materials and Security Consolidation Complex Facilities Radioactive Waste Management Basis and DOE Manual 435.1-1 Compliance Tables

    SciTech Connect (OSTI)

    Not Listed

    2011-09-01

    Department of Energy Order 435.1, 'Radioactive Waste Management,' along with its associated manual and guidance, requires development and maintenance of a radioactive waste management basis for each radioactive waste management facility, operation, and activity. This document presents a radioactive waste management basis for Idaho National Laboratory's Materials and Security Consolidation Center facilities that manage radioactive waste. The radioactive waste management basis for a facility comprises existing laboratory-wide and facility-specific documents. Department of Energy Manual 435.1-1, 'Radioactive Waste Management Manual,' facility compliance tables also are presented for the facilities. The tables serve as a tool for developing the radioactive waste management basis.

  16. Research and Education Campus Facilities Radioactive Waste Management Basis and DOE Manual 435.1-1 Compliance Tables

    SciTech Connect (OSTI)

    L. Harvego; Brion Bennett

    2011-11-01

    U.S. Department of Energy Order 435.1, 'Radioactive Waste Management,' along with its associated manual and guidance, requires development and maintenance of a radioactive waste management basis for each radioactive waste management facility, operation, and activity. This document presents a radioactive waste management basis for Idaho National Laboratory Research and Education Campus facilities that manage radioactive waste. The radioactive waste management basis for a facility comprises existing laboratory-wide and facility-specific documents. Department of Energy Manual 435.1-1, 'Radioactive Waste Management Manual,' facility compliance tables also are presented for the facilities. The tables serve as a tool to develop the radioactive waste management basis.

  17. Evaluating cloud retrieval algorithms with the ARM BBHRP framework

    SciTech Connect (OSTI)

    Mlawer,E.; Dunn,M.; Mlawer, E.; Shippert, T.; Troyan, D.; Johnson, K. L.; Miller, M. A.; Delamere, J.; Turner, D. D.; Jensen, M. P.; Flynn, C.; Shupe, M.; Comstock, J.; Long, C. N.; Clough, S. T.; Sivaraman, C.; Khaiyer, M.; Xie, S.; Rutan, D.; Minnis, P.

    2008-03-10

    Climate and weather prediction models require accurate calculations of vertical profiles of radiative heating. Although heating rate calculations cannot be directly validated due to the lack of corresponding observations, surface and top-of-atmosphere measurements can indirectly establish the quality of computed heating rates through validation of the calculated irradiances at the atmospheric boundaries. The ARM Broadband Heating Rate Profile (BBHRP) project, a collaboration of all the working groups in the program, was designed with these heating rate validations as a key objective. Given the large dependence of radiative heating rates on cloud properties, a critical component of BBHRP radiative closure analyses has been the evaluation of cloud microphysical retrieval algorithms. This evaluation is an important step in establishing the necessary confidence in the continuous profiles of computed radiative heating rates produced by BBHRP at the ARM Climate Research Facility (ACRF) sites that are needed for modeling studies. This poster details the continued effort to evaluate cloud property retrieval algorithms within the BBHRP framework, a key focus of the project this year. A requirement for the computation of accurate heating rate profiles is a robust cloud microphysical product that captures the occurrence, height, and phase of clouds above each ACRF site. Various approaches to retrieve the microphysical properties of liquid, ice, and mixed-phase clouds have been processed in BBHRP for the ACRF Southern Great Plains (SGP) and the North Slope of Alaska (NSA) sites. These retrieval methods span a range of assumptions concerning the parameterization of cloud location, particle density, size, shape, and involve different measurement sources. We will present the radiative closure results from several different retrieval approaches for the SGP site, including those from Microbase, the current 'reference' retrieval approach in BBHRP. At the NSA, mixed-phase clouds and cloud with a low optical depth are prevalent; the radiative closure studies using Microbase demonstrated significant residuals. As an alternative to Microbase at NSA, the Shupe-Turner cloud property retrieval algorithm, aimed at improving the partitioning of cloud phase and incorporating more constrained, conditional microphysics retrievals, also has been evaluated using the BBHRP data set.

  18. Tightly Coupled Multiphysics Algorithm for Pebble Bed Reactors

    SciTech Connect (OSTI)

    HyeongKae Park; Dana Knoll; Derek Gaston; Richard Martineau

    2010-10-01

    We have developed a tightly coupled multiphysics simulation tool for the pebble-bed reactor (PBR) concept, a type of Very High-Temperature gas-cooled Reactor (VHTR). The simulation tool, PRONGHORN, takes advantages of the Multiphysics Object-Oriented Simulation Environment library, and is capable of solving multidimensional thermal-fluid and neutronics problems implicitly with a Newton-based approach. Expensive Jacobian matrix formation is alleviated via the Jacobian-free Newton-Krylov method, and physics-based preconditioning is applied to minimize Krylov iterations. Motivation for the work is provided via analysis and numerical experiments on simpler multiphysics reactor models. We then provide detail of the physical models and numerical methods in PRONGHORN. Finally, PRONGHORN's algorithmic capability is demonstrated on a number of PBR test cases.

  19. Invariant patterns in crystal lattices: Implications for protein folding algorithms

    SciTech Connect (OSTI)

    HART,WILLIAM E.; ISTRAIL,SORIN

    2000-06-01

    Crystal lattices are infinite periodic graphs that occur naturally in a variety of geometries and which are of fundamental importance in polymer science. Discrete models of protein folding use crystal lattices to define the space of protein conformations. Because various crystal lattices provide discretizations of the same physical phenomenon, it is reasonable to expect that there will exist invariants across lattices related to fundamental properties of the protein folding process. This paper considers whether performance-guaranteed approximability is such an invariant for HP lattice models. The authors define a master approximation algorithm that has provable performance guarantees provided that a specific sublattice exists within a given lattice. They describe a broad class of crystal lattices that are approximable, which further suggests that approximability is a general property of HP lattice models.

  20. Resistive Network Optimal Power Flow: Uniqueness and Algorithms

    SciTech Connect (OSTI)

    Tan, CW; Cai, DWH; Lou, X

    2015-01-01

    The optimal power flow (OPF) problem minimizes the power loss in an electrical network by optimizing the voltage and power delivered at the network buses, and is a nonconvex problem that is generally hard to solve. By leveraging a recent development on the zero duality gap of OPF, we propose a second-order cone programming convex relaxation of the resistive network OPF, and study the uniqueness of the optimal solution using differential topology, especially the Poincare-Hopf Index Theorem. We characterize the global uniqueness for different network topologies, e.g., line, radial, and mesh networks. This serves as a starting point to design distributed local algorithms with global behaviors that have low complexity, are computationally fast, and can run under synchronous and asynchronous settings in practical power grids.

  1. Optimized Uncertainty Quantification Algorithm Within a Dynamic Event Tree Framework

    SciTech Connect (OSTI)

    J. W. Nielsen; Akira Tokuhiro; Robert Hiromoto

    2014-06-01

    Methods for developing Phenomenological Identification and Ranking Tables (PIRT) for nuclear power plants have been a useful tool in providing insight into modelling aspects that are important to safety. These methods have involved expert knowledge with regards to reactor plant transients and thermal-hydraulic codes to identify are of highest importance. Quantified PIRT provides for rigorous method for quantifying the phenomena that can have the greatest impact. The transients that are evaluated and the timing of those events are typically developed in collaboration with the Probabilistic Risk Analysis. Though quite effective in evaluating risk, traditional PRA methods lack the capability to evaluate complex dynamic systems where end states may vary as a function of transition time from physical state to physical state . Dynamic PRA (DPRA) methods provide a more rigorous analysis of complex dynamic systems. A limitation of DPRA is its potential for state or combinatorial explosion that grows as a function of the number of components; as well as, the sampling of transition times from state-to-state of the entire system. This paper presents a method for performing QPIRT within a dynamic event tree framework such that timing events which result in the highest probabilities of failure are captured and a QPIRT is performed simultaneously while performing a discrete dynamic event tree evaluation. The resulting simulation results in a formal QPIRT for each end state. The use of dynamic event trees results in state explosion as the number of possible component states increases. This paper utilizes a branch and bound algorithm to optimize the solution of the dynamic event trees. The paper summarizes the methods used to implement the branch-and-bound algorithm in solving the discrete dynamic event trees.

  2. ITP Metal Casting: Theoretical/Best Practice Energy Use in Metalcasting

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Operations | Department of Energy Theoretical/Best Practice Energy Use in Metalcasting Operations ITP Metal Casting: Theoretical/Best Practice Energy Use in Metalcasting Operations PDF icon doebestpractice_052804.pdf More Documents & Publications ITP Metal Casting: Energy Use in Selected Metalcasting Facilities - 2003 ITP Metal Casting: Energy and Environmental Profile of the U.S. Metal casting Industry ITP Metal Casting: Advanced Melting Technologies: Energy Saving Concepts and

  3. Numerical Study of Velocity Shear Stabilization of 3D and Theoretical

    Office of Scientific and Technical Information (OSTI)

    Considerations for Centrifugally Confined Plasmas and Other Interchange-Limited Fusion Concepts (Technical Report) | SciTech Connect SciTech Connect Search Results Technical Report: Numerical Study of Velocity Shear Stabilization of 3D and Theoretical Considerations for Centrifugally Confined Plasmas and Other Interchange-Limited Fusion Concepts Citation Details In-Document Search Title: Numerical Study of Velocity Shear Stabilization of 3D and Theoretical Considerations for Centrifugally

  4. Theoretical and Computational Physics | U.S. DOE Office of Science (SC)

    Office of Science (SC) Website

    Theoretical and Computational Physics High Energy Physics (HEP) HEP Home About Research Science Drivers of Particle Physics Energy Frontier Intensity Frontier Cosmic Frontier Theoretical and Computational Physics Advanced Technology R&D Accelerator R&D Stewardship Facilities Science Highlights Benefits of HEP Funding Opportunities Advisory Committees Community Resources Contact Information High Energy Physics U.S. Department of Energy SC-25/Germantown Building 1000 Independence Ave., SW

  5. THEORETICAL TRANSIT SPECTRA FOR GJ 1214b AND OTHER 'SUPER-EARTHS' (Journal

    Office of Scientific and Technical Information (OSTI)

    Article) | SciTech Connect THEORETICAL TRANSIT SPECTRA FOR GJ 1214b AND OTHER 'SUPER-EARTHS' Citation Details In-Document Search Title: THEORETICAL TRANSIT SPECTRA FOR GJ 1214b AND OTHER 'SUPER-EARTHS' We present new calculations of transit spectra of super-Earths that allow for atmospheres with arbitrary proportions of common molecular species and haze. We test this method with generic spectra, reproducing the expected systematics and absorption features, then apply it to the nearby

  6. Theoretical investigations of defects in a Si-based digital ferromagnetic

    Office of Scientific and Technical Information (OSTI)

    heterostructure - a spintronic material (Journal Article) | SciTech Connect Journal Article: Theoretical investigations of defects in a Si-based digital ferromagnetic heterostructure - a spintronic material Citation Details In-Document Search Title: Theoretical investigations of defects in a Si-based digital ferromagnetic heterostructure - a spintronic material Authors: Fong, C Y ; Shauhgnessy, M ; Snow, R ; Yang, L H Publication Date: 2010-09-17 OSTI Identifier: 1124958 Report Number(s):

  7. Two-electron reduction of ethylene carbonate: theoretical review of SEI

    Office of Scientific and Technical Information (OSTI)

    formation mechanisms. (Conference) | SciTech Connect Conference: Two-electron reduction of ethylene carbonate: theoretical review of SEI formation mechanisms. Citation Details In-Document Search Title: Two-electron reduction of ethylene carbonate: theoretical review of SEI formation mechanisms. Authors: Leung, Kevin Publication Date: 2012-08-01 OSTI Identifier: 1061142 Report Number(s): SAND2012-6720C DOE Contract Number: AC04-94AL85000 Resource Type: Conference Resource Relation:

  8. Two-electron reduction of ethylene carbonate: theoretical review of SEI

    Office of Scientific and Technical Information (OSTI)

    formation mechanisms. (Conference) | SciTech Connect Conference: Two-electron reduction of ethylene carbonate: theoretical review of SEI formation mechanisms. Citation Details In-Document Search Title: Two-electron reduction of ethylene carbonate: theoretical review of SEI formation mechanisms. Abstract not provided. Authors: Leung, Kevin Publication Date: 2013-04-01 OSTI Identifier: 1078871 Report Number(s): SAND2013-3422C 452174 DOE Contract Number: AC04-94AL85000 Resource Type: Conference

  9. ARM: 10-minute TEMPORARY Raman Lidar: aerosol scattering ratio and backscattering coefficient profiles, from first Ferrare algorithm

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Sivaraman, Chitra; Flynn, Connor

    2010-12-15

    10-minute TEMPORARY Raman Lidar: aerosol scattering ratio and backscattering coefficient profiles, from first Ferrare algorithm

  10. ARM: 10-minute TEMPORARY Raman Lidar: aerosol extinction profiles and aerosol optical thickness, from first Ferrare algorithm

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Sivaraman, Chitra; Flynn, Connor

    2010-12-15

    10-minute TEMPORARY Raman Lidar: aerosol extinction profiles and aerosol optical thickness, from first Ferrare algorithm

  11. ARM: ARSCL: cloud boundaries from first Clothiaux algorithms on Vaisala or Belfort ceilometers, Micropulse lidar, and MMCR

    SciTech Connect (OSTI)

    Karen Johnson; Michael Jensen

    1996-11-08

    ARSCL: cloud boundaries from first Clothiaux algorithms on Vaisala or Belfort ceilometers, Micropulse lidar, and MMCR

  12. ARM: ARSCL: multiple outputs from first Clothiaux algorithms on Vaisala or Belfort ceilometers, Micropulse lidar, and MMCR

    SciTech Connect (OSTI)

    Karen Johnson; Michael Jensen

    1996-11-08

    ARSCL: multiple outputs from first Clothiaux algorithms on Vaisala or Belfort ceilometers, Micropulse lidar, and MMCR

  13. ARM: 10-minute TEMPORARY Raman Lidar: aerosol extinction profiles and aerosol optical thickness, from first Ferrare algorithm

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Sivaraman, Chitra; Flynn, Connor

    10-minute TEMPORARY Raman Lidar: aerosol extinction profiles and aerosol optical thickness, from first Ferrare algorithm

  14. ARM: 10-minute TEMPORARY Raman Lidar: aerosol scattering ratio and backscattering coefficient profiles, from first Ferrare algorithm

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Sivaraman, Chitra; Flynn, Connor

    10-minute TEMPORARY Raman Lidar: aerosol scattering ratio and backscattering coefficient profiles, from first Ferrare algorithm

  15. ARM: ARSCL: multiple outputs from first Clothiaux algorithms on Vaisala or Belfort ceilometers, Micropulse lidar, and MMCR

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Karen Johnson; Michael Jensen

    ARSCL: multiple outputs from first Clothiaux algorithms on Vaisala or Belfort ceilometers, Micropulse lidar, and MMCR

  16. ARM: ARSCL: cloud boundaries from first Clothiaux algorithms on Vaisala or Belfort ceilometers, Micropulse lidar, and MMCR

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Karen Johnson; Michael Jensen

    ARSCL: cloud boundaries from first Clothiaux algorithms on Vaisala or Belfort ceilometers, Micropulse lidar, and MMCR

  17. Update on Development of Mesh Generation Algorithms in MeshKit

    SciTech Connect (OSTI)

    Jain, Rajeev; Vanderzee, Evan; Mahadevan, Vijay

    2015-09-30

    MeshKit uses a graph-based design for coding all its meshing algorithms, which includes the Reactor Geometry (and mesh) Generation (RGG) algorithms. This report highlights the developmental updates of all the algorithms, results and future work. Parallel versions of algorithms, documentation and performance results are reported. RGG GUI design was updated to incorporate new features requested by the users; boundary layer generation and parallel RGG support were added to the GUI. Key contributions to the release, upgrade and maintenance of other SIGMA1 libraries (CGM and MOAB) were made. Several fundamental meshing algorithms for creating a robust parallel meshing pipeline in MeshKit are under development. Results and current status of automated, open-source and high quality nuclear reactor assembly mesh generation algorithms such as trimesher, quadmesher, interval matching and multi-sweeper are reported.

  18. Stride search: A general algorithm for storm detection in high resolution climate data

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Bosler, Peter Andrew; Roesler, Erika Louise; Taylor, Mark A.; Mundt, Miranda

    2015-09-08

    This article discusses the problem of identifying extreme climate events such as intense storms within large climate data sets. The basic storm detection algorithm is reviewed, which splits the problem into two parts: a spatial search followed by a temporal correlation problem. Two specific implementations of the spatial search algorithm are compared. The commonly used grid point search algorithm is reviewed, and a new algorithm called Stride Search is introduced. Stride Search is designed to work at all latitudes, while grid point searches may fail in polar regions. Results from the two algorithms are compared for the application of tropicalmore » cyclone detection, and shown to produce similar results for the same set of storm identification criteria. The time required for both algorithms to search the same data set is compared. Furthermore, Stride Search's ability to search extreme latitudes is demonstrated for the case of polar low detection.« less

  19. Architecture-Aware Algorithms for Scalable Performance and Resilience on Heterogeneous Architectures

    SciTech Connect (OSTI)

    Dongarra, Jack

    2013-10-15

    The goal of the Extreme-scale Algorithms & Software Institute (EASI) is to close the ?application-architecture performance gap? by exploring algorithms and runtime improvements that will enable key science applications to better exploit the architectural features of DOE extreme-scale systems. For the past year of the project, our efforts at the University of Tennessee have concentrated on, and made significant progress related to, the following high-level EASI goals: ? Develop multi-precision and architecture-aware implementations of Krylov, Poisson, Helmholtz solvers, and dense factorizations for heterogeneous multi-core systems; ? Explore new methods of algorithm resilience, and develop new algorithms with these capabilities; ? Develop runtime support for adaptable algorithms that are dealing with resilience, scalability; ? Distribute the new algorithms and runtime support through widely used software packages; ? Establish a strong outreach program to disseminate results, interact with colleagues and train students and junior members of our community.

  20. Digital revenue metering algorithm: development, analysis, implementation, testing, and evaluation. Final report

    SciTech Connect (OSTI)

    Schweitzer III, E.O.; To, H.W.; Ando, M.

    1980-11-01

    A digital revenue metering algorithm is described. The algorithm has been tested in a microcomputer system using two 8-bit MC6800 microprocessors and 12-bit analog-to-digital converters. The tests show that the system meets the accuracy requirements of ANSI C12-1975. The algorithm demands modest computing requirements and low data sampling rates. The algorithm uses Walsh-functions and will operate with as few as 4 samples per 60-Hz cycle. For proper response to odd harmonic frequencies, higher sampling rates must be used. Third harmonic power can be handled with an 8-sample per cycle Walsh function. However, even harmonics are effectively suppressed by the algorithm. The developed algorithm is intended for use in digital data acquisition systems for substations where interchange metering is required.

  1. Efficient algorithms for mixed aleatory-epistemic uncertainty quantification with application to radiation-hardened electronics. Part I, algorithms and benchmark results.

    SciTech Connect (OSTI)

    Swiler, Laura Painton; Eldred, Michael Scott

    2009-09-01

    This report documents the results of an FY09 ASC V&V Methods level 2 milestone demonstrating new algorithmic capabilities for mixed aleatory-epistemic uncertainty quantification. Through the combination of stochastic expansions for computing aleatory statistics and interval optimization for computing epistemic bounds, mixed uncertainty analysis studies are shown to be more accurate and efficient than previously achievable. Part I of the report describes the algorithms and presents benchmark performance results. Part II applies these new algorithms to UQ analysis of radiation effects in electronic devices and circuits for the QASPR program.

  2. Just in Time DSA-The Hanford Nuclear Safety Basis Strategy

    SciTech Connect (OSTI)

    Olinger, S. J.; Buhl, A. R.

    2002-02-26

    The U.S. Department of Energy, Richland Operations Office (RL) is responsible for 30 hazard category 2 and 3 nuclear facilities that are operated by its prime contractors, Fluor Hanford Incorporated (FHI), Bechtel Hanford, Incorporated (BHI) and Pacific Northwest National Laboratory (PNNL). The publication of Title 10, Code of Federal Regulations, Part 830, Subpart B, Safety Basis Requirements (the Rule) in January 2001 imposed the requirement that the Documented Safety Analyses (DSA) for these facilities be reviewed against the requirements of the Rule. Those DSA that do not meet the requirements must either be upgraded to satisfy the Rule, or an exemption must be obtained. RL and its prime contractors have developed a Nuclear Safety Strategy that provides a comprehensive approach for supporting RL's efforts to meet its long term objectives for hazard category 2 and 3 facilities while also meeting the requirements of the Rule. This approach will result in a reduction of the total number of safety basis documents that must be developed and maintained to support the remaining mission and closure of the Hanford Site and ensure that the documentation that must be developed will support: compliance with the Rule; a ''Just-In-Time'' approach to development of Rule-compliant safety bases supported by temporary exemptions; and consolidation of safety basis documents that support multiple facilities with a common mission (e.g. decontamination, decommissioning and demolition [DD&D], waste management, surveillance and maintenance). This strategy provides a clear path to transition the safety bases for the various Hanford facilities from support of operation and stabilization missions through DD&D to accelerate closure. This ''Just-In-Time'' Strategy can also be tailored for other DOE Sites, creating the potential for large cost savings and schedule reductions throughout the DOE complex.

  3. Fast Combinatorial Algorithm for the Solution of Linearly Constrained Least Squares Problems

    DOE Patents [OSTI]

    Van Benthem, Mark H.; Keenan, Michael R.

    2008-11-11

    A fast combinatorial algorithm can significantly reduce the computational burden when solving general equality and inequality constrained least squares problems with large numbers of observation vectors. The combinatorial algorithm provides a mathematically rigorous solution and operates at great speed by reorganizing the calculations to take advantage of the combinatorial nature of the problems to be solved. The combinatorial algorithm exploits the structure that exists in large-scale problems in order to minimize the number of arithmetic operations required to obtain a solution.

  4. Award DE-FG02-04ER52655 Final Technical Report: Interior Point Algorithms

    Office of Scientific and Technical Information (OSTI)

    for Optimization Problems (Technical Report) | SciTech Connect Award DE-FG02-04ER52655 Final Technical Report: Interior Point Algorithms for Optimization Problems Citation Details In-Document Search Title: Award DE-FG02-04ER52655 Final Technical Report: Interior Point Algorithms for Optimization Problems Over the period of this award we developed an algorithmic framework for constraint reduction in linear programming (LP) and convex quadratic programming (QP), proved convergence of our

  5. Technical basis for cases N-629 and N-631 as an alternative for RTNDT reference temperature

    SciTech Connect (OSTI)

    Merkle, John Graham; Server, W. L.

    2007-01-01

    ASME Code Cases N-629/N-631, published in 1999, provided an important new approach to allow material specific, measured fracture toughness curves for ferritic steels in the code applications. This has enabled some of the nuclear power plants whose reactor pressure vessel materials reached a certain threshold level based on overly conservative rules to use an alternative RTNDT to justify continued operation of their plants. These code cases have been approved by the US Nuclear Regulatory Commission and these have been proposed to be codified in Appendix A and Appendix G of the ASME Boiler and Pressure Vessel Code. This paper summarizes the basis of this approach for the record.

  6. Spatial compression algorithm for the analysis of very large multivariate images

    DOE Patents [OSTI]

    Keenan, Michael R. (Albuquerque, NM)

    2008-07-15

    A method for spatially compressing data sets enables the efficient analysis of very large multivariate images. The spatial compression algorithms use a wavelet transformation to map an image into a compressed image containing a smaller number of pixels that retain the original image's information content. Image analysis can then be performed on a compressed data matrix consisting of a reduced number of significant wavelet coefficients. Furthermore, a block algorithm can be used for performing common operations more efficiently. The spatial compression algorithms can be combined with spectral compression algorithms to provide further computational efficiencies.

  7. Workshop on algorithms for macromolecular modeling. Final project report, June 1, 1994--May 31, 1995

    SciTech Connect (OSTI)

    Leimkuhler, B.; Hermans, J.; Skeel, R.D.

    1995-07-01

    A workshop was held on algorithms and parallel implementations for macromolecular dynamics, protein folding, and structural refinement. This document contains abstracts and brief reports from that workshop.

  8. Algorithm for Accounting for the Interactions of Multiple Renewable Energy Technologies in Estimation of Annual Performance

    Energy Science and Technology Software Center (OSTI)

    2007-12-31

    The algorithm accounts for interactions between technologies in determining the annual energy performance of multiple renewable energy technologies at a subject site.

  9. Spectral compression algorithms for the analysis of very large multivariate images

    DOE Patents [OSTI]

    Keenan, Michael R. (Albuquerque, NM)

    2007-10-16

    A method for spectrally compressing data sets enables the efficient analysis of very large multivariate images. The spectral compression algorithm uses a factored representation of the data that can be obtained from Principal Components Analysis or other factorization technique. Furthermore, a block algorithm can be used for performing common operations more efficiently. An image analysis can be performed on the factored representation of the data, using only the most significant factors. The spectral compression algorithm can be combined with a spatial compression algorithm to provide further computational efficiencies.

  10. Solution Algorithms for Effective-Field Models of Multi-Fluid...

    Office of Scientific and Technical Information (OSTI)

    Laboratory (INL) Sponsoring Org: DOE - NE Country of Publication: United States Language: English Subject: 97 MATHEMATICS AND COMPUTING Multifluid algorithms; Reactor Safety...

  11. DTRA Algorithm Prize (Seventh Annual Sequencing, Finishing, Analysis in the Future (SFAF) Meeting 2012)

    ScienceCinema (OSTI)

    Whitechurch, Christian [Defense Threat Reduction Agency

    2013-02-12

    Christian Whitchurch on the "DTRA Algorithm Prize" at the 2012 Sequencing, Finishing, Analysis in the Future Meeting held June 5-7, 2012 in Santa Fe, New Mexico.

  12. Development of Probabilistic Design Basis Earthquake (DBE) Parameters for Moderate and High Hazard Facilities at INEEL

    SciTech Connect (OSTI)

    S. M. Payne; V. W. Gorman; S. A. Jensen; M. E. Nitzel; M. J. Russell; R. P. Smith

    2000-03-01

    Design Basis Earthquake (DBE) horizontal and vertical response spectra are developed for moderate and high hazard facilities or Performance Categories (PC) 3 and 4, respectively, at the Idaho National Engineering and Environmental Laboratory (INEEL). The probabilistic DBE response spectra will replace the deterministic DBE response spectra currently in the U.S. Department of Energy Idaho Operations Office (DOE-ID) Architectural Engineering Standards that govern seismic design criteria for several facility areas at the INEEL. Probabilistic DBE response spectra are recommended to DOE Naval Reactors for use at the Naval Reactor Facility at INEEL. The site-specific Uniform Hazard Spectra (UHS) developed by URS Greiner Woodward Clyde Federal Services are used as the basis for developing the DBE response spectra. In 1999, the UHS for all INEEL facility areas were recomputed using more appropriate attenuation relationships for the Basin and Range province. The revised UHS have lower ground motions than those produced in the 1996 INEEL site-wide probabilistic ground motion study. The DBE response spectra were developed by incorporating smoothed broadened regions of the peak accelerations, velocities, and displacements defined by the site-specific UHS. Portions of the DBE response spectra were adjusted to ensure conservatism for the structural design process.

  13. Engineering Basis Document Review Supporting the Double Shell Tank (DST) System Specification Development

    SciTech Connect (OSTI)

    LEONARD, M.W.

    2000-03-14

    The Double-Shell Tank (DST) System is required to transition from its current storage mission to a storage and retrieval mission supporting the River Protection Project Phase 1 privatization, defined in HNF-SD-WM-MAR-008, Tank Waste Remediation System Mission Analysis Report. Requirements for the DST subsystems are being developed using the top-down systems engineering process outlined in HNF-SD-WM-SEMP-002, Tank Waste Remediation System Systems Engineering Management Plan. This top-down process considers existing designs to the extent that these designs impose unavoidable constraints on the Phase 1 mission. Existing engineering-basis documents were screened, and the unavoidable constraints were identified. The constraints identified herein will be added to the DST System specification (HNF-SD-WM-TRD-007, System Specification for the Double-Shell Tank System). While the letter revisions of the DST System specification were constructed with a less rigorous review of the existing engineering-basis documents, the Revision 0 release of the specification must incorporate the results of the review documented herein. The purpose of this document is to describe the screening process and criteria used to determine which constraints are unavoidable and to document the screening results.

  14. Preliminary performance assessment for the Waste Isolation Pilot Plant, December 1992. Volume 2, Technical basis

    SciTech Connect (OSTI)

    Not Available

    1992-12-01

    Before disposing of transuranic radioactive waste in the Waste Isolation Pilot Plant (WIPP), the United States Department of Energy (DOE) must evaluate compliance with applicable long-term regulations of the United States Environmental Protection Agency (EPA). Sandia National Laboratories is conducting iterative performance assessments (PAs) of the WIPP for the DOE to provide interim guidance while preparing for a final compliance evaluation. This volume, Volume 2, contains the technical basis for the 1992 PA. Specifically, it describes the conceptual basis for consequence modeling and the PA methodology, including the selection of scenarios for analysis, the determination of scenario probabilities, and the estimation of scenario consequences using a Monte Carlo technique and a linked system of computational models. Additional information about the 1992 PA is provided in other volumes. Volume I contains an overview of WIPP PA and results of a preliminary comparison with the long-term requirements of the EPA`s Environmental Protection Standards for Management and Disposal of Spent Nuclear Fuel, High-Level and Transuranic Radioactive Wastes (40 CFR 191, Subpart B). Volume 3 contains the reference data base and values for input parameters used in consequence and probability modeling. Volume 4 contains uncertainty and sensitivity analyses related to the preliminary comparison with 40 CFR 191B. Volume 5 contains uncertainty and sensitivity analyses of gas and brine migration for undisturbed performance. Finally, guidance derived from the entire 1992 PA is presented in Volume 6.

  15. Current plans to characterize the design basis ground motion at the Yucca Mountain, Nevada Site

    SciTech Connect (OSTI)

    Simecka, W.B.; Grant, T.A.; Voegele, M.D.; Cline, K.M.

    1992-12-31

    A site at Yucca Mountain Nevada is currently being studied to assess its suitability as a potential host site for the nation`s first commercial high level waste repository. The DOE has proposed a new methodology for determining design-basis ground motions that uses both deterministic and probabilistic methods. The role of the deterministic approach is primary. It provides the level of detail needed by design engineers in the characterization of ground motions. The probabilistic approach provides a logical structured procedure for integrating the range of possible earthquakes that contribute to the ground motion hazard at the site. In addition, probabilistic methods will be used as needed to provide input for the assessment of long-term repository performance. This paper discusses the local tectonic environment, potential seismic sources and their associated displacements and ground motions. It also discusses the approach to assessing the design basis earthquake for the surface and underground facilities, as well as selected examples of the use of this type of information in design activities.

  16. Effective and efficient optics inspection approach using machine learning algorithms

    SciTech Connect (OSTI)

    Abdulla, G; Kegelmeyer, L; Liao, Z; Carr, W

    2010-11-02

    The Final Optics Damage Inspection (FODI) system automatically acquires and utilizes the Optics Inspection (OI) system to analyze images of the final optics at the National Ignition Facility (NIF). During each inspection cycle up to 1000 images acquired by FODI are examined by OI to identify and track damage sites on the optics. The process of tracking growing damage sites on the surface of an optic can be made more effective by identifying and removing signals associated with debris or reflections. The manual process to filter these false sites is daunting and time consuming. In this paper we discuss the use of machine learning tools and data mining techniques to help with this task. We describe the process to prepare a data set that can be used for training and identifying hardware reflections in the image data. In order to collect training data, the images are first automatically acquired and analyzed with existing software and then relevant features such as spatial, physical and luminosity measures are extracted for each site. A subset of these sites is 'truthed' or manually assigned a class to create training data. A supervised classification algorithm is used to test if the features can predict the class membership of new sites. A suite of self-configuring machine learning tools called 'Avatar Tools' is applied to classify all sites. To verify, we used 10-fold cross correlation and found the accuracy was above 99%. This substantially reduces the number of false alarms that would otherwise be sent for more extensive investigation.

  17. Support Vector Machine algorithm for regression and classification

    Energy Science and Technology Software Center (OSTI)

    2001-08-01

    The software is an implementation of the Support Vector Machine (SVM) algorithm that was invented and developed by Vladimir Vapnik and his co-workers at AT&T Bell Laboratories. The specific implementation reported here is an Active Set method for solving a quadratic optimization problem that forms the major part of any SVM program. The implementation is tuned to specific constraints generated in the SVM learning. Thus, it is more efficient than general-purpose quadratic optimization programs. Amore » decomposition method has been implemented in the software that enables processing large data sets. The size of the learning data is virtually unlimited by the capacity of the computer physical memory. The software is flexible and extensible. Two upper bounds are implemented to regulate the SVM learning for classification, which allow users to adjust the false positive and false negative rates. The software can be used either as a standalone, general-purpose SVM regression or classification program, or be embedded into a larger software system.« less

  18. Toward Developing Genetic Algorithms to Aid in Critical Infrastructure Modeling

    SciTech Connect (OSTI)

    Not Available

    2007-05-01

    Todays society relies upon an array of complex national and international infrastructure networks such as transportation, telecommunication, financial and energy. Understanding these interdependencies is necessary in order to protect our critical infrastructure. The Critical Infrastructure Modeling System, CIMS, examines the interrelationships between infrastructure networks. CIMS development is sponsored by the National Security Division at the Idaho National Laboratory (INL) in its ongoing mission for providing critical infrastructure protection and preparedness. A genetic algorithm (GA) is an optimization technique based on Darwins theory of evolution. A GA can be coupled with CIMS to search for optimum ways to protect infrastructure assets. This includes identifying optimum assets to enforce or protect, testing the addition of or change to infrastructure before implementation, or finding the optimum response to an emergency for response planning. This paper describes the addition of a GA to infrastructure modeling for infrastructure planning. It first introduces the CIMS infrastructure modeling software used as the modeling engine to support the GA. Next, the GA techniques and parameters are defined. Then a test scenario illustrates the integration with CIMS and the preliminary results.

  19. Integrated network design and scheduling problems : optimization algorithms and applications.

    SciTech Connect (OSTI)

    Nurre, Sarah G.; Carlson, Jeffrey J.

    2014-01-01

    We consider the class of integrated network design and scheduling problems. These problems focus on selecting and scheduling operations that will change the characteristics of a network, while being speci cally concerned with the performance of the network over time. Motivating applications of INDS problems include infrastructure restoration after extreme events and building humanitarian distribution supply chains. While similar models have been proposed, no one has performed an extensive review of INDS problems from their complexity, network and scheduling characteristics, information, and solution methods. We examine INDS problems under a parallel identical machine scheduling environment where the performance of the network is evaluated by solving classic network optimization problems. We classify that all considered INDS problems as NP-Hard and propose a novel heuristic dispatching rule algorithm that selects and schedules sets of arcs based on their interactions in the network. We present computational analysis based on realistic data sets representing the infrastructures of coastal New Hanover County, North Carolina, lower Manhattan, New York, and a realistic arti cial community CLARC County. These tests demonstrate the importance of a dispatching rule to arrive at near-optimal solutions during real-time decision making activities. We extend INDS problems to incorporate release dates which represent the earliest an operation can be performed and exible release dates through the introduction of specialized machine(s) that can perform work to move the release date earlier in time. An online optimization setting is explored where the release date of a component is not known.

  20. Linac Alignment Algorithm: Analysis on 1-to-1 Steering

    SciTech Connect (OSTI)

    Sun, Yipeng; Adolphsen, Chris; /SLAC

    2011-08-19

    In a linear accelerator, it is important to achieve a good alignment between all of its components (such as quadrupoles, RF cavities, beam position monitors et al.), in order to better preserve the beam quality during acceleration. After the survey of the main linac components, there are several beam-based alignment (BBA) techniques to be applied, to further optimize the beam trajectory and calculate the corresponding steering magnets strength. Among these techniques the most simple and straightforward one is the one-to-one (1-to-1) steering technique, which steers the beam from quad center to center, and removes the betatron oscillation from quad focusing. For a future linear collider such as the International Linear Collider (ILC), the initial beam emittance is very small in the vertical plane (flat beam with {gamma}{epsilon}{sub y} = 20-40nm), which means the alignment requirement is very tight. In this note, we evaluate the emittance growth with one-to-one correction algorithm employed, both analytically and numerically. Then the ILC main linac accelerator is taken as an example to compare the vertical emittance growth after 1-to-1 steering, both from analytical formulae and multi-particle tracking simulation. It is demonstrated that the estimated emittance growth from the derived formulae agrees well with the results from numerical simulation, with and without acceleration, respectively.

  1. Draft Basis for Section 3116 Determination for Closure of F-Tank Farm at the Savannah River Site.

    Office of Environmental Management (EM)

    SRS-WD-2010-001 Revision 0 Draft Basis for Section 3116 Determination for Closure of F-Tank Farm at the Savannah River Site September 30, 2010 Draft Basis for Section 3116 Determination DOE/SRS-WD-2010-001 for Closure of F-Tank Farm Revision 0 at the Savannah River Site September 30, 2010 Page ii REVISION SUMMARY REV. # DESCRIPTION DATE OF ISSUE 0 Initial Issue 09/30/2010 Draft Basis for Section 3116 Determination DOE/SRS-WD-2010-001 for Closure of F-Tank Farm Revision 0 at the Savannah River

  2. Basics of Polar-Format algorithm for processing Synthetic Aperture Radar images.

    SciTech Connect (OSTI)

    Doerry, Armin Walter

    2012-05-01

    The purpose of this report is to provide a background to Synthetic Aperture Radar (SAR) image formation using the Polar Format (PFA) processing algorithm. This is meant to be an aid to those tasked to implement real-time image formation using the Polar Format processing algorithm.

  3. Experimental Results in the Comparison of Search Algorithms Used with Room Temperature Detectors

    SciTech Connect (OSTI)

    Guss, P., Yuan, D., Cutler, M., Beller, D.

    2010-11-01

    Analysis of time sequence data was run for several higher resolution scintillation detectors using a variety of search algorithms, and results were obtained in predicting the relative performance for these detectors, which included a slightly superior performance by CeBr{sub 3}. Analysis of several search algorithms shows that inclusion of the RSPRT methodology can improve sensitivity.

  4. U.S. Department of Energy, Oak Ridge Operations Office Nuclear Facility Safety Basis Fundamentals Self-Study Guide [Fulfills ORO Safety Basis Competency 1, 2 (Part 1), or 7 (Part 1)

    Broader source: Energy.gov [DOE]

    "This self-study guide provides an overview of safety basis terminology, requirements, and activities that are applicable to DOE and Oak Ridge Operations Office (ORO) nuclear facilities on the Oak...

  5. Breckinridge Project, initial effort. Report XI, Volume V. Critical review of the design basis. [Critical review

    SciTech Connect (OSTI)

    1982-01-01

    Report XI, Technical Audit, is a compendium of research material used during the Initial Effort in making engineering comparisons and decisions. Volumes 4 and 5 of Report XI present those studies which provide a Critical Review of the Design Basis. The Critical Review Report, prepared by Intercontinental Econergy Associates, Inc., summarizes findings from an extensive review of the data base for the H-Coal process design. Volume 4 presents this review and assessment, and includes supporting material; specifically, Design Data Tabulation (Appendix A), Process Flow Sheets (Appendix B), and References (Appendix C). Volume 5 is a continuation of the references of Appendix C. Studies of a proprietary nature are noted and referenced, but are not included in these volumes. They are included in the Limited Access versions of these reports and may be reviewed by properly cleared personnel in the offices of Ashland Synthetic Fuels, Inc.

  6. Composition and Technical Basis for K Basin Settler Sludge Simulant for Inspection, Retrieval, and Pump Testing

    SciTech Connect (OSTI)

    Schmidt, Andrew J.; Zacher, Alan H.

    2007-06-25

    This report provides the formulation and technical basis for a K Basin Settler Tank Sludge simulant that will be used by the K Basin Closure Project (KBC) to test and develop equipment/approaches for Settler Tank sludge level measurement and retrieval in a mock-up test system of the actual Settler Tanks. The sludge simulant may also be used to demonstrate that the TOYO high pressure positive displacement pump design (reversing valves and hollow balls) is suitable for transfer of Settler Tank sludge from the K West (KW) Basin to the Cold Vacuum Drying Facility (CVDF) (~500 ft). As requested the by the K Basins Sludge Treatment Project (STP) the simulant is comprised of non-radioactive (and non-uranium) constituents.

  7. Micrometer-scale fabrication of complex three dimensional lattice + basis structures in silicon

    SciTech Connect (OSTI)

    Burckel, D. Bruce; Resnick, Paul J.; Finnegan, Patrick S.; Sinclair, Michael B.; Davids, Paul S.

    2015-01-01

    A complementary metal oxide semiconductor (CMOS) compatible version of membrane projection lithography (MPL) for fabrication of micrometer-scale three-dimensional structures is presented. The approach uses all inorganic materials and standard CMOS processing equipment. In a single layer, MPL is capable of creating all 5 2D-Bravais lattices. Furthermore, standard semiconductor processing steps can be used in a layer-by-layer approach to create fully three dimensional structures with any of the 14 3D-Bravais lattices. The unit cell basis is determined by the projection of the membrane pattern, with many degrees of freedom for defining functional inclusions. Here we demonstrate several unique structural motifs, and characterize 2D arrays of unit cells with split ring resonators in a silicon matrix. The structures exhibit strong polarization dependent resonances and, for properly oriented split ring resonators (SRRs), coupling to the magnetic field of a normally incident transverse electromagnetic wave, a response unique to 3D inclusions.

  8. Structural basis of GDP release and gating in G protein coupled Fe[superscript 2+] transport

    SciTech Connect (OSTI)

    Guilfoyle, Amy; Maher, Megan J.; Rapp, Mikaela; Clarke, Ronald; Harrop, Stephen; Jormakka, Mika

    2009-09-29

    G proteins are key molecular switches in the regulation of membrane protein function and signal transduction. The prokaryotic membrane protein FeoB is involved in G protein coupled Fe{sup 2+} transport, and is unique in that the G protein is directly tethered to the membrane domain. Here, we report the structure of the soluble domain of FeoB, including the G protein domain, and its assembly into an unexpected trimer. Comparisons between nucleotide free and liganded structures reveal the closed and open state of a central cytoplasmic pore, respectively. In addition, these data provide the first observation of a conformational switch in the nucleotide-binding G5 motif, defining the structural basis for GDP release. From these results, structural parallels are drawn to eukaryotic G protein coupled membrane processes.

  9. Criteria for calculating the efficiency of HEPA filters during and after design basis accidents

    SciTech Connect (OSTI)

    Bergman, W.; First, M.W.; Anderson, W.L.; Gilbert, H.; Jacox, J.W.

    1994-12-01

    We have reviewed the literature on the performance of high efficiency particulate air (HEPA) filters under normal and abnormal conditions to establish criteria for calculating the efficiency of HEPA filters in a DOE nonreactor nuclear facility during and after a Design Basis Accident (DBA). The literature review included the performance of new filters and parameters that may cause deterioration in the filter performance such as filter age, radiation, corrosive chemicals, seismic and rough handling, high temperature, moisture, particle clogging, high air flow and pressure pulses. The deterioration of the filter efficiency depends on the exposure parameters; in severe exposure conditions the filter will be structurally damaged and have a residual efficiency of 0%. Despite the many studies on HEPA filter performance under adverse conditions, there are large gaps and limitations in the data that introduce significant error in the estimates of HEPA filter efficiencies under DBA conditions. Because of this limitation, conservative values of filter efficiency were chosen when there was insufficient data.

  10. Scientific basis for risk assessment and management of uranium mill tailings

    SciTech Connect (OSTI)

    Not Available

    1986-01-01

    A National Research Council study panel, convened by the Board on Radioactive Waste Management, has examined the scientific basis for risk assessment and management of uranium mill tailings and issued this final report containing a number of recommendations. Chapter 1 provides a brief introduction to the problem. Chapter 2 examines the processes of uranium extraction and the mechanisms by which radionuclides and toxic chemicals contained in the ore can enter the environment. Chapter 3 is devoted to a review of the evidence on health risks associated with radon and its decay products. Chapter 4 provides a consideration of conventional and possible new technical alternatives for tailings management. Chapter 5 explores a number of issues of comparative risk, provides a brief history of uranium mill tailings regulation, and concludes with a discussion of choices that must be made in mill tailing risk management. 211 refs., 30 figs., 27 tabs.

  11. Modeling node bandwidth limits and their effects on vector combining algorithms

    SciTech Connect (OSTI)

    Littlefield, R.J.

    1992-01-13

    Each node in a message-passing multicomputer typically has several communication links. However, the maximum aggregate communication speed of a node is often less than the sum of its individual link speeds. Such computers are called node bandwidth limited (NBL). The NBL constraint is important when choosing algorithms because it can change the relative performance of different algorithms that accomplish the same task. This paper introduces a model of communication performance for NBL computers and uses the model to analyze the overall performance of three algorithms for vector combining (global sum) on the Intel Touchstone DELTA computer. Each of the three algorithms is found to be at least 33% faster than the other two for some combinations of machine size and vector length. The NBL constraint is shown to significantly affect the conditions under which each algorithm is fastest.

  12. ASYMPTOTICALLY OPTIMAL HIGH-ORDER ACCURATE ALGORITHMS FOR THE SOLUTION OF CERTAIN ELLIPTIC PDEs

    SciTech Connect (OSTI)

    Leonid Kunyansky, PhD

    2008-11-26

    The main goal of the project, "Asymptotically Optimal, High-Order Accurate Algorithms for the Solution of Certain Elliptic PDE's" (DE-FG02-03ER25577) was to develop fast, high-order algorithms for the solution of scattering problems and spectral problems of photonic crystals theory. The results we obtained lie in three areas: (1) asymptotically fast, high-order algorithms for the solution of eigenvalue problems of photonics, (2) fast, high-order algorithms for the solution of acoustic and electromagnetic scattering problems in the inhomogeneous media, and (3) inversion formulas and fast algorithms for the inverse source problem for the acoustic wave equation, with applications to thermo- and opto- acoustic tomography.

  13. Benchmark Theoretical Study of the ?? Binding Energy in the Benzene Dimer

    SciTech Connect (OSTI)

    Miliordos, Evangelos; Apra, Edoardo; Xantheas, Sotiris S.

    2014-09-04

    We establish a new estimate for the interaction energy between two benzene molecules in the parallel displaced (PD) conformation by systematically converging (i) the intra- and intermolecular geometry at the minimum geometry, (ii) the expansion of the orbital basis set and (iii) the level of electron correlation. The calculations were performed at the second order Mller - Plesset perturbation (MP2) and the Coupled Cluster including Singles, Doubles and a perturbative estimate of Triples replacements [CCSD(T)] levels of electronic structure theory. At both levels of theory, by including results corrected for Basis Set Superposition Error (BSSE), we have estimated the Complete Basis Set (CBS) limit by employing the family of Dunnings correlation consistent polarized valence basis sets. The largest MP2 calculation was performed with the cc-pV6Z basis set (2,772 basis functions), whereas the largest CCSD(T) calculation with the cc-pV5Z basis set (1,752 basis functions). The cluster geometries were optimized with basis sets up to quadruple-? quality, observing that both its intra- and inter-molecular parts have practically converged with the triple-? quality sets. The use of converged geometries was found to play an important role for obtaining accurate estimates for the CBS limits. Our results demonstrate that the binding energies with the families of the plain (cc-pVnZ) and augmented (aug-cc-pVnZ) sets converge [to within < 0.01 kcal/mol for MP2 and < 0.15 kcal/mol for CCSD(T)] to the same CBS limit. In addition, the average of the uncorrected and BSSEcorrected binding energies was found to converge to the same CBS limit must faster than either of the two constituents (uncorrected or BSSE-corrected binding energies). Due to the fact that the family of augmented basis sets (especially for the larger sets) causes serious linear dependency problems, the plain basis sets (for which no linear dependencies were found) are deemed as a more efficient and straightforward path for obtaining an accurate CBS limit. We considered extrapolations of the uncorrected (?𝐸) and BSSE-corrected (?𝐸!") binding energies, their average value (?𝐸!"#) as well as the average of the latter over the plain and augmented sets (?𝐸!"#) with the cardinal number of the basis set n. Our best estimate of the CCSD(T)/CBS limit for the ?-? interaction energy in the PD benzene dimer is De = 2.65 0.02 kcal/mol. The best CCSD(T)/cc-pV5Z calculated value is 2.62 kcal/mol, just 0.03 kcal/mol away from the CBS limit. For comparison, the MP2/CBS limit estimate is 5.00 0.01 kcal/mol, demonstrating a 90% overbinding with respect to CCSD(T). The Spin-Component-Scaled (SCS) MP2 variant was found to closely reproduce the CCSD(T) results for each basis set, while Scaled-Opposite-Spin (SOS) yielded results that are too low when compared to CCSD(T).

  14. Research in theoretical elementary particle physics at the University of Florida: Task A. Annual progress report

    SciTech Connect (OSTI)

    Field, R.D.; Ramond, P.M.; Sikivie, P.; Thorn, C.B.

    1994-12-01

    This is the Annual Progress Report of the theoretical particle theory group at the University of Florida under DOE Grant DE-FG05-86ER40272. At present our group consists of four Full Professors (Field, Ramond, Thorn, Sikivie), one Associate Professor (Woodard), and two Assistant Professors (Qiu, Kennedy). In addition, we have four postdoctoral research associates and seven graduate students. The research of our group covers a broad range of topics in theoretical high energy physics including both theory and phenomenology. Included in this report is a summary of the last several years, an outline of our current research program.

  15. Theoretical investigation of phase-controlled bias effect in capacitively coupled plasma discharges

    SciTech Connect (OSTI)

    Kwon, Deuk-Chul; Yoon, Jung-Sik [Convergence Plasma Research Center, National Fusion Research Institute, Daejeon 305-333 (Korea, Republic of)

    2011-07-15

    We theoretically investigated the effect of phase difference between powered electrodes in capacitively coupled plasma (CCP) discharges. Previous experimental result has shown that the plasma potential could be controlled by using a phase-shift controller in CCP discharges. In this work, based on the previously developed radio frequency sheath models, we developed a circuit model to self-consistently determine the bias voltage from the plasma parameters. Results show that the present theoretical model explains the experimental results quite well and there is an optimum value of the phase difference for which the V{sub dc}/V{sub pp} ratio becomes a minimum.

  16. Joe Grange Nov. 30 2012 FNAL Joint Experimental-Theoretical Seminar

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    February 21, 13 f Joe Grange Nov. 30 2012 FNAL Joint Experimental-Theoretical Seminar 1. Introduction ν cross sections in a single-detector oscillation exp't recent σ interest from MiniBooNE neutrino-mode data 2. Anti-neutrino analyses ν μ background (wrong signs) ν μ CCQE σ (new!) ν μ NCE σ (new!) 3. Outlook and summary Outline 2 Thursday, February 21, 13 f Joe Grange Nov. 30 2012 FNAL Joint Experimental-Theoretical Seminar 1. Introduction ν cross sections in a single-detector

  17. Advanced Test Reactor Complex Facilities Radioactive Waste Management Basis and DOE Manual 435.1-1 Compliance Tables

    SciTech Connect (OSTI)

    Lisa Harvego; Brion Bennett

    2011-11-01

    U.S. Department of Energy Order 435.1, 'Radioactive Waste Management,' along with its associated manual and guidance, requires development and maintenance of a radioactive waste management basis for each radioactive waste management facility, operation, and activity. This document presents a radioactive waste management basis for Idaho National Laboratory's Advanced Test Reactor Complex facilities that manage radioactive waste. The radioactive waste management basis for a facility comprises existing laboratory-wide and facility-specific documents. U.S. Department of Energy Manual 435.1-1, 'Radioactive Waste Management Manual,' facility compliance tables also are presented for the facilities. The tables serve as a tool to develop the radioactive waste management basis.

  18. AUDIT REPORT Follow-up on Nuclear Safety: Safety Basis and Quality Assurance at the Los Alamos National

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Nuclear Safety: Safety Basis and Quality Assurance at the Los Alamos National Laboratory DOE/IG-0941 July 2015 U.S. Department of Energy Office of Inspector General Office of Audits and Inspections Department of Energy Washington, DC 20585 July 16, 2015 MEMORANDUM FOR THE SECRETARY FROM: Gregory H. Friedman Inspector General SUBJECT: INFORMATION: Audit Report: "Follow-up on Nuclear Safety: Safety Basis and Quality Assurance at the Los Alamos National Laboratory" BACKGROUND A primary

  19. Integrated theoretical and experimental study of the thermophysical properties of fluid mixtures: Annual report

    SciTech Connect (OSTI)

    Haynes, W.M.

    1986-09-01

    The objectives of this research are to assist the fuel and chemical process industries to improve efficiency and thereby reduce the use of energy and feedstocks and to aid them in the utilization of unconventional feedstocks and energy sources. The objectives are satisfied through the following research efforts: (1) development of predictive procedures for the thermophysical properties of fluids and fluid mixtures; (2) basic understanding of fluid behavior with advances in theory; and (3) acquisition of experimental data to support the theoretical and modeling efforts. The following are what we feel are the most significant results of our recent research: accurate, wide-range, and self-consistent PVT and phase equilibria data are essential to the development of theoretically based predictive models for the behavior and properties of fluid mixtures. We have developed experimental techniques to screen and characterize such systems, as well as techniques for analyzing and reporting data for them. We have made significant studies in experimentally and theoretically determining the sensitivity of theoretical models to the effects of hydrogen bonding. We have learned that non-Newtonian behavior is universal and that common assumptions of fluid behavior must be treated with caution. Most importantly, we have learned that the effect of shear on fluid behavior and properties cannot be ignored. 11 refs., 7 figs., 1 tab.

  20. Final Report May 1, 2012 to May 31, 2015: "Theoretical Studies in Elementary Particle Physics"

    SciTech Connect (OSTI)

    Collins, John C.; Roiban, Radu

    2015-08-19

    This final report summarizes work at Penn State University from May 1, 2012 to May 31, 2015. The work was in theoretical elementary particle physics. Many new results in perturbative QCD, in string theory, and in related areas were obtained, with a substantial impact on the experimental program.

  1. An Experimental and Theoretical Multi-Mbar Study of Ti-6Al-4V

    SciTech Connect (OSTI)

    Tegner, B E; Macleod, S G; CYNN, H; Proctor, J; Evans, W J; McMahon, M I; Ackland, G J

    2011-04-13

    We report results from an experimental and theoretical study of the room temperature (RT) compression of the ternary alloy Ti-6Al-4V. In this work, we have extended knowledge of the equation of state (EOS) from 40 GPa to 221 GPa, and observed a different sequence of phase transitions to that reported previously for pure Ti.

  2. Final technical report, Symposium on New Theoretical Concepts and Directions in Catalysis

    SciTech Connect (OSTI)

    Metiu, Horia

    2014-08-22

    We organized in August 2013 a Symposium on New Theoretical Concepts and Directions in Catalysis with the participation of 20 invited distinguished quantum chemists and other researchers who use computations to study catalysis. Symposium website; http://catalysis.cnsi.ucsb.edu/

  3. Technical Basis for Certification of Seismic Design Criteria for the Waste Treatment Plant, Hanford, Washington

    SciTech Connect (OSTI)

    Brouns, T.M.; Rohay, A.C. [Pacific Northwest National Laboratory, Richland, WA (United States); Youngs, R.R. [Geomatrix Consultants, Inc., Oakland, CA (United States); Costantino, C.J. [C.J. Costantino and Associates, Valley, NY (United States); Miller, L.F. [U.S. Department of Energy, Office of River Protection, Richland, WA (United States)

    2008-07-01

    In August 2007, Secretary of Energy Samuel W. Bodman approved the final seismic and ground motion criteria for the Waste Treatment and Immobilization Plant (WTP) at the Department of Energy's (DOE) Hanford Site. Construction of the WTP began in 2002 based on seismic design criteria established in 1999 and a probabilistic seismic hazard analysis completed in 1996. The design criteria were reevaluated in 2005 to address questions from the Defense Nuclear Facilities Safety Board (DNFSB), resulting in an increase by up to 40% in the seismic design basis. DOE announced in 2006 the suspension of construction on the pretreatment and high-level waste vitrification facilities within the WTP to validate the design with more stringent seismic criteria. In 2007, the U.S. Congress mandated that the Secretary of Energy certify the final seismic and ground motion criteria prior to expenditure of funds on construction of these two facilities. With the Secretary's approval of the final seismic criteria in the summer of 2007, DOE authorized restart of construction of the pretreatment and high-level waste vitrification facilities. The technical basis for the certification of seismic design criteria resulted from a two-year Seismic Boreholes Project that planned, collected, and analyzed geological data from four new boreholes drilled to depths of approximately 1400 feet below ground surface on the WTP site. A key uncertainty identified in the 2005 analyses was the velocity contrasts between the basalt flows and sedimentary interbeds below the WTP. The absence of directly-measured seismic shear wave velocities in the sedimentary interbeds resulted in the use of a wider and more conservative range of velocities in the 2005 analyses. The Seismic Boreholes Project was designed to directly measure the velocities and velocity contrasts in the basalts and sediments below the WTP, reanalyze the ground motion response, and assess the level of conservatism in the 2005 seismic design criteria. The characterization and analysis effort included 1) downhole measurements of the velocity properties (including uncertainties) of the basalt/interbed sequences, 2) confirmation of the geometry of the contact between the various basalt and interbedded sediments through examination of retrieved core from the core-hole and data collected through geophysical logging of each borehole, and 3) prediction of ground motion response to an earthquake using newly acquired and historic data. The data and analyses reflect a significant reduction in the uncertainty in shear wave velocities below the WTP and result in a significantly lower spectral acceleration (i.e., ground motion). The updated ground motion response analyses and corresponding design response spectra reflect a 25% lower peak horizontal acceleration than reflected in the 2005 design criteria. These results provide confidence that the WTP seismic design criteria are conservative. (authors)

  4. Beyond-Design-Basis-Accidents Passive Containment-Cooling Spray System

    SciTech Connect (OSTI)

    Karameldin, Aly; Temraz, Hassan M. Elsawy; Ibrahim, Nady Attia [Atomic Energy Authority (Egypt)

    2001-10-15

    The proposed safety feature considered in this study aims to increase the safety margins of nuclear power plants by proposed water tanks located inside or outside the upper zone of the containment to be utilized for (a) residual heat removal of the reactor in case of station blackout or in case of normal reactor shutdown and (b) beyond-design-basis accidents, in which core melt and debris-concrete interaction take place, associated with accumulative containment pressure increase and partial loss of the active systems. The proposed passive containment system can be implemented by a special mechanism, which can allow the pressurization of the water in the tanks and therefore can enable an additional spray system to start in case of increasing the containment pressure over a certain value just below the design pressure. A conservative case study is that of a Westinghouse 3411-MW(thermal) power station, where the proposed passive containment cooling spray system (PCCSS) will start at a pressure of 6 bars and terminate at a pressure of 3 bars. A one-dimensional lumped model is postulated to describe the thermal and hydraulic process behavior inside the containment after a beyond-design-basis accident. The considered parameters are the spray mass flow rate, the initial droplet diameters, fuel-cooling time, and the ultimate containment pressure. The overall heat and mass balance inside the containment are carried out, during both the containment depressurization (by the spraying system) and pressurization (by the residual energies). The results show that the design of the PCCSS is viable and has a capability to maintain the containment below the design pressure passively for the required grace period of 72 h. Design curves of the proposed PCCSS indicate the effect of the spray flow rate and cooling time on the total sprayed volume during the grace period of 72 h. From these curves it can be concluded that for the grace period of 72 h, the required tank volumes are 3800 and 4700 m{sup 3}, corresponding to fuel-cooling times (time after shutdown) of two weeks and one week, respectively. This large quantity of water serves as an ultimate heat sink available for the residual heat removal in the case of station blackout. The optimal spraying droplet diameter, travel, and mass flow rate are 3 mm, 30 m, and 100 to 125 kg/s, respectively.

  5. Early Site Permit Demonstration Program: Guidelines for determining design basis ground motions. Volume 1

    SciTech Connect (OSTI)

    Not Available

    1993-03-18

    This report develops and applies a methodology for estimating strong earthquake ground motion. The motivation was to develop a much needed tool for use in developing the seismic requirements for structural designs. An earthquake`s ground motion is a function of the earthquake`s magnitude, and the physical properties of the earth through which the seismic waves travel from the earthquake fault to the site of interest. The emphasis of this study is on ground motion estimation in Eastern North America (east of the Rocky Mountains), with particular emphasis on the Eastern United States and southeastern Canada. Eastern North America is a stable continental region, having sparse earthquake activity with rare occurrences of large earthquakes. While large earthquakes are of interest for assessing seismic hazard, little data exists from the region to empirically quantify their effects. Therefore, empirically based approaches that are used for other regions, such as Western North America, are not appropriate for Eastern North America. Moreover, recent advances in science and technology have now made it possible to combine theoretical and empirical methods to develop new procedures and models for estimating ground motion. The focus of the report is on the attributes of ground motion in Eastern North America that are of interest for the design of facilities such as nuclear power plants. Specifically considered are magnitudes M from 5 to 8, distances from 0 to 500 km, and frequencies from 1 to 35 Hz.

  6. Draft Function Allocation Framework and Preliminary Technical Basis for Advanced SMR Concepts of Operations

    SciTech Connect (OSTI)

    Jacques Hugo; John Forester; David Gertman; Jeffrey Joe; Heather Medema; Julius Persensky; April Whaley

    2013-04-01

    This report presents preliminary research results from the investigation in to the development of new models and guidance for concepts of operations (ConOps) in advanced small modular reactor (aSMR) designs. In support of this objective, three important research areas were included: operating principles of multi-modular plants, functional allocation models and strategies that would affect the development of new, non-traditional concept of operations, and the requiremetns for human performance, based upon work domain analysis and current regulatory requirements. As part of the approach for this report, we outline potential functions, including the theoretical and operational foundations for the development of a new functional allocation model and the identification of specific regulatory requirements that will influence the development of future concept of operations. The report also highlights changes in research strategy prompted by confirmationof the importance of applying the work domain analysis methodology to a reference aSMR design. It is described how this methodology will enrich the findings from this phase of the project in the subsequent phases and help in identification of metrics and focused studies for the determination of human performance criteria that can be used to support the design process.

  7. Development of Variational Guiding Center Algorithms for Parallel Calculations in Experimental Magnetic Equilibria

    SciTech Connect (OSTI)

    Ellison, C. Leland; Finn, J. M.; Qin, H.; Tang, William M.

    2014-10-01

    Structure-preserving algorithms obtained via discrete variational principles exhibit strong promise for the calculation of guiding center test particle trajectories. The non-canonical Hamiltonian structure of the guiding center equations forms a novel and challenging context for geometric integration. To demonstrate the practical relevance of these methods, a prototypical variational midpoint algorithm is applied to an experimental magnetic equilibrium. The stability characteristics, conservation properties, and implementation requirements associated with the variational algorithms are addressed. Furthermore, computational run time is reduced for large numbers of particles by parallelizing the calculation on GPU hardware.

  8. DWPF Algorithm for Calculation of Source Terms and Consequences for EXCEL

    Energy Science and Technology Software Center (OSTI)

    1997-02-11

    The DWPFAST software application algorithm is an Excel spreadsheet, with optional macros, designed to calculate the radiological source terms and consequences due to postulated accident progressions in non-reactor nuclear facilities (currently it is being used for DWPF). Upon input of a multi-character accident progression identification code, and basic facility data, the algorithm calculates individual accident segment releases, overall facility releases, and radiological consequences for various receptors, for up to 13 individual radionuclides. The algorithm wasmore » designed to support probabilistic safety assements (PSAs).« less

  9. Supervisory Power Management Control Algorithms for Hybrid Electric Vehicles. A Survey

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Malikopoulos, Andreas

    2014-03-31

    The growing necessity for environmentally benign hybrid propulsion systems has led to the development of advanced power management control algorithms to maximize fuel economy and minimize pollutant emissions. This paper surveys the control algorithms for hybrid electric vehicles (HEVs) and plug-in HEVs (PHEVs) that have been reported in the literature to date. The exposition ranges from parallel, series, and power split HEVs and PHEVs and includes a classification of the algorithms in terms of their implementation and the chronological order of their appearance. Remaining challenges and potential future research directions are also discussed.

  10. A Linac Simulation Code for Macro-Particles Tracking and Steering Algorithm Implementation

    SciTech Connect (OSTI)

    sun, yipeng

    2012-05-03

    In this paper, a linac simulation code written in Fortran90 is presented and several simulation examples are given. This code is optimized to implement linac alignment and steering algorithms, and evaluate the accelerator errors such as RF phase and acceleration gradient, quadrupole and BPM misalignment. It can track a single particle or a bunch of particles through normal linear accelerator elements such as quadrupole, RF cavity, dipole corrector and drift space. One-to-one steering algorithm and a global alignment (steering) algorithm are implemented in this code.

  11. Natural micropolymorphism in human leukocyte antigens provides a basis for genetic control of antigen recognition

    SciTech Connect (OSTI)

    Archbold, Julia K.; Macdonald, Whitney A.; Gras, Stephanie; Ely, Lauren K.; Miles, John J.; Bell, Melissa J.; Brennan, Rebekah M.; Beddoe, Travis; Wilce, Matthew C.J.; Clements, Craig S.; Purcell, Anthony W.; McCluskey, James; Burrows, Scott R.; Rossjohn, Jamie

    2009-07-10

    Human leukocyte antigen (HLA) gene polymorphism plays a critical role in protective immunity, disease susceptibility, autoimmunity, and drug hypersensitivity, yet the basis of how HLA polymorphism influences T cell receptor (TCR) recognition is unclear. We examined how a natural micropolymorphism in HLA-B44, an important and large HLA allelic family, affected antigen recognition. T cell-mediated immunity to an Epstein-Barr virus determinant (EENLLDFVRF) is enhanced when HLA-B*4405 was the presenting allotype compared with HLA-B*4402 or HLA-B*4403, each of which differ by just one amino acid. The micropolymorphism in these HLA-B44 allotypes altered the mode of binding and dynamics of the bound viral epitope. The structure of the TCR-HLA-B*4405EENLLDFVRF complex revealed that peptide flexibility was a critical parameter in enabling preferential engagement with HLA-B*4405 in comparison to HLA-B*4402/03. Accordingly, major histocompatibility complex (MHC) polymorphism can alter the dynamics of the peptide-MHC landscape, resulting in fine-tuning of T cell responses between closely related allotypes.

  12. Using compliance audits as the basis for developing an effective mechanical integrity program

    SciTech Connect (OSTI)

    Kiihne, E.J.; Mannan, M. [RMT/Jones and Neuse, Inc., Austin, TX (United States)

    1996-08-01

    The OSHA Process Safety Management (PSM) rule requires all covered facilities to conduct a compliance audit every three years. In addition, all the audit findings must be resolved within a reasonable time period. The process industry as a whole is lagging behind in compliance with mechanical integrity program requirements as demonstrated by the high number of OSHA citations issued to-date on mechanical integrity-related issues. This paper analyzes the findings of several PSM compliance audits and develops recommendations for developing effective mechanical integrity programs. The six explicit requirements of mechanical integrity, i.e. covered equipment, written procedures, training, inspection and testing, equipment deficiencies, and quality assurance are analyzed in the following manner: Number of OSHA citations in mechanical integrity and the distribution of these citations between the six specific requirements. Comparison of the OSHA citations with the audit findings from PSM compliance audits conducted by the authors. Using the conclusions from the OSHA citations and PSM compliance audits as a basis for developing effective mechanical integrity programs.

  13. Micrometer-scale fabrication of complex three dimensional lattice + basis structures in silicon

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Burckel, D. Bruce; Resnick, Paul J.; Finnegan, Patrick S.; Sinclair, Michael B.; Davids, Paul S.

    2015-01-01

    A complementary metal oxide semiconductor (CMOS) compatible version of membrane projection lithography (MPL) for fabrication of micrometer-scale three-dimensional structures is presented. The approach uses all inorganic materials and standard CMOS processing equipment. In a single layer, MPL is capable of creating all 5 2D-Bravais lattices. Furthermore, standard semiconductor processing steps can be used in a layer-by-layer approach to create fully three dimensional structures with any of the 14 3D-Bravais lattices. The unit cell basis is determined by the projection of the membrane pattern, with many degrees of freedom for defining functional inclusions. Here we demonstrate several unique structural motifs, andmorecharacterize 2D arrays of unit cells with split ring resonators in a silicon matrix. The structures exhibit strong polarization dependent resonances and, for properly oriented split ring resonators (SRRs), coupling to the magnetic field of a normally incident transverse electromagnetic wave, a response unique to 3D inclusions.less

  14. Modular High Temperature Gas-Cooled Reactor Safety Basis and Approach

    SciTech Connect (OSTI)

    David Petti; Jim Kinsey; Dave Alberstein

    2014-01-01

    Various international efforts are underway to assess the safety of advanced nuclear reactor designs. For example, the International Atomic Energy Agency has recently held its first Consultancy Meeting on a new cooperative research program on high temperature gas-cooled reactor (HTGR) safety. Furthermore, the Generation IV International Forum Reactor Safety Working Group has recently developed a methodology, called the Integrated Safety Assessment Methodology, for use in Generation IV advanced reactor technology development, design, and design review. A risk and safety assessment white paper is under development with respect to the Very High Temperature Reactor to pilot the Integrated Safety Assessment Methodology and to demonstrate its validity and feasibility. To support such efforts, this information paper on the modular HTGR safety basis and approach has been prepared. The paper provides a summary level introduction to HTGR history, public safety objectives, inherent and passive safety features, radionuclide release barriers, functional safety approach, and risk-informed safety approach. The information in this paper is intended to further the understanding of the modular HTGR safety approach. The paper gives those involved in the assessment of advanced reactor designs an opportunity to assess an advanced design that has already received extensive review by regulatory authorities and to judge the utility of recently proposed new methods for advanced reactor safety assessment such as the Integrated Safety Assessment Methodology.

  15. Human-system Interfaces to Automatic Systems: Review Guidance and Technical Basis

    SciTech Connect (OSTI)

    OHara, J.M.; Higgins, J.C.

    2010-01-31

    Automation has become ubiquitous in modern complex systems and commercial nuclear power plants are no exception. Beyond the control of plant functions and systems, automation is applied to a wide range of additional functions including monitoring and detection, situation assessment, response planning, response implementation, and interface management. Automation has become a 'team player' supporting plant personnel in nearly all aspects of plant operation. In light of the increasing use and importance of automation in new and future plants, guidance is needed to enable the NRC staff to conduct safety reviews of the human factors engineering (HFE) aspects of modern automation. The objective of the research described in this report was to develop guidance for reviewing the operator's interface with automation. We first developed a characterization of the important HFE aspects of automation based on how it is implemented in current systems. The characterization included five dimensions: Level of automation, function of automation, modes of automation, flexibility of allocation, and reliability of automation. Next, we reviewed literature pertaining to the effects of these aspects of automation on human performance and the design of human-system interfaces (HSIs) for automation. Then, we used the technical basis established by the literature to develop design review guidance. The guidance is divided into the following seven topics: Automation displays, interaction and control, automation modes, automation levels, adaptive automation, error tolerance and failure management, and HSI integration. In addition, we identified insights into the automaton design process, operator training, and operations.

  16. Application of Radial Basis Functional Link Networks to Exploration for Proterozoic Mineral Deposits in Central Iran

    SciTech Connect (OSTI)

    Behnia, Pouran [Geological Survey of Iran, Geomatics Department (Iran, Islamic Republic of)], E-mail: pouranb@yahoo.com

    2007-06-15

    The metallogeny of Central Iran is characterized mainly by the presence of several iron, apatite, and uranium deposits of Proterozoic age. Radial Basis Function Link Networks (RBFLN) were used as a data-driven method for GIS-based predictive mapping of Proterozoic mineralization in this area. To generate the input data for RBFLN, the evidential maps comprising stratigraphic, structural, geophysical, and geochemical data were used. Fifty-eight deposits and 58 'nondeposits' were used to train the network. The operations for the application of neural networks employed in this study involve both multiclass and binary representation of evidential maps. Running RBFLN on different input data showed that an increase in the number of evidential maps and classes leads to a larger classification sum of squared error (SSE). As a whole, an increase in the number of iterations resulted in the improvement of training SSE. The results of applying RBFLN showed that a successful classification depends on the existence of spatially well distributed deposits and nondeposits throughout the study area.

  17. Application of the MELCOR code to design basis PWR large dry containment analysis.

    SciTech Connect (OSTI)

    Phillips, Jesse; Notafrancesco, Allen (USNRC, Office of Nuclear Regulatory Research, Rockville, MD); Tills, Jack Lee (Jack Tills & Associates, Inc., Sandia Park, NM)

    2009-05-01

    The MELCOR computer code has been developed by Sandia National Laboratories under USNRC sponsorship to provide capability for independently auditing analyses submitted by reactor manufactures and utilities. MELCOR is a fully integrated code (encompassing the reactor coolant system and the containment building) that models the progression of postulated accidents in light water reactor power plants. To assess the adequacy of containment thermal-hydraulic modeling incorporated in the MELCOR code for application to PWR large dry containments, several selected demonstration designs were analyzed. This report documents MELCOR code demonstration calculations performed for postulated design basis accident (DBA) analysis (LOCA and MSLB) inside containment, which are compared to other code results. The key processes when analyzing the containment loads inside PWR large dry containments are (1) expansion and transport of high mass/energy releases, (2) heat and mass transfer to structural passive heat sinks, and (3) containment pressure reduction due to engineered safety features. A code-to-code benchmarking for DBA events showed that MELCOR predictions of maximum containment loads were equivalent to similar predictions using a qualified containment code known as CONTAIN. This equivalency was found to apply for both single- and multi-cell containment models.

  18. Technical Basis for Radiological Emergency Plan Annex for WTD Emergency Response Plan: West Point Treatment Plant

    SciTech Connect (OSTI)

    Hickey, Eva E.; Strom, Daniel J.

    2005-08-01

    Staff of the King County Wastewater Treatment Division (WTD) have concern about the aftermath of a radiological dispersion event (RDE) leading to the introduction of significant quantities of radioactive material into the combined sanitary and storm sewer system in King County, Washington. Radioactive material could come from the use of a radiological dispersion device (RDD). RDDs include "dirty bombs" that are not nuclear detonations but are explosives designed to spread radioactive material (National Council on Radiation Protection and Measurements (NCRP) 2001). Radioactive material also could come from deliberate introduction or dispersion of radioactive material into the environment, including waterways and water supply systems. This document, Volume 3 of PNNL-15163 is the technical basis for the Annex to the West Point Treatment Plant (WPTP) Emergency Response Plan related to responding to a radiological emergency at the WPTP. The plan primarily considers response to radioactive material that has been introduced in the other combined sanitary and storm sewer system from a radiological dispersion device, but is applicable to any accidental or deliberate introduction of materials into the system.

  19. Technical Basis for Certification of Seismic Design Criteria for the Waste Treatment Plant, Hanford, Washington

    SciTech Connect (OSTI)

    Brouns, Thomas M.; Rohay, Alan C.; Youngs, Robert R.; Costantino, Carl J.; Miller, Lewis F.

    2008-02-28

    In August 2007, Secretary of Energy Samuel W. Bodman approved the final seismic and ground motion criteria for the Waste Treatment and Immobilization Plant (WTP) at the Department of Energys (DOE) Hanford Site. Construction of the WTP began in 2002 based on seismic design criteria established in 1999 and a probabilistic seismic hazard analysis completed in 1996. The design criteria were re-evaluated in 2005 to address questions from the Defense Nuclear Facilities Safety Board (DNFSB), resulting in an increase by up to 40% in the seismic design basis. DOE announced in 2006 the suspension of construction on the pretreatment and high-level waste vitrification facilities within the WTP to validate the design with more stringent seismic criteria. In 2007, the U.S. Congress mandated that the Secretary of Energy certify the final seismic and ground motion criteria prior to expenditure of funds on construction of these two facilities. With the Secretarys approval of the final seismic criteria this past summer, DOE authorized restart of construction of the pretreatment and high-level waste vitrification facilities.

  20. Accurate ab initio-based adiabatic global potential energy surface for the 2{sup 2}A″ state of NH{sub 2} by extrapolation to the complete basis set limit

    SciTech Connect (OSTI)

    Li, Y. Q.; Ma, F. C.; Sun, M. T.

    2013-10-21

    A full three-dimensional global potential energy surface is reported first time for the title system, which is important for the photodissociation processes. It is obtained using double many-body expansion theory and an extensive set of accurate ab initio energies extrapolated to the complete basis set limit. Such a work can be recommended for dynamics studies of the N({sup 2}D) + H{sub 2} reaction, a reliable theoretical treatment of the photodissociation dynamics and as building blocks for constructing the double many-body expansion potential energy surface of larger nitrogen/hydrogen containing systems. In turn, a preliminary theoretical study of the reaction N({sup 2}D)+H{sub 2}(X{sup 1}Σ{sub g}{sup +})(ν=0,j=0)→NH(a{sup 1}Δ)+H({sup 2}S) has been carried out with the method of quasi-classical trajectory on the new potential energy surface. Integral cross sections and thermal rate constants have been calculated, providing perhaps the most reliable estimate of the integral cross sections and the rate constants known thus far for such a reaction.