National Library of Energy BETA

Sample records for algorithm theoretical basis

  1. Theoretical Basis for the Design of a DWPF Evacuated Canister

    SciTech Connect (OSTI)

    Routt, K.R.

    2001-09-17

    This report provides the theoretical bases for use of an evacuated canister for draining a glass melter. Design recommendations are also presented to ensure satisfactory performance in future tests of the concept.

  2. A Decision Theoretic Approach to Evaluate Radiation Detection Algorithms

    SciTech Connect (OSTI)

    Nobles, Mallory A.; Sego, Landon H.; Cooley, Scott K.; Gosink, Luke J.; Anderson, Richard M.; Hays, Spencer E.; Tardiff, Mark F.

    2013-07-01

    There are a variety of sensor systems deployed at U.S. border crossings and ports of entry that scan for illicit nuclear material. In this work, we develop a framework for comparing the performance of detection algorithms that interpret the output of these scans and determine when secondary screening is needed. We optimize each algorithm to minimize its risk, or expected loss. We measure an algorithms risk by considering its performance over a sample, the probability distribution of threat sources, and the consequence of detection errors. While it is common to optimize algorithms by fixing one error rate and minimizing another, our framework allows one to simultaneously consider multiple types of detection errors. Our framework is flexible and easily adapted to many different assumptions regarding the probability of a vehicle containing illicit material, and the relative consequences of a false positive and false negative errors. Our methods can therefore inform decision makers of the algorithm family and parameter values which best reduce the threat from illicit nuclear material, given their understanding of the environment at any point in time. To illustrate the applicability of our methods, in this paper, we compare the risk from two families of detection algorithms and discuss the policy implications of our results.

  3. TU-F-18A-04: Use of An Image-Based Material-Decomposition Algorithm for Multi-Energy CT to Determine Basis Material Densities

    SciTech Connect (OSTI)

    Li, Z; Leng, S; Yu, L; McCollough, C

    2014-06-15

    Purpose: Published methods for image-based material decomposition with multi-energy CT images have required the assumption of volume conservation or accurate knowledge of the x-ray spectra and detector response. The purpose of this work was to develop an image-based material-decomposition algorithm that can overcome these limitations. Methods: An image-based material decomposition algorithm was developed that requires only mass conservation (rather than volume conservation). With this method, using multi-energy CT measurements made with n=4 energy bins, the mass density of each basis material and of the mixture can be determined without knowledge of the tube spectra and detector response. A digital phantom containing 12 samples of mixtures from water, calcium, iron, and iodine was used in the simulation (Siemens DRASIM). The calibration was performed by using pure materials at each energy bin. The accuracy of the technique was evaluated in noise-free and noisy data under the assumption of an ideal photon-counting detector. Results: Basis material densities can be estimated accurately by either theoretic calculation or calibration with known pure materials. The calibration approach requires no prior information about the spectra and detector response. Regression analysis of theoretical values versus estimated values results in excellent agreement for both noise-free and noisy data. For the calibration approach, the R-square values are 0.9960+/−0.0025 and 0.9476+/−0.0363 for noise-free and noisy data, respectively. Conclusion: From multi-energy CT images with n=4 energy bins, the developed image-based material decomposition method accurately estimated 4 basis material density (3 without k-edge and 1 with in the range of the simulated energy bins) even without any prior information about spectra and detector response. This method is applicable to mixtures of solutions and dissolvable materials, where volume conservation assumptions do not apply. CHM receives research support from NIH and Siemens Healthcare.

  4. Theoretical Biology and Biophysics

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    6 Theoretical Biology and Biophysics Modeling biological systems and analysis and informatics of molecular and cellular biological data Mathematical Biology/Immunology Fundamental science and modeling enabling host-pathogen science Computational Structural Biology Understanding biomolecule function by determining their 3-D macromolecular structure Bioinformatics and Epidemiology Sequence analysis using advanced algorithms, software, computational hardware Contacts Group Leader Nick Hengartner

  5. Theoretical Division

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    ADTSC » T Theoretical Division Theoretical research encompasses all disciplines of science. Physics and Chemistry of Materials Nuclear and Particle Physics, Astrophysics and Cosmology Fluid Dynamics and Solid Mechanics Physics of Condensed Matter and Complex Systems Applied Mathematics and Plasma Physics Theoretical Biology and Biophysics Contacts Division Leader Jack Shlachter Email Deputy Division Leader (Acting) Anna Hayes-Sterbenz Email Point of Contact Jenny Esch (505) 667-4401 Email

  6. Theoretical Physics

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    HEP Theoretical Physics Understanding discoveries at the Energy, Intensity, and Cosmic Frontiers Get Expertise Rajan Gupta (505) 667-7664 Email Bruce Carlsten (505) 667-5657 Email HEP Theory at Los Alamos The Theoretical High Energy Physics group at Los Alamos National Laboratory is active in a number of diverse areas of research. Their primary areas of interest are in physics beyond the Standard Model, cosmology, dark matter, lattice quantum chromodynamics, neutrinos, the fundamentals of

  7. Improved multiprocessor garbage collection algorithms

    SciTech Connect (OSTI)

    Newman, I.A.; Stallard, R.P.; Woodward, M.C.

    1983-01-01

    Outlines the results of an investigation of existing multiprocessor garbage collection algorithms and introduces two new algorithms which significantly improve some aspects of the performance of their predecessors. The two algorithms arise from different starting assumptions. One considers the case where the algorithm will terminate successfully whatever list structure is being processed and assumes that the extra data space should be minimised. The other seeks a very fast garbage collection time for list structures that do not contain loops. Results of both theoretical and experimental investigations are given to demonstrate the efficacy of the algorithms. 7 references.

  8. An algorithm for nonrelativistic quantum-mechanical finite-nuclear-mass variational calculations of nitrogen atom in L = 0, M = 0 states using all-electrons explicitly correlated Gaussian basis functions

    SciTech Connect (OSTI)

    Sharkey, Keeper L.; Adamowicz, Ludwik; Department of Physics, University of Arizona, Tucson, Arizona 85721

    2014-05-07

    An algorithm for quantum-mechanical nonrelativistic variational calculations of L = 0 and M = 0 states of atoms with an arbitrary number of s electrons and with three p electrons have been implemented and tested in the calculations of the ground {sup 4}S state of the nitrogen atom. The spatial part of the wave function is expanded in terms of all-electrons explicitly correlated Gaussian functions with the appropriate pre-exponential Cartesian angular factors for states with the L = 0 and M = 0 symmetry. The algorithm includes formulas for calculating the Hamiltonian and overlap matrix elements, as well as formulas for calculating the analytic energy gradient determined with respect to the Gaussian exponential parameters. The gradient is used in the variational optimization of these parameters. The Hamiltonian used in the approach is obtained by rigorously separating the center-of-mass motion from the laboratory-frame all-particle Hamiltonian, and thus it explicitly depends on the finite mass of the nucleus. With that, the mass effect on the total ground-state energy is determined.

  9. Safety Basis Report

    SciTech Connect (OSTI)

    R.J. Garrett

    2002-01-14

    As part of the internal Integrated Safety Management Assessment verification process, it was determined that there was a lack of documentation that summarizes the safety basis of the current Yucca Mountain Project (YMP) site characterization activities. It was noted that a safety basis would make it possible to establish a technically justifiable graded approach to the implementation of the requirements identified in the Standards/Requirements Identification Document. The Standards/Requirements Identification Documents commit a facility to compliance with specific requirements and, together with the hazard baseline documentation, provide a technical basis for ensuring that the public and workers are protected. This Safety Basis Report has been developed to establish and document the safety basis of the current site characterization activities, establish and document the hazard baseline, and provide the technical basis for identifying structures, systems, and components (SSCs) that perform functions necessary to protect the public, the worker, and the environment from hazards unique to the YMP site characterization activities. This technical basis for identifying SSCs serves as a grading process for the implementation of programs such as Conduct of Operations (DOE Order 5480.19) and the Suspect/Counterfeit Items Program. In addition, this report provides a consolidated summary of the hazards analyses processes developed to support the design, construction, and operation of the YMP site characterization facilities and, therefore, provides a tool for evaluating the safety impacts of changes to the design and operation of the YMP site characterization activities.

  10. Technical Planning Basis

    Broader source: Directives, Delegations, and Requirements [Office of Management (MA)]

    2007-07-11

    The Guide assists DOE/NNSA field elements and operating contractors in identifying and analyzing hazards at facilities and sites to provide the technical planning basis for emergency management programs. Supersedes DOE G 151.1-1, Volume 2.

  11. Algorithms for builder guidelines

    SciTech Connect (OSTI)

    Balcomb, J.D.; Lekov, A.B.

    1989-06-01

    The Builder Guidelines are designed to make simple, appropriate guidelines available to builders for their specific localities. Builders may select from passive solar and conservation strategies with different performance potentials. They can then compare the calculated results for their particular house design with a typical house in the same location. Algorithms used to develop the Builder Guidelines are described. The main algorithms used are the monthly solar ratio (SLR) method for winter heating, the diurnal heat capacity (DHC) method for temperature swing, and a new simplified calculation method (McCool) for summer cooling. This paper applies the algorithms to estimate the performance potential of passive solar strategies, and the annual heating and cooling loads of various combinations of conservation and passive solar strategies. The basis of the McCool method is described. All three methods are implemented in a microcomputer program used to generate the guideline numbers. Guidelines for Denver, Colorado, are used to illustrate the results. The structure of the guidelines and worksheet booklets are also presented. 5 refs., 3 tabs.

  12. Theoretical manual for DYNA3D

    SciTech Connect (OSTI)

    Hallquist, J.O.

    1983-03-01

    This report provides a theoretical manual for DYNA3D, a vectorized explicit three-dimensional finite element code for analyzing the large deformation dynamic response of inelastic solids. A contact-impact algorithm that permits gaps and sliding along material interfaces is described. By a specialization of this algorithm, such interfaces can be rigidly tied to admit variable zoning without the need of transition regions. Spatial discretization is achieved by the use of 8-node solid elements, and the equations-of-motion are integrated by the central difference method. DYNA3D is operational on the CRAY-1 and CDC7600 computers.

  13. Library of Continuation Algorithms

    Energy Science and Technology Software Center (OSTI)

    2005-03-01

    LOCA (Library of Continuation Algorithms) is scientific software written in C++ that provides advanced analysis tools for nonlinear systems. In particular, it provides parameter continuation algorithms. bifurcation tracking algorithms, and drivers for linear stability analysis. The algorithms are aimed at large-scale applications that use Newton’s method for their nonlinear solve.

  14. Exploratory Development of Theoretical Methods | The Ames Laboratory

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Exploratory Development of Theoretical Methods Research Personnel Updates Publications Calculating Plutonium and Praseodymium Structural Transformations Read More Genetic Algorithm for Grain Boundary and Crystal Structure Predictions Read More Universal Dynamical Decoupling of a Single Solid-state Spin from a Spin Bath Read More Previous Pause Next Modeling The purpose of this FWP is to generate new theories, models, and algorithms that will be beneficial to the research programs at the Ames

  15. Basis

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    operator bispectral analysis D. A. Baver, 1 P. W. Terry, 1 and C. Holland 2 1 Department of Physics, University of Wisconsin, Madison, Wisconsin 53706, USA 2 Center for Energy Research, University of California, San Diego, California 92093, USA ͑Received 11 September 2008; accepted 12 February 2009; published online 30 March 2009͒ A new procedure for calculating model coefficients from fluctuation data for fully developed turbulence is derived. This procedure differs from previous related

  16. Basis

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    that this equation is a difference- equation representation in the temporal domain of a first- order-in-time nonlinear partial differential equation. The co- efficient L k...

  17. ALGORITHM FOR ACCNT

    Energy Science and Technology Software Center (OSTI)

    002651IBMPC00 Algorithm for Accounting for the Interactions of Multiple Renewable Energy Technologies in Estimation of Annual Performance

  18. CRAD, NNSA- Safety Basis (SB)

    Broader source: Energy.gov [DOE]

    CRAD for Safety Basis (SB). Criteria Review and Approach Documents (CRADs) that can be used to conduct a well-organized and thorough assessment of elements of safety and health programs.

  19. Basis functions for electronic structure calculations on spheres

    SciTech Connect (OSTI)

    Gill, Peter M. W. Loos, Pierre-François Agboola, Davids

    2014-12-28

    We introduce a new basis function (the spherical Gaussian) for electronic structure calculations on spheres of any dimension D. We find general expressions for the one- and two-electron integrals and propose an efficient computational algorithm incorporating the Cauchy-Schwarz bound. Using numerical calculations for the D = 2 case, we show that spherical Gaussians are more efficient than spherical harmonics when the electrons are strongly localized.

  20. The Basis Code Development System

    Energy Science and Technology Software Center (OSTI)

    1994-03-15

    BASIS9.4 is a system for developing interactive computer programs in Fortran, with some support for C and C++ as well. Using BASIS9.4 you can create a program that has a sophisticated programming language as its user interface so that the user can set, calculate with, and plot, all the major variables in the program. The program author writes only the scientific part of the program; BASIS9.4 supplies an environment in which to exercise that scientificmore » programming which includes an interactive language, an interpreter, graphics, terminal logs, error recovery, macros, saving and retrieving variables, formatted I/O, and online documentation.« less

  1. Safety Basis Information System | Department of Energy

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Basis Information System Safety Basis Information System Safety Basis Report (Public Access) Click on the above link to see the current Safety Basis report. This report provides a list of all DOE nuclear facilities with the safety basis status, hazard categorization, and safety basis type. Safety Basis Login Click on the above link to log in to the Safety Basis web interface. "RESTRICTED; access only to DOE and DOE contractors" Safety Basis Account Request Click on the above link to

  2. Authorization basis requirements comparison report

    SciTech Connect (OSTI)

    Brantley, W.M.

    1997-08-18

    The TWRS Authorization Basis (AB) consists of a set of documents identified by TWRS management with the concurrence of DOE-RL. Upon implementation of the TWRS Basis for Interim Operation (BIO) and Technical Safety Requirements (TSRs), the AB list will be revised to include the BIO and TSRs. Some documents that currently form part of the AB will be removed from the list. This SD identifies each - requirement from those documents, and recommends a disposition for each to ensure that necessary requirements are retained when the AB is revised to incorporate the BIO and TSRs. This SD also identifies documents that will remain part of the AB after the BIO and TSRs are implemented. This document does not change the AB, but provides guidance for the preparation of change documentation.

  3. Hanford Generic Interim Safety Basis

    SciTech Connect (OSTI)

    Lavender, J.C.

    1994-09-09

    The purpose of this document is to identify WHC programs and requirements that are an integral part of the authorization basis for nuclear facilities that are generic to all WHC-managed facilities. The purpose of these programs is to implement the DOE Orders, as WHC becomes contractually obligated to implement them. The Hanford Generic ISB focuses on the institutional controls and safety requirements identified in DOE Order 5480.23, Nuclear Safety Analysis Reports.

  4. OSR encapsulation basis -- 100-KW

    SciTech Connect (OSTI)

    Meichle, R.H.

    1995-01-27

    The purpose of this report is to provide the basis for a change in the Operations Safety Requirement (OSR) encapsulated fuel storage requirements in the 105 KW fuel storage basin which will permit the handling and storing of encapsulated fuel in canisters which no longer have a water-free space in the top of the canister. The scope of this report is limited to providing the change from the perspective of the safety envelope (bases) of the Safety Analysis Report (SAR) and Operations Safety Requirements (OSR). It does not change the encapsulation process itself.

  5. OpenEIS Algorithms

    Energy Science and Technology Software Center (OSTI)

    2013-07-29

    The OpenEIS Algorithm package seeks to provide a low-risk path for building owners, service providers and managers to explore analytical methods for improving building control and operational efficiency. Users of this software can analyze building data, and learn how commercial implementations would provide long-term value. The code also serves as a reference implementation for developers who wish to adapt the algorithms for use in commercial tools or service offerings.

  6. Scalable Methods for Electronic Excitations and Optical Responses in Nanostructures: Mathematics to Algorithms to Observables

    SciTech Connect (OSTI)

    James R. Chelikowsky

    2009-03-31

    The work reported here took place at the University of Minnesota from September 15, 2003 to November 14, 2005. This funding resulted in 10 invited articles or book chapters, 37 articles in refereed journals and 13 invited talks. The funding helped train 5 PhD students. The research supported by this grant focused on developing theoretical methods for predicting and understanding the properties of matter at the nanoscale. Within this regime, new phenomena occur that are characteristic of neither the atomic limit, nor the crystalline limit. Moreover, this regime is crucial for understanding the emergence of macroscopic properties such as ferromagnetism. For example, elemental Fe clusters possess magnetic moments that reside between the atomic and crystalline limits, but the transition from the atomic to the crystalline limit is not a simple interpolation between the two size regimes. To capitalize properly on predicting such phenomena in this transition regime, a deeper understanding of the electronic, magnetic and structural properties of matter is required, e.g., electron correlation effects are enhanced within this size regime and the surface of a confined system must be explicitly included. A key element of our research involved the construction of new algorithms to address problems peculiar to the nanoscale. Typically, one would like to consider systems with thousands of atoms or more, e.g., a silicon nanocrystal that is 7 nm in diameter would contain over 10,000 atoms. Previous ab initio methods could address systems with hundreds of atoms whereas empirical methods can routinely handle hundreds of thousands of atoms (or more). However, these empirical methods often rely on ad hoc assumptions and lack incorporation of structural and electronic degrees of freedom. The key theoretical ingredients in our work involved the use of ab initio pseudopotentials and density functional approaches. The key numerical ingredients involved the implementation of algorithms for solving the Kohn-Sham equation without the use of an explicit basis, i.e., a real space grid. We invented algorithms for a solution of the Kohn-Sham equation based on Chebyshev 'subspace filtering'. Our filtering algorithms dramatically enhanced our ability to explore systems with thousands of atoms, i.e., we examined silicon quantum dots with approximately 11,000 atoms (or 40,000 electrons). We applied this algorithm to a number of nanoscale systems to examine the role of quantum confinement on electronic and magnetic properties: (1) Doping of nanocrystals and nanowires, including both magnetic and non-magnetic dopants and the role of self-purification; (2) Optical excitations and electronic properties of nanocrystals; (3) Intrinsic defects in nanostructures; and (4) The emergence of ferromagnetism from atoms to crystals.

  7. Beyond Design Basis Events | Department of Energy

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Beyond Design Basis Events Beyond Design Basis Events Beyond Design Basis Events Following the March 2011 Fukushima Daiichi nuclear plant accident in Japan, DOE embarked upon several initiatives to investigate the safety posture of its nuclear facilities relative to beyond design basis events (BDBEs). These initiatives included issuing Safety Bulletin 2011-01, Events Beyond Design Safety Basis Analysis, and conducting two DOE nuclear safety workshops. DOE also issued two reports documenting the

  8. Internal dosimetry technical basis manual

    SciTech Connect (OSTI)

    Not Available

    1990-12-20

    The internal dosimetry program at the Savannah River Site (SRS) consists of radiation protection programs and activities used to detect and evaluate intakes of radioactive material by radiation workers. Examples of such programs are: air monitoring; surface contamination monitoring; personal contamination surveys; radiobioassay; and dose assessment. The objectives of the internal dosimetry program are to demonstrate that the workplace is under control and that workers are not being exposed to radioactive material, and to detect and assess inadvertent intakes in the workplace. The Savannah River Site Internal Dosimetry Technical Basis Manual (TBM) is intended to provide a technical and philosophical discussion of the radiobioassay and dose assessment aspects of the internal dosimetry program. Detailed information on air, surface, and personal contamination surveillance programs is not given in this manual except for how these programs interface with routine and special bioassay programs.

  9. Catalyst by Design - Theoretical, Nanostructural, and Experimental...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Oxidation Catalyst for Diesel Engine Emission Treatment Catalyst by Design - Theoretical, ... More Documents & Publications Catalyst by Design - Theoretical, Nanostructural, and ...

  10. BASIS Set Exchange (BSE): Chemistry Basis Sets from the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL) Basis Set Library

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Feller, D; Schuchardt, Karen L.; Didier, Brett T.; Elsethagen, Todd; Sun, Lisong; Gurumoorthi, Vidhya; Chase, Jared; Li, Jun

    The Basis Set Exchange (BSE) provides a web-based user interface for downloading and uploading Gaussian-type (GTO) basis sets, including effective core potentials (ECPs), from the EMSL Basis Set Library. It provides an improved user interface and capabilities over its predecessor, the EMSL Basis Set Order Form, for exploring the contents of the EMSL Basis Set Library. The popular Basis Set Order Form and underlying Basis Set Library were originally developed by Dr. David Feller and have been available from the EMSL webpages since 1994. BSE not only allows downloading of the more than 200 Basis sets in various formats; it allows users to annotate existing sets and to upload new sets. (Specialized Interface)

  11. ON THE VERIFICATION AND VALIDATION OF GEOSPATIAL IMAGE ANALYSIS ALGORITHMS

    SciTech Connect (OSTI)

    Roberts, Randy S.; Trucano, Timothy G.; Pope, Paul A.; Aragon, Cecilia R.; Jiang , Ming; Wei, Thomas; Chilton, Lawrence; Bakel, A. J.

    2010-07-25

    Verification and validation (V&V) of geospatial image analysis algorithms is a difficult task and is becoming increasingly important. While there are many types of image analysis algorithms, we focus on developing V&V methodologies for algorithms designed to provide textual descriptions of geospatial imagery. In this paper, we present a novel methodological basis for V&V that employs a domain-specific ontology, which provides a naming convention for a domain-bounded set of objects and a set of named relationship between these objects. We describe a validation process that proceeds through objectively comparing benchmark imagery, produced using the ontology, with algorithm results. As an example, we describe how the proposed V&V methodology would be applied to algorithms designed to provide textual descriptions of facilities

  12. Tank characterization technical sampling basis

    SciTech Connect (OSTI)

    Brown, T.M.

    1998-04-28

    Tank Characterization Technical Sampling Basis (this document) is the first step of an in place working process to plan characterization activities in an optimal manner. This document will be used to develop the revision of the Waste Information Requirements Document (WIRD) (Winkelman et al. 1997) and ultimately, to create sampling schedules. The revised WIRD will define all Characterization Project activities over the course of subsequent fiscal years 1999 through 2002. This document establishes priorities for sampling and characterization activities conducted under the Tank Waste Remediation System (TWRS) Tank Waste Characterization Project. The Tank Waste Characterization Project is designed to provide all TWRS programs with information describing the physical, chemical, and radiological properties of the contents of waste storage tanks at the Hanford Site. These tanks contain radioactive waste generated from the production of nuclear weapons materials at the Hanford Site. The waste composition varies from tank to tank because of the large number of chemical processes that were used when producing nuclear weapons materials over the years and because the wastes were mixed during efforts to better use tank storage space. The Tank Waste Characterization Project mission is to provide information and waste sample material necessary for TWRS to define and maintain safe interim storage and to process waste fractions into stable forms for ultimate disposal. This document integrates the information needed to address safety issues, regulatory requirements, and retrieval, treatment, and immobilization requirements. Characterization sampling to support tank farm operational needs is also discussed.

  13. Advanced Fuel Cycle Cost Basis

    SciTech Connect (OSTI)

    D. E. Shropshire; K. A. Williams; W. B. Boore; J. D. Smith; B. W. Dixon; M. Dunzik-Gougar; R. D. Adams; D. Gombert; E. Schneider

    2009-12-01

    This report, commissioned by the U.S. Department of Energy (DOE), provides a comprehensive set of cost data supporting a cost analysis for the relative economic comparison of options for use in the Advanced Fuel Cycle Initiative (AFCI) Program. The report describes the AFCI cost basis development process, reference information on AFCI cost modules, a procedure for estimating fuel cycle costs, economic evaluation guidelines, and a discussion on the integration of cost data into economic computer models. This report contains reference cost data for 25 cost modules—23 fuel cycle cost modules and 2 reactor modules. The cost modules were developed in the areas of natural uranium mining and milling, conversion, enrichment, depleted uranium disposition, fuel fabrication, interim spent fuel storage, reprocessing, waste conditioning, spent nuclear fuel (SNF) packaging, long-term monitored retrievable storage, near surface disposal of low-level waste (LLW), geologic repository and other disposal concepts, and transportation processes for nuclear fuel, LLW, SNF, transuranic, and high-level waste.

  14. Advanced Fuel Cycle Cost Basis

    SciTech Connect (OSTI)

    D. E. Shropshire; K. A. Williams; W. B. Boore; J. D. Smith; B. W. Dixon; M. Dunzik-Gougar; R. D. Adams; D. Gombert; E. Schneider

    2008-03-01

    This report, commissioned by the U.S. Department of Energy (DOE), provides a comprehensive set of cost data supporting a cost analysis for the relative economic comparison of options for use in the Advanced Fuel Cycle Initiative (AFCI) Program. The report describes the AFCI cost basis development process, reference information on AFCI cost modules, a procedure for estimating fuel cycle costs, economic evaluation guidelines, and a discussion on the integration of cost data into economic computer models. This report contains reference cost data for 25 cost modules—23 fuel cycle cost modules and 2 reactor modules. The cost modules were developed in the areas of natural uranium mining and milling, conversion, enrichment, depleted uranium disposition, fuel fabrication, interim spent fuel storage, reprocessing, waste conditioning, spent nuclear fuel (SNF) packaging, long-term monitored retrievable storage, near surface disposal of low-level waste (LLW), geologic repository and other disposal concepts, and transportation processes for nuclear fuel, LLW, SNF, transuranic, and high-level waste.

  15. Advanced Fuel Cycle Cost Basis

    SciTech Connect (OSTI)

    D. E. Shropshire; K. A. Williams; W. B. Boore; J. D. Smith; B. W. Dixon; M. Dunzik-Gougar; R. D. Adams; D. Gombert

    2007-04-01

    This report, commissioned by the U.S. Department of Energy (DOE), provides a comprehensive set of cost data supporting a cost analysis for the relative economic comparison of options for use in the Advanced Fuel Cycle Initiative (AFCI) Program. The report describes the AFCI cost basis development process, reference information on AFCI cost modules, a procedure for estimating fuel cycle costs, economic evaluation guidelines, and a discussion on the integration of cost data into economic computer models. This report contains reference cost data for 26 cost modules—24 fuel cycle cost modules and 2 reactor modules. The cost modules were developed in the areas of natural uranium mining and milling, conversion, enrichment, depleted uranium disposition, fuel fabrication, interim spent fuel storage, reprocessing, waste conditioning, spent nuclear fuel (SNF) packaging, long-term monitored retrievable storage, near surface disposal of low-level waste (LLW), geologic repository and other disposal concepts, and transportation processes for nuclear fuel, LLW, SNF, and high-level waste.

  16. New Effective Multithreaded Matching Algorithms

    SciTech Connect (OSTI)

    Manne, Fredrik; Halappanavar, Mahantesh

    2014-05-19

    Matching is an important combinatorial problem with a number of applications in areas such as community detection, sparse linear algebra, and network alignment. Since computing optimal matchings can be very time consuming, several fast approximation algorithms, both sequential and parallel, have been suggested. Common to the algorithms giving the best solutions is that they tend to be sequential by nature, while algorithms more suitable for parallel computation give solutions of less quality. We present a new simple 1 2 -approximation algorithm for the weighted matching problem. This algorithm is both faster than any other suggested sequential 1 2 -approximation algorithm on almost all inputs and also scales better than previous multithreaded algorithms. We further extend this to a general scalable multithreaded algorithm that computes matchings of weight comparable with the best sequential algorithms. The performance of the suggested algorithms is documented through extensive experiments on different multithreaded architectures.

  17. Theoretical Division Current Job Openings

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    PADSTE » ADTSC » T » Job Openings Theoretical Division Job Openings Explore the multiple dimensions of a career at Los Alamos Lab: work with the best minds on the planet in an inclusive environment that is rich in intellectual vitality and opportunities for growth. Click in the Job Number to be directed to the description/application page. Postdoc Positions IRC49276 Theoretical and Computational Fluid Dynamics IRC49630 ACME Global Climate Model IRC49351 Mathematical/Computational Modeling

  18. Robotic Follow Algorithm

    Energy Science and Technology Software Center (OSTI)

    2005-03-30

    The Robotic Follow Algorithm enables allows any robotic vehicle to follow a moving target while reactively choosing a route around nearby obstacles. The robotic follow behavior can be used with different camera systems and can be used with thermal or visual tracking as well as other tracking methods such as radio frequency tags.

  19. On constructing optimistic simulation algorithms for the discrete event system specification

    SciTech Connect (OSTI)

    Nutaro, James J

    2008-01-01

    This article describes a Time Warp simulation algorithm for discrete event models that are described in terms of the Discrete Event System Specification (DEVS). The article shows how the total state transition and total output function of a DEVS atomic model can be transformed into an event processing procedure for a logical process. A specific Time Warp algorithm is constructed around this logical process, and it is shown that the algorithm correctly simulates a DEVS coupled model that consists entirely of interacting atomic models. The simulation algorithm is presented abstractly; it is intended to provide a basis for implementing efficient and scalable parallel algorithms that correctly simulate DEVS models.

  20. Large scale tracking algorithms.

    SciTech Connect (OSTI)

    Hansen, Ross L.; Love, Joshua Alan; Melgaard, David Kennett; Karelitz, David B.; Pitts, Todd Alan; Zollweg, Joshua David; Anderson, Dylan Z.; Nandy, Prabal; Whitlow, Gary L.; Bender, Daniel A.; Byrne, Raymond Harry

    2015-01-01

    Low signal-to-noise data processing algorithms for improved detection, tracking, discrimination and situational threat assessment are a key research challenge. As sensor technologies progress, the number of pixels will increase signi cantly. This will result in increased resolution, which could improve object discrimination, but unfortunately, will also result in a significant increase in the number of potential targets to track. Many tracking techniques, like multi-hypothesis trackers, suffer from a combinatorial explosion as the number of potential targets increase. As the resolution increases, the phenomenology applied towards detection algorithms also changes. For low resolution sensors, "blob" tracking is the norm. For higher resolution data, additional information may be employed in the detection and classfication steps. The most challenging scenarios are those where the targets cannot be fully resolved, yet must be tracked and distinguished for neighboring closely spaced objects. Tracking vehicles in an urban environment is an example of such a challenging scenario. This report evaluates several potential tracking algorithms for large-scale tracking in an urban environment.

  1. Lightning Talks 2015: Theoretical Division

    SciTech Connect (OSTI)

    Shlachter, Jack S.

    2015-11-25

    This document is a compilation of slides from a number of student presentations given to LANL Theoretical Division members. The subjects cover the range of activities of the Division, including plasma physics, environmental issues, materials research, bacterial resistance to antibiotics, and computational methods.

  2. Beyond Design Basis Event Pilot Evaluations

    Broader source: Energy.gov [DOE]

    This document provides Results and Recommendations for Improvements to Enhance Nuclear Safety at Department of Energy Nuclear Facilities based upon Beyond Design Basis Event Pilot Evaluations

  3. Property:ExplorationBasis | Open Energy Information

    Open Energy Info (EERE)

    Text Description Exploration Basis Why was exploration work conducted in this area (e.g., USGS report of a geothermal resource, hot springs with geothemmetry indicating...

  4. design basis threat | National Nuclear Security Administration

    National Nuclear Security Administration (NNSA)

    design basis threat | National Nuclear Security Administration Facebook Twitter Youtube Flickr RSS People Mission Managing the Stockpile Preventing Proliferation Powering the...

  5. Structural basis for Tetrahymena telomerase processivity factor...

    Office of Scientific and Technical Information (OSTI)

    DNA Citation Details In-Document Search Title: Structural basis for Tetrahymena telomerase processivity factor Teb1 binding to single-stranded telomeric-repeat DNA Authors: ...

  6. CCM Continuity Constraint Method: A finite-element computational fluid dynamics algorithm for incompressible Navier-Stokes fluid flows

    SciTech Connect (OSTI)

    Williams, P.T.

    1993-09-01

    As the field of computational fluid dynamics (CFD) continues to mature, algorithms are required to exploit the most recent advances in approximation theory, numerical mathematics, computing architectures, and hardware. Meeting this requirement is particularly challenging in incompressible fluid mechanics, where primitive-variable CFD formulations that are robust, while also accurate and efficient in three dimensions, remain an elusive goal. This dissertation asserts that one key to accomplishing this goal is recognition of the dual role assumed by the pressure, i.e., a mechanism for instantaneously enforcing conservation of mass and a force in the mechanical balance law for conservation of momentum. Proving this assertion has motivated the development of a new, primitive-variable, incompressible, CFD algorithm called the Continuity Constraint Method (CCM). The theoretical basis for the CCM consists of a finite-element spatial semi-discretization of a Galerkin weak statement, equal-order interpolation for all state-variables, a 0-implicit time-integration scheme, and a quasi-Newton iterative procedure extended by a Taylor Weak Statement (TWS) formulation for dispersion error control. Original contributions to algorithmic theory include: (a) formulation of the unsteady evolution of the divergence error, (b) investigation of the role of non-smoothness in the discretized continuity-constraint function, (c) development of a uniformly H{sup 1} Galerkin weak statement for the Reynolds-averaged Navier-Stokes pressure Poisson equation, (d) derivation of physically and numerically well-posed boundary conditions, and (e) investigation of sparse data structures and iterative methods for solving the matrix algebra statements generated by the algorithm.

  7. A new augmentation based algorithm for extracting maximal chordal subgraphs

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Bhowmick, Sanjukta; Chen, Tzu-Yi; Halappanavar, Mahantesh

    2015-02-01

    A graph is chordal if every cycle of length greater than three contains an edge between non-adjacent vertices. Chordal graphs are of interest both theoretically, since they admit polynomial time solutions to a range of NP-hard graph problems, and practically, since they arise in many applications including sparse linear algebra, computer vision, and computational biology. A maximal chordal subgraph is a chordal subgraph that is not a proper subgraph of any other chordal subgraph. Existing algorithms for computing maximal chordal subgraphs depend on dynamically ordering the vertices, which is an inherently sequential process and therefore limits the algorithms’ parallelizability. Inmore » this paper we explore techniques to develop a scalable parallel algorithm for extracting a maximal chordal subgraph. We demonstrate that an earlier attempt at developing a parallel algorithm may induce a non-optimal vertex ordering and is therefore not guaranteed to terminate with a maximal chordal subgraph. We then give a new algorithm that first computes and then repeatedly augments a spanning chordal subgraph. After proving that the algorithm terminates with a maximal chordal subgraph, we then demonstrate that this algorithm is more amenable to parallelization and that the parallel version also terminates with a maximal chordal subgraph. That said, the complexity of the new algorithm is higher than that of the previous parallel algorithm, although the earlier algorithm computes a chordal subgraph which is not guaranteed to be maximal. We experimented with our augmentation-based algorithm on both synthetic and real-world graphs. We provide scalability results and also explore the effect of different choices for the initial spanning chordal subgraph on both the running time and on the number of edges in the maximal chordal subgraph.« less

  8. Dynamical properties of non-ideal plasma on the basis of effective potentials

    SciTech Connect (OSTI)

    Ramazanov, T. S.; Kodanova, S. K.; Moldabekov, Zh. A.; Issanova, M. K.

    2013-11-15

    In this work, stopping power has been calculated on the basis of the Coulomb logarithm using the effective potentials. Calculations of the Coulomb logarithm and stopping power for different interaction potentials and degrees of ionization are compared. The comparison with the data of other theoretical and experimental works was carried out.

  9. Theoretical studies of combustion dynamics

    SciTech Connect (OSTI)

    Bowman, J.M.

    1993-12-01

    The basic objectives of this research program are to develop and apply theoretical techniques to fundamental dynamical processes of importance in gas-phase combustion. There are two major areas currently supported by this grant. One is reactive scattering of diatom-diatom systems, and the other is the dynamics of complex formation and decay based on L{sup 2} methods. In all of these studies, the authors focus on systems that are of interest experimentally, and for which potential energy surfaces based, at least in part, on ab initio calculations are available.

  10. Safety Basis Information System | Department of Energy

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Request Click on the above link to access the form to request access to the Safety Basis web interface. If you need assistance logging in, please AU UserSupport. Contact Nimi Rao...

  11. SRS FTF Section 3116 Basis for Determination

    Broader source: Energy.gov [DOE]

    Basis for Section 3116 Determination for Closure of F-Tank Farm at the Savannah River Site. In accordance with NDAA Section 3116, certain waste from reprocessing of spent nuclear fuel is not high...

  12. Basis for UCNI | Department of Energy

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    UCNI Basis for UCNI What documents contain the legal and policy foundations for the UCNI program? Section 148 of the Atomic Energy Act of 1954, as amended (42 U.S.C. 2011 et seq.), is the statutory basis for the UCNI program. 10 CFR Part 1017, Identification and Protection of Unclassified Controlled Nuclear Information specifies many detailed policies and requirements concerning the UCNI program. DOE O 471.1B, Identification and Protection of Unclassified Controlled Nuclear Information,

  13. Theoretical perspectives on strange physics

    SciTech Connect (OSTI)

    Ellis, J.

    1983-04-01

    Kaons are heavy enough to have an interesting range of decay modes available to them, and light enough to be produced in sufficient numbers to explore rare modes with satisfying statistics. Kaons and their decays have provided at least two major breakthroughs in our knowledge of fundamental physics. They have revealed to us CP violation, and their lack of flavor-changing neutral interactions warned us to expect charm. In addition, K/sup 0/-anti K/sup 0/ mixing has provided us with one of our most elegant and sensitive laboratories for testing quantum mechanics. There is every reason to expect that future generations of kaon experiments with intense sources would add further to our knowledge of fundamental physics. This talk attempts to set future kaon experiments in a general theoretical context, and indicate how they may bear upon fundamental theoretical issues. A survey of different experiments which would be done with an Intense Medium Energy Source of Strangeness, including rare K decays, probes of the nature of CP isolation, ..mu.. decays, hyperon decays and neutrino physics is given. (WHK)

  14. Critical review of theoretical models for anomalous effects in deuterated metals

    SciTech Connect (OSTI)

    Chechin, V.A.; Tsarev, V.A. ); Rabinowitz, M. ); Kim, Y.E. )

    1994-03-01

    The authors briefly summarize the reported anomalous effects in deuterated metals at ambient temperature commonly known as [open quotes]cold fusion[close quotes] (CF) with an emphasis on the latest experiments, as well as the theoretical basis for the opposition to interpreting them as cold fusion. Then they critically examine more than 25 theoretical models for CF, including unusual nuclear and exotic chemical hypotheses. They conclude that they do not explain the data.

  15. Nanoplasmonics simulations at the basis set limit through completeness-optimized, local numerical basis sets

    SciTech Connect (OSTI)

    Rossi, Tuomas P. Sakko, Arto; Puska, Martti J.; Lehtola, Susi; Nieminen, Risto M.

    2015-03-07

    We present an approach for generating local numerical basis sets of improving accuracy for first-principles nanoplasmonics simulations within time-dependent density functional theory. The method is demonstrated for copper, silver, and gold nanoparticles that are of experimental interest but computationally demanding due to the semi-core d-electrons that affect their plasmonic response. The basis sets are constructed by augmenting numerical atomic orbital basis sets by truncated Gaussian-type orbitals generated by the completeness-optimization scheme, which is applied to the photoabsorption spectra of homoatomic metal atom dimers. We obtain basis sets of improving accuracy up to the complete basis set limit and demonstrate that the performance of the basis sets transfers to simulations of larger nanoparticles and nanoalloys as well as to calculations with various exchange-correlation functionals. This work promotes the use of the local basis set approach of controllable accuracy in first-principles nanoplasmonics simulations and beyond.

  16. Polychromatic sparse image reconstruction and mass attenuation spectrum estimation via B-spline basis function expansion

    SciTech Connect (OSTI)

    Gu, Renliang E-mail: ald@iastate.edu; Dogandžić, Aleksandar E-mail: ald@iastate.edu

    2015-03-31

    We develop a sparse image reconstruction method for polychromatic computed tomography (CT) measurements under the blind scenario where the material of the inspected object and the incident energy spectrum are unknown. To obtain a parsimonious measurement model parameterization, we first rewrite the measurement equation using our mass-attenuation parameterization, which has the Laplace integral form. The unknown mass-attenuation spectrum is expanded into basis functions using a B-spline basis of order one. We develop a block coordinate-descent algorithm for constrained minimization of a penalized negative log-likelihood function, where constraints and penalty terms ensure nonnegativity of the spline coefficients and sparsity of the density map image in the wavelet domain. This algorithm alternates between a Nesterov’s proximal-gradient step for estimating the density map image and an active-set step for estimating the incident spectrum parameters. Numerical simulations demonstrate the performance of the proposed scheme.

  17. Cubit Adaptive Meshing Algorithm Library

    Energy Science and Technology Software Center (OSTI)

    2004-09-01

    CAMAL (Cubit adaptive meshing algorithm library) is a software component library for mesh generation. CAMAL 2.0 includes components for triangle, quad and tetrahedral meshing. A simple Application Programmers Interface (API) takes a discrete boundary definition and CAMAL computes a quality interior unstructured grid. The triangle and quad algorithms may also import a geometric definition of a surface on which to define the grid. CAMAL’s triangle meshing uses a 3D space advancing front method, the quadmore » meshing algorithm is based upon Sandia’s patented paving algorithm and the tetrahedral meshing algorithm employs the GHS3D-Tetmesh component developed by INRIA, France.« less

  18. Corrigendum to "Theoretical investigation of microstructure evolution...

    Office of Scientific and Technical Information (OSTI)

    Accepted Manuscript: Corrigendum to "Theoretical investigation of microstructure evolution ... microstructure evolution and deformation of zirconium under neutron irradiation" J. Nucl. ...

  19. Arctic Mixed-Phase Cloud Properties from AERI Lidar Observations: Algorithm and Results from SHEBA

    SciTech Connect (OSTI)

    Turner, David D.

    2005-04-01

    A new approach to retrieve microphysical properties from mixed-phase Arctic clouds is presented. This mixed-phase cloud property retrieval algorithm (MIXCRA) retrieves cloud optical depth, ice fraction, and the effective radius of the water and ice particles from ground-based, high-resolution infrared radiance and lidar cloud boundary observations. The theoretical basis for this technique is that the absorption coefficient of ice is greater than that of liquid water from 10 to 13 ?m, whereas liquid water is more absorbing than ice from 16 to 25 ?m. MIXCRA retrievals are only valid for optically thin (?visible < 6) single-layer clouds when the precipitable water vapor is less than 1 cm. MIXCRA was applied to the Atmospheric Emitted Radiance Interferometer (AERI) data that were collected during the Surface Heat Budget of the Arctic Ocean (SHEBA) experiment from November 1997 to May 1998, where 63% of all of the cloudy scenes above the SHEBA site met this specification. The retrieval determined that approximately 48% of these clouds were mixed phase and that a significant number of clouds (during all 7 months) contained liquid water, even for cloud temperatures as low as 240 K. The retrieved distributions of effective radii for water and ice particles in single-phase clouds are shown to be different than the effective radii in mixed-phase clouds.

  20. Theoretical Nuclear Physics - Research - Cyclotron Institute

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Theoretical Nuclear Physics By addressing this elastic scattering indirect technique, we hope that more accurate measurements of elastic scattering data will provide very important astrophysical information. Progress toward understanding the structure and behavior of strongly interacting many-body systems requires detailed theoretical study. The theoretical physics program concentrates on the development of fundamental and phenomenological models of nuclear behavior. In some systems, the

  1. Structural Basis for Activation of Cholera Toxin

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Structural Basis for Activation of Cholera Toxin Structural Basis for Activation of Cholera Toxin Print Wednesday, 30 November 2005 00:00 Cholera is a serious disease that claims thousands of victims each year in third-world, war-torn, and disaster-stricken nations. The culprit is the bacterium Vibrio cholerae, which can be ingested through contaminated food or water and colonizes the mucous membrane of the human small intestine. There, it secretes cholera toxin (CT), a protein whose A1 subunit

  2. Research in Theoretical Particle Physics

    SciTech Connect (OSTI)

    Feldman, Hume A; Marfatia, Danny

    2014-09-24

    This document is the final report on activity supported under DOE Grant Number DE-FG02-13ER42024. The report covers the period July 15, 2013 – March 31, 2014. Faculty supported by the grant during the period were Danny Marfatia (1.0 FTE) and Hume Feldman (1% FTE). The grant partly supported University of Hawaii students, David Yaylali and Keita Fukushima, who are supervised by Jason Kumar. Both students are expected to graduate with Ph.D. degrees in 2014. Yaylali will be joining the University of Arizona theory group in Fall 2014 with a 3-year postdoctoral appointment under Keith Dienes. The group’s research covered topics subsumed under the Energy Frontier, the Intensity Frontier, and the Cosmic Frontier. Many theoretical results related to the Standard Model and models of new physics were published during the reporting period. The report contains brief project descriptions in Section 1. Sections 2 and 3 lists published and submitted work, respectively. Sections 4 and 5 summarize group activity including conferences, workshops and professional presentations.

  3. TWRS authorization basis configuration control summary

    SciTech Connect (OSTI)

    Mendoza, D.P.

    1997-12-26

    This document was developed to define the Authorization Basis management functional requirements for configuration control, to evaluate the management control systems currently in place, and identify any additional controls that may be required until the TWRS [Tank Waste Remediation System] Configuration Management system is fully in place.

  4. CRAD, Facility Safety- Nuclear Facility Safety Basis

    Office of Energy Efficiency and Renewable Energy (EERE)

    A section of Appendix C to DOE G 226.1-2 "Federal Line Management Oversight of Department of Energy Nuclear Facilities." Consists of Criteria Review and Approach Documents (CRADs) that can be used for assessment of a contractor's Nuclear Facility Safety Basis.

  5. Review and Approval of Nuclear Facility Safety Basis and Safety Design Basis Documents

    Broader source: Directives, Delegations, and Requirements [Office of Management (MA)]

    2014-12-19

    This Standard describes a framework and the criteria to be used for approval of (1) safety basis documents, as required by 10 Code of Federal Regulation (C.F.R.) 830, Nuclear Safety Management, and (2) safety design basis documents, as required by Department of Energy (DOE) Standard (STD)-1189-2008, Integration of Safety into the Design Process.

  6. Optimized Algorithms Boost Combustion Research

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Optimized Algorithms Boost Combustion Research Optimized Algorithms Boost Combustion Research Methane Flame Simulations Run 6x Faster on NERSC's Hopper Supercomputer November 25, 2014 Contact: Kathy Kincade, +1 510 495 2124, kkincade@lbl.gov Turbulent combustion simulations, which provide input to the design of more fuel-efficient combustion systems, have gotten their own efficiency boost, thanks to researchers from the Computational Research Division (CRD) at Lawrence Berkeley National

  7. Grid and basis adaptive polynomial chaos techniques for sensitivity and uncertainty analysis

    SciTech Connect (OSTI)

    Perk, Zoltn Gilli, Luca Lathouwers, Danny Kloosterman, Jan Leen

    2014-03-01

    The demand for accurate and computationally affordable sensitivity and uncertainty techniques is constantly on the rise and has become especially pressing in the nuclear field with the shift to Best Estimate Plus Uncertainty methodologies in the licensing of nuclear installations. Besides traditional, already well developed methods such as first order perturbation theory or Monte Carlo sampling Polynomial Chaos Expansion (PCE) has been given a growing emphasis in recent years due to its simple application and good performance. This paper presents new developments of the research done at TU Delft on such Polynomial Chaos (PC) techniques. Our work is focused on the Non-Intrusive Spectral Projection (NISP) approach and adaptive methods for building the PCE of responses of interest. Recent efforts resulted in a new adaptive sparse grid algorithm designed for estimating the PC coefficients. The algorithm is based on Gerstner's procedure for calculating multi-dimensional integrals but proves to be computationally significantly cheaper, while at the same it retains a similar accuracy as the original method. More importantly the issue of basis adaptivity has been investigated and two techniques have been implemented for constructing the sparse PCE of quantities of interest. Not using the traditional full PC basis set leads to further reduction in computational time since the high order grids necessary for accurately estimating the near zero expansion coefficients of polynomial basis vectors not needed in the PCE can be excluded from the calculation. Moreover the sparse PC representation of the response is easier to handle when used for sensitivity analysis or uncertainty propagation due to the smaller number of basis vectors. The developed grid and basis adaptive methods have been implemented in Matlab as the Fully Adaptive Non-Intrusive Spectral Projection (FANISP) algorithm and were tested on four analytical problems. These show consistent good performance both in terms of the accuracy of the resulting PC representation of quantities and the computational costs associated with constructing the sparse PCE. Basis adaptivity also seems to make the employment of PC techniques possible for problems with a higher number of input parameters (1520), alleviating a well known limitation of the traditional approach. The prospect of larger scale applicability and the simplicity of implementation makes such adaptive PC algorithms particularly appealing for the sensitivity and uncertainty analysis of complex systems and legacy codes.

  8. Radioactive Waste Management BasisApril 2006

    SciTech Connect (OSTI)

    Perkins, B K

    2011-08-31

    This Radioactive Waste Management Basis (RWMB) documents radioactive waste management practices adopted at Lawrence Livermore National Laboratory (LLNL) pursuant to Department of Energy (DOE) Order 435.1, Radioactive Waste Management. The purpose of this Radioactive Waste Management Basis is to describe the systematic approach for planning, executing, and evaluating the management of radioactive waste at LLNL. The implementation of this document will ensure that waste management activities at LLNL are conducted in compliance with the requirements of DOE Order 435.1, Radioactive Waste Management, and the Implementation Guide for DOE Manual 435.1-1, Radioactive Waste Management Manual. Technical justification is provided where methods for meeting the requirements of DOE Order 435.1 deviate from the DOE Manual 435.1-1 and Implementation Guide.

  9. TECHNICAL BASIS DOCUMENT FOR NATURAL EVENT HAZARDS

    SciTech Connect (OSTI)

    KRIPPS, L.J.

    2006-07-31

    This technical basis document was developed to support the documented safety analysis (DSA) and describes the risk binning process and the technical basis for assigning risk bins for natural event hazard (NEH)-initiated accidents. The purpose of the risk binning process is to determine the need for safety-significant structures, systems, and components (SSC) and technical safety requirement (TSR)-level controls for a given representative accident or represented hazardous conditions based on an evaluation of the frequency and consequence. Note that the risk binning process is not applied to facility workers, because all facility worker hazardous conditions are considered for safety-significant SSCs and/or TSR-level controls.

  10. Structural Basis for Activation of Cholera Toxin

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Structural Basis for Activation of Cholera Toxin Print Cholera is a serious disease that claims thousands of victims each year in third-world, war-torn, and disaster-stricken nations. The culprit is the bacterium Vibrio cholerae, which can be ingested through contaminated food or water and colonizes the mucous membrane of the human small intestine. There, it secretes cholera toxin (CT), a protein whose A1 subunit (CTA1) triggers a series of events culminating in the massive efflux of

  11. Structural Basis for Activation of Cholera Toxin

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Structural Basis for Activation of Cholera Toxin Print Cholera is a serious disease that claims thousands of victims each year in third-world, war-torn, and disaster-stricken nations. The culprit is the bacterium Vibrio cholerae, which can be ingested through contaminated food or water and colonizes the mucous membrane of the human small intestine. There, it secretes cholera toxin (CT), a protein whose A1 subunit (CTA1) triggers a series of events culminating in the massive efflux of

  12. Structural Basis for Activation of Cholera Toxin

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Structural Basis for Activation of Cholera Toxin Print Cholera is a serious disease that claims thousands of victims each year in third-world, war-torn, and disaster-stricken nations. The culprit is the bacterium Vibrio cholerae, which can be ingested through contaminated food or water and colonizes the mucous membrane of the human small intestine. There, it secretes cholera toxin (CT), a protein whose A1 subunit (CTA1) triggers a series of events culminating in the massive efflux of

  13. Design Basis Threat | National Nuclear Security Administration

    National Nuclear Security Administration (NNSA)

    Design Basis Threat NNSA has taken aggressive action to improve the security of its nuclear weapons material (often referred to as special nuclear material, or SNM) and nuclear weapons in its custody. NNSA has taken aggressive action to improve the security of its nuclear weapons material (often referred to as special nuclear material, or SNM) and nuclear weapons in its custody. NNSA has taken aggressive action to improve the security of its nuclear weapons material (often referred to as special

  14. Structural Basis for Activation of Cholera Toxin

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Structural Basis for Activation of Cholera Toxin Print Cholera is a serious disease that claims thousands of victims each year in third-world, war-torn, and disaster-stricken nations. The culprit is the bacterium Vibrio cholerae, which can be ingested through contaminated food or water and colonizes the mucous membrane of the human small intestine. There, it secretes cholera toxin (CT), a protein whose A1 subunit (CTA1) triggers a series of events culminating in the massive efflux of

  15. 2005 American Conference on Theoretical Chemistry

    SciTech Connect (OSTI)

    Carter, Emily A

    2006-11-19

    The materials uploaded are meant to serve as final report on the funds provided by DOE-BES to help sponsor the 2005 American Conference on Theoretical Chemistry.

  16. COLLOQUIUM: Theoretical and Experimental Aspects of Controlled...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    5:30pm MBG Auditorium COLLOQUIUM: Theoretical and Experimental Aspects of Controlled Quantum Dynamics Professor Herschel Rabitz Princeton University Abstract: PDF icon...

  17. Theoretical and experimental studies of electrified interfaces...

    Office of Scientific and Technical Information (OSTI)

    This report presents theoretical results from models that explicitly include the molecular nature of the electrical double layer and predict critical electrochemical quantities ...

  18. GPU Accelerated Event Detection Algorithm

    Energy Science and Technology Software Center (OSTI)

    2011-05-25

    Smart grid external require new algorithmic approaches as well as parallel formulations. One of the critical components is the prediction of changes and detection of anomalies within the power grid. The state-of-the-art algorithms are not suited to handle the demands of streaming data analysis. (i) need for events detection algorithms that can scale with the size of data, (ii) need for algorithms that can not only handle multi dimensional nature of the data, but alsomore » model both spatial and temporal dependencies in the data, which, for the most part, are highly nonlinear, (iii) need for algorithms that can operate in an online fashion with streaming data. The GAEDA code is a new online anomaly detection techniques that take into account spatial, temporal, multi-dimensional aspects of the data set. The basic idea behind the proposed approach is to (a) to convert a multi-dimensional sequence into a univariate time series that captures the changes between successive windows extracted from the original sequence using singular value decomposition (SVD), and then (b) to apply known anomaly detection techniques for univariate time series. A key challenge for the proposed approach is to make the algorithm scalable to huge datasets by adopting techniques from perturbation theory, incremental SVD analysis. We used recent advances in tensor decomposition techniques which reduce computational complexity to monitor the change between successive windows and detect anomalies in the same manner as described above. Therefore we propose to develop the parallel solutions on many core systems such as GPUs, because these algorithms involve lot of numerical operations and are highly data-parallelizable.« less

  19. Review and Approval of Nuclear Facility Safety Basis and Safety...

    Office of Environmental Management (EM)

    DOE STANDARD REVIEW AND APPROVAL OF NUCLEAR FACILITY SAFETY BASIS AND SAFETY DESIGN BASIS ... Neither a reviewer nor the preparer has veto power over ultimate resolution or ...

  20. ORISE: The Medical Basis for Radiation-Accident Preparedness...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    The Medical Basis for Radiation-Accident Preparedness: Medical Management Proceedings of the Fifth International REACTS Symposium on the Medical Basis for Radiation-Accident ...

  1. Structural Basis for the Interaction between Pyk2-FAT Domain...

    Office of Scientific and Technical Information (OSTI)

    Structural Basis for the Interaction between Pyk2-FAT Domain and Leupaxin LD Repeats Citation Details In-Document Search Title: Structural Basis for the Interaction between ...

  2. Nuclear Safety Basis Program Review Overview and Management Oversight...

    Office of Environmental Management (EM)

    Nuclear Safety Basis Program Review Overview and Management Oversight Standard Review Plan Nuclear Safety Basis Program Review Overview and Management Oversight Standard Review ...

  3. Structural basis for substrate specificity in the Escherichia...

    Office of Scientific and Technical Information (OSTI)

    Structural basis for substrate specificity in the Escherichia coli maltose transport system Citation Details In-Document Search Title: Structural basis for substrate specificity in ...

  4. Los Alamos National Laboratory fission basis (Conference) | SciTech...

    Office of Scientific and Technical Information (OSTI)

    Los Alamos National Laboratory fission basis Citation Details In-Document Search Title: Los Alamos National Laboratory fission basis You are accessing a document from the ...

  5. Technical Cost Modeling - Life Cycle Analysis Basis for Program...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    More Documents & Publications Technical Cost Modeling - Life Cycle Analysis Basis for Program Focus Technical Cost Modeling - Life Cycle Analysis Basis for Program Focus Polymer ...

  6. Technical Cost Modeling - Life Cycle Analysis Basis for Program...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    More Documents & Publications Technical Cost Modeling - Life Cycle Analysis Basis for Program Focus Technical Cost Modeling - Life Cycle Analysis Basis for Program Focus Life Cycle ...

  7. Technical Cost Modeling - Life Cycle Analysis Basis for Program...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    More Documents & Publications Technical Cost Modeling - Life Cycle Analysis Basis for Program Focus Technical Cost Modeling - Life Cycle Analysis Basis for Program Focus ...

  8. Heavy quarkonium in a holographic basis (Journal Article) | DOE...

    Office of Scientific and Technical Information (OSTI)

    Heavy quarkonium in a holographic basis Title: Heavy quarkonium in a holographic basis Authors: Li, Yang Search DOE PAGES for author "Li, Yang" Search DOE PAGES for ORCID ...

  9. A molecular basis for advanced materials in water treatment....

    Office of Scientific and Technical Information (OSTI)

    A molecular basis for advanced materials in water treatment. Citation Details In-Document Search Title: A molecular basis for advanced materials in water treatment. Authors: Rempe, ...

  10. Theoretical Fusion Research | Princeton Plasma Physics Lab

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Theory & Computational Department Weekly Highlights Weekly Seminars Basic Plasma Science Plasma Astrophysics Other Physics and Engineering Research PPPL Technical Reports NSTX-U Education Organization Contact Us Overview Experimental Fusion Research Theoretical Fusion Research Theory & Computational Department Weekly Highlights Weekly Seminars Basic Plasma Science Plasma Astrophysics Other Physics and Engineering Research PPPL Technical Reports NSTX-U Theoretical Fusion Research About

  11. Real-time algorithm for robust coincidence search

    SciTech Connect (OSTI)

    Petrovic, T.; Vencelj, M.; Lipoglavsek, M.; Gajevic, J.; Pelicon, P.

    2012-10-20

    In in-beam {gamma}-ray spectroscopy experiments, we often look for coincident detection events. Among every N events detected, coincidence search is naively of principal complexity O(N{sup 2}). When we limit the approximate width of the coincidence search window, the complexity can be reduced to O(N), permitting the implementation of the algorithm into real-time measurements, carried out indefinitely. We have built an algorithm to find simultaneous events between two detection channels. The algorithm was tested in an experiment where coincidences between X and {gamma} rays detected in two HPGe detectors were observed in the decay of {sup 61}Cu. Functioning of the algorithm was validated by comparing calculated experimental branching ratio for EC decay and theoretical calculation for 3 selected {gamma}-ray energies for {sup 61}Cu decay. Our research opened a question on the validity of the adopted value of total angular momentum of the 656 keV state (J{sup {pi}} = 1/2{sup -}) in {sup 61}Ni.

  12. Technical Basis for PNNL Beryllium Inventory

    SciTech Connect (OSTI)

    Johnson, Michelle Lynn

    2014-07-09

    The Department of Energy (DOE) issued Title 10 of the Code of Federal Regulations Part 850, “Chronic Beryllium Disease Prevention Program” (the Beryllium Rule) in 1999 and required full compliance by no later than January 7, 2002. The Beryllium Rule requires the development of a baseline beryllium inventory of the locations of beryllium operations and other locations of potential beryllium contamination at DOE facilities. The baseline beryllium inventory is also required to identify workers exposed or potentially exposed to beryllium at those locations. Prior to DOE issuing 10 CFR 850, Pacific Northwest Nuclear Laboratory (PNNL) had documented the beryllium characterization and worker exposure potential for multiple facilities in compliance with DOE’s 1997 Notice 440.1, “Interim Chronic Beryllium Disease.” After DOE’s issuance of 10 CFR 850, PNNL developed an implementation plan to be compliant by 2002. In 2014, an internal self-assessment (ITS #E-00748) of PNNL’s Chronic Beryllium Disease Prevention Program (CBDPP) identified several deficiencies. One deficiency is that the technical basis for establishing the baseline beryllium inventory when the Beryllium Rule was implemented was either not documented or not retrievable. In addition, the beryllium inventory itself had not been adequately documented and maintained since PNNL established its own CBDPP, separate from Hanford Site’s program. This document reconstructs PNNL’s baseline beryllium inventory as it would have existed when it achieved compliance with the Beryllium Rule in 2001 and provides the technical basis for the baseline beryllium inventory.

  13. Jet measurements at D0 using a KT algorithm

    SciTech Connect (OSTI)

    V.Daniel Elvira

    2002-10-03

    D0 has implemented and calibrated a k{perpendicular} jet algorithm for the first time in a p{bar p} collider. We present two results based on 1992-1996 data which were recently published: the subjet multiplicity in quark and gluon jets and the central inclusive jet cross section. The measured ratio between subjet multiplicities in gluon and quark jets is consistent with theoretical predictions and previous experimental values. NLO pQCD predictions of the k{perpendicular} inclusive jet cross section agree with the D0 measurement, although marginally in the low p{sub T} range. We also present a preliminary measurement of thrust cross sections, which indicates the need to include higher than {alpha}{sub s}{sup 3} terms and resumation in the theoretical calculations.

  14. Adaptive protection algorithm and system

    DOE Patents [OSTI]

    Hedrick, Paul (Pittsburgh, PA) [Pittsburgh, PA; Toms, Helen L. (Irwin, PA) [Irwin, PA; Miller, Roger M. (Mars, PA) [Mars, PA

    2009-04-28

    An adaptive protection algorithm and system for protecting electrical distribution systems traces the flow of power through a distribution system, assigns a value (or rank) to each circuit breaker in the system and then determines the appropriate trip set points based on the assigned rank.

  15. Review and Approval of Nuclear Facility Safety Basis and Safety Design Basis Documents

    Energy Savers [EERE]

    SENSITIVE DOE-STD-1104-2009 May 2009 Superseding DOE-STD-1104-96 DOE STANDARD REVIEW AND APPROVAL OF NUCLEAR FACILITY SAFETY BASIS AND SAFETY DESIGN BASIS DOCUMENTS U.S. Department of Energy AREA SAFT Washington, DC 20585 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. DOE-STD-1104-2009 ii Available on the Department of Energy Technical Standards web page at http://www.hss.energy.gov/nuclearsafety/ns/techstds/ DOE-STD-1104-2009 iii CONTENTS FOREWORD

  16. algorithms

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    and its Use in Coupling Codes for Multiphysics Simulations Rod Schmidt, Noel Belcourt, Russell Hooper, and Roger Pawlowski Sandia National Laboratories P.O. Box 5800...

  17. algorithms

    Office of Scientific and Technical Information (OSTI)

    1 are estimated us- ing the conventional MCMC (C-MCMC) with 60,000 model executions (red-solid lines), the linear, quadratic, and cubic surrogate systems with 9226, 4375, 3765...

  18. algorithms

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    of the vehicle. Here the two domains are the fluid exterior to the vehicle (compressible, turbulent fluid flow) and the interior of the vehicle (structural dynamics)...

  19. Theoretical Plasma Physicist | Princeton Plasma Physics Lab

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Theoretical Plasma Physicist Department: Theory Supervisor(s): Amitava Bhattacharjee Staff: RM 3 Requisition Number: 16000351 PPPL/Theory Department has an opening at the rank of Research Physicist in theoretical plasma physics. Research areas of interest include macroscopic equilibrium and stability, energetic particles, turbulence and transport, and waves in fusion plasmas. The Department is looking to recruit an exceptionally strong theorist with leadership potential. Minimum qualifications

  20. A radial basis function Galerkin method for inhomogeneous nonlocal diffusion

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Lehoucq, Richard B.; Rowe, Stephen T.

    2016-02-01

    We introduce a discretization for a nonlocal diffusion problem using a localized basis of radial basis functions. The stiffness matrix entries are assembled by a special quadrature routine unique to the localized basis. Combining the quadrature method with the localized basis produces a well-conditioned, sparse, symmetric positive definite stiffness matrix. We demonstrate that both the continuum and discrete problems are well-posed and present numerical results for the convergence behavior of the radial basis function method. As a result, we explore approximating the solution to anisotropic differential equations by solving anisotropic nonlocal integral equations using the radial basis function method.

  1. Interim Basis for PCB Sampling and Analyses

    SciTech Connect (OSTI)

    BANNING, D.L.

    2001-03-20

    This document was developed as an interim basis for sampling and analysis of polychlorinated biphenyls (PCBs) and will be used until a formal data quality objective (DQO) document is prepared and approved. On August 31, 2000, the Framework Agreement for Management of Polychlorinated Biphenyls (PCBs) in Hanford Tank Waste was signed by the U.S. Department of Energy (DOE), the Environmental Protection Agency (EPA), and the Washington State Department of Ecology (Ecology) (Ecology et al. 2000). This agreement outlines the management of double shell tank (DST) waste as Toxic Substance Control Act (TSCA) PCB remediation waste based on a risk-based disposal approval option per Title 40 of the Code of Federal Regulations 761.61 (c). The agreement calls for ''Quantification of PCBs in DSTs, single shell tanks (SSTs), and incoming waste to ensure that the vitrification plant and other ancillary facilities PCB waste acceptance limits and the requirements of the anticipated risk-based disposal approval are met.'' Waste samples will be analyzed for PCBs to satisfy this requirement. This document describes the DQO process undertaken to assure appropriate data will be collected to support management of PCBs and is presented in a DQO format. The DQO process was implemented in accordance with the U.S. Environmental Protection Agency EPA QA/G4, Guidance for the Data Quality Objectives Process (EPA 1994) and the Data Quality Objectives for Sampling and Analyses, HNF-IP-0842, Rev. 1A, Vol. IV, Section 4.16 (Banning 1999).

  2. Interim Basis for PCB Sampling and Analyses

    SciTech Connect (OSTI)

    BANNING, D.L.

    2001-01-18

    This document was developed as an interim basis for sampling and analysis of polychlorinated biphenyls (PCBs) and will be used until a formal data quality objective (DQO) document is prepared and approved. On August 31, 2000, the Framework Agreement for Management of Polychlorinated Biphenyls (PCBs) in Hanford Tank Waste was signed by the US. Department of Energy (DOE), the Environmental Protection Agency (EPA), and the Washington State Department of Ecology (Ecology) (Ecology et al. 2000). This agreement outlines the management of double shell tank (DST) waste as Toxic Substance Control Act (TSCA) PCB remediation waste based on a risk-based disposal approval option per Title 40 of the Code of Federal Regulations 761.61 (c). The agreement calls for ''Quantification of PCBs in DSTs, single shell tanks (SSTs), and incoming waste to ensure that the vitrification plant and other ancillary facilities PCB waste acceptance limits and the requirements of the anticipated risk-based disposal approval are met.'' Waste samples will be analyzed for PCBs to satisfy this requirement. This document describes the DQO process undertaken to assure appropriate data will be collected to support management of PCBs and is presented in a DQO format. The DQO process was implemented in accordance with the U.S. Environmental Protection Agency EPA QAlG4, Guidance for the Data Quality Objectives Process (EPA 1994) and the Data Quality Objectives for Sampling and Analyses, HNF-IP-0842, Rev. 1 A, Vol. IV, Section 4.16 (Banning 1999).

  3. Time Variant Floating Mean Counting Algorithm

    Energy Science and Technology Software Center (OSTI)

    1999-06-03

    This software was written to test a time variant floating mean counting algorithm. The algorithm was developed by Westinghouse Savannah River Company and a provisional patent has been filed on the algorithm. The test software was developed to work with the Val Tech model IVB prototype version II count rate meter hardware. The test software was used to verify the algorithm developed by WSRC could be correctly implemented with the vendor''s hardware.

  4. Office of Nuclear Safety Basis and Facility Design

    Broader source: Energy.gov [DOE]

    The Office of Nuclear Safety Basis & Facility Design establishes safety basis and facility design requirements and expectations related to analysis and design of nuclear facilities to ensure protection of workers and the public from the hazards associated with nuclear operations.

  5. Enterprise Assessments Targeted Review of the Safety Basis at...

    Office of Environmental Management (EM)

    Basis at the Savannah River Site F-Area Central Laboratory Facility - January 2016 Enterprise Assessments Targeted Review of the Safety Basis at the Savannah River Site F-Area ...

  6. Authorization basis status report (miscellaneous TWRS facilities, tanks and components)

    SciTech Connect (OSTI)

    Stickney, R.G.

    1998-04-29

    This report presents the results of a systematic evaluation conducted to identify miscellaneous TWRS facilities, tanks and components with potential needed authorization basis upgrades. It provides the Authorization Basis upgrade plan for those miscellaneous TWRS facilities, tanks and components identified.

  7. Review and Approval of Nuclear Facility Safety Basis and Safety...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    104-2014, Review and Approval of Nuclear Facility Safety Basis and Safety Design Basis Documents by Website Administrator This Standard describes a framework and the criteria to be...

  8. CRAD, Integrated Safety Basis and Engineering Design Review ...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Integrated Safety Basis and Engineering Design Review - August 20, 2014 (EA CRAD 31-4, Rev. 0) CRAD, Integrated Safety Basis and Engineering Design Review - August 20, 2014 (EA...

  9. PARFUME Theory and Model basis Report

    SciTech Connect (OSTI)

    Darrell L. Knudson; Gregory K Miller; G.K. Miller; D.A. Petti; J.T. Maki; D.L. Knudson

    2009-09-01

    The success of gas reactors depends upon the safety and quality of the coated particle fuel. The fuel performance modeling code PARFUME simulates the mechanical, thermal and physico-chemical behavior of fuel particles during irradiation. This report documents the theory and material properties behind vari¬ous capabilities of the code, which include: 1) various options for calculating CO production and fission product gas release, 2) an analytical solution for stresses in the coating layers that accounts for irradiation-induced creep and swelling of the pyrocarbon layers, 3) a thermal model that calculates a time-dependent temperature profile through a pebble bed sphere or a prismatic block core, as well as through the layers of each analyzed particle, 4) simulation of multi-dimensional particle behavior associated with cracking in the IPyC layer, partial debonding of the IPyC from the SiC, particle asphericity, and kernel migration (or amoeba effect), 5) two independent methods for determining particle failure probabilities, 6) a model for calculating release-to-birth (R/B) ratios of gaseous fission products that accounts for particle failures and uranium contamination in the fuel matrix, and 7) the evaluation of an accident condition, where a particle experiences a sudden change in temperature following a period of normal irradiation. The accident condi¬tion entails diffusion of fission products through the particle coating layers and through the fuel matrix to the coolant boundary. This document represents the initial version of the PARFUME Theory and Model Basis Report. More detailed descriptions will be provided in future revisions.

  10. Kinetically balanced Gaussian basis-set approach to relativistic Compton profiles of atoms

    SciTech Connect (OSTI)

    Jaiswal, Prerit; Shukla, Alok

    2007-02-15

    Atomic Compton profiles (CPs) are a very important property which provide us information about the momentum distribution of atomic electrons. Therefore, for CPs of heavy atoms, relativistic effects are expected to be important, warranting a relativistic treatment of the problem. In this paper, we present an efficient approach aimed at ab initio calculations of atomic CPs within a Dirac-Hartree-Fock (DHF) formalism, employing kinetically balanced Gaussian basis functions. The approach is used to compute the CPs of noble gases ranging from He to Rn, and the results have been compared to the experimental and other theoretical data, wherever possible. The influence of the quality of the basis set on the calculated CPs has also been systematically investigated.

  11. Hamiltonian Light-front Field Theory Within an AdS/QCD Basis

    SciTech Connect (OSTI)

    Vary, J.P.; Honkanen, H.; Li, Jun; Maris, P.; Brodsky, S.J.; Harindranath, A.; de Teramond, G.F.; Sternberg, P.; Ng, E.G.; Yang, C.; /LBL, Berkeley

    2009-12-16

    Non-perturbative Hamiltonian light-front quantum field theory presents opportunities and challenges that bridge particle physics and nuclear physics. Fundamental theories, such as Quantum Chromodynamics (QCD) and Quantum Electrodynamics (QED) offer the promise of great predictive power spanning phenomena on all scales from the microscopic to cosmic scales, but new tools that do not rely exclusively on perturbation theory are required to make connection from one scale to the next. We outline recent theoretical and computational progress to build these bridges and provide illustrative results for nuclear structure and quantum field theory. As our framework we choose light-front gauge and a basis function representation with two-dimensional harmonic oscillator basis for transverse modes that corresponds with eigensolutions of the soft-wall AdS/QCD model obtained from light-front holography.

  12. CRAD, Engineering Design and Safety Basis- December 22, 2009

    Broader source: Energy.gov [DOE]

    Engineering Design and Safety Basis Inspection Criteria, Inspection Activities, and Lines of Inquiry (HSS CRAD 64-19, Rev. 0)

  13. Theoretical aspects of light meson spectroscopy

    SciTech Connect (OSTI)

    Barnes, T. |

    1995-12-31

    In this pedagogical review the authors discuss the theoretical understanding of light hadron spectroscopy in terms of QCD and the quark model. They begin with a summary of the known and surmised properties of QCD and confinement. Following this they review the nonrelativistic quark potential model for q{anti q} mesons and discuss the quarkonium spectrum and methods for identifying q{anti q} states. Finally, they review theoretical expectations for non-q{anti q} states (glueballs, hybrids and multiquark systems) and the status of experimental candidates for these states.

  14. Theoretical studies of chemical reaction dynamics

    SciTech Connect (OSTI)

    Schatz, G.C.

    1993-12-01

    This collaborative program with the Theoretical Chemistry Group at Argonne involves theoretical studies of gas phase chemical reactions and related energy transfer and photodissociation processes. Many of the reactions studied are of direct relevance to combustion; others are selected they provide important examples of special dynamical processes, or are of relevance to experimental measurements. Both classical trajectory and quantum reactive scattering methods are used for these studies, and the types of information determined range from thermal rate constants to state to state differential cross sections.

  15. Volume-preserving algorithm for secular relativistic dynamics of charged particles

    SciTech Connect (OSTI)

    Zhang, Ruili; Liu, Jian; Wang, Yulei; He, Yang; Qin, Hong; Sun, Yajuan

    2015-04-15

    Secular dynamics of relativistic charged particles has theoretical significance and a wide range of applications. However, conventional algorithms are not applicable to this problem due to the coherent accumulation of numerical errors. To overcome this difficulty, we develop a volume-preserving algorithm (VPA) with long-term accuracy and conservativeness via a systematic splitting method. Applied to the simulation of runaway electrons with a time-span over 10 magnitudes, the VPA generates accurate results and enables the discovery of new physics for secular runaway dynamics.

  16. Final Technical Report "Multiscale Simulation Algorithms for Biochemical Systems"

    SciTech Connect (OSTI)

    Petzold, Linda R.

    2012-10-25

    Biochemical systems are inherently multiscale and stochastic. In microscopic systems formed by living cells, the small numbers of reactant molecules can result in dynamical behavior that is discrete and stochastic rather than continuous and deterministic. An analysis tool that respects these dynamical characteristics is the stochastic simulation algorithm (SSA, Gillespie, 1976), a numerical simulation procedure that is essentially exact for chemical systems that are spatially homogeneous or well stirred. Despite recent improvements, as a procedure that simulates every reaction event, the SSA is necessarily inefficient for most realistic problems. There are two main reasons for this, both arising from the multiscale nature of the underlying problem: (1) stiffness, i.e. the presence of multiple timescales, the fastest of which are stable; and (2) the need to include in the simulation both species that are present in relatively small quantities and should be modeled by a discrete stochastic process, and species that are present in larger quantities and are more efficiently modeled by a deterministic differential equation (or at some scale in between). This project has focused on the development of fast and adaptive algorithms, and the fun- damental theory upon which they must be based, for the multiscale simulation of biochemical systems. Areas addressed by this project include: (1) Theoretical and practical foundations for ac- celerated discrete stochastic simulation (tau-leaping); (2) Dealing with stiffness (fast reactions) in an efficient and well-justified manner in discrete stochastic simulation; (3) Development of adaptive multiscale algorithms for spatially homogeneous discrete stochastic simulation; (4) Development of high-performance SSA algorithms.

  17. Experimental and Theoretical Investigation of Lubricant and Additive...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Theoretical Investigation of Lubricant and Additive Effects on Engine Friction Experimental and Theoretical Investigation of Lubricant and Additive Effects on Engine Friction ...

  18. Efficient Theoretical Screening of Solid Sorbents for CO2 Capture...

    Office of Scientific and Technical Information (OSTI)

    Journal Article: Efficient Theoretical Screening of Solid Sorbents for CO2 Capture Applications* Citation Details In-Document Search Title: Efficient Theoretical Screening of Solid ...

  19. ITP Steel: Theoretical Minimum Energies to Produce Steel for...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Theoretical Minimum Energies to Produce Steel for Selected Conditions, March 2000 ITP Steel: Theoretical Minimum Energies to Produce Steel for Selected Conditions, March 2000 PDF ...

  20. Theoretical Synthesis of Mixed Materials for CO2 Capture Applications...

    Office of Scientific and Technical Information (OSTI)

    Conference: Theoretical Synthesis of Mixed Materials for CO2 Capture Applications Citation Details In-Document Search Title: Theoretical Synthesis of Mixed Materials for CO2 ...

  1. Toward Catalyst Design from Theoretical Calculations (464th Brookhaven...

    Office of Scientific and Technical Information (OSTI)

    Toward Catalyst Design from Theoretical Calculations (464th Brookhaven Lecture) Citation Details In-Document Search Title: Toward Catalyst Design from Theoretical Calculations...

  2. Research in theoretical nuclear and neutrino physics. Final report...

    Office of Scientific and Technical Information (OSTI)

    Technical Report: Research in theoretical nuclear and neutrino physics. Final report Citation Details In-Document Search Title: Research in theoretical nuclear and neutrino ...

  3. Research in theoretical nuclear and neutrino physics. Final report...

    Office of Scientific and Technical Information (OSTI)

    Research in theoretical nuclear and neutrino physics. Final report Citation Details In-Document Search Title: Research in theoretical nuclear and neutrino physics. Final report The ...

  4. Catalysis by Design - Theoretical and Experimental Studies of...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Design - Theoretical and Experimental Studies of Model Catalysts for Lean NOx Treatment Catalysis by Design - Theoretical and Experimental Studies of Model Catalysts for Lean NOx ...

  5. Operando Raman and Theoretical Vibration Spectroscopy of Non...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Operando Raman and Theoretical Vibration Spectroscopy of Non-PGM Catalysts Operando Raman and Theoretical Vibration Spectroscopy of Non-PGM Catalysts Presentation about...

  6. Hanford External Dosimetry Technical Basis Manual PNL-MA-842

    SciTech Connect (OSTI)

    Rathbone, Bruce A.

    2010-01-01

    The Hanford External Dosimetry Technical Basis Manual PNL-MA-842 documents the design and implementation of the external dosimetry system used at Hanford. The manual describes the dosimeter design, processing protocols, dose calculation methodology, radiation fields encountered, dosimeter response characteristics, limitations of dosimeter design under field conditions, and makes recommendations for effective use of the dosimeters in the field. The manual describes the technical basis for the dosimetry system in a manner intended to help ensure defensibility of the dose of record at Hanford and to demonstrate compliance with 10 CFR 835, DOELAP, DOE-RL, ORP, PNSO, and Hanford contractor requirements. The dosimetry system is operated by PNNL’s Hanford External Dosimetry Program (HEDP) which provides dosimetry services to all Hanford contractors. The primary users of this manual are DOE and DOE contractors at Hanford using the dosimetry services of PNNL. Development and maintenance of this manual is funded directly by DOE and DOE contractors. Its contents have been reviewed and approved by DOE and DOE contractors at Hanford through the Hanford Personnel Dosimetry Advisory Committee (HPDAC) which is chartered and chaired by DOE-RL and serves as means of coordinating dosimetry practices across contractors at Hanford. This manual was established in 1996. Since its inception, it has been revised many times and maintained by PNNL as a controlled document with controlled distribution. The first revision to be released through PNNL’s Electronic Records & Information Capture Architecture (ERICA) database was designated Revision 0. Revision numbers that are whole numbers reflect major revisions typically involving significant changes to all chapters in the document. Revision numbers that include a decimal fraction reflect minor revisions, usually restricted to selected chapters or selected pages in the document. Maintenance and distribution of controlled hard copies of the manual by PNNL was discontinued beginning with Revision 0.2. Revision Log: Rev. 0 (2/25/2005) Major revision and expansion. Rev. 0.1 (3/12/2007) Updated Chapters 5, 6 and 9 to reflect change in default ring calibration factor used in HEDP dose calculation software. Factor changed from 1.5 to 2.0 beginning January 1, 2007. Pages on which changes were made are as follows: 5.23, 5.69, 5.78, 5.80, 5.82, 6.3, 6.5, 6.29, and 9.2. Rev 0.2 (8/28/2009) Updated Chapters 3, 5, 6, 8 and 9. Chapters 6 and 8 were significantly expanded. References in the Preface and Chapters 1, 2, 4, and 7 were updated to reflect updates to DOE documents. Approved by HPDAC on 6/2/2009. Rev 1.0 (1/1/2010) Major revision. Updated all chapters to reflect the Hanford site wide implementation on January 1, 2010 of new DOE requirements for occupational radiation protection. The new requirements are given in the June 8, 2007 amendment to 10 CFR 835 Occupational Radiation Protection (Federal Register, June 8, 2007. Title 10 Part 835. U.S., Code of Federal Regulations, Vol. 72, No. 110, 31904-31941). Revision 1.0 to the manual replaces ICRP 26 dosimetry concepts and terminology with ICRP 60 dosimetry concepts and terminology and replaces external dose conversion factors from ICRP 51 with those from ICRP 74 for use in measurement of operational quantities with dosimeters. Descriptions of dose algorithms and dosimeter response characteristics, and field performance were updated to reflect changes in the neutron quality factors used in the measurement of operational quantities.

  7. A new paradigm for the molecular basis of rubber elasticity

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Hanson, David E.; Barber, John L.

    2015-02-19

    The molecular basis for rubber elasticity is arguably the oldest and one of the most important questions in the field of polymer physics. The theoretical investigation of rubber elasticity began in earnest almost a century ago with the development of analytic thermodynamic models, based on simple, highly-symmetric configurations of so-called Gaussian chains, i.e. polymer chains that obey Markov statistics. Numerous theories have been proposed over the past 90 years based on the ansatz that the elastic force for individual network chains arises from the entropy change associated with the distribution of end-to-end distances of a free polymer chain. There aremore » serious philosophical objections to this assumption and others, such as the assumption that all network nodes undergo affine motion and that all of the network chains have the same length. Recently, a new paradigm for elasticity in rubber networks has been proposed that is based on mechanisms that originate at the molecular level. Using conventional statistical mechanics analyses, quantum chemistry, and molecular dynamics simulations, the fundamental entropic and enthalpic chain extension forces for polyisoprene (natural rubber) have been determined, along with estimates for the basic force constants. Concurrently, the complex morphology of natural rubber networks (the joint probability density distributions that relate the chain end-to-end distance to its contour length) has also been captured in a numerical model. When molecular chain forces are merged with the network structure in this model, it is possible to study the mechanical response to tensile and compressive strains of a representative volume element of a polymer network. As strain is imposed on a network, pathways of connected taut chains, that completely span the network along strain axis, emerge. Although these chains represent only a few percent of the total, they account for nearly all of the elastic stress at high strain. Here we provide a brief review of previous elasticity theories and their deficiencies, and present a new paradigm with an emphasis on experimental comparisons.« less

  8. Nuclear Physics Division Theoretical Study Division

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    CEBIT 67-18 Nuclear Physics Division Theoretical Study Division 11 July 1967 ORGANISATION EUROPEENNE POUR LA RECHERCHE NUCLEAIRE C E R N EUROPEAN ORGANIZATION FOR NUCLEAR RESEARCH THE K°K° SYSTEM G. Charpak, CERN, Geneva, Switzerland, and M. Gourdin, Faculty des Sciences, Orsay, Prance. Lectures delivered at the Matscience Institute, Madras, India, December 1966 and January 1967 G E N E V A 1967 (C) Copyright CERN, Geneve, 1967 Propriety litteraire et scientifique r&ervee pour tous les

  9. Student's algorithm solves real-world problem

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Student's algorithm solves real-world problem Supercomputing Challenge: student's algorithm solves real-world problem Students learn how to use powerful computers to analyze, model, and solve real-world problems. April 3, 2012 Jordon Medlock of Albuquerque's Manzano High School won the 2012 Lab-sponsored Supercomputing Challenge Jordon Medlock of Albuquerque's Manzano High School won the 2012 Lab-sponsored Supercomputing Challenge by creating a computer algorithm that automates the process of

  10. Enterprise Assessments Review of the Delegation of Safety Basis Approval

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Authority for Hazard Category 1, 2, and 3 Nuclear Facilities - April 2016 | Department of Energy Delegation of Safety Basis Approval Authority for Hazard Category 1, 2, and 3 Nuclear Facilities - April 2016 Enterprise Assessments Review of the Delegation of Safety Basis Approval Authority for Hazard Category 1, 2, and 3 Nuclear Facilities - April 2016 April 2016 Enterprise Assessments Review of the Delegation of Safety Basis Approval Authority for Hazard Category 1, 2, and 3 Nuclear

  11. Protocol for Enhanced Evaluations of Beyond Design Basis Events Supporting

    Energy Savers [EERE]

    Implementation of Operating Experience Report 2013-01 | Department of Energy Protocol for Enhanced Evaluations of Beyond Design Basis Events Supporting Implementation of Operating Experience Report 2013-01 Protocol for Enhanced Evaluations of Beyond Design Basis Events Supporting Implementation of Operating Experience Report 2013-01 April, 2013 Protocol for Enhanced Evaluations of Beyond Design Basis Events Supporting Implementation of Operating Experience Report 2013-01 To support the

  12. ORISE: The Medical Basis for Radiation-Accident Preparedness: Medical

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Management (Published by REAC/TS) The Medical Basis for Radiation-Accident Preparedness: Medical Management Proceedings of the Fifth International REAC/TS Symposium on the Medical Basis for Radiation-Accident Preparedness and the Biodosimetry Workshop As part of its mission to provide continuing education for personnel responsible for treating radiation injuries, REAC/TS hosted the Fifth International REAC/TS Symposium on the Medical Basis for Radiation-Accident Preparedness symposium and

  13. Solar Position Algorithm (SPA) - Energy Innovation Portal

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Thermal Solar Thermal Energy Analysis Energy Analysis Find More Like This Return to Search Solar Position Algorithm (SPA) National Renewable Energy Laboratory Contact NREL About ...

  14. Java implementation of Class Association Rule algorithms

    Energy Science and Technology Software Center (OSTI)

    2007-08-30

    Java implementation of three Class Association Rule mining algorithms, NETCAR, CARapriori, and clustering based rule mining. NETCAR algorithm is a novel algorithm developed by Makio Tamura. The algorithm is discussed in a paper: UCRL-JRNL-232466-DRAFT, and would be published in a peer review scientific journal. The software is used to extract combinations of genes relevant with a phenotype from a phylogenetic profile and a phenotype profile. The phylogenetic profiles is represented by a binary matrix andmore » a phenotype profile is represented by a binary vector. The present application of this software will be in genome analysis, however, it could be applied more generally.« less

  15. Technical Planning Basis - DOE Directives, Delegations, and Requiremen...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    2, Technical Planning Basis by David Freshwater Functional areas: Defense Nuclear Facility Safety and Health Requirement, Safety and Security, The Guide assists DOENNSA field...

  16. Protocol for Enhanced Evaluations of Beyond Design Basis Events...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Protocol for Enhanced Evaluations of Beyond Design Basis Events Supporting Implementation of Operating Experience Report 2013-01 Protocol for Enhanced Evaluations of Beyond Design...

  17. CRAD, Review of Safety Basis Development- January 31, 2013

    Broader source: Energy.gov [DOE]

    Review of Safety Basis Development for the Savannah River Site Salt Waste Processing Facility - Inspection Criteria, Approach, and Lines of Inquiry (HSS CRAD 45-57, Rev. 0)

  18. Assessing Beyond Design Basis Seismic Events and Implications...

    Office of Environmental Management (EM)

    Defense Nuclear Facilities Safety Board Topics Covered: Department of Energy Approach to Natural Phenomena Hazards Analysis and Design (Seismic) Design Basis and Beyond Design...

  19. Structural and Functional Basis for Broad-spectrum Neutralization...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Structural and Functional Basis for Broad-spectrum Neutralization of Avian and Human ... unsuccessful. figure 2 Fig 2A. Broad spectrum neutralizing antibody F10 in complex ...

  20. Theoretical, Methodological, and Empirical Approaches to Cost Savings: A Compendium

    SciTech Connect (OSTI)

    M Weimar

    1998-12-10

    This publication summarizes and contains the original documentation for understanding why the U.S. Department of Energy's (DOE's) privatization approach provides cost savings and the different approaches that could be used in calculating cost savings for the Tank Waste Remediation System (TWRS) Phase I contract. The initial section summarizes the approaches in the different papers. The appendices are the individual source papers which have been reviewed by individuals outside of the Pacific Northwest National Laboratory and the TWRS Program. Appendix A provides a theoretical basis for and estimate of the level of savings that can be" obtained from a fixed-priced contract with performance risk maintained by the contractor. Appendix B provides the methodology for determining cost savings when comparing a fixed-priced contractor with a Management and Operations (M&O) contractor (cost-plus contractor). Appendix C summarizes the economic model used to calculate cost savings and provides hypothetical output from preliminary calculations. Appendix D provides the summary of the approach for the DOE-Richland Operations Office (RL) estimate of the M&O contractor to perform the same work as BNFL Inc. Appendix E contains information on cost growth and per metric ton of glass costs for high-level waste at two other DOE sites, West Valley and Savannah River. Appendix F addresses a risk allocation analysis of the BNFL proposal that indicates,that the current approach is still better than the alternative.

  1. Final Report: Sublinear Algorithms for In-situ and In-transit Data Analysis at Exascale.

    SciTech Connect (OSTI)

    Bennett, Janine Camille; Pinar, Ali; Seshadhri, C.; Thompson, David; Salloum, Maher; Bhagatwala, Ankit; Chen, Jacqueline H.

    2015-09-01

    Post-Moore's law scaling is creating a disruptive shift in simulation workflows, as saving the entirety of raw data to persistent storage becomes expensive. We are moving away from a post-process centric data analysis paradigm towards a concurrent analysis framework, in which raw simulation data is processed as it is computed. Algorithms must adapt to machines with extreme concurrency, low communication bandwidth, and high memory latency, while operating within the time constraints prescribed by the simulation. Furthermore, in- put parameters are often data dependent and cannot always be prescribed. The study of sublinear algorithms is a recent development in theoretical computer science and discrete mathematics that has significant potential to provide solutions for these challenges. The approaches of sublinear algorithms address the fundamental mathematical problem of understanding global features of a data set using limited resources. These theoretical ideas align with practical challenges of in-situ and in-transit computation where vast amounts of data must be processed under severe communication and memory constraints. This report details key advancements made in applying sublinear algorithms in-situ to identify features of interest and to enable adaptive workflows over the course of a three year LDRD. Prior to this LDRD, there was no precedent in applying sublinear techniques to large-scale, physics based simulations. This project has definitively demonstrated their efficacy at mitigating high performance computing challenges and highlighted the rich potential for follow-on re- search opportunities in this space.

  2. Petascale algorithms for reactor hydrodynamics.

    SciTech Connect (OSTI)

    Fischer, P.; Lottes, J.; Pointer, W. D.; Siegel, A.

    2008-01-01

    We describe recent algorithmic developments that have enabled large eddy simulations of reactor flows on up to P = 65, 000 processors on the IBM BG/P at the Argonne Leadership Computing Facility. Petascale computing is expected to play a pivotal role in the design and analysis of next-generation nuclear reactors. Argonne's SHARP project is focused on advanced reactor simulation, with a current emphasis on modeling coupled neutronics and thermal-hydraulics (TH). The TH modeling comprises a hierarchy of computational fluid dynamics approaches ranging from detailed turbulence computations, using DNS (direct numerical simulation) and LES (large eddy simulation), to full core analysis based on RANS (Reynolds-averaged Navier-Stokes) and subchannel models. Our initial study is focused on LES of sodium-cooled fast reactor cores. The aim is to leverage petascale platforms at DOE's Leadership Computing Facilities (LCFs) to provide detailed information about heat transfer within the core and to provide baseline data for less expensive RANS and subchannel models.

  3. Initial borehole acoustic televiewer data processing algorithms

    SciTech Connect (OSTI)

    Moore, T.K.

    1988-06-01

    With the development of a new digital televiewer, several algorithms have been developed in support of off-line data processing. This report describes the initial set of utilities developed to support data handling as well as data display. Functional descriptions, implementation details, and instructions for use of the seven algorithms are provided. 5 refs., 33 figs., 1 tab.

  4. The double-beta decay: Theoretical challenges

    SciTech Connect (OSTI)

    Horoi, Mihai

    2012-11-20

    Neutrinoless double beta decay is a unique process that could reveal physics beyond the Standard Model of particle physics namely, if observed, it would prove that neutrinos are Majorana particles. In addition, it could provide information regarding the neutrino masses and their hierarchy, provided that reliable nuclear matrix elements can be obtained. The two neutrino double beta decay is an associate process that is allowed by the Standard Model, and it was observed for about ten nuclei. The present contribution gives a brief review of the theoretical challenges associated with these two process, emphasizing the reliable calculation of the associated nuclear matrix elements.

  5. Theoretical Screening of Mixed Solid Sorbent for

    Office of Scientific and Technical Information (OSTI)

    xtended A b stra c t o f 2 0 1 4 AICliE S pring M eeting, New O rleans, LA, M ar.30-A pr.02, 20 1 4 Theoretical Screening of Mixed Solid Sorbent for Applications to C 0 2 Capture Technology Yuhua Duan' N ational E nergy T echnology Laboratory, United States D epartm ent o f Energy, Pittsburgh, Pennsylvania 15236, USA Abstract Since current technologies for capturing CO2 to fight global clim ate change are still too energy intensive, there is a critical need for developm ent o f new m aterials

  6. THEORETICAL STUDIES OF HADRONS AND NUCLEI

    SciTech Connect (OSTI)

    STEPHEN R COTANCH

    2007-03-20

    This report details final research results obtained during the 9 year period from June 1, 1997 through July 15, 2006. The research project, entitled ?Theoretical Studies of Hadrons and Nuclei?, was supported by grant DE-FG02-97ER41048 between North Carolina State University [NCSU] and the U. S. Department of Energy [DOE]. In compliance with grant requirements the Principal Investigator [PI], Professor Stephen R. Cotanch, conducted a theoretical research program investigating hadrons and nuclei and devoted to this program 50% of his time during the academic year and 100% of his time in the summer. Highlights of new, significant research results are briefly summarized in the following three sections corresponding to the respective sub-programs of this project (hadron structure, probing hadrons and hadron systems electromagnetically, and many-body studies). Recent progress is also discussed in a recent renewal/supplemental grant proposal submitted to DOE. Finally, full detailed descriptions of completed work can be found in the publications listed at the end of this report.

  7. Structural basis for the antibody neutralization of Herpes simplex virus

    Office of Scientific and Technical Information (OSTI)

    (Journal Article) | SciTech Connect Structural basis for the antibody neutralization of Herpes simplex virus Citation Details In-Document Search Title: Structural basis for the antibody neutralization of Herpes simplex virus The gD-E317-Fab complex crystal revealed the conformational epitope of human mAb E317 on HSV gD, providing a molecular basis for understanding the viral neutralization mechanism. Glycoprotein D (gD) of Herpes simplex virus (HSV) binds to a host cell surface receptor,

  8. Structural basis for ubiquitin-mediated antiviral signal activation...

    Office of Scientific and Technical Information (OSTI)

    Title: Structural basis for ubiquitin-mediated antiviral signal activation by RIG-I Authors: Peisley, Alys ; Wu, Bin ; Xu, Hui ; Chen, Zhijian J. ; Hur , Sun 1 ; HHMI) 2 ; ...

  9. The Three-Dimensional Structural Basis of Type II Hyperprolinemia...

    Office of Scientific and Technical Information (OSTI)

    Journal Article: The Three-Dimensional Structural Basis of Type II Hyperprolinemia Citation Details In-Document Search ... Here, we report the first structure of human P5CDH (HsP5CDH) ...

  10. CRAD, Review of Safety Basis Development- October 11, 2012

    Broader source: Energy.gov [DOE]

    Review of Safety Basis Development for the Y-12 National Security Complex Uranium Processing Facility Inspection Criteria, Approach, and Lines of Inquiry (HSS CRAD 45-55, Rev. 0)

  11. Enterprise Assessments Targeted Review of the Safety Basis at...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    ... in SAR 3.3.2.3.1 does not address the effect of an earthquake followed by a fire. ... Although the accident analysis is generally sound, the analysis of the design basis ...

  12. Structural basis for the antibody neutralization of Herpes simplex...

    Office of Scientific and Technical Information (OSTI)

    of Herpes simplex virus Citation Details In-Document Search Title: Structural basis for the antibody neutralization of Herpes simplex virus The gD-E317-Fab complex ...

  13. General Engineer/Physical Scientist (Safety Basis Engineer/Scientist)

    Broader source: Energy.gov [DOE]

    A successful candidate in this position will serve as an authority in the safety basis functional area. The incumbent is responsible for managing, coordinating, and authorizing work in the context...

  14. Computational and Theoretical Chemistry | U.S. DOE Office of...

    Office of Science (SC) Website

    Computational and Theoretical Chemistry Chemical Sciences, Geosciences, & Biosciences ... Molecular Sciences and Gas Phase Chemical Physics programs-which together comprise ...

  15. Advanced Test Reactor Design Basis Reconstitution Project Issue Resolution Process

    SciTech Connect (OSTI)

    Steven D. Winter; Gregg L. Sharp; William E. Kohn; Richard T. McCracken

    2007-05-01

    The Advanced Test Reactor (ATR) Design Basis Reconstitution Program (DBRP) is a structured assessment and reconstitution of the design basis for the ATR. The DBRP is designed to establish and document the ties between the Document Safety Analysis (DSA), design basis, and actual system configurations. Where the DBRP assessment team cannot establish a link between these three major elements, a gap is identified. Resolutions to identified gaps represent configuration management and design basis recovery actions. The proposed paper discusses the process being applied to define, evaluate, report, and address gaps that are identified through the ATR DBRP. Design basis verification may be performed or required for a nuclear facility safety basis on various levels. The process is applicable to large-scale design basis reconstitution efforts, such as the ATR DBRP, or may be scaled for application on smaller projects. The concepts are applicable to long-term maintenance of a nuclear facility safety basis and recovery of degraded safety basis components. The ATR DBRP assessment team has observed numerous examples where a clear and accurate link between the DSA, design basis, and actual system configuration was not immediately identifiable in supporting documentation. As a result, a systematic approach to effectively document, prioritize, and evaluate each observation is required. The DBRP issue resolution process provides direction for consistent identification, documentation, categorization, and evaluation, and where applicable, entry into the determination process for a potential inadequacy in the safety analysis (PISA). The issue resolution process is a key element for execution of the DBRP. Application of the process facilitates collection, assessment, and reporting of issues identified by the DBRP team. Application of the process results in an organized database of safety basis gaps and prioritized corrective action planning and resolution. The DBRP team follows the ATR DBRP issue resolution process which provides a method for the team to promptly sort and prioritize questions and issues between those that can be addressed as a normal part of the reconstitution project and those that are to be handle as PISAs. Presentation of the DBRP issue resolution process provides an example for similar activities that may be required at other facilities within the Department of Energy complex.

  16. Structural and Functional Basis for Inhibition of Erythrocyte Invasion by

    Office of Scientific and Technical Information (OSTI)

    Antibodies that Target Plasmodium falciparum EBA-175 (Journal Article) | SciTech Connect Journal Article: Structural and Functional Basis for Inhibition of Erythrocyte Invasion by Antibodies that Target Plasmodium falciparum EBA-175 Citation Details In-Document Search Title: Structural and Functional Basis for Inhibition of Erythrocyte Invasion by Antibodies that Target Plasmodium falciparum EBA-175 Authors: Chen, Edwin ; Paing, May M. ; Salinas, Nichole ; Sim, B. Kim Lee ; Tolia, Niraj H.

  17. Structural basis for biomolecular recognition in overlapping binding sites

    Office of Scientific and Technical Information (OSTI)

    in a diiron enzyme system (Journal Article) | SciTech Connect Structural basis for biomolecular recognition in overlapping binding sites in a diiron enzyme system Citation Details In-Document Search Title: Structural basis for biomolecular recognition in overlapping binding sites in a diiron enzyme system Authors: Acheson, Justin F. ; Bailey, Lucas J. ; Elsen, Nathaniel L. ; Fox, Brian G. [1] + Show Author Affiliations UW Publication Date: 2016-01-22 OSTI Identifier: 1229904 Resource Type:

  18. Structural basis of JAZ repression of MYC transcription factors in

    Office of Scientific and Technical Information (OSTI)

    jasmonate signalling (Journal Article) | SciTech Connect Journal Article: Structural basis of JAZ repression of MYC transcription factors in jasmonate signalling Citation Details In-Document Search Title: Structural basis of JAZ repression of MYC transcription factors in jasmonate signalling Authors: Zhang, Feng ; Yao, Jian ; Ke, Jiyuan ; Zhang, Li ; Lam, Vinh Q. ; Xin, Xiu-Fang ; Zhou, X. Edward ; Chen, Jian ; Brunzelle, Joseph ; Griffin, Patrick R. ; Zhou, Mingguo ; Xu, H. Eric ; Melcher,

  19. Structural basis for Tetrahymena telomerase processivity factor Teb1

    Office of Scientific and Technical Information (OSTI)

    binding to single-stranded telomeric-repeat DNA (Journal Article) | SciTech Connect Journal Article: Structural basis for Tetrahymena telomerase processivity factor Teb1 binding to single-stranded telomeric-repeat DNA Citation Details In-Document Search Title: Structural basis for Tetrahymena telomerase processivity factor Teb1 binding to single-stranded telomeric-repeat DNA Authors: Zeng, Zhixiong ; Min, Bosun ; Huang, Jing ; Hong, Kyungah ; Yang, Yuting ; Collins, Kathleen ; Lei, Ming [1]

  20. Structural basis for substrate specificity in the Escherichia coli maltose

    Office of Scientific and Technical Information (OSTI)

    transport system (Journal Article) | SciTech Connect Structural basis for substrate specificity in the Escherichia coli maltose transport system Citation Details In-Document Search Title: Structural basis for substrate specificity in the Escherichia coli maltose transport system Authors: Oldham, Michael L. ; Chen, Shanshuang ; Chen, Jue [1] ; HHMI) [2] + Show Author Affiliations (Purdue) [Purdue ( Publication Date: 2013-11-11 OSTI Identifier: 1105053 Resource Type: Journal Article Resource

  1. Atomic substitution reveals the structural basis for substrate adenine

    Office of Scientific and Technical Information (OSTI)

    recognition and removal by adenine DNA glycosylase (Journal Article) | SciTech Connect Atomic substitution reveals the structural basis for substrate adenine recognition and removal by adenine DNA glycosylase Citation Details In-Document Search Title: Atomic substitution reveals the structural basis for substrate adenine recognition and removal by adenine DNA glycosylase Adenine DNA glycosylase catalyzes the glycolytic removal of adenine from the promutagenic A {center_dot} oxoG base pair in

  2. Advanced Imaging Algorithms for Radiation Imaging Systems

    SciTech Connect (OSTI)

    Marleau, Peter

    2015-10-01

    The intent of the proposed work, in collaboration with University of Michigan, is to develop the algorithms that will bring the analysis from qualitative images to quantitative attributes of objects containing SNM. The first step to achieving this is to develop an indepth understanding of the intrinsic errors associated with the deconvolution and MLEM algorithms. A significant new effort will be undertaken to relate the image data to a posited three-dimensional model of geometric primitives that can be adjusted to get the best fit. In this way, parameters of the model such as sizes, shapes, and masses can be extracted for both radioactive and non-radioactive materials. This model-based algorithm will need the integrated response of a hypothesized configuration of material to be calculated many times. As such, both the MLEM and the model-based algorithm require significant increases in calculation speed in order to converge to solutions in practical amounts of time.

  3. Advanced CHP Control Algorithms: Scope Specification

    SciTech Connect (OSTI)

    Katipamula, Srinivas; Brambley, Michael R.

    2006-04-28

    The primary objective of this multiyear project is to develop algorithms for combined heat and power systems to ensure optimal performance, increase reliability, and lead to the goal of clean, efficient, reliable and affordable next generation energy systems.

  4. Tracking Algorithm for Multi- Dimensional Array Transposition

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    192002 Yun (Helen) He, SC2002 1 MPI and OpenMP Paradigms on Cluster of SMP Architectures: the Vacancy Tracking Algorithm for Multi- Dimensional Array Transposition Yun (Helen) He...

  5. Drainage Algorithm for Geospatial Knowledge

    Energy Science and Technology Software Center (OSTI)

    2006-08-15

    The Pacific Northwest National Laboratory (PNNL) has developed a prototype stream extraction algorithm that semi-automatically extracts and characterizes streams using a variety of multisensor imagery and digital terrain elevation data (DTED) data. The system is currently optimized for three types of single-band imagery: radar, visible, and thermal. Method of Solution: DRAGON: (1) classifies pixels into clumps of water objects based on the classification of water pixels by spectral signatures and neighborhood relationships, (2) uses themore » morphology operations (erosion and dilation) to separate out large lakes (or embayment), isolated lakes, ponds, wide rivers and narrow rivers, and (3) translates the river objects into vector objects. In detail, the process can be broken down into the following steps. A. Water pixels are initially identified using on the extend range and slope values (if an optional DEM file is available). B. Erode to the distance that defines a large water body and then dilate back. The resulting mask can be used to identify large lake and embayment objects that are then removed from the image. Since this operation be time consuming it is only performed if a simple test (i.e. a large box can be found somewhere in the image that contains only water pixels) that indicates a large water body is present. C. All water pixels are ‘clumped’ (in Imagine terminology clumping is when pixels of a common classification that touch are connected) and clumps which do not contain pure water pixels (e.g. dark cloud shadows) are removed D. The resulting true water pixels are clumped and water objects which are too small (e.g. ponds) or isolated lakes (i.e. isolated objects with a small compactness ratio) are removed. Note that at this point lakes have been identified has a byproduct of the filtering process and can be output has vector layers if needed. E. At this point only river pixels are left in the image. To separate out wide rivers all objects in the image are eroded by the half width of narrow rivers. This causes all narrow rivers to be removed and leaves only the core of wide rivers. This core is dilated out by the same distance to create a mask that is used with the original river image to separate out rivers into two separate images of narrow rivers and wide rivers F. If in the image that contains wide rivers there are small isolated short (less than 300 meters if NGA criteria is used) segments these segments are transferred to the narrow river file in order to be treated has parts of single line rivers G. The narrow river file is optionally dilated and eroded. This ‘closing’ has the effect of removing small islands, filling small gaps, and smoothing the outline H. The user also has the option of ‘closing’ objects in the wide river file. However, this depends on the degree to which the user wants to remove small islands in the large rivers. I. To make the translation from raster to single vector easier the objects in the narrow river image are reduced to a single center line (i.e. thinned) with binary morphology operations.« less

  6. Theoretical crystallography with the Advanced Visualization System

    SciTech Connect (OSTI)

    Younkin, C.R.; Thornton, E.N.; Nicholas, J.B.; Jones, D.R.; Hess, A.C.

    1993-05-01

    Space is an Application Visualization System (AVS) graphics module designed for crystallographic and molecular research. The program can handle molecules, two-dimensional periodic systems, and three-dimensional periodic systems, all referred to in the paper as models. Using several methods, the user can select atoms, groups of atoms, or entire molecules. Selections can be moved, copied, deleted, and merged. An important feature of Space is the crystallography component. The program allows the user to generate the unit cell from the asymmetric unit, manipulate the unit cell, and replicate it in three dimensions. Space includes the Buerger reduction algorithm which determines the asymmetric unit and the space group of highest symmetry of an input unit cell. Space also allows the user to display planes in the lattice based on Miller indices, and to cleave the crystal to expose the surface. The user can display important precalculated volumetric data in Space, such as electron densities and electrostatic surfaces. With a variety of methods, Space can compute the electrostatic potential of any chemical system based on input point charges.

  7. New Design Methods and Algorithms for Multi-component Distillation...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Design Methods and Algorithms for Multi-component Distillation Processes New Design Methods and Algorithms for Multi-component Distillation Processes PDF icon multicomponent.pdf ...

  8. Theoretical priors on modified growth parametrisations

    SciTech Connect (OSTI)

    Song, Yong-Seon; Hollenstein, Lukas; Caldera-Cabral, Gabriela; Koyama, Kazuya E-mail: Lukas.Hollenstein@unige.ch E-mail: Kazuya.Koyama@port.ac.uk

    2010-04-01

    Next generation surveys will observe the large-scale structure of the Universe with unprecedented accuracy. This will enable us to test the relationships between matter over-densities, the curvature perturbation and the Newtonian potential. Any large-distance modification of gravity or exotic nature of dark energy modifies these relationships as compared to those predicted in the standard smooth dark energy model based on General Relativity. In linear theory of structure growth such modifications are often parameterised by virtue of two functions of space and time that enter the relation of the curvature perturbation to, first, the matter over- density, and second, the Newtonian potential. We investigate the predictions for these functions in Brans-Dicke theory, clustering dark energy models and interacting dark energy models. We find that each theory has a distinct path in the parameter space of modified growth. Understanding these theoretical priors on the parameterisations of modified growth is essential to reveal the nature of cosmic acceleration with the help of upcoming observations of structure formation.

  9. Theoretical Model for Nanoporous Carbon Supercapacitors

    SciTech Connect (OSTI)

    Sumpter, Bobby G; Meunier, Vincent; Huang, Jingsong

    2008-01-01

    The unprecedented anomalous increase in capacitance of nanoporous carbon supercapacitors at pore sizes smaller than 1 nm [Science 2006, 313, 1760.] challenges the long-held presumption that pores smaller than the size of solvated electrolyte ions do not contribute to energy storage. We propose a heuristic model to replace the commonly used model for an electric double-layer capacitor (EDLC) on the basis of an electric double-cylinder capacitor (EDCC) for mesopores (2 {50 nm pore size), which becomes an electric wire-in-cylinder capacitor (EWCC) for micropores (< 2 nm pore size). Our analysis of the available experimental data in the micropore regime is confirmed by 1st principles density functional theory calculations and reveals significant curvature effects for carbon capacitance. The EDCC (and/or EWCC) model allows the supercapacitor properties to be correlated with pore size, specific surface area, Debye length, electrolyte concentration and dielectric constant, and solute ion size. The new model not only explains the experimental data, but also offers a practical direction for the optimization of the properties of carbon supercapacitors through experiments.

  10. Parallelism of the SANDstorm hash algorithm.

    SciTech Connect (OSTI)

    Torgerson, Mark Dolan; Draelos, Timothy John; Schroeppel, Richard Crabtree

    2009-09-01

    Mainstream cryptographic hashing algorithms are not parallelizable. This limits their speed and they are not able to take advantage of the current trend of being run on multi-core platforms. Being limited in speed limits their usefulness as an authentication mechanism in secure communications. Sandia researchers have created a new cryptographic hashing algorithm, SANDstorm, which was specifically designed to take advantage of multi-core processing and be parallelizable on a wide range of platforms. This report describes a late-start LDRD effort to verify the parallelizability claims of the SANDstorm designers. We have shown, with operating code and bench testing, that the SANDstorm algorithm may be trivially parallelized on a wide range of hardware platforms. Implementations using OpenMP demonstrates a linear speedup with multiple cores. We have also shown significant performance gains with optimized C code and the use of assembly instructions to exploit particular platform capabilities.

  11. Nonlinear Global Optimization Using Curdling Algorithm

    Energy Science and Technology Software Center (OSTI)

    1996-03-01

    An algorithm for performing curdling optimization which is a derivative-free, grid-refinement approach to nonlinear optimization was developed and implemented in software. This approach overcomes a number of deficiencies in existing approaches. Most notably, it finds extremal regions rather than only single external extremal points. The program is interactive and collects information on control parameters and constraints using menus. For up to four dimensions, function convergence is displayed graphically. Because the algorithm does not compute derivatives,more » gradients or vectors, it is numerically stable. It can find all the roots of a polynomial in one pass. It is an inherently parallel algorithm. Constraints are handled as being initially fuzzy, but become tighter with each iteration.« less

  12. Bootstrap performance profiles in stochastic algorithms assessment

    SciTech Connect (OSTI)

    Costa, Lino; Esprito Santo, Isabel A.C.P.; Oliveira, Pedro

    2015-03-10

    Optimization with stochastic algorithms has become a relevant research field. Due to its stochastic nature, its assessment is not straightforward and involves integrating accuracy and precision. Performance profiles for the mean do not show the trade-off between accuracy and precision, and parametric stochastic profiles require strong distributional assumptions and are limited to the mean performance for a large number of runs. In this work, bootstrap performance profiles are used to compare stochastic algorithms for different statistics. This technique allows the estimation of the sampling distribution of almost any statistic even with small samples. Multiple comparison profiles are presented for more than two algorithms. The advantages and drawbacks of each assessment methodology are discussed.

  13. Berkeley Algorithms Help Researchers Understand Dark Energy

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Berkeley Algorithms Help Researchers Understand Dark Energy Berkeley Algorithms Help Researchers Understand Dark Energy November 24, 2014 Contact: Linda Vu, +1 510 495 2402, lvu@lbl.gov Scientists believe that dark energy-the mysterious force that is accelerating cosmic expansion-makes up about 70 percent of the mass and energy of the universe. But because they don't know what it is, they cannot observe it directly. To unlock the mystery of dark energy and its influence on the universe,

  14. Graph algorithms in the titan toolkit.

    SciTech Connect (OSTI)

    McLendon, William Clarence, III; Wylie, Brian Neil

    2009-10-01

    Graph algorithms are a key component in a wide variety of intelligence analysis activities. The Graph-Based Informatics for Non-Proliferation and Counter-Terrorism project addresses the critical need of making these graph algorithms accessible to Sandia analysts in a manner that is both intuitive and effective. Specifically we describe the design and implementation of an open source toolkit for doing graph analysis, informatics, and visualization that provides Sandia with novel analysis capability for non-proliferation and counter-terrorism.

  15. Theoretical Description of the Fission Process

    SciTech Connect (OSTI)

    Witold Nazarewicz

    2009-10-25

    Advanced theoretical methods and high-performance computers may finally unlock the secrets of nuclear fission, a fundamental nuclear decay that is of great relevance to society. In this work, we studied the phenomenon of spontaneous fission using the symmetry-unrestricted nuclear density functional theory (DFT). Our results show that many observed properties of fissioning nuclei can be explained in terms of pathways in multidimensional collective space corresponding to different geometries of fission products. From the calculated collective potential and collective mass, we estimated spontaneous fission half-lives, and good agreement with experimental data was found. We also predicted a new phenomenon of trimodal spontaneous fission for some transfermium isotopes. Our calculations demonstrate that fission barriers of excited superheavy nuclei vary rapidly with particle number, pointing to the importance of shell effects even at large excitation energies. The results are consistent with recent experiments where superheavy elements were created by bombarding an actinide target with 48-calcium; yet even at high excitation energies, sizable fission barriers remained. Not only does this reveal clues about the conditions for creating new elements, it also provides a wider context for understanding other types of fission. Understanding of the fission process is crucial for many areas of science and technology. Fission governs existence of many transuranium elements, including the predicted long-lived superheavy species. In nuclear astrophysics, fission influences the formation of heavy elements on the final stages of the r-process in a very high neutron density environment. Fission applications are numerous. Improved understanding of the fission process will enable scientists to enhance the safety and reliability of the nations nuclear stockpile and nuclear reactors. The deployment of a fleet of safe and efficient advanced reactors, which will also minimize radiotoxic waste and be proliferation-resistant, is a goal for the advanced nuclear fuel cycles program. While in the past the design, construction, and operation of reactors were supported through empirical trials, this new phase in nuclear energy production is expected to heavily rely on advanced modeling and simulation capabilities.

  16. Theoretical Studies of Hydrogen Storage Alloys.

    SciTech Connect (OSTI)

    Jonsson, Hannes

    2012-03-22

    Theoretical calculations were carried out to search for lightweight alloys that can be used to reversibly store hydrogen in mobile applications, such as automobiles. Our primary focus was on magnesium based alloys. While MgH{sub 2} is in many respects a promising hydrogen storage material, there are two serious problems which need to be solved in order to make it useful: (i) the binding energy of the hydrogen atoms in the hydride is too large, causing the release temperature to be too high, and (ii) the diffusion of hydrogen through the hydride is so slow that loading of hydrogen into the metal takes much too long. In the first year of the project, we found that the addition of ca. 15% of aluminum decreases the binding energy to the hydrogen to the target value of 0.25 eV which corresponds to release of 1 bar hydrogen gas at 100 degrees C. Also, the addition of ca. 15% of transition metal atoms, such as Ti or V, reduces the formation energy of interstitial H-atoms making the diffusion of H-atoms through the hydride more than ten orders of magnitude faster at room temperature. In the second year of the project, several calculations of alloys of magnesium with various other transition metals were carried out and systematic trends in stability, hydrogen binding energy and diffusivity established. Some calculations of ternary alloys and their hydrides were also carried out, for example of Mg{sub 6}AlTiH{sub 16}. It was found that the binding energy reduction due to the addition of aluminum and increased diffusivity due to the addition of a transition metal are both effective at the same time. This material would in principle work well for hydrogen storage but it is, unfortunately, unstable with respect to phase separation. A search was made for a ternary alloy of this type where both the alloy and the corresponding hydride are stable. Promising results were obtained by including Zn in the alloy.

  17. CRAD, Safety Basis- Idaho MF-628 Drum Treatment Facility

    Broader source: Energy.gov [DOE]

    A section of Appendix C to DOE G 226.1-2 "Federal Line Management Oversight of Department of Energy Nuclear Facilities." Consists of Criteria Review and Approach Documents (CRADs) used for a May 2007 readiness assessment of the Safety Basis at the Advanced Mixed Waste Treatment Project.

  18. CRAD, Safety Basis- Idaho Accelerated Retrieval Project Phase II

    Broader source: Energy.gov [DOE]

    A section of Appendix C to DOE G 226.1-2 "Federal Line Management Oversight of Department of Energy Nuclear Facilities." Consists of Criteria Review and Approach Documents (CRADs) used for a February 2006 Commencement of Operations assessment of the Safety Basis at the Idaho Accelerated Retrieval Project Phase II.

  19. Cold Vacuum Drying (CVD) Facility Design Basis Accident Analysis Documentation

    SciTech Connect (OSTI)

    PIEPHO, M.G.

    1999-10-20

    This document provides the detailed accident analysis to support HNF-3553, Annex B, Spent Nuclear Fuel Project Final Safety Analysis Report, ''Cold Vacuum Drying Facility Final Safety Analysis Report (FSAR).'' All assumptions, parameters and models used to provide the analysis of the design basis accidents are documented to support the conclusions in the FSAR.

  20. Canister Storage Building (CSB) Design Basis Accident Analysis Documentation

    SciTech Connect (OSTI)

    CROWE, R.D.; PIEPHO, M.G.

    2000-03-23

    This document provided the detailed accident analysis to support HNF-3553, Spent Nuclear Fuel Project Final Safety Analysis Report, Annex A, ''Canister Storage Building Final Safety Analysis Report''. All assumptions, parameters, and models used to provide the analysis of the design basis accidents are documented to support the conclusions in the Canister Storage Building Final Safety Analysis Report.

  1. Canister storage building design basis accident analysis documentation

    SciTech Connect (OSTI)

    KOPELIC, S.D.

    1999-02-25

    This document provides the detailed accident analysis to support HNF-3553, Spent Nuclear Fuel Project Final Safety Analysis Report, Annex A, ''Canister Storage Building Final Safety Analysis Report.'' All assumptions, parameters, and models used to provide the analysis of the design basis accidents are documented to support the conclusions in the Canister Storage Building Final Safety Analysis Report.

  2. Solar Power Tower Design Basis Document, Revision 0

    SciTech Connect (OSTI)

    ZAVOICO,ALEXIS B.

    2001-07-01

    This report contains the design basis for a generic molten-salt solar power tower. A solar power tower uses a field of tracking mirrors (heliostats) that redirect sunlight on to a centrally located receiver mounted on top a tower, which absorbs the concentrated sunlight. Molten nitrate salt, pumped from a tank at ground level, absorbs the sunlight, heating it up to 565 C. The heated salt flows back to ground level into another tank where it is stored, then pumped through a steam generator to produce steam and make electricity. This report establishes a set of criteria upon which the next generation of solar power towers will be designed. The report contains detailed criteria for each of the major systems: Collector System, Receiver System, Thermal Storage System, Steam Generator System, Master Control System, and Electric Heat Tracing System. The Electric Power Generation System and Balance of Plant discussions are limited to interface requirements. This design basis builds on the extensive experience gained from the Solar Two project and includes potential design innovations that will improve reliability and lower technical risk. This design basis document is a living document and contains several areas that require trade-studies and design analysis to fully complete the design basis. Project- and site-specific conditions and requirements will also resolve open To Be Determined issues.

  3. Canister Storage Building (CSB) Design Basis Accident Analysis Documentation

    SciTech Connect (OSTI)

    CROWE, R.D.

    1999-09-09

    This document provides the detailed accident analysis to support ''HNF-3553, Spent Nuclear Fuel Project Final Safety, Analysis Report, Annex A,'' ''Canister Storage Building Final Safety Analysis Report.'' All assumptions, parameters, and models used to provide the analysis of the design basis accidents are documented to support the conclusions in the Canister Storage Building Final Safety Analysis Report.

  4. ITP Steel: Theoretical Minimum Energies to Produce Steel for Selected

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Conditions, March 2000 | Department of Energy Theoretical Minimum Energies to Produce Steel for Selected Conditions, March 2000 ITP Steel: Theoretical Minimum Energies to Produce Steel for Selected Conditions, March 2000 PDF icon theoretical_minimum_energies.pdf More Documents & Publications Ironmaking Process Alternatives Screening Study ITP Steel: Steel Industry Marginal Opportunity Study September 2005 ITP Steel: Steel Industry Energy Bandwidth Study October 2004

  5. Theoretical Predictions of the thermodynamic Properties of Solid Sorbents

    Office of Scientific and Technical Information (OSTI)

    Capture CO2 Applications (Conference) | SciTech Connect Conference: Theoretical Predictions of the thermodynamic Properties of Solid Sorbents Capture CO2 Applications Citation Details In-Document Search Title: Theoretical Predictions of the thermodynamic Properties of Solid Sorbents Capture CO2 Applications We are establishing a theoretical procedure to identify most potential candidates of CO{sub 2} solid sorbents from a large solid material databank to meet the DOE programmatic goal for

  6. Improvements of Nuclear Data and Its Uncertainties by Theoretical...

    Office of Scientific and Technical Information (OSTI)

    Improvements of Nuclear Data and Its Uncertainties by Theoretical Modeling Talou, Patrick Los Alamos National Laboratory; Nazarewicz, Witold University of Tennessee, Knoxville,...

  7. Theoretical study of Ag- and Au-filled skutterudites.

    Broader source: Energy.gov [DOE]

    Uses ab initio atomistic DFT modeling as implemented in VASP to determine theoretical values of thermoelectric properties for Ag-filled skutterudites.

  8. Improvements of Nuclear Data and Its Uncertainties by Theoretical...

    Office of Scientific and Technical Information (OSTI)

    Improvements of Nuclear Data and Its Uncertainties by Theoretical Modeling Citation Details In-Document Search Title: Improvements of Nuclear Data and Its Uncertainties by ...

  9. Improvements to Nuclear Data and Its Uncertainties by Theoretical...

    Office of Scientific and Technical Information (OSTI)

    Improvements to Nuclear Data and Its Uncertainties by Theoretical Modeling Citation Details In-Document Search Title: Improvements to Nuclear Data and Its Uncertainties by ...

  10. PDES. FIPS Standard Data Encryption Algorithm

    SciTech Connect (OSTI)

    Nessett, D.N.

    1992-03-03

    PDES performs the National Bureau of Standards FIPS Pub. 46 data encryption/description algorithm used for the cryptographic protection of computer data. The DES algorithm is designed to encipher and decipher blocks of data consisting of 64 bits under control of a 64-bit key. The key is generated in such a way that each of the 56 bits used directly by the algorithm are random and the remaining 8 error-detecting bits are set to make the parity of each 8-bit byte of the key odd, i.e. there is an odd number of 1 bits in each 8-bit byte. Each member of a group of authorized users of encrypted computer data must have the key that was used to encipher the data in order to use it. Data can be recovered from cipher only by using exactly the same key used to encipher it, but with the schedule of addressing the key bits altered so that the deciphering process is the reverse of the enciphering process. A block of data to be enciphered is subjected to an initial permutation, then to a complex key-dependent computation, and finally to a permutation which is the inverse of the initial permutation. Two PDES routines are included; both perform the same calculation. One, identified as FDES.MAR, is designed to achieve speed in execution, while the other identified as PDES.MAR, presents a clearer view of how the algorithm is executed.

  11. Gamma-ray spectral analysis algorithm library

    Energy Science and Technology Software Center (OSTI)

    2013-05-06

    The routines of the Gauss Algorithms library are used to implement special purpose products that need to analyze gamma-ray spectra from Ge semiconductor detectors as a part of their function. These routines provide the ability to calibrate energy, calibrate peakwidth, search for peaks, search for regions, and fit the spectral data in a given region to locate gamma rays.

  12. Gamma-ray Spectral Analysis Algorithm Library

    Energy Science and Technology Software Center (OSTI)

    1997-09-25

    The routines of the Gauss Algorithm library are used to implement special purpose products that need to analyze gamma-ray spectra from GE semiconductor detectors as a part of their function. These routines provide the ability to calibrate energy, calibrate peakwidth, search for peaks, search for regions, and fit the spectral data in a given region to locate gamma rays.

  13. Control algorithms for autonomous robot navigation

    SciTech Connect (OSTI)

    Jorgensen, C.C.

    1985-09-20

    This paper examines control algorithm requirements for autonomous robot navigation outside laboratory environments. Three aspects of navigation are considered: navigation control in explored terrain, environment interactions with robot sensors, and navigation control in unanticipated situations. Major navigation methods are presented and relevance of traditional human learning theory is discussed. A new navigation technique linking graph theory and incidental learning is introduced.

  14. Theoretical minimum energies to produce steel for selected conditions

    SciTech Connect (OSTI)

    Fruehan, R. J.; Fortini, O.; Paxton, H. W.; Brindle, R.

    2000-03-01

    An ITP study has determined the theoretical minimum energy requirements for producing steel from ore, scrap, and direct reduced iron. Dr. Richard Fruehan's report, Theoretical Minimum Energies to Produce Steel for Selected Conditions, provides insight into the potential energy savings (and associated reductions in carbon dioxide emissions) for ironmaking, steelmaking, and rolling processes (PDF459 KB).

  15. Basis for NGNP Reactor Design Down-Selection

    SciTech Connect (OSTI)

    L.E. Demick

    2011-11-01

    The purpose of this paper is to identify the extent of technology development, design and licensing maturity anticipated to be required to credibly identify differences that could make a technical choice practical between the prismatic and pebble bed reactor designs. This paper does not address a business decision based on the economics, business model and resulting business case since these will vary based on the reactor application. The selection of the type of reactor, the module ratings, the number of modules, the configuration of the balance of plant and other design selections will be made on the basis of optimizing the Business Case for the application. These are not decisions that can be made on a generic basis.

  16. Resilient Control Systems Practical Metrics Basis for Defining Mission Impact

    SciTech Connect (OSTI)

    Craig G. Rieger

    2014-08-01

    "Resilience” describes how systems operate at an acceptable level of normalcy despite disturbances or threats. In this paper we first consider the cognitive, cyber-physical interdependencies inherent in critical infrastructure systems and how resilience differs from reliability to mitigate these risks. Terminology and metrics basis are provided to integrate the cognitive, cyber-physical aspects that should be considered when defining solutions for resilience. A practical approach is taken to roll this metrics basis up to system integrity and business case metrics that establish “proper operation” and “impact.” A notional chemical processing plant is the use case for demonstrating how the system integrity metrics can be applied to establish performance, and

  17. Design-Load Basis for LANL Structures, Systems, and Components

    SciTech Connect (OSTI)

    I. Cuesta

    2004-09-01

    This document supports the recommendations in the Los Alamos National Laboratory (LANL) Engineering Standard Manual (ESM), Chapter 5--Structural providing the basis for the loads, analysis procedures, and codes to be used in the ESM. It also provides the justification for eliminating the loads to be considered in design, and evidence that the design basis loads are appropriate and consistent with the graded approach required by the Department of Energy (DOE) Code of Federal Regulation Nuclear Safety Management, 10, Part 830. This document focuses on (1) the primary and secondary natural phenomena hazards listed in DOE-G-420.1-2, Appendix C, (2) additional loads not related to natural phenomena hazards, and (3) the design loads on structures during construction.

  18. Advanced Fuel Cycle Economic Tools, Algorithms, and Methodologies

    SciTech Connect (OSTI)

    David E. Shropshire

    2009-05-01

    The Advanced Fuel Cycle Initiative (AFCI) Systems Analysis supports engineering economic analyses and trade-studies, and requires a requisite reference cost basis to support adequate analysis rigor. In this regard, the AFCI program has created a reference set of economic documentation. The documentation consists of the “Advanced Fuel Cycle (AFC) Cost Basis” report (Shropshire, et al. 2007), “AFCI Economic Analysis” report, and the “AFCI Economic Tools, Algorithms, and Methodologies Report.” Together, these documents provide the reference cost basis, cost modeling basis, and methodologies needed to support AFCI economic analysis. The application of the reference cost data in the cost and econometric systems analysis models will be supported by this report. These methodologies include: the energy/environment/economic evaluation of nuclear technology penetration in the energy market—domestic and internationally—and impacts on AFCI facility deployment, uranium resource modeling to inform the front-end fuel cycle costs, facility first-of-a-kind to nth-of-a-kind learning with application to deployment of AFCI facilities, cost tradeoffs to meet nuclear non-proliferation requirements, and international nuclear facility supply/demand analysis. The economic analysis will be performed using two cost models. VISION.ECON will be used to evaluate and compare costs under dynamic conditions, consistent with the cases and analysis performed by the AFCI Systems Analysis team. Generation IV Excel Calculations of Nuclear Systems (G4-ECONS) will provide static (snapshot-in-time) cost analysis and will provide a check on the dynamic results. In future analysis, additional AFCI measures may be developed to show the value of AFCI in closing the fuel cycle. Comparisons can show AFCI in terms of reduced global proliferation (e.g., reduction in enrichment), greater sustainability through preservation of a natural resource (e.g., reduction in uranium ore depletion), value from weaning the U.S. from energy imports (e.g., measures of energy self-sufficiency), and minimization of future high level waste (HLW) repositories world-wide.

  19. Online Monitoring Technical Basis and Analysis Framework for Emergency

    Energy Savers [EERE]

    Diesel Generators-Interim Report for FY 2013 | Department of Energy for Emergency Diesel Generators-Interim Report for FY 2013 Online Monitoring Technical Basis and Analysis Framework for Emergency Diesel Generators-Interim Report for FY 2013 The Light Water Reactor Sustainability Program is a research, development, and deployment program sponsored by the U.S. Department of Energy Office of Nuclear Energy. The program is operated in collaboration with the Electric Power Research Institute's

  20. Online Monitoring Technical Basis and Analysis Framework for Large Power

    Energy Savers [EERE]

    Transformers; Interim Report for FY 2012 | Department of Energy for Large Power Transformers; Interim Report for FY 2012 Online Monitoring Technical Basis and Analysis Framework for Large Power Transformers; Interim Report for FY 2012 The Light Water Reactor Sustainability Program is a research, development, and deployment program sponsored by the U.S. Department of Energy Office of Nuclear Energy. The program is operated in collaboration with the Electric Power Research Institute's (EPRI's)

  1. Interim Safety Basis for Fuel Supply Shutdown Facility

    SciTech Connect (OSTI)

    BENECKE, M.W.

    2000-09-07

    This ISB, in conjunction with the IOSR, provides the required basis for interim operation or restrictions on interim operations and administrative controls for the facility until a SAR is prepared in accordance with the new requirements or the facility is shut down. It is concluded that the risks associated with tha current and anticipated mode of the facility, uranium disposition, clean up, and transition activities required for permanent closure, are within risk guidelines.

  2. Semi-Implicit Reversible Algorithms for Rigid Body Rotational Dynamics

    SciTech Connect (OSTI)

    Nukala, Phani K; Shelton Jr, William Allison

    2006-09-01

    This paper presents two semi-implicit algorithms based on splitting methodology for rigid body rotational dynamics. The first algorithm is a variation of partitioned Runge-Kutta (PRK) methodology that can be formulated as a splitting method. The second algorithm is akin to a multiple time stepping scheme and is based on modified Crouch-Grossman (MCG) methodology, which can also be expressed as a splitting algorithm. These algorithms are second-order accurate and time-reversible; however, they are not Poisson integrators, i.e., non-symplectic. These algorithms conserve some of the first integrals of motion, but some others are not conserved; however, the fluctuations in these invariants are bounded over exponentially long time intervals. These algorithms exhibit excellent long-term behavior because of their reversibility property and their (approximate) Poisson structure preserving property. The numerical results indicate that the proposed algorithms exhibit superior performance compared to some of the currently well known algorithms such as the Simo-Wong algorithm, Newmark algorithm, discrete Moser-Veselov algorithm, Lewis-Simo algorithm, and the LIEMID[EA] algorithm.

  3. A garbage collection algorithm for shared memory parallel processors

    SciTech Connect (OSTI)

    Crammond, J. )

    1988-12-01

    This paper describes a technique for adapting the Morris sliding garbage collection algorithm to execute on parallel machines with shared memory. The algorithm is described within the framework of an implementation of the parallel logic language Parlog. However, the algorithm is a general one and can easily be adapted to parallel Prolog systems and to other languages. The performance of the algorithm executing a few simple Parlog benchmarks is analyzed. Finally, it is shown how the technique for parallelizing the sequential algorithm can be adapted for a semi-space copying algorithm.

  4. Fast computation algorithms for speckle pattern simulation

    SciTech Connect (OSTI)

    Nascov, Victor; Samoilă, Cornel; Ursuţiu, Doru

    2013-11-13

    We present our development of a series of efficient computation algorithms, generally usable to calculate light diffraction and particularly for speckle pattern simulation. We use mainly the scalar diffraction theory in the form of Rayleigh-Sommerfeld diffraction formula and its Fresnel approximation. Our algorithms are based on a special form of the convolution theorem and the Fast Fourier Transform. They are able to evaluate the diffraction formula much faster than by direct computation and we have circumvented the restrictions regarding the relative sizes of the input and output domains, met on commonly used procedures. Moreover, the input and output planes can be tilted each to other and the output domain can be off-axis shifted.

  5. 5th International REAC/TS Symposium: The Medical Basis for Radiation...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    The Medical Basis for Radiation Accident Preparedness Skip site navigation and ... The Medical Basis for Radiation Accident Preparedness Sept. 27-29, 2011 | Miami, ...

  6. CRAD, Safety Basis Upgrade Review (DOE-STD-3009-2014) - May 15...

    Office of Environmental Management (EM)

    1) provides objectives, criteria, and approaches for establishing and maintaining the safety basis at nuclear facilities. CRAD, Safety Basis Upgrade Review (DOE-STD-3009-2014)...

  7. Automated Algorithm for MFRSR Data Analysis

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Automated Algorithm for MFRSR Data Analysis M. D. Alexandrov and B. Cairns Columbia University and National Aeronautics and Space Administration Goddard Institute for Space Studies New York, New York A. A. Lacis and B. E. Carlson National Aeronautics and Space Administration Goddard Institute for Space Studies New York, New York A. Marshak National Aeronautics and Space Administration Goddard Space Flight Center Greenbelt, Maryland We present a substantial upgrade of our previously developed

  8. RELEASE OF DRIED RADIOACTIVE WASTE MATERIALS TECHNICAL BASIS DOCUMENT

    SciTech Connect (OSTI)

    KOZLOWSKI, S.D.

    2007-05-30

    This technical basis document was developed to support RPP-23429, Preliminary Documented Safety Analysis for the Demonstration Bulk Vitrification System (PDSA) and RPP-23479, Preliminary Documented Safety Analysis for the Contact-Handled Transuranic Mixed (CH-TRUM) Waste Facility. The main document describes the risk binning process and the technical basis for assigning risk bins to the representative accidents involving the release of dried radioactive waste materials from the Demonstration Bulk Vitrification System (DBVS) and to the associated represented hazardous conditions. Appendices D through F provide the technical basis for assigning risk bins to the representative dried waste release accident and associated represented hazardous conditions for the Contact-Handled Transuranic Mixed (CH-TRUM) Waste Packaging Unit (WPU). The risk binning process uses an evaluation of the frequency and consequence of a given representative accident or represented hazardous condition to determine the need for safety structures, systems, and components (SSC) and technical safety requirement (TSR)-level controls. A representative accident or a represented hazardous condition is assigned to a risk bin based on the potential radiological and toxicological consequences to the public and the collocated worker. Note that the risk binning process is not applied to facility workers because credible hazardous conditions with the potential for significant facility worker consequences are considered for safety-significant SSCs and/or TSR-level controls regardless of their estimated frequency. The controls for protection of the facility workers are described in RPP-23429 and RPP-23479. Determination of the need for safety-class SSCs was performed in accordance with DOE-STD-3009-94, Preparation Guide for US. Department of Energy Nonreactor Nuclear Facility Documented Safety Analyses, as described below.

  9. Guidance For Preparatioon of Basis For Interim Operation (BIO) Documents

    Energy Savers [EERE]

    3011-2002 December 2002 Superceding DOE-STD-3011-94 November 1994 DOE STANDARD GUIDANCE FOR PREPARATION OF BASIS FOR INTERIM OPERATION (BIO) DOCUMENTS U.S. Department of Energy AREA SAFT Washington, D.C. 20585 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. NOT MEASUREMENT SENSITIVE DOE-STD-3011-2002 ii This document has been reproduced directly from the best available copy. Available to DOE and DOE contractors from ES&H Technical Information Services, U.S.

  10. The Bender-Dunne basis operators as Hilbert space operators

    SciTech Connect (OSTI)

    Bunao, Joseph; Galapon, Eric A. E-mail: eric.galapon@upd.edu.ph

    2014-02-15

    The Bender-Dunne basis operators, T{sub ?m,n}=2{sup ?n}?{sub k=0}{sup n}(n/k )q{sup k}p{sup ?m}q{sup n?k} where q and p are the position and momentum operators, respectively, are formal integral operators in position representation in the entire real line R for positive integers n and m. We show, by explicit construction of a dense domain, that the operators T{sub ?m,n}'s are densely defined operators in the Hilbert space L{sup 2}(R)

  11. Scaling Up Coordinate Descent Algorithms for Large ?1 Regularization Problems

    SciTech Connect (OSTI)

    Scherrer, Chad; Halappanavar, Mahantesh; Tewari, Ambuj; Haglin, David J.

    2012-07-03

    We present a generic framework for parallel coordinate descent (CD) algorithms that has as special cases the original sequential algorithms of Cyclic CD and Stochastic CD, as well as the recent parallel Shotgun algorithm of Bradley et al. We introduce two novel parallel algorithms that are also special cases---Thread-Greedy CD and Coloring-Based CD---and give performance measurements for an OpenMP implementation of these.

  12. Neutron-Antineutron Oscillations: Theoretical Status and Experimental Prospects

    SciTech Connect (OSTI)

    Phillips, D. G.; Snow, W. M.; Babu, K.; Banerjee, S.; Baxter, D. V.; Berezhiani, Z.; Bergevin, M.; Bhattacharya, S.; Brooijmans, G.; Castellanos, L.; et al.,

    2014-10-04

    This paper summarizes the relevant theoretical developments, outlines some ideas to improve experimental searches for free neutron-antineutron oscillations, and suggests avenues for future improvement in the experimental sensitivity.

  13. EXPERIMENTAL AND THEORETICAL DETERMINATION OF HEAVY OIL VISCOSITY...

    Office of Scientific and Technical Information (OSTI)

    EXPERIMENTAL AND THEORETICAL DETERMINATION OF HEAVY OIL VISCOSITY UNDER RESERVOIR CONDITIONS FINAL PROGRESS REPORT PERIOD: OCT 1999-MAY 2003 CONTRACT NUMBER: DE-FG26-99FT40615 ...

  14. Final Report. Research in Theoretical High Energy Physics

    SciTech Connect (OSTI)

    Greensite, Jeffrey P.; Golterman, Maarten F.L.

    2015-04-30

    Grant-supported research in theoretical high-energy physics, conducted in the period 1992-2015 is briefly described, and a full listing of published articles result from those research activities is supplied.

  15. Theoretical/best practice energy use in metalcasting operations

    SciTech Connect (OSTI)

    Schifo, J. F.; Radia, J. T.

    2004-05-01

    This study determined the theoretical minimum energy requirements for melting processes for all ferrous and noferrous engenieering alloys. Also the report details the Best Practice energy consumption for the industry.

  16. Neutron-antineutron oscillations: Theoretical status and experimental prospects

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Phillips, D. G.; Snow, W. M.; Babu, K.; Banerjee, S.; Baxter, D. V.; Berezhiani, Z.; Bergevin, M.; Bhattacharya, S.; Brooijmans, G.; Castellanos, L.; et al

    2016-02-01

    This paper summarizes the relevant theoretical developments, outlines some ideas to improve experimental searches for free neutron-antineutron oscillations, and suggests avenues for future improvement in the experimental sensitivity.

  17. Cold Vacuum Drying facility design basis accident analysis documentation

    SciTech Connect (OSTI)

    CROWE, R.D.

    2000-08-08

    This document provides the detailed accident analysis to support HNF-3553, Annex B, Spent Nuclear Fuel Project Final Safety Analysis Report (FSAR), ''Cold Vacuum Drying Facility Final Safety Analysis Report.'' All assumptions, parameters, and models used to provide the analysis of the design basis accidents are documented to support the conclusions in the FSAR. The calculations in this document address the design basis accidents (DBAs) selected for analysis in HNF-3553, ''Spent Nuclear Fuel Project Final Safety Analysis Report'', Annex B, ''Cold Vacuum Drying Facility Final Safety Analysis Report.'' The objective is to determine the quantity of radioactive particulate available for release at any point during processing at the Cold Vacuum Drying Facility (CVDF) and to use that quantity to determine the amount of radioactive material released during the DBAs. The radioactive material released is used to determine dose consequences to receptors at four locations, and the dose consequences are compared with the appropriate evaluation guidelines and release limits to ascertain the need for preventive and mitigative controls.

  18. Hanford External Dosimetry Technical Basis Manual PNL-MA-842

    SciTech Connect (OSTI)

    Rathbone, Bruce A.

    2005-02-25

    The Hanford External Dosimetry Technical Basis Manual PNL-MA-842 documents the design and implementation of the external dosimetry system used at Hanford. The manual describes the dosimeter design, processing protocols, dose calculation methodology, radiation fields encountered, dosimeter response characteristics, limitations of dosimeter design under field conditions, and makes recommendations for effective use of the dosimeters in the field. The manual describes the technical basis for the dosimetry system in a manner intended to help ensure defensibility of the dose of record at Hanford and to demonstrate compliance with 10 CFR 835, DOELAP, DOE-RL, ORP, PNSO, and Hanford contractor requirements. The dosimetry system is operated by PNNL’s Hanford External Dosimetry Program which provides dosimetry services to all Hanford contractors. The primary users of this manual are DOE and DOE contractors at Hanford using the dosimetry services of PNNL. Development and maintenance of this manual is funded directly by DOE and DOE contractors. Its contents have been reviewed and approved by DOE and DOE contractors at Hanford through the Hanford Personnel Dosimetry Advisory Committee which is chartered and chaired by DOE-RL and serves as means of coordinating dosimetry practices across contractors at Hanford. This manual was established in 1996. Since inception, it has been revised many times and maintained by PNNL as a controlled document with controlled distribution. Rev. 0 marks the first revision to be released through PNNL’s Electronic Records & Information Capture Architecture (ERICA) database.

  19. Improvements of Nuclear Data and Its Uncertainties by Theoretical Modeling

    Office of Scientific and Technical Information (OSTI)

    (Technical Report) | SciTech Connect Improvements of Nuclear Data and Its Uncertainties by Theoretical Modeling Citation Details In-Document Search Title: Improvements of Nuclear Data and Its Uncertainties by Theoretical Modeling Authors: Talou, Patrick [1] ; Nazarewicz, Witold [2] ; Prinja, Anil [3] ; Danon, Yaron [4] + Show Author Affiliations Los Alamos National Laboratory University of Tennessee, Knoxville, TN 37996, USA University of New Mexico, USA Rensselaer Polytechnic Institute, USA

  20. Improvements to Nuclear Data and Its Uncertainties by Theoretical Modeling

    Office of Scientific and Technical Information (OSTI)

    (Technical Report) | SciTech Connect Improvements to Nuclear Data and Its Uncertainties by Theoretical Modeling Citation Details In-Document Search Title: Improvements to Nuclear Data and Its Uncertainties by Theoretical Modeling This project addresses three important gaps in existing evaluated nuclear data libraries that represent a significant hindrance against highly advanced modeling and simulation capabilities for the Advanced Fuel Cycle Initiative (AFCI). This project will: Develop

  1. Theoretical Study on Catalysis by Protein Enzymes and Ribozyme

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Theoretical Study on Catalysis by Protein Enzymes and Ribozyme Theoretical Study on Catalysis by Protein Enzymes and Ribozyme 2000 NERSC Annual Report 17shkarplus.jpg The energetics were determined for three mechanisms proposed for TIM catalyzed reactions. Results from reaction path calculations suggest that the two mechanisms that involve an enediol intermediate are likely to occur, while the direct intra-substrate proton transfer mechanism (in green) is energetically unfavorable due to the

  2. Theoretical atomic physics code development I: CATS: Cowan Atomic Structure

    Office of Scientific and Technical Information (OSTI)

    Code (Technical Report) | SciTech Connect Technical Report: Theoretical atomic physics code development I: CATS: Cowan Atomic Structure Code Citation Details In-Document Search Title: Theoretical atomic physics code development I: CATS: Cowan Atomic Structure Code × You are accessing a document from the Department of Energy's (DOE) SciTech Connect. This site is a product of DOE's Office of Scientific and Technical Information (OSTI) and is provided as a public service. Visit OSTI to utilize

  3. Theoretical energy release of thermites, intermetallics, and combustible

    Office of Scientific and Technical Information (OSTI)

    metals (Technical Report) | SciTech Connect Theoretical energy release of thermites, intermetallics, and combustible metals Citation Details In-Document Search Title: Theoretical energy release of thermites, intermetallics, and combustible metals × You are accessing a document from the Department of Energy's (DOE) SciTech Connect. This site is a product of DOE's Office of Scientific and Technical Information (OSTI) and is provided as a public service. Visit OSTI to utilize additional

  4. Research in theoretical nuclear and neutrino physics. Final report

    Office of Scientific and Technical Information (OSTI)

    (Technical Report) | SciTech Connect Technical Report: Research in theoretical nuclear and neutrino physics. Final report Citation Details In-Document Search Title: Research in theoretical nuclear and neutrino physics. Final report The main focus of the research supported by the nuclear theory grant DE-FG02-04ER41319 was on studying parton dynamics in high-energy heavy ion collisions, perturbative approach to charm production and its contribution to atmospheric neutrinos, application of

  5. The Geometry Of Disorder: Theoretical Investigations Of Quasicrystals And

    Office of Scientific and Technical Information (OSTI)

    Frustrated Magnets: Quasi-Crystals And Quasi-Equivalence: Symmetries And Energies In Alloys And Biological Materials (Technical Report) | SciTech Connect The Geometry Of Disorder: Theoretical Investigations Of Quasicrystals And Frustrated Magnets: Quasi-Crystals And Quasi-Equivalence: Symmetries And Energies In Alloys And Biological Materials Citation Details In-Document Search Title: The Geometry Of Disorder: Theoretical Investigations Of Quasicrystals And Frustrated Magnets: Quasi-Crystals

  6. Experimental and theoretical investigations of non-centrosymmetric

    Office of Scientific and Technical Information (OSTI)

    8-hydroxyquinolinium dibenzoyl-(L)-tartrate methanol monohydrate single crystal (Journal Article) | SciTech Connect Experimental and theoretical investigations of non-centrosymmetric 8-hydroxyquinolinium dibenzoyl-(L)-tartrate methanol monohydrate single crystal Citation Details In-Document Search Title: Experimental and theoretical investigations of non-centrosymmetric 8-hydroxyquinolinium dibenzoyl-(L)-tartrate methanol monohydrate single crystal Graphical abstract: ORTEP diagram of HQDBT.

  7. Theoretical Synthesis of Mixed Materials for CO2 Capture Applications

    Office of Scientific and Technical Information (OSTI)

    (Conference) | SciTech Connect Theoretical Synthesis of Mixed Materials for CO2 Capture Applications Citation Details In-Document Search Title: Theoretical Synthesis of Mixed Materials for CO2 Capture Applications These pages provide an example of the layout and style required for the preparation of four-page papers for the TechConnect World 2015 technical proceedings.Documents must be submitted in electronic (Adobe PDFfile) format. Please study the enclosed materials beforebeginning the

  8. Theoretical and experimental studies of electrified interfaces relevant to

    Office of Scientific and Technical Information (OSTI)

    energy storage. (Technical Report) | SciTech Connect Technical Report: Theoretical and experimental studies of electrified interfaces relevant to energy storage. Citation Details In-Document Search Title: Theoretical and experimental studies of electrified interfaces relevant to energy storage. Advances in technology for electrochemical energy storage require increased understanding of electrolyte/electrode interfaces, including the electric double layer structure, and processes involved in

  9. Theoretical calculating the thermodynamic properties of solid sorbents for

    Office of Scientific and Technical Information (OSTI)

    CO{sub 2} capture applications (Technical Report) | SciTech Connect Technical Report: Theoretical calculating the thermodynamic properties of solid sorbents for CO{sub 2} capture applications Citation Details In-Document Search Title: Theoretical calculating the thermodynamic properties of solid sorbents for CO{sub 2} capture applications Since current technologies for capturing CO{sub 2} to fight global climate change are still too energy intensive, there is a critical need for development

  10. Toward Catalyst Design from Theoretical Calculations (464th Brookhaven

    Office of Scientific and Technical Information (OSTI)

    Lecture) (Conference) | SciTech Connect Conference: Toward Catalyst Design from Theoretical Calculations (464th Brookhaven Lecture) Citation Details In-Document Search Title: Toward Catalyst Design from Theoretical Calculations (464th Brookhaven Lecture) Catalysts have been used to speed up chemical reactions as long as yeast has been used to make bread rise. Today, catalysts are used everywhere from home kitchens to industrial chemical factories. In the near future, new catalysts being

  11. Modeling and Algorithmic Approaches to Constitutively-Complex, Micro-structured Fluids

    SciTech Connect (OSTI)

    Forest, Mark Gregory [University of North Carolina at Chapel Hill] [University of North Carolina at Chapel Hill

    2014-05-06

    The team for this Project made significant progress on modeling and algorithmic approaches to hydrodynamics of fluids with complex microstructure. Our advances are broken down into modeling and algorithmic approaches. In experiments a driven magnetic bead in a complex fluid accelerates out of the Stokes regime and settles into another apparent linear response regime. The modeling explains the take-off as a deformation of entanglements, and the longtime behavior is a nonlinear, far-from-equilibrium property. Furthermore, the model has predictive value, as we can tune microstructural properties relative to the magnetic force applied to the bead to exhibit all possible behaviors. Wave-theoretic probes of complex fluids have been extended in two significant directions, to small volumes and the nonlinear regime. Heterogeneous stress and strain features that lie beyond experimental capability were studied. It was shown that nonlinear penetration of boundary stress in confined viscoelastic fluids is not monotone, indicating the possibility of interlacing layers of linear and nonlinear behavior, and thus layers of variable viscosity. Models, algorithms, and codes were developed and simulations performed leading to phase diagrams of nanorod dispersion hydrodynamics in parallel shear cells and confined cavities representative of film and membrane processing conditions. Hydrodynamic codes for polymeric fluids are extended to include coupling between microscopic and macroscopic models, and to the strongly nonlinear regime.

  12. A Faster Parallel Algorithm and Efficient Multithreaded Implementations for Evaluating Betweenness Centrality on Massive Datasets

    SciTech Connect (OSTI)

    Madduri, Kamesh; Ediger, David; Jiang, Karl; Bader, David A.; Chavarria-Miranda, Daniel

    2009-02-15

    We present a new lock-free parallel algorithm for computing betweenness centralityof massive small-world networks. With minor changes to the data structures, ouralgorithm also achieves better spatial cache locality compared to previous approaches. Betweenness centrality is a key algorithm kernel in HPCS SSCA#2, a benchmark extensively used to evaluate the performance of emerging high-performance computing architectures for graph-theoretic computations. We design optimized implementations of betweenness centrality and the SSCA#2 benchmark for two hardware multithreaded systems: a Cray XMT system with the Threadstorm processor, and a single-socket Sun multicore server with the UltraSPARC T2 processor. For a small-world network of 134 million vertices and 1.073 billion edges, the 16-processor XMT system and the 8-core Sun Fire T5120 server achieve TEPS scores (an algorithmic performance count for the SSCA#2 benchmark) of 160 million and 90 million respectively, which corresponds to more than a 2X performance improvement over the previous parallel implementations. To better characterize the performance of these multithreaded systems, we correlate the SSCA#2 performance results with data from the memory-intensive STREAM and RandomAccess benchmarks. Finally, we demonstrate the applicability of our implementation to analyze massive real-world datasets by computing approximate betweenness centrality for a large-scale IMDb movie-actor network.

  13. US-VISIT Identity Matching Algorithm Evaluation Program: ADIS Algorithm Evaluation Project Plan Update

    SciTech Connect (OSTI)

    Grant, C W; Lenderman, J S; Gansemer, J D

    2011-02-24

    This document is an update to the 'ADIS Algorithm Evaluation Project Plan' specified in the Statement of Work for the US-VISIT Identity Matching Algorithm Evaluation Program, as deliverable II.D.1. The original plan was delivered in August 2010. This document modifies the plan to reflect modified deliverables reflecting delays in obtaining a database refresh. This document describes the revised schedule of the program deliverables. The detailed description of the processes used, the statistical analysis processes and the results of the statistical analysis will be described fully in the program deliverables. The US-VISIT Identity Matching Algorithm Evaluation Program is work performed by Lawrence Livermore National Laboratory (LLNL) under IAA HSHQVT-07-X-00002 P00004 from the Department of Homeland Security (DHS).

  14. Algorithmic crystal chemistry: A cellular automata approach

    SciTech Connect (OSTI)

    Krivovichev, S. V.

    2012-01-15

    Atomic-molecular mechanisms of crystal growth can be modeled based on crystallochemical information using cellular automata (a particular case of finite deterministic automata). In particular, the formation of heteropolyhedral layered complexes in uranyl selenates can be modeled applying a one-dimensional three-colored cellular automaton. The use of the theory of calculations (in particular, the theory of automata) in crystallography allows one to interpret crystal growth as a computational process (the realization of an algorithm or program with a finite number of steps).

  15. A Monte Carlo algorithm for degenerate plasmas

    SciTech Connect (OSTI)

    Turrell, A.E. Sherlock, M.; Rose, S.J.

    2013-09-15

    A procedure for performing Monte Carlo calculations of plasmas with an arbitrary level of degeneracy is outlined. It has possible applications in inertial confinement fusion and astrophysics. Degenerate particles are initialised according to the FermiDirac distribution function, and scattering is via a Pauli blocked binary collision approximation. The algorithm is tested against degenerate electronion equilibration, and the degenerate resistivity transport coefficient from unmagnetised first order transport theory. The code is applied to the cold fuel shell and alpha particle equilibration problem of inertial confinement fusion.

  16. Algorithms for Contact in a Mulitphysics Environment

    Energy Science and Technology Software Center (OSTI)

    2001-12-19

    Many codes require either a contact capability or a need to determine geometric proximity of non-connected topological entities (which is a subset of what contact requires). ACME is a library to provide services to determine contact forces and/or geometric proximity interactions. This includes generic capabilities such as determining points in Cartesian volumes, finding faces in Cartesian volumes, etc. ACME can be run in single or multi-processor mode (the basic algorithms have been tested up tomore » 4500 processors).« less

  17. A Unified Differential Evolution Algorithm for Global Optimization

    SciTech Connect (OSTI)

    Qiang, Ji; Mitchell, Chad

    2014-06-24

    Abstract?In this paper, we propose a new unified differential evolution (uDE) algorithm for single objective global optimization. Instead of selecting among multiple mutation strategies as in the conventional differential evolution algorithm, this algorithm employs a single equation as the mutation strategy. It has the virtue of mathematical simplicity and also provides users the flexbility for broader exploration of different mutation strategies. Numerical tests using twelve basic unimodal and multimodal functions show promising performance of the proposed algorithm in comparison to convential differential evolution algorithms.

  18. Evaluation of machine learning algorithms for prediction of regions of high RANS uncertainty

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Ling, Julia; Templeton, Jeremy Alan

    2015-08-04

    Reynolds Averaged Navier Stokes (RANS) models are widely used in industry to predict fluid flows, despite their acknowledged deficiencies. Not only do RANS models often produce inaccurate flow predictions, but there are very limited diagnostics available to assess RANS accuracy for a given flow configuration. If experimental or higher fidelity simulation results are not available for RANS validation, there is no reliable method to evaluate RANS accuracy. This paper explores the potential of utilizing machine learning algorithms to identify regions of high RANS uncertainty. Three different machine learning algorithms were evaluated: support vector machines, Adaboost decision trees, and random forests.more » The algorithms were trained on a database of canonical flow configurations for which validated direct numerical simulation or large eddy simulation results were available, and were used to classify RANS results on a point-by-point basis as having either high or low uncertainty, based on the breakdown of specific RANS modeling assumptions. Classifiers were developed for three different basic RANS eddy viscosity model assumptions: the isotropy of the eddy viscosity, the linearity of the Boussinesq hypothesis, and the non-negativity of the eddy viscosity. It is shown that these classifiers are able to generalize to flows substantially different from those on which they were trained. As a result, feature selection techniques, model evaluation, and extrapolation detection are discussed in the context of turbulence modeling applications.« less

  19. Evaluation of machine learning algorithms for prediction of regions of high RANS uncertainty

    SciTech Connect (OSTI)

    Ling, Julia; Templeton, Jeremy Alan

    2015-08-04

    Reynolds Averaged Navier Stokes (RANS) models are widely used in industry to predict fluid flows, despite their acknowledged deficiencies. Not only do RANS models often produce inaccurate flow predictions, but there are very limited diagnostics available to assess RANS accuracy for a given flow configuration. If experimental or higher fidelity simulation results are not available for RANS validation, there is no reliable method to evaluate RANS accuracy. This paper explores the potential of utilizing machine learning algorithms to identify regions of high RANS uncertainty. Three different machine learning algorithms were evaluated: support vector machines, Adaboost decision trees, and random forests. The algorithms were trained on a database of canonical flow configurations for which validated direct numerical simulation or large eddy simulation results were available, and were used to classify RANS results on a point-by-point basis as having either high or low uncertainty, based on the breakdown of specific RANS modeling assumptions. Classifiers were developed for three different basic RANS eddy viscosity model assumptions: the isotropy of the eddy viscosity, the linearity of the Boussinesq hypothesis, and the non-negativity of the eddy viscosity. It is shown that these classifiers are able to generalize to flows substantially different from those on which they were trained. As a result, feature selection techniques, model evaluation, and extrapolation detection are discussed in the context of turbulence modeling applications.

  20. Electronic structure basis for the extraordinary magnetoresistance in WTe2

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Pletikosić, I.; Ali, Mazhar N.; Fedorov, A. V.; Cava, R. J.; Valla, T.

    2014-11-19

    The electronic structure basis of the extremely large magnetoresistance in layered non-magnetic tungsten ditelluride has been investigated by angle-resolved photoelectron spectroscopy. Hole and electron pockets of approximately the same size were found at the Fermi level, suggesting that carrier compensation should be considered the primary source of the effect. The material exhibits a highly anisotropic, quasi one-dimensional Fermi surface from which the pronounced anisotropy of the magnetoresistance follows. As a result, a change in the Fermi surface with temperature was found and a high-density-of-states band that may take over conduction at higher temperatures and cause the observed turn-on behavior ofmore » the magnetoresistance in WTe₂ was identified.« less

  1. Electronic structure basis for the titanic magnetoresistance in WTe?

    SciTech Connect (OSTI)

    Pletikosic, I.; Ali, Mazhar N.; Fedorov, A. V.; Cava, R. J.; Valla, T.

    2014-11-19

    The electronic structure basis of the extremely large magnetoresistance in layered non-magnetic tungsten ditelluride has been investigated by angle-resolved photoelectron spectroscopy. Hole and electron pockets of approximately the same size were found at the Fermi level, suggesting that carrier compensation should be considered the primary source of the effect. The material exhibits a highly anisotropic, quasi one-dimensional Fermi surface from which the pronounced anisotropy of the magnetoresistance follows. A change in the Fermi surface with temperature was found and a high-density-of-states band that may take over conduction at higher temperatures and cause the observed turn-on behavior of the magnetoresistance in WTe? was identified.

  2. Electronic structure basis for the titanic magnetoresistance in WTe?

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Pletikosic, I.; Ali, Mazhar N.; Fedorov, A. V.; Cava, R. J.; Valla, T.

    2014-11-19

    The electronic structure basis of the extremely large magnetoresistance in layered non-magnetic tungsten ditelluride has been investigated by angle-resolved photoelectron spectroscopy. Hole and electron pockets of approximately the same size were found at the Fermi level, suggesting that carrier compensation should be considered the primary source of the effect. The material exhibits a highly anisotropic, quasi one-dimensional Fermi surface from which the pronounced anisotropy of the magnetoresistance follows. A change in the Fermi surface with temperature was found and a high-density-of-states band that may take over conduction at higher temperatures and cause the observed turn-on behavior of the magnetoresistance inmoreWTe? was identified.less

  3. Draft Geologic Disposal Requirements Basis for STAD Specification

    SciTech Connect (OSTI)

    Ilgen, Anastasia G.; Bryan, Charles R.; Hardin, Ernest

    2015-03-25

    This document provides the basis for requirements in the current version of Performance Specification for Standardized Transportation, Aging, and Disposal Canister Systems, (FCRD-NFST-2014-0000579) that are driven by storage and geologic disposal considerations. Performance requirements for the Standardized Transportation, Aging, and Disposal (STAD) canister are given in Section 3.1 of that report. Here, the requirements are reviewed and the rationale for each provided. Note that, while FCRD-NFST-2014-0000579 provides performance specifications for other components of the STAD storage system (e.g. storage overpack, transfer and transportation casks, and others), these have no impact on the canister performance during disposal, and are not discussed here.

  4. Hanford External Dosimetry Technical Basis Manual PNL-MA-842

    SciTech Connect (OSTI)

    Rathbone, Bruce A.

    2009-08-28

    The Hanford External Dosimetry Technical Basis Manual PNL-MA-842 documents the design and implementation of the external dosimetry system used at Hanford. The manual describes the dosimeter design, processing protocols, dose calculation methodology, radiation fields encountered, dosimeter response characteristics, limitations of dosimeter design under field conditions, and makes recommendations for effective use of the dosimeters in the field. The manual describes the technical basis for the dosimetry system in a manner intended to help ensure defensibility of the dose of record at Hanford and to demonstrate compliance with 10 CFR 835, DOELAP, DOE-RL, ORP, PNSO, and Hanford contractor requirements. The dosimetry system is operated by PNNL’s Hanford External Dosimetry Program (HEDP) which provides dosimetry services to all Hanford contractors. The primary users of this manual are DOE and DOE contractors at Hanford using the dosimetry services of PNNL. Development and maintenance of this manual is funded directly by DOE and DOE contractors. Its contents have been reviewed and approved by DOE and DOE contractors at Hanford through the Hanford Personnel Dosimetry Advisory Committee (HPDAC) which is chartered and chaired by DOE-RL and serves as means of coordinating dosimetry practices across contractors at Hanford. This manual was established in 1996. Since inception, it has been revised many times and maintained by PNNL as a controlled document with controlled distribution. The first revision to be released through PNNL’s Electronic Records & Information Capture Architecture (ERICA) database was designated Revision 0. Revision numbers that are whole numbers reflect major revisions typically involving changes to all chapters in the document. Revision numbers that include a decimal fraction reflect minor revisions, usually restricted to selected chapters or selected pages in the document.

  5. An Adaptive Unified Differential Evolution Algorithm for Global Optimization

    SciTech Connect (OSTI)

    Qiang, Ji; Mitchell, Chad

    2014-11-03

    In this paper, we propose a new adaptive unified differential evolution algorithm for single-objective global optimization. Instead of the multiple mutation strate- gies proposed in conventional differential evolution algorithms, this algorithm employs a single equation unifying multiple strategies into one expression. It has the virtue of mathematical simplicity and also provides users the flexibility for broader exploration of the space of mutation operators. By making all control parameters in the proposed algorithm self-adaptively evolve during the process of optimization, it frees the application users from the burden of choosing appro- priate control parameters and also improves the performance of the algorithm. In numerical tests using thirteen basic unimodal and multimodal functions, the proposed adaptive unified algorithm shows promising performance in compari- son to several conventional differential evolution algorithms.

  6. Derivation of Seasonal Cloud Properties at ARM-NSA from Multispectral...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    In this paper, the operational Clouds and the Earth's Radiant Energy System (CERES) cloud ... In Clouds and the Earth's Radiant Energy System (CERES) Algorithm Theoretical Basis ...

  7. Daylighting simulation: methods, algorithms, and resources

    SciTech Connect (OSTI)

    Carroll, William L.

    1999-12-01

    This document presents work conducted as part of Subtask C, ''Daylighting Design Tools'', Subgroup C2, ''New Daylight Algorithms'', of the IEA SHC Task 21 and the ECBCS Program Annex 29 ''Daylight in Buildings''. The search for and collection of daylighting analysis methods and algorithms led to two important observations. First, there is a wide range of needs for different types of methods to produce a complete analysis tool. These include: Geometry; Light modeling; Characterization of the natural illumination resource; Materials and components properties, representations; and Usability issues (interfaces, interoperability, representation of analysis results, etc). Second, very advantageously, there have been rapid advances in many basic methods in these areas, due to other forces. They are in part driven by: The commercial computer graphics community (commerce, entertainment); The lighting industry; Architectural rendering and visualization for projects; and Academia: Course materials, research. This has led to a very rich set of information resources that have direct applicability to the small daylighting analysis community. Furthermore, much of this information is in fact available online. Because much of the information about methods and algorithms is now online, an innovative reporting strategy was used: the core formats are electronic, and used to produce a printed form only secondarily. The electronic forms include both online WWW pages and a downloadable .PDF file with the same appearance and content. Both electronic forms include live primary and indirect links to actual information sources on the WWW. In most cases, little additional commentary is provided regarding the information links or citations that are provided. This in turn allows the report to be very concise. The links are expected speak for themselves. The report consists of only about 10+ pages, with about 100+ primary links, but with potentially thousands of indirect links. For purposes of the printed version, a list of the links is explicitly provided. This document exists in HTML form at the URL address: http://eande.lbl.gov/Task21/dlalgorithms.html. An equivalent downloadable PDF version, also with live links, at the URL address: http://eande.lbl.gov/Task21/dlalgorithms.pdf. A printed report can be derived directly from either of the electronic versions by simply printing either of them. In addition to the live links in the electronic forms, all report forms, electronic and paper, also have explicitly listed link addresses so that they can be followed up or referenced manually.

  8. Hanford External Dosimetry Technical Basis Manual PNL-MA-842

    SciTech Connect (OSTI)

    Rathbone, Bruce A.

    2011-04-04

    The Hanford External Dosimetry Technical Basis Manual PNL-MA-842 documents the design and implementation of the external dosimetry system used at the U.S. Department of Energy (DOE) Hanford site. The manual describes the dosimeter design, processing protocols, dose calculation methodology, radiation fields encountered, dosimeter response characteristics, limitations of dosimeter design under field conditions, and makes recommendations for effective use of the dosimeters in the field. The manual describes the technical basis for the dosimetry system in a manner intended to help ensure defensibility of the dose of record at Hanford and to demonstrate compliance with requirements of 10 CFR 835, the DOE Laboratory Accreditation Program, the DOE Richland Operations Office, DOE Office of River Protection, DOE Pacific Northwest Office of Science, and Hanford’s DOE contractors. The dosimetry system is operated by the Pacific Northwest National Laboratory (PNNL) Hanford External Dosimetry Program which provides dosimetry services to PNNL and all Hanford contractors. The primary users of this manual are DOE and DOE contractors at Hanford using the dosimetry services of PNNL. Development and maintenance of this manual is funded directly by DOE and DOE contractors. Its contents have been reviewed and approved by DOE and DOE contractors at Hanford through the Hanford Personnel Dosimetry Advisory Committee which is chartered and chaired by DOE-RL and serves as means of coordinating dosimetry practices across contractors at Hanford. This manual was established in 1996. Since its inception, it has been revised many times and maintained by PNNL as a controlled document with controlled distribution. The first revision to be released through PNNL’s Electronic Records & Information Capture Architecture database was designated Revision 0. Revision numbers that are whole numbers reflect major revisions typically involving significant changes to all chapters in the document. Revision numbers that include a decimal fraction reflect minor revisions, usually restricted to selected chapters or selected pages in the document. Maintenance and distribution of controlled hard copies of the manual by PNNL was discontinued beginning with Revision 0.2.

  9. Hanford External Dosimetry Technical Basis Manual PNL-MA-842

    SciTech Connect (OSTI)

    Rathbone, Bruce A.

    2007-03-12

    The Hanford External Dosimetry Technical Basis Manual PNL-MA-842 documents the design and implementation of the external dosimetry system used at Hanford. The manual describes the dosimeter design, processing protocols, dose calculation methodology, radiation fields encountered, dosimeter response characteristics, limitations of dosimeter design under field conditions, and makes recommendations for effective use of the dosimeters in the field. The manual describes the technical basis for the dosimetry system in a manner intended to help ensure defensibility of the dose of record at Hanford and to demonstrate compliance with 10 CFR 835, DOELAP, DOE-RL, ORP, PNSO, and Hanford contractor requirements. The dosimetry system is operated by PNNL’s Hanford External Dosimetry Program (HEDP) which provides dosimetry services to all Hanford contractors. The primary users of this manual are DOE and DOE contractors at Hanford using the dosimetry services of PNNL. Development and maintenance of this manual is funded directly by DOE and DOE contractors. Its contents have been reviewed and approved by DOE and DOE contractors at Hanford through the Hanford Personnel Dosimetry Advisory Committee (HPDAC) which is chartered and chaired by DOE-RL and serves as means of coordinating dosimetry practices across contractors at Hanford. This manual was established in 1996. Since inception, it has been revised many times and maintained by PNNL as a controlled document with controlled distribution. Rev. 0 marks the first revision to be released through PNNL’s Electronic Records & Information Capture Architecture (ERICA) database. Revision numbers that are whole numbers reflect major revisions typically involving changes to all chapters in the document. Revision numbers that include a decimal fraction reflect minor revisions, usually restricted to selected chapters or selected pages in the document. Revision Log: Rev. 0 (2/25/2005) Major revision and expansion. Rev. 0.1 (3/12/2007) Minor revision. Updated Chapters 5, 6 and 9 to reflect change in default ring calibration factor used in HEDP dose calculation software. Factor changed from 1.5 to 2.0 beginning January 1, 2007. Pages on which changes were made are as follows: 5.23, 5.69, 5.78, 5.80, 5.82, 6.3, 6.5, 6.29, 9.2.

  10. Hanford External Dosimetry Technical Basis Manual PNL-MA-842

    SciTech Connect (OSTI)

    Rathbone, Bruce A.

    2010-04-01

    The Hanford External Dosimetry Technical Basis Manual PNL-MA-842 documents the design and implementation of the external dosimetry system used at the U.S. Department of Energy (DOE) Hanford site. The manual describes the dosimeter design, processing protocols, dose calculation methodology, radiation fields encountered, dosimeter response characteristics, limitations of dosimeter design under field conditions, and makes recommendations for effective use of the dosimeters in the field. The manual describes the technical basis for the dosimetry system in a manner intended to help ensure defensibility of the dose of record at Hanford and to demonstrate compliance with requirements of 10 CFR 835, the DOE Laboratory Accreditation Program, the DOE Richland Operations Office, DOE Office of River Protection, DOE Pacific Northwest Office of Science, and Hanford’s DOE contractors. The dosimetry system is operated by the Pacific Northwest National Laboratory (PNNL) Hanford External Dosimetry Program which provides dosimetry services to PNNL and all Hanford contractors. The primary users of this manual are DOE and DOE contractors at Hanford using the dosimetry services of PNNL. Development and maintenance of this manual is funded directly by DOE and DOE contractors. Its contents have been reviewed and approved by DOE and DOE contractors at Hanford through the Hanford Personnel Dosimetry Advisory Committee which is chartered and chaired by DOE-RL and serves as means of coordinating dosimetry practices across contractors at Hanford. This manual was established in 1996. Since its inception, it has been revised many times and maintained by PNNL as a controlled document with controlled distribution. The first revision to be released through PNNL’s Electronic Records & Information Capture Architecture database was designated Revision 0. Revision numbers that are whole numbers reflect major revisions typically involving significant changes to all chapters in the document. Revision numbers that include a decimal fraction reflect minor revisions, usually restricted to selected chapters or selected pages in the document. Maintenance and distribution of controlled hard copies of the manual by PNNL was discontinued beginning with Revision 0.2.

  11. A practical and theoretical definition of very small field size for radiotherapy output factor measurements

    SciTech Connect (OSTI)

    Charles, P. H. Crowe, S. B.; Langton, C. M.; Trapp, J. V.; Cranmer-Sargison, G.; Thwaites, D. I.; Kairn, T.; Knight, R. T.; Kenny, J.

    2014-04-15

    Purpose: This work introduces the concept of very small field size. Output factor (OPF) measurements at these field sizes require extremely careful experimental methodology including the measurement of dosimetric field size at the same time as each OPF measurement. Two quantifiable scientific definitions of the threshold of very small field size are presented. Methods: A practical definition was established by quantifying the effect that a 1 mm error in field size or detector position had on OPFs and setting acceptable uncertainties on OPF at 1%. Alternatively, for a theoretical definition of very small field size, the OPFs were separated into additional factors to investigate the specific effects of lateral electronic disequilibrium, photon scatter in the phantom, and source occlusion. The dominant effect was established and formed the basis of a theoretical definition of very small fields. Each factor was obtained using Monte Carlo simulations of a Varian iX linear accelerator for various square field sizes of side length from 4 to 100 mm, using a nominal photon energy of 6 MV. Results: According to the practical definition established in this project, field sizes ?15 mm were considered to be very small for 6 MV beams for maximal field size uncertainties of 1 mm. If the acceptable uncertainty in the OPF was increased from 1.0% to 2.0%, or field size uncertainties are 0.5 mm, field sizes ?12 mm were considered to be very small. Lateral electronic disequilibrium in the phantom was the dominant cause of change in OPF at very small field sizes. Thus the theoretical definition of very small field size coincided to the field size at which lateral electronic disequilibrium clearly caused a greater change in OPF than any other effects. This was found to occur at field sizes ?12 mm. Source occlusion also caused a large change in OPF for field sizes ?8 mm. Based on the results of this study, field sizes ?12 mm were considered to be theoretically very small for 6 MV beams. Conclusions: Extremely careful experimental methodology including the measurement of dosimetric field size at the same time as output factor measurement for each field size setting and also very precise detector alignment is required at field sizes at least ?12 mm and more conservatively?15 mm for 6 MV beams. These recommendations should be applied in addition to all the usual considerations for small field dosimetry, including careful detector selection.

  12. Theoretical solution of the minimum charge problem for gaseous detonations

    SciTech Connect (OSTI)

    Ostensen, R.W.

    1990-12-01

    A theoretical model was developed for the minimum charge to trigger a gaseous detonation in spherical geometry as a generalization of the Zeldovich model. Careful comparisons were made between the theoretical predictions and experimental data on the minimum charge to trigger detonations in propane-air mixtures. The predictions are an order of magnitude too high, and there is no apparent resolution to the discrepancy. A dynamic model, which takes into account the experimentally observed oscillations in the detonation zone, may be necessary for reliable predictions. 27 refs., 9 figs.

  13. Theoretical and experimental investigation of heat pipe solar collector

    SciTech Connect (OSTI)

    Azad, E.

    2008-09-15

    Heat pipe solar collector was designed and constructed at IROST and its performance was measured on an outdoor test facility. The thermal behavior of a gravity assisted heat pipe solar collector was investigated theoretically and experimentally. A theoretical model based on effectiveness-NTU method was developed for evaluating the thermal efficiency of the collector, the inlet, outlet water temperatures and heat pipe temperature. Optimum value of evaporator length to condenser length ratio is also determined. The modelling predictions were validated using experimental data and it shows that there is a good concurrence between measured and predicted results. (author)

  14. Theoretical investigation of thermodynamic stability and mobility of the

    Office of Scientific and Technical Information (OSTI)

    oxygen vacancy in ThO2 -UO2 solid solutions (Journal Article) | DOE PAGES Theoretical investigation of thermodynamic stability and mobility of the oxygen vacancy in ThO2 -UO2 solid solutions « Prev Next » Title: Theoretical investigation of thermodynamic stability and mobility of the oxygen vacancy in ThO2 -UO2 solid solutions The thermodynamic stability and the migration energy barriers of oxygen vacancies in ThO2 -UO2 solid solutions are investigated by density functional theory

  15. Theoretical investigations of two Si-based spintronic materials

    Office of Scientific and Technical Information (OSTI)

    (Conference) | SciTech Connect Conference: Theoretical investigations of two Si-based spintronic materials Citation Details In-Document Search Title: Theoretical investigations of two Si-based spintronic materials Two Si-based spintronic materials, a Mn-Si digital ferromagnetic heterostructure ({delta}-layer of Mn doped in Si) with defects and dilutely doped Mn{sub x}Si{sub 1-x} alloy are investigated using a density-functional based approach. We model the heterostructure and alloy with a

  16. Component evaluation testing and analysis algorithms.

    SciTech Connect (OSTI)

    Hart, Darren M.; Merchant, Bion John

    2011-10-01

    The Ground-Based Monitoring R&E Component Evaluation project performs testing on the hardware components that make up Seismic and Infrasound monitoring systems. The majority of the testing is focused on the Digital Waveform Recorder (DWR), Seismic Sensor, and Infrasound Sensor. In order to guarantee consistency, traceability, and visibility into the results of the testing process, it is necessary to document the test and analysis procedures that are in place. Other reports document the testing procedures that are in place (Kromer, 2007). This document serves to provide a comprehensive overview of the analysis and the algorithms that are applied to the Component Evaluation testing. A brief summary of each test is included to provide the context for the analysis that is to be performed.

  17. Neurons to algorithms LDRD final report.

    SciTech Connect (OSTI)

    Rothganger, Fredrick H.; Aimone, James Bradley; Warrender, Christina E.; Trumbo, Derek

    2013-09-01

    Over the last three years the Neurons to Algorithms (N2A) LDRD project teams has built infrastructure to discover computational structures in the brain. This consists of a modeling language, a tool that enables model development and simulation in that language, and initial connections with the Neuroinformatics community, a group working toward similar goals. The approach of N2A is to express large complex systems like the brain as populations of a discrete part types that have specific structural relationships with each other, along with internal and structural dynamics. Such an evolving mathematical system may be able to capture the essence of neural processing, and ultimately of thought itself. This final report is a cover for the actual products of the project: the N2A Language Specification, the N2A Application, and a journal paper summarizing our methods.

  18. Automated DNA Base Pair Calling Algorithm

    Energy Science and Technology Software Center (OSTI)

    1999-07-07

    The procedure solves the problem of calling the DNA base pair sequence from two channel electropherogram separations in an automated fashion. The core of the program involves a peak picking algorithm based upon first, second, and third derivative spectra for each electropherogram channel, signal levels as a function of time, peak spacing, base pair signal to noise sequence patterns, frequency vs ratio of the two channel histograms, and confidence levels generated during the run. Themore » ratios of the two channels at peak centers can be used to accurately and reproducibly determine the base pair sequence. A further enhancement is a novel Gaussian deconvolution used to determine the peak heights used in generating the ratio.« less

  19. Development of engineering technology basis for industrialization of pyrometallurgical reprocessing

    SciTech Connect (OSTI)

    Koyama, Tadafumi; Hijikata, Takatoshi; Yokoo, Takeshi; Inoue, Tadashi

    2007-07-01

    Development of the engineering technology basis of pyrometallurgical reprocessing is a key issue for industrialization. For development of the transport technologies of molten salt and liquid cadmium at around 500 deg. C, a salt transport test rig and a metal transport test rig were installed in Ar glove box. Function of centrifugal pump and 1/2' declined tubing were confirmed with LiCl- KCl molten salt. The transport behavior of molten salt was found to follow that of water. Function of centrifugal pump, vacuum sucking and 1/2' declined tubing were confirmed with liquid Cd. With employing the transport technologies, industrialization applicable electro-refiner was newly designed and engineering-scale model was fabricated in Ar glove box. The electro-refiner has semi-continuous liquid Cd cathode instead of conventional one used in small-scale tests. With using actinide-simulating elements, demonstration of industrial-scale throughput will be carried out in this electro-refiner for more precise evaluation of industrialization potential of pyrometallurgical reprocessing. (authors)

  20. Hanford Technical Basis for Multiple Dosimetry Effective Dose Methodology

    SciTech Connect (OSTI)

    Hill, Robin L.; Rathbone, Bruce A.

    2010-08-01

    The current method at Hanford for dealing with the results from multiple dosimeters worn during non-uniform irradiation is to use a compartmentalization method to calculate the effective dose (E). The method, as documented in the current version of Section 6.9.3 in the 'Hanford External Dosimetry Technical Basis Manual, PNL-MA-842,' is based on the compartmentalization method presented in the 1997 ANSI/HPS N13.41 standard, 'Criteria for Performing Multiple Dosimetry.' With the adoption of the ICRP 60 methodology in the 2007 revision to 10 CFR 835 came changes that have a direct affect on the compartmentalization method described in the 1997 ANSI/HPS N13.41 standard, and, thus, to the method used at Hanford. The ANSI/HPS N13.41 standard committee is in the process of updating the standard, but the changes to the standard have not yet been approved. And, the drafts of the revision of the standard tend to align more with ICRP 60 than with the changes specified in the 2007 revision to 10 CFR 835. Therefore, a revised method for calculating effective dose from non-uniform external irradiation using a compartmental method was developed using the tissue weighting factors and remainder organs specified in 10 CFR 835 (2007).

  1. Climate Change: The Physical Basis and Latest Results

    ScienceCinema (OSTI)

    None

    2011-10-06

    The 2007 Assessment Report of the Intergovernmental Panel on Climate Change (IPCC) concludes: "Warming in the climate system is unequivocal." Without the contribution of Physics to climate science over many decades, such a statement would not have been possible. Experimental physics enables us to read climate archives such as polar ice cores and so provides the context for the current changes. For example, today the concentration of CO2 in the atmosphere, the second most important greenhouse gas, is 28% higher than any time during the last 800,000 years. Classical fluid mechanics and numerical mathematics are the basis of climate models from which estimates of future climate change are obtained. But major instabilities and surprises in the Earth System are still unknown. These are also to be considered when the climatic consequences of proposals for geo-engineering are estimated. Only Physics will permit us to further improve our understanding in order to provide the foundation for policy decisions facing the global climate change challenge.

  2. New Design Methods and Algorithms for Multi-component Distillation

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Processes | Department of Energy Design Methods and Algorithms for Multi-component Distillation Processes New Design Methods and Algorithms for Multi-component Distillation Processes PDF icon multicomponent.pdf More Documents & Publications CX-100137 Categorical Exclusion Determination DEVELOPMENT OF METHOD AND ALGORITHMS TO IDENTIFY EASILY IMPLEMENTABLE ENERGY-EFFICIENT LOW-COST MULTICOMPONENT DISTILLATION COLUMN TRAINS WITH LARGE ENERGY SAVINGS FOR WIDE NUMBER OF SEPARATIONS ITP

  3. The Structural Basis for Tight Control of PP2A Methylation and...

    Office of Scientific and Technical Information (OSTI)

    The Structural Basis for Tight Control of PP2A Methylation and Function by LCMT-1 Citation Details In-Document Search Title: The Structural Basis for Tight Control of PP2A ...

  4. Researcher, Los Alamos National Laboratory - Methods and Algorithms...

    National Nuclear Security Administration (NNSA)

    Lowell Brown of the Methods and Algorithms Group in the Applied Physics Division has made many contributions to physics, from quantum field theory, particle and nuclear physics, ...

  5. NREL: Awards and Honors - Current Interrupt Charging Algorithm...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Current Interrupt Charging Algorithm for Lead-Acid Batteries Developers: Matthew A. Keyser, Ahmad A. Pesaran, and Mark M. Mihalic, National Renewable Energy Laboratory; Robert F....

  6. Development of an Outdoor Temperature-Based Control Algorithm...

    Office of Scientific and Technical Information (OSTI)

    Development of an Outdoor Temperature-Based Control Algorithm for Residential Mechanical Ventilation Control Citation Details In-Document Search Title: Development of an Outdoor ...

  7. DEVELOPMENT OF METHOD AND ALGORITHMS TO IDENTIFY EASILY IMPLEMENTABLE...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    A state-of-the-art optimization algorithm is being developed to apply low-energy distillation processes that allow chemical manufacturers to reduce energy consumption between 10% ...

  8. Algorithm for Finding Similar Shapes in Large Molecular Structures Libraries

    Energy Science and Technology Software Center (OSTI)

    1994-10-19

    The SHAPES software consists of methods and algorithms for representing and rapidly comparing molecular shapes. Molecular shapes algorithms are a class of algorithm derived and applied for recognizing when two three-dimensional shapes share common features. They proceed from the notion that the shapes to be compared are regions in three-dimensional space. The algorithms allow recognition of when localized subregions from two or more different shapes could never be superimposed by any rigid-body motion. Rigid-body motionsmore » are arbitrary combinations of translations and rotations.« less

  9. Solar and Moon Position Algorithm (SAMPA) - Energy Innovation...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Contact NREL About This Technology Technology Marketing Summary This algorithm calculates ... SolarLunar Tracking Device Manufacturing SolarLunar Tracking Simulation Software More ...

  10. Numerical Analysis of Fixed Point Algorithms in the Presence...

    Office of Scientific and Technical Information (OSTI)

    in the Presence of Hardware Faults Citation Details In-Document Search Title: Numerical Analysis of Fixed Point Algorithms in the Presence of Hardware Faults You are ...

  11. Use of a Radon Stripping Algorithm for Retrospective Assessment...

    Office of Scientific and Technical Information (OSTI)

    and beta spectroscopy system employing a passive implanted planar silicon (PIPS) detector. ... MODIFICATIONS; PROGENY; RADON; SILICON air monitoring, radon, algorithm, PIPS, ...

  12. Problems Found Using a Radon Stripping Algorithm for Retrospective...

    Office of Scientific and Technical Information (OSTI)

    and beta spectroscopy system employing a passive implanted planar silicon (PIPS) detector. ... MODIFICATIONS; PROGENY; RADON; SILICON air monitoring, radon, algorithm, PIPS, ...

  13. A sequential implicit algorithm of chemo-thermo-poro-mechanics...

    Office of Scientific and Technical Information (OSTI)

    A sequential implicit algorithm of chemo-thermo-poro-mechanics for fractured geothermal reservoirs Citation Details In-Document Search This content will become publicly available ...

  14. A modern solver framework to manage solution algorithms in the...

    Office of Scientific and Technical Information (OSTI)

    A modern solver framework to manage solution algorithms in the Community Earth System Model Citation Details In-Document Search Title: A modern solver framework to manage solution ...

  15. Evaluation of Monte Carlo Electron-Transport Algorithms in the...

    Office of Scientific and Technical Information (OSTI)

    Series Codes for Stochastic-Media Simulations. Citation Details In-Document Search Title: Evaluation of Monte Carlo Electron-Transport Algorithms in the Integrated Tiger Series ...

  16. EnPI V4.0 Algorithm

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Department of Energy Advanced Manufacturing Office EnPI V4.0 Tool Algorithm Updated September 11 th , 2014 2 Contents Definition of Symbols ......

  17. EnPI V4.0 Tool Algorithm

    Broader source: Energy.gov [DOE]

    This document provides background information and detail about the algorithms and calculations that drive the Energy Performance Indicator (EnPI) Tool.

  18. Use of a Radon Stripping Algorithm for Retrospective Assessment...

    Office of Scientific and Technical Information (OSTI)

    using a commercial alpha and beta spectroscopy system employing a passive implanted ... FLOW; ALGORITHMS; BETA SOURCES; BETA SPECTROSCOPY; EVALUATION; MODIFICATIONS; PROGENY; ...

  19. Safety evaluation of MHTGR licensing basis accident scenarios

    SciTech Connect (OSTI)

    Kroeger, P.G.

    1989-04-01

    The safety potential of the Modular High-Temperature Gas Reactor (MHTGR) was evaluated, based on the Preliminary Safety Information Document (PSID), as submitted by the US Department of Energy to the US Nuclear Regulatory Commission. The relevant reactor safety codes were extended for this purpose and applied to this new reactor concept, searching primarily for potential accident scenarios that might lead to fuel failures due to excessive core temperatures and/or to vessel damage, due to excessive vessel temperatures. The design basis accident scenario leading to the highest vessel temperatures is the depressurized core heatup scenario without any forced cooling and with decay heat rejection to the passive Reactor Cavity Cooling System (RCCS). This scenario was evaluated, including numerous parametric variations of input parameters, like material properties and decay heat. It was found that significant safety margins exist, but that high confidence levels in the core effective thermal conductivity, the reactor vessel and RCCS thermal emissivities and the decay heat function are required to maintain this safety margin. Severe accident extensions of this depressurized core heatup scenario included the cases of complete RCCS failure, cases of massive air ingress, core heatup without scram and cases of degraded RCCS performance due to absorbing gases in the reactor cavity. Except for no-scram scenarios extending beyond 100 hr, the fuel never reached the limiting temperature of 1600/degree/C, below which measurable fuel failures are not expected. In some of the scenarios, excessive vessel and concrete temperatures could lead to investment losses but are not expected to lead to any source term beyond that from the circulating inventory. 19 refs., 56 figs., 11 tabs.

  20. Theoretical investigations of defects in a Si-based digital ferromagne...

    Office of Scientific and Technical Information (OSTI)

    Theoretical investigations of defects in a Si-based digital ferromagnetic heterostructure - a spintronic material Citation Details In-Document Search Title: Theoretical...

  1. Can we derive Tully's surface-hopping algorithm from the semiclassical quantum Liouville equation? Almost, but only with decoherence

    SciTech Connect (OSTI)

    Subotnik, Joseph E. Ouyang, Wenjun; Landry, Brian R.

    2013-12-07

    In this article, we demonstrate that Tully's fewest-switches surface hopping (FSSH) algorithm approximately obeys the mixed quantum-classical Liouville equation (QCLE), provided that several conditions are satisfied some major conditions, and some minor. The major conditions are: (1) nuclei must be moving quickly with large momenta; (2) there cannot be explicit recoherences or interference effects between nuclear wave packets; (3) force-based decoherence must be added to the FSSH algorithm, and the trajectories can no longer rigorously be independent (though approximations for independent trajectories are possible). We furthermore expect that FSSH (with decoherence) will be most robust when nonadiabatic transitions in an adiabatic basis are dictated primarily by derivative couplings that are presumably localized to crossing regions, rather than by small but pervasive off-diagonal force matrix elements. In the end, our results emphasize the strengths of and possibilities for the FSSH algorithm when decoherence is included, while also demonstrating the limitations of the FSSH algorithm and its inherent inability to follow the QCLE exactly.

  2. TECHNICAL BASIS FOR VENTILATION REQUIREMENTS IN TANK FARMS OPERATING SPECIFICATIONS DOCUMENTS

    SciTech Connect (OSTI)

    BERGLIN, E J

    2003-06-23

    This report provides the technical basis for high efficiency particulate air filter (HEPA) for Hanford tank farm ventilation systems (sometimes known as heating, ventilation and air conditioning [HVAC]) to support limits defined in Process Engineering Operating Specification Documents (OSDs). This technical basis included a review of older technical basis and provides clarifications, as necessary, to technical basis limit revisions or justification. This document provides an updated technical basis for tank farm ventilation systems related to Operation Specification Documents (OSDs) for double-shell tanks (DSTs), single-shell tanks (SSTs), double-contained receiver tanks (DCRTs), catch tanks, and various other miscellaneous facilities.

  3. Establishing the Technical Basis for Disposal of Heat-generating Waste in

    Energy Savers [EERE]

    Salt | Department of Energy Establishing the Technical Basis for Disposal of Heat-generating Waste in Salt Establishing the Technical Basis for Disposal of Heat-generating Waste in Salt The report summarizes available historic tests and the developed technical basis for disposal of heat-generating waste in salt, and the means by which a safety case for disposal of heat generating waste at a generic salt site can be initiated from the existing technical basis. Though the basis for a salt

  4. Theoretical model for plasma expansion generated by hypervelocity impact

    SciTech Connect (OSTI)

    Ju, Yuanyuan; Zhang, Qingming Zhang, Dongjiang; Long, Renrong; Chen, Li; Huang, Fenglei; Gong, Zizheng

    2014-09-15

    The hypervelocity impact experiments of spherical LY12 aluminum projectile diameter of 6.4?mm on LY12 aluminum target thickness of 23?mm have been conducted using a two-stage light gas gun. The impact velocity of the projectile is 5.2, 5.7, and 6.3?km/s, respectively. The experimental results show that the plasma phase transition appears under the current experiment conditions, and the plasma expansion consists of accumulation, equilibrium, and attenuation. The plasma characteristic parameters decrease as the plasma expands outward and are proportional with the third power of the impact velocity, i.e., (T{sub e}, n{sub e})???v{sub p}{sup 3}. Based on the experimental results, a theoretical model on the plasma expansion is developed and the theoretical results are consistent with the experimental data.

  5. Theoretical evaluation of the optimal performance of a thermoacoustic refrigerator

    SciTech Connect (OSTI)

    Minner, B.L.; Braun, J.E.; Mongeau, L.G.

    1997-12-31

    Theoretical models were integrated with a design optimization tool to allow estimates of the maximum coefficient of performance for thermoacoustic cooling systems. The system model was validated using experimental results for a well-documented prototype. The optimization tool was then applied to this prototype to demonstrate the benefits of systematic optimization. A twofold increase in performance was predicted through the variation of component dimensions alone, while a threefold improvement was estimated when the working fluid parameters were also considered. Devices with a similar configuration were optimized for operating requirements representative of a home refrigerator. The results indicate that the coefficients of performance are comparable to those of existing vapor-compression equipment for this application. In addition to the choice of working fluid, the heat exchanger configuration was found to be a critical design factor affecting performance. Further experimental work is needed to confirm the theoretical predictions presented in this paper.

  6. Particle Communication and Domain Neighbor Coupling: Scalable Domain Decomposed Algorithms for Monte Carlo Particle Transport

    SciTech Connect (OSTI)

    O'Brien, M J; Brantley, P S

    2015-01-20

    In order to run Monte Carlo particle transport calculations on new supercomputers with hundreds of thousands or millions of processors, care must be taken to implement scalable algorithms. This means that the algorithms must continue to perform well as the processor count increases. In this paper, we examine the scalability of:(1) globally resolving the particle locations on the correct processor, (2) deciding that particle streaming communication has finished, and (3) efficiently coupling neighbor domains together with different replication levels. We have run domain decomposed Monte Carlo particle transport on up to 221 = 2,097,152 MPI processes on the IBM BG/Q Sequoia supercomputer and observed scalable results that agree with our theoretical predictions. These calculations were carefully constructed to have the same amount of work on every processor, i.e. the calculation is already load balanced. We also examine load imbalanced calculations where each domains replication level is proportional to its particle workload. In this case we show how to efficiently couple together adjacent domains to maintain within workgroup load balance and minimize memory usage.

  7. Theoretical Spectroscopy of Low Dimensional Systems | MIT-Harvard Center

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    for Excitonics Theoretical Spectroscopy of Low Dimensional Systems November 11, 2009 at 2pm/Pfizer Hall - Mb-23 Harvard University 12 Oxford Street Cambridge Angel Rubio Universidad del Pais Vasco UPV/EHU and Centro Mixto CSIC-UPV/EHU rubio abstract: There has been much progress in the synthesis and characterization of nanostructures however, there remain immense challenges in understanding their properties and interactions with external probes in order to realize their tremendous potential

  8. COLLOQUIUM: Theoretical and Experimental Aspects of Controlled Quantum

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Dynamics | Princeton Plasma Physics Lab March 25, 2015, 4:15pm to 5:30pm MBG Auditorium COLLOQUIUM: Theoretical and Experimental Aspects of Controlled Quantum Dynamics Professor Herschel Rabitz Princeton University Abstract: PDF icon COLL.03.25.15.pdf Controlling quantum dynamics phenomena spans a wide range of applications and potential technologies. Although some experiments are far more demanding than others, the experiments are collectively proving to be remarkably successful considering

  9. Materials for electrochemical capacitors: Theoretical and experimental constraints

    SciTech Connect (OSTI)

    Sarangapani, S.; Tilak, B.V.; Chen, C.P.

    1996-11-01

    Electrochemical capacitors, also called supercapacitors, are unique devices exhibiting 20 to 200 times greater capacitance than conventional capacitors. The large capacitance exhibited by these systems has been demonstrated to arise from a combination of the double-layer capacitance and pseudocapacitance associated with surface redox-type reactions. The purpose of this review is to survey the published data of available electrode materials possessing high specific double-layer or pseudocapacitance and examine their reported performance data in relation to their theoretical expectations.

  10. Alamos National Laboratory theoretical biologists Bette Korber, Will Fischer, Sydeaka

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    strategy expands immune responses March 3, 2010 Mosaic vaccines show promise in reducing the spread of deadly virus LOS ALAMOS, New Mexico, March 3, 2010-Two teams of researchers-including Los Alamos National Laboratory theoretical biologists Bette Korber, Will Fischer, Sydeaka Watson, and James Szinger-have announced an HIV vaccination strategy that has been shown to expand the breadth and depth of immune responses in rhesus monkeys. Rhesus monkeys provide the best animal model currently

  11. Catalyst by Design - Theoretical, Nanostructural, and Experimental Studies

    Broader source: Energy.gov (indexed) [DOE]

    of Emission Treatment Catalyst | Department of Energy Poster presented at the 16th Directions in Engine-Efficiency and Emissions Research (DEER) Conference in Detroit, MI, September 27-30, 2010. PDF icon p-08_narula.pdf More Documents & Publications Catalyst by Design - Theoretical, Nanostructural, and Experimental Studies of Oxidation Catalyst for Diesel Engine Emission Treatment Catalysts via First Principles Catalysts via First Principles

  12. Catalyst by Design - Theoretical, Nanostructural, and Experimental Studies

    Broader source: Energy.gov (indexed) [DOE]

    of Oxidation Catalyst for Diesel Engine Emission Treatment | Department of Energy The overlap among theory, structure, and fully formed catalysts form the foundation of this study PDF icon deer09_narula.pdf More Documents & Publications Catalyst by Design - Theoretical, Nanostructural, and Experimental Studies of Emission Treatment Catalyst Catalysis by Design: Bridging the Gap Between Theory and Experiments at Nanoscale Level Catalysts via First Principles (Agreement ID:10635)

  13. Experimental and Theoretical Investigation of Lubricant and Additive

    Broader source: Energy.gov (indexed) [DOE]

    Effects on Engine Friction | Department of Energy Combining data from motored engine friction, a theoretical engine model, a line friction contact rig, and a fired engine can provide better insight to lube oil and additive performance. PDF icon p-02_rohr.pdf More Documents & Publications Validation of a Small Engine Based Procedure for Studying Performance of Engine Lube Oils, Ionic Liquids as Lubricants and/or Lubricant Additives, Opportunities for Engine Friction Reduction and Durable

  14. An efficient parallel algorithm for matrix-vector multiplication

    SciTech Connect (OSTI)

    Hendrickson, B.; Leland, R.; Plimpton, S.

    1993-03-01

    The multiplication of a vector by a matrix is the kernel computation of many algorithms in scientific computation. A fast parallel algorithm for this calculation is therefore necessary if one is to make full use of the new generation of parallel supercomputers. This paper presents a high performance, parallel matrix-vector multiplication algorithm that is particularly well suited to hypercube multiprocessors. For an n x n matrix on p processors, the communication cost of this algorithm is O(n/[radical]p + log(p)), independent of the matrix sparsity pattern. The performance of the algorithm is demonstrated by employing it as the kernel in the well-known NAS conjugate gradient benchmark, where a run time of 6.09 seconds was observed. This is the best published performance on this benchmark achieved to date using a massively parallel supercomputer.

  15. A brief comparison between grid based real space algorithms andspectrum algorithms for electronic structure calculations

    SciTech Connect (OSTI)

    Wang, Lin-Wang

    2006-12-01

    Quantum mechanical ab initio calculation constitutes the biggest portion of the computer time in material science and chemical science simulations. As a computer center like NERSC, to better serve these communities, it will be very useful to have a prediction for the future trends of ab initio calculations in these areas. Such prediction can help us to decide what future computer architecture can be most useful for these communities, and what should be emphasized on in future supercomputer procurement. As the size of the computer and the size of the simulated physical systems increase, there is a renewed interest in using the real space grid method in electronic structure calculations. This is fueled by two factors. First, it is generally assumed that the real space grid method is more suitable for parallel computation for its limited communication requirement, compared with spectrum method where a global FFT is required. Second, as the size N of the calculated system increases together with the computer power, O(N) scaling approaches become more favorable than the traditional direct O(N{sup 3}) scaling methods. These O(N) methods are usually based on localized orbital in real space, which can be described more naturally by the real space basis. In this report, the author compares the real space methods versus the traditional plane wave (PW) spectrum methods, for their technical pros and cons, and the possible of future trends. For the real space method, the author focuses on the regular grid finite different (FD) method and the finite element (FE) method. These are the methods used mostly in material science simulation. As for chemical science, the predominant methods are still Gaussian basis method, and sometime the atomic orbital basis method. These two basis sets are localized in real space, and there is no indication that their roles in quantum chemical simulation will change anytime soon. The author focuses on the density functional theory (DFT), which is the most used method for quantum mechanical material science simulation.

  16. CRITICALITY SAFETY CONTROLS AND THE SAFETY BASIS AT PFP

    SciTech Connect (OSTI)

    Kessler, S

    2009-04-21

    With the implementation of DOE Order 420.1B, Facility Safety, and DOE-STD-3007-2007, 'Guidelines for Preparing Criticality Safety Evaluations at Department of Energy Non-Reactor Nuclear Facilities', a new requirement was imposed that all criticality safety controls be evaluated for inclusion in the facility Documented Safety Analysis (DSA) and that the evaluation process be documented in the site Criticality Safety Program Description Document (CSPDD). At the Hanford site in Washington State the CSPDD, HNF-31695, 'General Description of the FH Criticality Safety Program', requires each facility develop a linking document called a Criticality Control Review (CCR) to document performance of these evaluations. Chapter 5, Appendix 5B of HNF-7098, Criticality Safety Program, provided an example of a format for a CCR that could be used in lieu of each facility developing its own CCR. Since the Plutonium Finishing Plant (PFP) is presently undergoing Deactivation and Decommissioning (D&D), new procedures are being developed for cleanout of equipment and systems that have not been operated in years. Existing Criticality Safety Evaluations (CSE) are revised, or new ones written, to develop the controls required to support D&D activities. Other Hanford facilities, including PFP, had difficulty using the basic CCR out of HNF-7098 when first implemented. Interpretation of the new guidelines indicated that many of the controls needed to be elevated to TSR level controls. Criterion 2 of the standard, requiring that the consequence of a criticality be examined for establishing the classification of a control, was not addressed. Upon in-depth review by PFP Criticality Safety staff, it was not clear that the programmatic interpretation of criterion 8C could be applied at PFP. Therefore, the PFP Criticality Safety staff decided to write their own CCR. The PFP CCR provides additional guidance for the evaluation team to use by clarifying the evaluation criteria in DOE-STD-3007-2007. In reviewing documents used in classifying controls for Nuclear Safety, it was noted that DOE-HDBK-1188, 'Glossary of Environment, Health, and Safety Terms', defines an Administrative Control (AC) in terms that are different than typically used in Criticality Safety. As part of this CCR, a new term, Criticality Administrative Control (CAC) was defined to clarify the difference between an AC used for criticality safety and an AC used for nuclear safety. In Nuclear Safety terms, an AC is a provision relating to organization and management, procedures, recordkeeping, assessment, and reporting necessary to ensure safe operation of a facility. A CAC was defined as an administrative control derived in a criticality safety analysis that is implemented to ensure double contingency. According to criterion 2 of Section IV, 'Linkage to the Documented Safety Analysis', of DOESTD-3007-2007, the consequence of a criticality should be examined for the purposes of classifying the significance of a control or component. HNF-PRO-700, 'Safety Basis Development', provides control selection criteria based on consequence and risk that may be used in the development of a Criticality Safety Evaluation (CSE) to establish the classification of a component as a design feature, as safety class or safety significant, i.e., an Engineered Safety Feature (ESF), or as equipment important to safety; or merely provides defense-in-depth. Similar logic is applied to the CACs. Criterion 8C of DOE-STD-3007-2007, as written, added to the confusion of using the basic CCR from HNF-7098. The PFP CCR attempts to clarify this criterion by revising it to say 'Programmatic commitments or general references to control philosophy (e.g., mass control or spacing control or concentration control as an overall control strategy for the process without specific quantification of individual limits) is included in the PFP DSA'. Table 1 shows the PFP methodology for evaluating CACs. This evaluation process has been in use since February of 2008 and has proven to be simple and effective. Each control identified i

  17. Theoretical hot methane line lists up to T = 2000 K for astrophysical applications

    SciTech Connect (OSTI)

    Rey, M.; Tyuterev, Vl. G.; Nikitin, A. V.

    2014-07-01

    The paper describes the construction of complete sets of hot methane lines based on accurate ab initio potential and dipole moment surfaces and extensive first-principle calculations. Four line lists spanning the [0-5000] cm{sup 1} infrared region were built at T = 500, 1000, 1500, and 2000 K. For each of these four temperatures, we have constructed two versions of line lists: a version for high-resolution applications containing strong and medium lines and a full version appropriate for low-resolution opacity calculations. A comparison with available empirical databases is discussed in detail for both cold and hot bands giving a very good agreement for line positions, typically <0.1-0.5 cm{sup 1} and ?5% for intensities of strong lines. Together with numerical tests using various basis sets, this confirms the computational convergence of our results for the most important lines, which is the major issue for theoretical spectra predictions. We showed that transitions with lower state energies up to 14,000 cm{sup 1} could give significant contributions to the methane opacity and have to be systematically taken into account. Our list at 2000 K calculated up to J = 50 contains 11.5 billion transitions for I > 10{sup 29} cm mol{sup 1}. These new lists are expected to be quantitatively accurate with respect to the precision of available and currently planned observations of astrophysical objects with improved spectral resolution.

  18. Generalizing the self-healing diffusion Monte Carlo approach to finite temperature: a path for the optimization of low-energy many-body basis expansions

    SciTech Connect (OSTI)

    Kim, Jeongnim; Reboredo, Fernando A

    2014-01-01

    The self-healing diffusion Monte Carlo method for complex functions [F. A. Reboredo J. Chem. Phys. {\\bf 136}, 204101 (2012)] and some ideas of the correlation function Monte Carlo approach [D. M. Ceperley and B. Bernu, J. Chem. Phys. {\\bf 89}, 6316 (1988)] are blended to obtain a method for the calculation of thermodynamic properties of many-body systems at low temperatures. In order to allow the evolution in imaginary time to describe the density matrix, we remove the fixed-node restriction using complex antisymmetric trial wave functions. A statistical method is derived for the calculation of finite temperature properties of many-body systems near the ground state. In the process we also obtain a parallel algorithm that optimizes the many-body basis of a small subspace of the many-body Hilbert space. This small subspace is optimized to have maximum overlap with the one expanded by the lower energy eigenstates of a many-body Hamiltonian. We show in a model system that the Helmholtz free energy is minimized within this subspace as the iteration number increases. We show that the subspace expanded by the small basis systematically converges towards the subspace expanded by the lowest energy eigenstates. Possible applications of this method to calculate the thermodynamic properties of many-body systems near the ground state are discussed. The resulting basis can be also used to accelerate the calculation of the ground or excited states with Quantum Monte Carlo.

  19. Two linear time, low overhead algorithms for graph layout

    Energy Science and Technology Software Center (OSTI)

    2008-01-10

    The software comprises two algorithms designed to perform a 2D layout of a graph structure in time linear with respect to the vertices and edges in the graph, whereas most other layout algorithms have a running time that is quadratic with respect to the number of vertices or greater. Although these layout algorithms run in a fraction of the time as their competitors, they provide competitive results when applied to most real-world graphs. These algorithmsmore » also have a low constant running time and small memory footprint, making them useful for small to large graphs.« less

  20. Genetic Algorithm Based Neural Networks for Nonlinear Optimization

    Energy Science and Technology Software Center (OSTI)

    1994-09-28

    This software develops a novel approach to nonlinear optimization using genetic algorithm based neural networks. To our best knowledge, this approach represents the first attempt at applying both neural network and genetic algorithm techniques to solve a nonlinear optimization problem. The approach constructs a neural network structure and an appropriately shaped energy surface whose minima correspond to optimal solutions of the problem. A genetic algorithm is employed to perform a parallel and powerful search ofmore » the energy surface.« less

  1. A divide-conquer-recombine algorithmic paradigm for large spatiotemporal quantum molecular dynamics simulations

    SciTech Connect (OSTI)

    Shimojo, Fuyuki; Hattori, Shinnosuke [Collaboratory for Advanced Computing and Simulations, Department of Physics and Astronomy, Department of Computer Science, and Department of Chemical Engineering and Materials Science, University of Southern California, Los Angeles, California 90089-0242 (United States) [Collaboratory for Advanced Computing and Simulations, Department of Physics and Astronomy, Department of Computer Science, and Department of Chemical Engineering and Materials Science, University of Southern California, Los Angeles, California 90089-0242 (United States); Department of Physics, Kumamoto University, Kumamoto 860-8555 (Japan); Kalia, Rajiv K.; Mou, Weiwei; Nakano, Aiichiro; Nomura, Ken-ichi; Rajak, Pankaj; Vashishta, Priya [Collaboratory for Advanced Computing and Simulations, Department of Physics and Astronomy, Department of Computer Science, and Department of Chemical Engineering and Materials Science, University of Southern California, Los Angeles, California 90089-0242 (United States)] [Collaboratory for Advanced Computing and Simulations, Department of Physics and Astronomy, Department of Computer Science, and Department of Chemical Engineering and Materials Science, University of Southern California, Los Angeles, California 90089-0242 (United States); Kunaseth, Manaschai [Collaboratory for Advanced Computing and Simulations, Department of Physics and Astronomy, Department of Computer Science, and Department of Chemical Engineering and Materials Science, University of Southern California, Los Angeles, California 90089-0242 (United States) [Collaboratory for Advanced Computing and Simulations, Department of Physics and Astronomy, Department of Computer Science, and Department of Chemical Engineering and Materials Science, University of Southern California, Los Angeles, California 90089-0242 (United States); National Nanotechnology Center, Pathumthani 12120 (Thailand); Ohmura, Satoshi [Collaboratory for Advanced Computing and Simulations, Department of Physics and Astronomy, Department of Computer Science, and Department of Chemical Engineering and Materials Science, University of Southern California, Los Angeles, California 90089-0242 (United States) [Collaboratory for Advanced Computing and Simulations, Department of Physics and Astronomy, Department of Computer Science, and Department of Chemical Engineering and Materials Science, University of Southern California, Los Angeles, California 90089-0242 (United States); Department of Physics, Kumamoto University, Kumamoto 860-8555 (Japan); Department of Physics, Kyoto University, Kyoto 606-8502 (Japan); Shimamura, Kohei [Collaboratory for Advanced Computing and Simulations, Department of Physics and Astronomy, Department of Computer Science, and Department of Chemical Engineering and Materials Science, University of Southern California, Los Angeles, California 90089-0242 (United States) [Collaboratory for Advanced Computing and Simulations, Department of Physics and Astronomy, Department of Computer Science, and Department of Chemical Engineering and Materials Science, University of Southern California, Los Angeles, California 90089-0242 (United States); Department of Physics, Kumamoto University, Kumamoto 860-8555 (Japan); Department of Applied Quantum Physics and Nuclear Engineering, Kyushu University, Fukuoka 819-0395 (Japan)

    2014-05-14

    We introduce an extension of the divide-and-conquer (DC) algorithmic paradigm called divide-conquer-recombine (DCR) to perform large quantum molecular dynamics (QMD) simulations on massively parallel supercomputers, in which interatomic forces are computed quantum mechanically in the framework of density functional theory (DFT). In DCR, the DC phase constructs globally informed, overlapping local-domain solutions, which in the recombine phase are synthesized into a global solution encompassing large spatiotemporal scales. For the DC phase, we design a lean divide-and-conquer (LDC) DFT algorithm, which significantly reduces the prefactor of the O(N) computational cost for N electrons by applying a density-adaptive boundary condition at the peripheries of the DC domains. Our globally scalable and locally efficient solver is based on a hybrid real-reciprocal space approach that combines: (1) a highly scalable real-space multigrid to represent the global charge density; and (2) a numerically efficient plane-wave basis for local electronic wave functions and charge density within each domain. Hybrid space-band decomposition is used to implement the LDC-DFT algorithm on parallel computers. A benchmark test on an IBM Blue Gene/Q computer exhibits an isogranular parallel efficiency of 0.984 on 786?432 cores for a 50.3 10{sup 6}-atom SiC system. As a test of production runs, LDC-DFT-based QMD simulation involving 16?661 atoms is performed on the Blue Gene/Q to study on-demand production of hydrogen gas from water using LiAl alloy particles. As an example of the recombine phase, LDC-DFT electronic structures are used as a basis set to describe global photoexcitation dynamics with nonadiabatic QMD (NAQMD) and kinetic Monte Carlo (KMC) methods. The NAQMD simulations are based on the linear response time-dependent density functional theory to describe electronic excited states and a surface-hopping approach to describe transitions between the excited states. A series of techniques are employed for efficiently calculating the long-range exact exchange correction and excited-state forces. The NAQMD trajectories are analyzed to extract the rates of various excitonic processes, which are then used in KMC simulation to study the dynamics of the global exciton flow network. This has allowed the study of large-scale photoexcitation dynamics in 6400-atom amorphous molecular solid, reaching the experimental time scales.

  2. Theoretical and experimental research on multi-beam klystron

    SciTech Connect (OSTI)

    Ding Yaogen; Peng Jun; Zhu Yunshu; Shi Shaoming [Institute of Electronics, Chinese Academy of Sciences, Beijing 100080 (China)

    1999-05-07

    Theoretical and experimental research work on multi-beam klystron (MBK) conducted in Institute of Electronics, Chinese Academy of Sciences (IECAS) is described in this paper. Research progress on Interaction between multi-electron beam and microwave electric field, multi-beam cavity, filter loaded double gap cavity broadband output circuit, multi-beam electron gun, and periodic reversal focusing system is presented. Performance and measurement results of five types of MBK are also given out. The key technical problems for present MBK are discussed in this paper.

  3. Enterprise Assessments Targeted Review of the Safety Basis at the Savannah

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    River Site F-Area Central Laboratory Facility - January 2016 | Department of Energy Basis at the Savannah River Site F-Area Central Laboratory Facility - January 2016 Enterprise Assessments Targeted Review of the Safety Basis at the Savannah River Site F-Area Central Laboratory Facility - January 2016 January 2016 Review of the Safety Basis F-Area Central Laboratory Facility at the Savannah River Site The Office of Nuclear Safety and Environmental Assessments, within the U.S. Department of

  4. Nuclear Safety Basis Program Review Overview and Management Oversight Standard Review Plan

    Broader source: Energy.gov [DOE]

    This SRP, Nuclear Safety Basis Program Review, consists of five volumes. It provides information to help strengthen the technical rigor of line management oversight and federal monitoring of DOE nuclear facilities. It provides a primer on the safety basis development and documentation process used by the DOE. It also provides a set of LOIs for the review of safety basis programs and documents of nuclear facilities at various stages of the facility life cycle.

  5. Report to the Secretary of Energy on Beyond Design Basis Event...

    Broader source: Energy.gov (indexed) [DOE]

    BDBEReportfinal.pdf More Documents & Publications Report to the Secretary of Energy on Beyond Design Basis Event Pilot Evaluations, Results and Recommendations for Improvements...

  6. Technical Basis and Considerations for DOE M 435.1-1 (Appendix A)

    Broader source: Directives, Delegations, and Requirements [Office of Management (MA)]

    1999-07-09

    This appendix establishes the technical basis of the order revision process and of each of the requirements included in the revised radioactive waste management order.

  7. Improving Department of Energy Capabilities for Mitigating Beyond Design Basis Events

    Broader source: Energy.gov [DOE]

    This is a level 1 operating experience document providing direction for Improving Department of Energy Capabilities for Mitigating Beyond Design Basis Events. [OE-1: 2013-01

  8. Structural basis for the prion-like MAVS filaments in antiviral...

    Office of Scientific and Technical Information (OSTI)

    in antiviral innate immunity Citation Details In-Document Search Title: Structural basis for the prion-like MAVS filaments in antiviral innate immunity Authors: Xu, Hui ; He, ...

  9. 2010 DOE National Science Bowl® Photos - Basis Charter School...

    Office of Science (SC) Website

    Basis Charter School National Science Bowl (NSB) NSB Home About National Science Bowl Contacts Regional Science Bowl Coordinators National Science Bowl FAQ's Alumni Past National ...

  10. CRAD, Review of Safety Basis Development - May 6, 2013 | Department of

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Energy Review of Safety Basis Development - May 6, 2013 CRAD, Review of Safety Basis Development - May 6, 2013 May 6, 2013 Review of Safety Basis Development for the Los Alamos National Laboratory Transuranic Waste Facility (HSS CRAD 45-59, Rev. 0) The review will consider selected aspects of the development of safety basis for the Transuranic Waste Facility (TWF) to assess the extent to which safety is integrated into the design of the TWF in accordance with DOE directives; in particular,

  11. Sandia Energy - Genetic Algorithm for Innovative Device Designs...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Genetic Algorithm for Innovative Device Designs in High-Efficiency III-V Nitride Light-Emitting Diodes Home Energy Solid-State Lighting News Energy Efficiency News & Events Genetic...

  12. Gacs quantum algorithmic entropy in infinite dimensional Hilbert spaces

    SciTech Connect (OSTI)

    Benatti, Fabio; Oskouei, Samad Khabbazi Deh Abad, Ahmad Shafiei

    2014-08-15

    We extend the notion of Gacs quantum algorithmic entropy, originally formulated for finitely many qubits, to infinite dimensional quantum spin chains and investigate the relation of this extension with two quantum dynamical entropies that have been proposed in recent years.

  13. THE LEVENBERG-MARQUARDT ALGORITHM: IMPLEMENTATION AND THEORY

    Office of Scientific and Technical Information (OSTI)

    ... Quart. Appl. Math. 2, 164-168. 8. Iferquardt, D. W. 1963. An algorithm for least squares estimation of non- linear parameters, SIAM J. Appl. Math. 11, 431-441. 9. Osborne, M. R. ...

  14. Theoretical Minimum Energies to Produce Steel for Selected Conditions

    SciTech Connect (OSTI)

    Fruehan, R.J.; Fortini, O.; Paxton, H.W.; Brindle, R.

    2000-05-01

    The energy used to produce liquid steel in today's integrated and electric arc furnace (EAF) facilities is significantly higher than the theoretical minimum energy requirements. This study presents the absolute minimum energy required to produce steel from ore and mixtures of scrap and scrap alternatives. Additional cases in which the assumptions are changed to more closely approximate actual operating conditions are also analyzed. The results, summarized in Table E-1, should give insight into the theoretical and practical potentials for reducing steelmaking energy requirements. The energy values have also been converted to carbon dioxide (CO{sub 2}) emissions in order to indicate the potential for reduction in emissions of this greenhouse gas (Table E-2). The study showed that increasing scrap melting has the largest impact on energy consumption. However, scrap should be viewed as having ''invested'' energy since at one time it was produced by reducing ore. Increasing scrap melting in the BOF mayor may not decrease energy if the ''invested'' energy in scrap is considered.

  15. The magnetic flywheel flow meter: Theoretical and experimental contributions

    SciTech Connect (OSTI)

    Buchenau, D. Galindo, V.; Eckert, S.

    2014-06-02

    The development of contactless flow meters is an important issue for monitoring and controlling of processes in different application fields, like metallurgy, liquid metal casting, or cooling systems for nuclear reactors and transmutation machines. Shercliff described in his book “The Theory of Electromagnetic Flow Measurement, Cambridge University Press, 1962” a simple and robust device for contact-less measurements of liquid metal flow rates which is known as magnetic flywheel. The sensor consists of several permanent magnets attached on a rotatable soft iron plate. This arrangement will be placed closely to the liquid metal flow to be measured, so that the field of the permanent magnets penetrates into the fluid volume. The flywheel will be accelerated by a Lorentz force arising from the interaction between the magnetic field and the moving liquid. Steady rotation rates of the flywheel can be taken as a measure for the mean flow rate inside the fluid channel. The present paper provides a detailed theoretical description of the sensor in order to gain a better insight into the functional principle of the magnetic flywheel. Theoretical predictions are confirmed by corresponding laboratory experiments. For that purpose, a laboratory model of such a flow meter was built and tested on a GaInSn-loop under various test conditions.

  16. An Experimental and Theoretical High Energy Physics Program

    SciTech Connect (OSTI)

    Shipsey, Ian

    2012-07-31

    The Purdue High Energy Physics Group conducts research in experimental and theoretical elementary particle physics and experimental high energy astrophysics. Our goals, which we share with high energy physics colleagues around the world, are to understand at the most fundamental level the nature of matter, energy, space and time, and in order to explain the birth, evolution and fate of the Universe. The experiments in which we are currently involved are: CDF, CLEO-c, CMS, LSST, and VERITAS. We have been instrumental in establishing two major in-house facilities: The Purdue Particle Physics Microstructure Detector Facility (P3MD) in 1995 and the CMS Tier-2 center in 2005. The research efforts of the theory group span phenomenological and theoretical aspects of the Standard Model as well as many of its possible extensions. Recent work includes phenomenological consequences of supersymmetric models, string theory and applications of gauge/gravity duality, the cosmological implications of massive gravitons, and the physics of extra dimensions.

  17. Generation of Simulated Wind Data using an Intelligent Algorithm

    Office of Scientific and Technical Information (OSTI)

    (Conference) | SciTech Connect Conference: Generation of Simulated Wind Data using an Intelligent Algorithm Citation Details In-Document Search Title: Generation of Simulated Wind Data using an Intelligent Algorithm Authors: Weissbach, R. ; Wang, W. L. ; Hodge, B. M. ; Tang, M. H. ; Sonnenmeier, J. Publication Date: 2014-01-01 OSTI Identifier: 1176733 DOE Contract Number: AC36-08GO28308 Resource Type: Conference Resource Relation: Conference: Proceedings of the 2014 North American Power

  18. Library Event Matching event classification algorithm for electron neutrino

    Office of Scientific and Technical Information (OSTI)

    interactions in the NOνA detectors (Journal Article) | SciTech Connect Library Event Matching event classification algorithm for electron neutrino interactions in the NOνA detectors Citation Details In-Document Search This content will become publicly available on April 12, 2017 Title: Library Event Matching event classification algorithm for electron neutrino interactions in the NOνA detectors Authors: Backhouse, C. ; Patterson, R. B. Publication Date: 2015-04-01 OSTI Identifier: 1245665

  19. New Algorithm Enables Faster Simulations of Ultrafast Processes

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Algorithm Enables Faster Simulations of Ultrafast Processes New Algorithm Enables Faster Simulations of Ultrafast Processes Opens the Door for Real-Time Simulations in Atomic-Level Materials Research February 20, 2015 Contact: Rachel Berkowitz, 510-486-7254, rberkowitz@lbl.gov femtosecondalgorithm copy Model of ion (Cl) collision with atomically thin semiconductor (MoSe2). Collision region is shown in blue and zoomed in; red points show initial positions of Cl. The simulation calculates the

  20. Visualizing and improving the robustness of phase retrieval algorithms

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Tripathi, Ashish; Leyffer, Sven; Munson, Todd; Wild, Stefan M.

    2015-06-01

    Coherent x-ray diffractive imaging is a novel imaging technique that utilizes phase retrieval and nonlinear optimization methods to image matter at nanometer scales. We explore how the convergence properties of a popular phase retrieval algorithm, Fienup's HIO, behave by introducing a reduced dimensionality problem allowing us to visualize and quantify convergence to local minima and the globally optimal solution. We then introduce generalizations of HIO that improve upon the original algorithm's ability to converge to the globally optimal solution.

  1. DEVELOPMENT OF METHOD AND ALGORITHMS TO IDENTIFY EASILY IMPLEMENTABLE

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    ENERGY-EFFICIENT LOW-COST MULTICOMPONENT DISTILLATION COLUMN TRAINS WITH LARGE ENERGY SAVINGS FOR WIDE NUMBER OF SEPARATIONS | Department of Energy DEVELOPMENT OF METHOD AND ALGORITHMS TO IDENTIFY EASILY IMPLEMENTABLE ENERGY-EFFICIENT LOW-COST MULTICOMPONENT DISTILLATION COLUMN TRAINS WITH LARGE ENERGY SAVINGS FOR WIDE NUMBER OF SEPARATIONS DEVELOPMENT OF METHOD AND ALGORITHMS TO IDENTIFY EASILY IMPLEMENTABLE ENERGY-EFFICIENT LOW-COST MULTICOMPONENT DISTILLATION COLUMN TRAINS WITH LARGE

  2. The differential algebra based multiple level fast multipole algorithm for

    Office of Scientific and Technical Information (OSTI)

    3D space charge field calculation and photoemission simulation (Journal Article) | SciTech Connect Journal Article: The differential algebra based multiple level fast multipole algorithm for 3D space charge field calculation and photoemission simulation Citation Details In-Document Search This content will become publicly available on September 28, 2016 Title: The differential algebra based multiple level fast multipole algorithm for 3D space charge field calculation and photoemission

  3. The theoretical study of passive and active optical devices via planewave based transfer (scattering) matrix method and other approaches

    SciTech Connect (OSTI)

    Zhuo, Ye

    2011-05-15

    In this thesis, we theoretically study the electromagnetic wave propagation in several passive and active optical components and devices including 2-D photonic crystals, straight and curved waveguides, organic light emitting diodes (OLEDs), and etc. Several optical designs are also presented like organic photovoltaic (OPV) cells and solar concentrators. The first part of the thesis focuses on theoretical investigation. First, the plane-wave-based transfer (scattering) matrix method (TMM) is briefly described with a short review of photonic crystals and other numerical methods to study them (Chapter 1 and 2). Next TMM, the numerical method itself is investigated in details and developed in advance to deal with more complex optical systems. In chapter 3, TMM is extended in curvilinear coordinates to study curved nanoribbon waveguides. The problem of a curved structure is transformed into an equivalent one of a straight structure with spatially dependent tensors of dielectric constant and magnetic permeability. In chapter 4, a new set of localized basis orbitals are introduced to locally represent electromagnetic field in photonic crystals as alternative to planewave basis. The second part of the thesis focuses on the design of optical devices. First, two examples of TMM applications are given. The first example is the design of metal grating structures as replacements of ITO to enhance the optical absorption in OPV cells (chapter 6). The second one is the design of the same structure as above to enhance the light extraction of OLEDs (chapter 7). Next, two design examples by ray tracing method are given, including applying a microlens array to enhance the light extraction of OLEDs (chapter 5) and an all-angle wide-wavelength design of solar concentrator (chapter 8). In summary, this dissertation has extended TMM which makes it capable of treating complex optical systems. Several optical designs by TMM and ray tracing method are also given as a full complement of this work.

  4. Monitoring and Commissioning Verification Algorithms for CHP Systems

    SciTech Connect (OSTI)

    Brambley, Michael R.; Katipamula, Srinivas; Jiang, Wei

    2008-03-31

    This document provides the algorithms for CHP system performance monitoring and commissioning verification (CxV). It starts by presenting system-level and component-level performance metrics, followed by descriptions of algorithms for performance monitoring and commissioning verification, using the metric presented earlier. Verification of commissioning is accomplished essentially by comparing actual measured performance to benchmarks for performance provided by the system integrator and/or component manufacturers. The results of these comparisons are then automatically interpreted to provide conclusions regarding whether the CHP system and its components have been properly commissioned and where problems are found, guidance is provided for corrections. A discussion of uncertainty handling is then provided, which is followed by a description of how simulations models can be used to generate data for testing the algorithms. A model is described for simulating a CHP system consisting of a micro-turbine, an exhaust-gas heat recovery unit that produces hot water, a absorption chiller and a cooling tower. The process for using this model for generating data for testing the algorithms for a selected set of faults is described. The next section applies the algorithms developed to CHP laboratory and field data to illustrate their use. The report then concludes with a discussion of the need for laboratory testing of the algorithms on a physical CHP systems and identification of the recommended next steps.

  5. Incremental k-core decomposition: Algorithms and evaluation

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Sariyuce, Ahmet Erdem; Gedik, Bugra; Jacques-SIlva, Gabriela; Wu, Kun -Lung; Catalyurek, Umit V.

    2016-02-01

    A k-core of a graph is a maximal connected subgraph in which every vertex is connected to at least k vertices in the subgraph. k-core decomposition is often used in large-scale network analysis, such as community detection, protein function prediction, visualization, and solving NP-hard problems on real networks efficiently, like maximal clique finding. In many real-world applications, networks change over time. As a result, it is essential to develop efficient incremental algorithms for dynamic graph data. In this paper, we propose a suite of incremental k-core decomposition algorithms for dynamic graph data. These algorithms locate a small subgraph that ismore » guaranteed to contain the list of vertices whose maximum k-core values have changed and efficiently process this subgraph to update the k-core decomposition. We present incremental algorithms for both insertion and deletion operations, and propose auxiliary vertex state maintenance techniques that can further accelerate these operations. Our results show a significant reduction in runtime compared to non-incremental alternatives. We illustrate the efficiency of our algorithms on different types of real and synthetic graphs, at varying scales. Furthermore, for a graph of 16 million vertices, we observe relative throughputs reaching a million times, relative to the non-incremental algorithms.« less

  6. Impacts of Time Delays on Distributed Algorithms for Economic Dispatch

    SciTech Connect (OSTI)

    Yang, Tao; Wu, Di; Sun, Yannan; Lian, Jianming

    2015-07-26

    Economic dispatch problem (EDP) is an important problem in power systems. It can be formulated as an optimization problem with the objective to minimize the total generation cost subject to the power balance constraint and generator capacity limits. Recently, several consensus-based algorithms have been proposed to solve EDP in a distributed manner. However, impacts of communication time delays on these distributed algorithms are not fully understood, especially for the case where the communication network is directed, i.e., the information exchange is unidirectional. This paper investigates communication time delay effects on a distributed algorithm for directed communication networks. The algorithm has been tested by applying time delays to different types of information exchange. Several case studies are carried out to evaluate the effectiveness and performance of the algorithm in the presence of time delays in communication networks. It is found that time delay effects have negative effects on the convergence rate, and can even result in an incorrect converge value or fail the algorithm to converge.

  7. Visual Empirical Region of Influence (VERI) Pattern Recognition Algorithms

    SciTech Connect (OSTI)

    2002-05-01

    We developed new pattern recognition (PR) algorithms based on a human visual perception model. We named these algorithms Visual Empirical Region of Influence (VERI) algorithms. To compare the new algorithm's effectiveness against othe PR algorithms, we benchmarked their clustering capabilities with a standard set of two-dimensional data that is well known in the PR community. The VERI algorithm succeeded in clustering all the data correctly. No existing algorithm had previously clustered all the pattens in the data set successfully. The commands to execute VERI algorithms are quite difficult to master when executed from a DOS command line. The algorithm requires several parameters to operate correctly. From our own experiences we realized that if we wanted to provide a new data analysis tool to the PR community we would have to provide a new data analysis tool to the PR community we would have to make the tool powerful, yet easy and intuitive to use. That was our motivation for developing graphical user interfaces (GUI's) to the VERI algorithms. We developed GUI's to control the VERI algorithm in a single pass mode and in an optimization mode. We also developed a visualization technique that allows users to graphically animate and visually inspect multi-dimensional data after it has been classified by the VERI algorithms. The visualization technique that allows users to graphically animate and visually inspect multi-dimensional data after it has been classified by the VERI algorithms. The visualization package is integrated into the single pass interface. Both the single pass interface and optimization interface are part of the PR software package we have developed and make available to other users. The single pass mode only finds PR results for the sets of features in the data set that are manually requested by the user. The optimization model uses a brute force method of searching through the cominations of features in a data set for features that produce the best pattern recognition results. With a small number of features in a data set an exact solution can be determined. However, the number of possible combinations increases exponentially with the number of features and an alternate means of finding a solution must be found. We developed and implemented a technique for finding solutions in data sets with both small and large numbers of features. The VERI interface tools were written using the Tcl/Tk GUI programming language, version 8.1. Although the Tcl/Tk packages are designed to run on multiple computer platforms, we have concentrated our efforts to develop a user interface for the ubiquitous DOS environment. The VERI algorithms are compiled, executable programs. The interfaces run the VERI algorithms in Leave-One-Out mode using the Euclidean metric.

  8. Visual Empirical Region of Influence (VERI) Pattern Recognition Algorithms

    Energy Science and Technology Software Center (OSTI)

    2002-05-01

    We developed new pattern recognition (PR) algorithms based on a human visual perception model. We named these algorithms Visual Empirical Region of Influence (VERI) algorithms. To compare the new algorithm's effectiveness against othe PR algorithms, we benchmarked their clustering capabilities with a standard set of two-dimensional data that is well known in the PR community. The VERI algorithm succeeded in clustering all the data correctly. No existing algorithm had previously clustered all the pattens inmore » the data set successfully. The commands to execute VERI algorithms are quite difficult to master when executed from a DOS command line. The algorithm requires several parameters to operate correctly. From our own experiences we realized that if we wanted to provide a new data analysis tool to the PR community we would have to provide a new data analysis tool to the PR community we would have to make the tool powerful, yet easy and intuitive to use. That was our motivation for developing graphical user interfaces (GUI's) to the VERI algorithms. We developed GUI's to control the VERI algorithm in a single pass mode and in an optimization mode. We also developed a visualization technique that allows users to graphically animate and visually inspect multi-dimensional data after it has been classified by the VERI algorithms. The visualization technique that allows users to graphically animate and visually inspect multi-dimensional data after it has been classified by the VERI algorithms. The visualization package is integrated into the single pass interface. Both the single pass interface and optimization interface are part of the PR software package we have developed and make available to other users. The single pass mode only finds PR results for the sets of features in the data set that are manually requested by the user. The optimization model uses a brute force method of searching through the cominations of features in a data set for features that produce the best pattern recognition results. With a small number of features in a data set an exact solution can be determined. However, the number of possible combinations increases exponentially with the number of features and an alternate means of finding a solution must be found. We developed and implemented a technique for finding solutions in data sets with both small and large numbers of features. The VERI interface tools were written using the Tcl/Tk GUI programming language, version 8.1. Although the Tcl/Tk packages are designed to run on multiple computer platforms, we have concentrated our efforts to develop a user interface for the ubiquitous DOS environment. The VERI algorithms are compiled, executable programs. The interfaces run the VERI algorithms in Leave-One-Out mode using the Euclidean metric.« less

  9. Criteria Document for B-plant Surveillance and Maintenance Phase Safety Basis Document

    SciTech Connect (OSTI)

    SCHWEHR, B.A.

    1999-08-31

    This document is required by the Project Hanford Managing Contractor (PHMC) procedure, HNF-PRO-705, Safety Basis Planning, Documentation, Review, and Approval. This document specifies the criteria that shall be in the B Plant surveillance and maintenance phase safety basis in order to obtain approval of the DOE-RL. This CD describes the criteria to be addressed in the S&M Phase safety basis for the deactivated Waste Fractionization Facility (B Plant) on the Hanford Site in Washington state. This criteria document describes: the document type and format that will be used for the S&M Phase safety basis, the requirements documents that will be invoked for the document development, the deactivated condition of the B Plant facility, and the scope of issues to be addressed in the S&M Phase safety basis document.

  10. Structural and Electronic Properties of Isolated Nanodiamonds: A Theoretical Perspective

    SciTech Connect (OSTI)

    Raty, J; Galli, G

    2004-09-09

    Nanometer sized diamond has been found in meteorites, proto-planetary nebulae and interstellar dusts, as well as in residues of detonation and in diamond films. Remarkably, the size distribution of diamond nanoparticles appears to be peaked around 2-5 nm, and to be largely independent of preparation conditions. Using ab-initio calculations, we have shown that in this size range nanodiamond has a fullerene-like surface and, unlike silicon and germanium, exhibit very weak quantum confinement effects. We called these carbon nanoparticles bucky-diamonds: their atomic structure, predicted by simulations, is consistent with many experimental findings. In addition, we carried out calculations of the stability of nanodiamond which provided a unifying explanation of its size distribution in extra-terrestrial samples, and in ultra-crystalline diamond films. Here we present a summary of our theoretical results and we briefly outline work in progress on doping of nanodiamond with nitrogen.

  11. Experimental And Theoretical High Energy Physics Research At UCLA

    SciTech Connect (OSTI)

    Cousins, Robert D.

    2013-07-22

    This is the final report of the UCLA High Energy Physics DOE Grant No. DE-FG02- 91ER40662. This report covers the last grant project period, namely the three years beginning January 15, 2010, plus extensions through April 30, 2013. The report describes the broad range of our experimental research spanning direct dark matter detection searches using both liquid xenon (XENON) and liquid argon (DARKSIDE); present (ICARUS) and R&D for future (LBNE) neutrino physics; ultra-high-energy neutrino and cosmic ray detection (ANITA); and the highest-energy accelerator-based physics with the CMS experiment and CERNs Large Hadron Collider. For our theory group, the report describes frontier activities including particle astrophysics and cosmology; neutrino physics; LHC interaction cross section calculations now feasible due to breakthroughs in theoretical techniques; and advances in the formal theory of supergravity.

  12. Theoretical collapse pressures for two pressurized torispherical heads

    SciTech Connect (OSTI)

    Kalnins, A.; Updike, D.P.; Rana, M.D.

    1995-12-01

    In order to determine the pressures at which real torispherical heads fail upon a single application of pressure, two heads were pressurized in recent Praxair tests, and displacements and strains were recorded at various locations. In this paper, theoretical results for the two test heads are presented in the form of curves of pressure versus crown deflections, using the available geometry and material parameters. From these curves, limit and collapse pressures are calculated, using procedures permitted by the ASME B and PV Code Section 8/Div.2. These pressures are shown to vary widely, depending on the method and model used to calculate them. The effect of no stress relief on the behavior of the Praxair test heads is also evaluated and found to be of no significance for neither the objectives of the tests nor the objectives of this paper. The results of this paper are submitted as an enhancement to the experimental results recorded during the Praxair tests.

  13. Modeling an Application's Theoretical Minimum and Average Transactional Response Times

    SciTech Connect (OSTI)

    Paiz, Mary Rose

    2015-04-01

    The theoretical minimum transactional response time of an application serves as a ba- sis for the expected response time. The lower threshold for the minimum response time represents the minimum amount of time that the application should take to complete a transaction. Knowing the lower threshold is beneficial in detecting anomalies that are re- sults of unsuccessful transactions. On the converse, when an application's response time falls above an upper threshold, there is likely an anomaly in the application that is causing unusual performance issues in the transaction. This report explains how the non-stationary Generalized Extreme Value distribution is used to estimate the lower threshold of an ap- plication's daily minimum transactional response time. It also explains how the seasonal Autoregressive Integrated Moving Average time series model is used to estimate the upper threshold for an application's average transactional response time.

  14. Theoretical analysis of sound transmission loss through graphene sheets

    SciTech Connect (OSTI)

    Natsuki, Toshiaki; Ni, Qing-Qing

    2014-11-17

    We examine the potential of using graphene sheets (GSs) as sound insulating materials that can be used for nano-devices because of their small size, super electronic, and mechanical properties. In this study, a theoretical analysis is proposed to predict the sound transmission loss through multi-layered GSs, which are formed by stacks of GS and bound together by van der Waals (vdW) forces between individual layers. The result shows that the resonant frequencies of the sound transmission loss occur in the multi-layered GSs and the values are very high. Based on the present analytical solution, we predict the acoustic insulation property for various layers of sheets under both normal incident wave and acoustic field of random incidence source. The scheme could be useful in vibration absorption application of nano devices and materials.

  15. A theoretical analysis of rotating cavitation in inducers

    SciTech Connect (OSTI)

    Tsujimoto, Y.; Kamijo, K. (National Aerospace Lab., Miyagi, (Japan)); Yoshida, Y. (Osaka Univ., Toyonaka, (Japan). Engineering Science)

    1993-03-01

    Rotating cavitation was analyzed using an actuator disk method. Quasi-steady pressure performance of the impeller, mass flow gain factor, and cavitation compliance of the cavity were taken into account. Three types of destabilizing modes were predicted: rotation cavitation propagating faster than the rotational speed of the impeller, rotating cavitation propagating in the direction opposite that of the impeller, and rotating stall propagating slower than the rotational speed of the impeller. It was shown that both types of rotating cavitation were caused by the positive mass flow gain factor, while the rotating stall was caused by the positive slope of the pressure performance. Stability and propagation velocity maps are presented for the two types of rotating cavitation in the mass flow gain factor-cavitation compliance place. The correlation between theoretical results and experimental observations is discussed.

  16. Theoretic base of Edge Local Mode triggering by vertical displacements

    SciTech Connect (OSTI)

    Wang, Z. T.; He, Z. X.; Wang, Z. H.; Wu, N.; Tang, C. J.

    2015-05-15

    Vertical instability is studied with R-dependent displacement. For Solovev's configuration, the stability boundary of the vertical instability is calculated. The pressure gradient is a destabilizing factor which is contrary to Rebhan's result. Equilibrium parallel current density, j{sub //}, at plasma boundary is a drive of the vertical instability similar to Peeling-ballooning modes; however, the vertical instability cannot be stabilized by the magnetic shear which tends towards infinity near the separatrix. The induced current observed in the Edge Local Mode (ELM) triggering experiment by vertical modulation is derived. The theory provides some theoretic explanation for the mitigation of type-I ELMS on ASDEX Upgrade. The principle could be also used for ITER.

  17. Theoretical Research in Cosmology, High-Energy Physics and String Theory

    SciTech Connect (OSTI)

    Ng, Y Jack; Dolan, Louise; Mersini-Houghton, Laura; Frampton, Paul

    2013-07-29

    The research was in the area of Theoretical Physics: Cosmology, High-Energy Physics and String Theory

  18. 2D/3D registration algorithm for lung brachytherapy

    SciTech Connect (OSTI)

    Zvonarev, P. S.; Farrell, T. J.; Hunter, R.; Wierzbicki, M.; Hayward, J. E.; Sur, R. K.

    2013-02-15

    Purpose: A 2D/3D registration algorithm is proposed for registering orthogonal x-ray images with a diagnostic CT volume for high dose rate (HDR) lung brachytherapy. Methods: The algorithm utilizes a rigid registration model based on a pixel/voxel intensity matching approach. To achieve accurate registration, a robust similarity measure combining normalized mutual information, image gradient, and intensity difference was developed. The algorithm was validated using a simple body and anthropomorphic phantoms. Transfer catheters were placed inside the phantoms to simulate the unique image features observed during treatment. The algorithm sensitivity to various degrees of initial misregistration and to the presence of foreign objects, such as ECG leads, was evaluated. Results: The mean registration error was 2.2 and 1.9 mm for the simple body and anthropomorphic phantoms, respectively. The error was comparable to the interoperator catheter digitization error of 1.6 mm. Preliminary analysis of data acquired from four patients indicated a mean registration error of 4.2 mm. Conclusions: Results obtained using the proposed algorithm are clinically acceptable especially considering the complications normally encountered when imaging during lung HDR brachytherapy.

  19. Mesh Algorithms for PDE with Sieve I: Mesh Distribution

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Knepley, Matthew G.; Karpeev, Dmitry A.

    2009-01-01

    We have developed a new programming framework, called Sieve, to support parallel numerical partial differential equation(s) (PDE) algorithms operating over distributed meshes. We have also developed a reference implementation of Sieve in C++ as a library of generic algorithms operating on distributed containers conforming to the Sieve interface. Sieve makes instances of the incidence relation, or arrows, the conceptual first-class objects represented in the containers. Further, generic algorithms acting on this arrow container are systematically used to provide natural geometric operations on the topology and also, through duality, on the data. Finally, coverings and duality are used to encode notmore » only individual meshes, but all types of hierarchies underlying PDE data structures, including multigrid and mesh partitions. In order to demonstrate the usefulness of the framework, we show how the mesh partition data can be represented and manipulated using the same fundamental mechanisms used to represent meshes. We present the complete description of an algorithm to encode a mesh partition and then distribute a mesh, which is independent of the mesh dimension, element shape, or embedding. Moreover, data associated with the mesh can be similarly distributed with exactly the same algorithm. The use of a high level of abstraction within the Sieve leads to several benefits in terms of code reuse, simplicity, and extensibility. We discuss these benefits and compare our approach to other existing mesh libraries.« less

  20. Safety basis academy summary of project implementation from 2007-2009

    SciTech Connect (OSTI)

    Johnston, Julie A

    2009-01-01

    During fiscal years 2007 through 2009, in accordance with Performance Based Incentives with DOE/NNSA Los Alamos Site Office, Los Alamos National Security (LANS) implemented and operated a Safety Basis Academy (SBA) to facilitate uniformity in technical qualifications of safety basis professionals across the nuclear weapons complex. The implementation phase of the Safety Basis Academy required development, delivery, and finalizing a set of 23 courses. The courses developed are capable of supporting qualification efforts for both federal and contractor personnel throughout the DOE/NNSA Complex. The LANS Associate Director for Nuclear and High Hazard Operations (AD-NHHO) delegated project responsibillity to the Safety Basis Division. The project was assigned to the Safety Basis Technical Services (SB-TS) Group at Los Alamos National Laboratory (LANL). The main tasks were project needs analysis, design, development, implementation of instructional delivery, and evaluation of SBA courses. DOE/NNSA responsibility for oversight of the SBA project was assigned to the Chief of Defense for Nuclear Safety, and delegated to the Authorization Basis Senior Advisor, Continuous Learning Chair (CDNS-ABSA/CLC). NNSA developed a memorandum of agreement with LANS AD-NHHO. Through a memorandum of agreement initiated by NNSA, the DOE National Training Center (NTC) will maintain the set of Safety Basis Academy courses and is able to facilitate course delivery throughout the DOE Complex.

  1. EAGLE: 'EAGLE'Is an' Algorithmic Graph Library for Exploration

    Energy Science and Technology Software Center (OSTI)

    2015-01-16

    The Resource Description Framework (RDF) and SPARQL Protocol and RDF Query Language (SPARQL) were introduced about a decade ago to enable flexible schema-free data interchange on the Semantic Web. Today data scientists use the framework as a scalable graph representation for integrating, querying, exploring and analyzing data sets hosted at different sources. With increasing adoption, the need for graph mining capabilities for the Semantic Web has emerged. Today there is no tools to conduct "graphmore » mining" on RDF standard data sets. We address that need through implementation of popular iterative Graph Mining algorithms (Triangle count, Connected component analysis, degree distribution, diversity degree, PageRank, etc.). We implement these algorithms as SPARQL queries, wrapped within Python scripts and call our software tool as EAGLE. In RDF style, EAGLE stands for "EAGLE 'Is an' algorithmic graph library for exploration. EAGLE is like 'MATLAB' for 'Linked Data.'« less

  2. Analysis of the Multi-Phase Copying Garbage Collection Algorithm

    SciTech Connect (OSTI)

    Podhorszki, Norbert

    2009-01-01

    The multi-phase copying garbage collection was designed to avoid the need for large amount of reserved memory usually required for the copying types of garbage collection algorithms. The collection is performed in multiple phases using the available free memory. This paper proves that the number of phases depends on the size of the reserved memory and the ratio of the garbage and accessible objects. The performance of the implemented algorithm is tested in a fine-grained parallel Prolog system. We find that reserving only 10% of memory for garbage collection is sufficient for good performance in practice. Additionally, an improvement of the generic algorithm specifically for the tested parallel Prolog system is described.

  3. Nonlinear Global Optimization Using Curdling Algorithm in Mathematica Environmet

    Energy Science and Technology Software Center (OSTI)

    1997-08-05

    An algorithm for performing optimization which is a derivative-free, grid-refinement approach to nonlinear optimization was developed and implemented in software as OPTIMIZE. This approach overcomes a number of deficiencies in existing approaches. Most notably, it finds extremal regions rather than only single extremal points. the program is interactive and collects information on control parameters and constraints using menus. For up to two (and potentially three) dimensions, function convergence is displayed graphically. Because the algorithm doesmore » not compute derivatives, gradients, or vectors, it is numerically stable. It can find all the roots of a polynomial in one pass. It is an inherently parallel algorithm. OPTIMIZE-M is a modification of OPTIMIZE designed for use within the Mathematica environment created by Wolfram Research.« less

  4. EAGLE: 'EAGLE'Is an' Algorithmic Graph Library for Exploration

    SciTech Connect (OSTI)

    2015-01-16

    The Resource Description Framework (RDF) and SPARQL Protocol and RDF Query Language (SPARQL) were introduced about a decade ago to enable flexible schema-free data interchange on the Semantic Web. Today data scientists use the framework as a scalable graph representation for integrating, querying, exploring and analyzing data sets hosted at different sources. With increasing adoption, the need for graph mining capabilities for the Semantic Web has emerged. Today there is no tools to conduct "graph mining" on RDF standard data sets. We address that need through implementation of popular iterative Graph Mining algorithms (Triangle count, Connected component analysis, degree distribution, diversity degree, PageRank, etc.). We implement these algorithms as SPARQL queries, wrapped within Python scripts and call our software tool as EAGLE. In RDF style, EAGLE stands for "EAGLE 'Is an' algorithmic graph library for exploration. EAGLE is like 'MATLAB' for 'Linked Data.'

  5. ARM: 10-second Raman Lidar: aerosol scattering ratio and backscattering coefficient profiles, from first Ferrare algorithm

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Chitra Sivaraman; Connor Flynn

    2004-10-01

    10-second Raman Lidar: aerosol scattering ratio and backscattering coefficient profiles, from first Ferrare algorithm

  6. ARM: 2-minute Raman Lidar: aerosol scattering ratio and backscattering coefficient profiles, from first Ferrare algorithm

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Chitra Sivaraman; Connor Flynn

    2004-10-01

    2-minute Raman Lidar: aerosol scattering ratio and backscattering coefficient profiles, from first Ferrare algorithm

  7. ARM: 10-minute Raman Lidar: aerosol extinction profiles and aerosol optical thickness, from first Ferrare algorithm

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Chitra Sivaraman; Connor Flynn

    1998-03-01

    10-minute Raman Lidar: aerosol extinction profiles and aerosol optical thickness, from first Ferrare algorithm

  8. ARM: 1-minute Raman Lidar: aerosol scattering ratio and backscattering coefficient profiles, from first Ferrare algorithm

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Chitra Sivaraman; Connor Flynn

    2004-10-01

    1-minute Raman Lidar: aerosol scattering ratio and backscattering coefficient profiles, from first Ferrare algorithm

  9. ARM: 1-minute Raman Lidar: aerosol extinction profiles and aerosol optical thickness, from first Ferrare algorithm

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Chitra Sivaraman; Connor Flynn

    2004-10-01

    1-minute Raman Lidar: aerosol extinction profiles and aerosol optical thickness, from first Ferrare algorithm

  10. ARM: SIRS: derived, correction of downwelling shortwave diffuse hemispheric measurements using Dutton and full algorithm

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Laura Riihimaki

    1997-03-21

    SIRS: derived, correction of downwelling shortwave diffuse hemispheric measurements using Dutton and full algorithm

  11. ARM: 10-minute Raman Lidar: aerosol scattering ratio and backscattering coefficient profiles, from first Ferrare algorithm

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Newsom, Rob; Goldsmith, John

    1998-03-01

    10-minute Raman Lidar: aerosol scattering ratio and backscattering coefficient profiles, from first Ferrare algorithm

  12. ARM: 10-second Raman Lidar: aerosol scattering ratio and backscattering coefficient profiles, from first Ferrare algorithm

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Chitra Sivaraman; Connor Flynn

    10-second Raman Lidar: aerosol scattering ratio and backscattering coefficient profiles, from first Ferrare algorithm

  13. ARM: 10-minute Raman Lidar: aerosol scattering ratio and backscattering coefficient profiles, from first Ferrare algorithm

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Chitra Sivaraman; Connor Flynn

    10-minute Raman Lidar: aerosol scattering ratio and backscattering coefficient profiles, from first Ferrare algorithm

  14. ARM: 10-minute Raman Lidar: aerosol extinction profiles and aerosol optical thickness, from first Ferrare algorithm

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Chitra Sivaraman; Connor Flynn

    10-minute Raman Lidar: aerosol extinction profiles and aerosol optical thickness, from first Ferrare algorithm

  15. ARM: 2-minute Raman Lidar: aerosol scattering ratio and backscattering coefficient profiles, from first Ferrare algorithm

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Sivaraman, Chitra; Flynn, Connor

    2-minute Raman Lidar: aerosol scattering ratio and backscattering coefficient profiles, from first Ferrare algorithm

  16. ARM: 1-minute Raman Lidar: aerosol extinction profiles and aerosol optical thickness, from first Ferrare algorithm

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Chitra Sivaraman; Connor Flynn

    1-minute Raman Lidar: aerosol extinction profiles and aerosol optical thickness, from first Ferrare algorithm

  17. ARM: 1-minute Raman Lidar: aerosol scattering ratio and backscattering coefficient profiles, from first Ferrare algorithm

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Chitra Sivaraman; Connor Flynn

    1-minute Raman Lidar: aerosol scattering ratio and backscattering coefficient profiles, from first Ferrare algorithm

  18. Extending vanLeer's Algorithm to Multiple Dimensions. (Conference) |

    Office of Scientific and Technical Information (OSTI)

    SciTech Connect Extending vanLeer's Algorithm to Multiple Dimensions. Citation Details In-Document Search Title: Extending vanLeer's Algorithm to Multiple Dimensions. Abstract not provided. Authors: Mosso, Stewart John ; Voth, Thomas Eugene ; Drake, Richard R. Publication Date: 2013-08-01 OSTI Identifier: 1115085 Report Number(s): SAND2013-7261C 477201 DOE Contract Number: AC04-94AL85000 Resource Type: Conference Resource Relation: Conference: MultiMat 2012 held September 2-6, 2013 in San

  19. COLLOQUIUM: Introduction to Quantum Algorithms | Princeton Plasma Physics

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Lab December 9, 2015, 4:15pm to 5:30pm MBG AUDITORIUM COLLOQUIUM: Introduction to Quantum Algorithms Dr. Nadya Shirokova University of Santa Clara Quantum computers are not an abstraction anymore - Google, NASA and USRA recently announced formation of the Quantum Artificial Intelligence Lab equipped with 1,000-qubit quantum computer. In this talk we will focus on quantum algorithms such as Deutsch, Shor's and Grover's and will discuss why they are faster than the classical ones. We will also

  20. Incremental Clustering Algorithm For Earth Science Data Mining

    SciTech Connect (OSTI)

    Vatsavai, Raju

    2009-01-01

    Remote sensing data plays a key role in understanding the complex geographic phenomena. Clustering is a useful tool in discovering interesting patterns and structures within the multivariate geospatial data. One of the key issues in clustering is the specication of appropriate number of clusters, which is not obvious in many practical situations. In this paper we provide an extension of G-means algorithm which automatically learns the number of clusters present in the data and avoids over estimation of the number of clusters. Experimental evaluation on simulated and remotely sensed image data shows the effectiveness of our algorithm.

  1. Theoretical and computer models of detonation in solid explosives

    SciTech Connect (OSTI)

    Tarver, C.M.; Urtiew, P.A.

    1997-10-01

    Recent experimental and theoretical advances in understanding energy transfer and chemical kinetics have led to improved models of detonation waves in solid explosives. The Nonequilibrium Zeldovich - von Neumann - Doring (NEZND) model is supported by picosecond laser experiments and molecular dynamics simulations of the multiphonon up-pumping and internal vibrational energy redistribution (IVR) processes by which the unreacted explosive molecules are excited to the transition state(s) preceding reaction behind the leading shock front(s). High temperature, high density transition state theory calculates the induction times measured by laser interferometric techniques. Exothermic chain reactions form product gases in highly excited vibrational states, which have been demonstrated to rapidly equilibrate via supercollisions. Embedded gauge and Fabry-Perot techniques measure the rates of reaction product expansion as thermal and chemical equilibrium is approached. Detonation reaction zone lengths in carbon-rich condensed phase explosives depend on the relatively slow formation of solid graphite or diamond. The Ignition and Growth reactive flow model based on pressure dependent reaction rates and Jones-Wilkins-Lee (JWL) equations of state has reproduced this nanosecond time resolved experimental data and thus has yielded accurate average reaction zone descriptions in one-, two- and three- dimensional hydrodynamic code calculations. The next generation reactive flow model requires improved equations of state and temperature dependent chemical kinetics. Such a model is being developed for the ALE3D hydrodynamic code, in which heat transfer and Arrhenius kinetics are intimately linked to the hydrodynamics.

  2. Theoretical and Experimental Studies of Elementary Particle Physics

    SciTech Connect (OSTI)

    Evans, Harold G; Kostelecky, V Alan; Musser, James A

    2013-07-29

    The elementary particle physics research program at Indiana University spans a broad range of the most interesting topics in this fundamental field, including important contributions to each of the frontiers identified in the recent report of HEPAP's Particle Physics Prioritization Panel: the Energy Frontier, the Intensity Frontier, and the Cosmic Frontier. Experimentally, we contribute to knowledge at the Energy Frontier through our work on the D0 and ATLAS collaborations. We work at the Intensity Frontier on the MINOS and NOvA experiments and participate in R&D for LBNE. We are also very active on the theoretical side of each of these areas with internationally recognized efforts in phenomenology both in and beyond the Standard Model and in lattice QCD. Finally, although not part of this grant, members of the Indiana University particle physics group have strong involvement in several astrophysics projects at the Cosmic Frontier. Our research efforts are divided into three task areas. The Task A group works on D0 and ATLAS; Task B is our theory group; and Task C contains our MINOS, NOvA, and LBNE (LArTPC) research. Each task includes contributions from faculty, senior scientists, postdocs, graduate and undergraduate students, engineers, technicians, and administrative personnel. This work was supported by DOE Grant DE-FG02-91ER40661. In the following, we describe progress made in the research of each task during the final period of the grant, from November 1, 2009 to April 30, 2013.

  3. Windmill wake turbulence decay: a preliminary theoretical model

    SciTech Connect (OSTI)

    Bossanyi, E.A.

    1983-02-01

    The results are given of initial theoretical attempts to predict dynamic wake characteristics, particularly turbulence decay, downstream of wind turbine generators in order to assess the potential for acoustic noise generation in clusters or arrays of turbines. These results must be considered preliminary, because the model described is at least partially based on the assumption of isotropy in the turbine wakes; however, anisotrpic conditions may actually exist, particularly in the near-wake regions. The results indicate that some excess spectral energy may still exist. The turbine-generated turbulence from one machine can reach the next machine in the cluster and, depending on the turbulent wavelengths critical for acoustic noise production and perhaps structural excitation, this may be a cause for concern. Such a situation is most likely to occur in the evening or morining, during the transition from the daytime to the nocturnal boundary layer and vice-versa, particularly at more elevated sites where the winds tend to increase after dark.

  4. Theoretical rate coefficients for allyl + HO2 and allyloxy decomposition

    SciTech Connect (OSTI)

    Goldsmith, C. F.; Klippenstein, S. J.; Green, W. H.

    2011-01-01

    The kinetics of the allyl + HO{sub 2} bimolecular reaction, the thermal decomposition of C{sub 3}H{sub 5}OOH, and the unimolecular reactions of C{sub 3}H{sub 5}O are studied theoretically. High-level ab initio calculations of the C{sub 3}H{sub 5}OOH and C{sub 3}H{sub 5}O potential energy surfaces are coupled with RRKM master equation methods to compute the temperature- and pressure-dependence of the rate coefficients. Variable reaction coordinate transition state theory is used to characterize the barrierless transition states for the allyl + HO{sub 2} and C{sub 3}H{sub 5}O + OH reactions. The predicted rate coefficients for allyl + HO{sub 2} ? C{sub 3}H{sub 5}OOH ? products are in good agreement with experimental values. The calculations for allyl + HO{sub 2} ? C{sub 3}H{sub 6} + O{sub 2} underpredict the observed rate. The new rate coefficients suggest that the reaction of allyl + HO{sub 2} will promote chain-branching significantly more than previous models suggest.

  5. Protonated Forms of Monoclinic Zirconia: A Theoretical Study

    SciTech Connect (OSTI)

    Mantz, Yves A.; Gemmen, Randall S.

    2010-05-06

    In various materials applications of zirconia, protonated forms of monoclinic zirconia may be formed, motivating their study within the framework of density-functional theory. Using the HCTH/120 exchange-correlation functional, the equations of state of yttria and of the three low-pressure zirconia polymorphs are computed, to verify our approach. Next, the favored charge state of a hydrogen atom in monoclinic zirconia is shown to be positive for all Fermilevel energies in the band gap, by the computation of defect formation energies.This result is consistent with a single previous theoretical prediction at midgap as well as muonium spectroscopy experiments. For the formally positively (+1e) charged system of a proton in monoclinic zirconia (with a homogeneous neutralizing background charge densityimplicitly included), modeled using up to a 3 x 3 x 3 arrangement of unit cells, different stable and metastable structures are identified. They are similar to those structures previously proposed for the neutral system of hydrogen-doedmonoclinic zirconia, at a similar level of theory. As predicted using the HCTH/120 functional, the lowest energy structure of the proton bonded to one of the two available oxygen atom types, O1, is favored by 0.39 eV compared to that of the proton bonded to O2. The rate of proton transfer between O1 ions is slower than that for hydrogen-dopedmonoclinic zirconia, whose transition-state structures may be lowered in energy by the extra electron.

  6. Theoretical studies of potential energy surfaces and computational methods

    SciTech Connect (OSTI)

    Shepard, R.

    1993-12-01

    This project involves the development, implementation, and application of theoretical methods for the calculation and characterization of potential energy surfaces involving molecular species that occur in hydrocarbon combustion. These potential energy surfaces require an accurate and balanced treatment of reactants, intermediates, and products. This difficult challenge is met with general multiconfiguration self-consistent-field (MCSCF) and multireference single- and double-excitation configuration interaction (MRSDCI) methods. In contrast to the more common single-reference electronic structure methods, this approach is capable of describing accurately molecular systems that are highly distorted away from their equilibrium geometries, including reactant, fragment, and transition-state geometries, and of describing regions of the potential surface that are associated with electronic wave functions of widely varying nature. The MCSCF reference wave functions are designed to be sufficiently flexible to describe qualitatively the changes in the electronic structure over the broad range of geometries of interest. The necessary mixing of ionic, covalent, and Rydberg contributions, along with the appropriate treatment of the different electron-spin components (e.g. closed shell, high-spin open-shell, low-spin open shell, radical, diradical, etc.) of the wave functions, are treated correctly at this level. Further treatment of electron correlation effects is included using large scale multireference CI wave functions, particularly including the single and double excitations relative to the MCSCF reference space. This leads to the most flexible and accurate large-scale MRSDCI wave functions that have been used to date in global PES studies.

  7. Structural basis for Notch1 engagement of Delta-like 4 (Journal...

    Office of Scientific and Technical Information (OSTI)

    Title: Structural basis for Notch1 engagement of Delta-like 4 Authors: Luca, Vincent C. ; Jude, Kevin M. ; Pierce, Nathan W. ; Nachury, Maxence V. ; Fischer, Suzanne ; Garcia, K. ...

  8. Structural Basis of UV DNA-Damage Recognition by the DDB1-DDB2...

    Office of Scientific and Technical Information (OSTI)

    Recognition by the DDB1-DDB2 Complex Citation Details In-Document Search Title: Structural Basis of UV DNA-Damage Recognition by the DDB1-DDB2 Complex Ultraviolet (UV) ...

  9. Structure of P-Glycoprotein Reveals a Molecular Basis for Poly...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Structure of P-Glycoprotein Reveals a Molecular Basis for Poly-Specific Drug Binding figure 1 Figure 1. Structure of P-gp. Many forms of cancer fail to respond to chemotherapy by ...

  10. Reducing Uncertainty in the Seismic Design Basis for the Waste Treatment Plant, Hanford, Washington

    SciTech Connect (OSTI)

    Brouns, Thomas M.; Rohay, Alan C.; Reidel, Steve; Gardner, Martin G.

    2007-02-27

    The seismic design basis for the Waste Treatment Plant (WTP) at the Department of Energys (DOE) Hanford Site near Richland was re-evaluated in 2005, resulting in an increase by up to 40% in the seismic design basis. The original seismic design basis for the WTP was established in 1999 based on a probabilistic seismic hazard analysis completed in 1996. The 2005 analysis was performed to address questions raised by the Defense Nuclear Facilities Safety Board (DNFSB) about the assumptions used in developing the original seismic criteria and adequacy of the site geotechnical surveys. The updated seismic response analysis used existing and newly acquired seismic velocity data, statistical analysis, expert elicitation, and ground motion simulation to develop interim design ground motion response spectra which enveloped the remaining uncertainties. The uncertainties in these response spectra were enveloped at approximately the 84th percentile to produce conservative design spectra, which contributed significantly to the increase in the seismic design basis.

  11. NSS 18.3 Verification of Authorization Basis Documentation 12/8/03

    Broader source: Energy.gov [DOE]

    The objective of this surveillance is for the Facility Representative to verify that the facility's configuration and operations remain consistent with the authorization basis.  As defined in DOE...

  12. Structural Basis of UV DNA-Damage Recognition by the DDB1-DDB2 Complex

    Office of Scientific and Technical Information (OSTI)

    (Journal Article) | SciTech Connect Structural Basis of UV DNA-Damage Recognition by the DDB1-DDB2 Complex Citation Details In-Document Search Title: Structural Basis of UV DNA-Damage Recognition by the DDB1-DDB2 Complex Ultraviolet (UV) light-induced pyrimidine photodimers are repaired by the nucleotide excision repair pathway. Photolesions have biophysical parameters closely resembling undamaged DNA, impeding discovery through damage surveillance proteins. The DDB1DDB2 complex serves in

  13. Los Alamos National Laboratory fission basis (Conference) | SciTech Connect

    Office of Scientific and Technical Information (OSTI)

    National Laboratory fission basis Citation Details In-Document Search Title: Los Alamos National Laboratory fission basis Authors: Keksis, August L [1] ; Chadwick, Mark B [1] + Show Author Affiliations Los Alamos National Laboratory Publication Date: 2011-05-06 OSTI Identifier: 1063939 Report Number(s): LA-UR-11-02744; LA-UR-11-2744 DOE Contract Number: AC52-06NA25396 Resource Type: Conference Resource Relation: Conference: 14th International Symposium on Reactor Dosimetry ; May 22, 2011 ;

  14. Technical Basis Spent Nuclear Fuel (SNF) Project Radiation and Contamination Trending Program

    SciTech Connect (OSTI)

    KURTZ, J.E.

    2000-05-10

    This report documents the technical basis for the Spent Nuclear Fuel (SNF) Program radiation and contamination trending program. The program consists of standardized radiation and contamination surveys of the KE Basin, radiation surveys of the KW basin, and radiation surveys of the Cold Vacuum Drying Facility (CVD) with the associated tracking. This report also discusses the remainder of radiological areas within the SNFP that do not have standardized trending programs and the basis for not having this program in those areas.

  15. Structural Basis of Selective Ubiquitination of TRF1 by SCFFbx4 (Journal

    Office of Scientific and Technical Information (OSTI)

    Article) | DOE PAGES Structural Basis of Selective Ubiquitination of TRF1 by SCFFbx4 Title: Structural Basis of Selective Ubiquitination of TRF1 by SCFFbx4 Authors: Zeng, Zhixiong ; Wang, Wei ; Yang, Yuting ; Chen, Yong ; Yang, Xiaomei ; Diehl, J. Alan ; Liu, Xuedong ; Lei, Ming Publication Date: 2010-02-01 OSTI Identifier: 1198117 Grant/Contract Number: AC02-06CH11357 Type: Published Article Journal Name: Developmental Cell Additional Journal Information: Journal Volume: 18; Journal Issue:

  16. Structural basis for the prion-like MAVS filaments in antiviral innate

    Office of Scientific and Technical Information (OSTI)

    immunity (Journal Article) | SciTech Connect Structural basis for the prion-like MAVS filaments in antiviral innate immunity Citation Details In-Document Search Title: Structural basis for the prion-like MAVS filaments in antiviral innate immunity Authors: Xu, Hui ; He, Xiaojing ; Zheng, Hui ; Huang, Lily J ; Hou, Fajian ; Yu, Zhiheng ; de la Cruz, Michael Jason ; Borkowski, Brian ; Zhang, Xuewu ; Chen, Zhijian J ; Jiang, Qiu-Xing [1] ; HHMI) [2] + Show Author Affiliations (UTSMC) (

  17. Structural Basis for Specificity and Flexibility in a Plant 4-Coumarate:CoA

    Office of Scientific and Technical Information (OSTI)

    Ligase (Journal Article) | SciTech Connect Structural Basis for Specificity and Flexibility in a Plant 4-Coumarate:CoA Ligase Citation Details In-Document Search Title: Structural Basis for Specificity and Flexibility in a Plant 4-Coumarate:CoA Ligase Authors: Li, Zhi ; Nair, Satish K. [1] + Show Author Affiliations UIUC Publication Date: 2015-12-04 OSTI Identifier: 1227510 Resource Type: Journal Article Resource Relation: Journal Name: Structure; Journal Volume: 23; Journal Issue: (11) ;

  18. Structural Basis of Selective Ubiquitination of TRF1 by SCFFbx4 (Journal

    Office of Scientific and Technical Information (OSTI)

    Article) | SciTech Connect Structural Basis of Selective Ubiquitination of TRF1 by SCFFbx4 Citation Details In-Document Search Title: Structural Basis of Selective Ubiquitination of TRF1 by SCFFbx4 Authors: Zeng, Zhixiong ; Wang, Wei ; Yang, Yuting ; Chen, Yong ; Yang, Xiaomei ; Diehl, J. Alan ; Liu, Xuedong ; Lei, Ming Publication Date: 2010-02-01 OSTI Identifier: 1198117 Grant/Contract Number: AC02-06CH11357 Type: Published Article Journal Name: Developmental Cell Additional Journal

  19. Structural Basis of UV DNA-Damage Recognition by the DDB1-DDB2 Complex

    Office of Scientific and Technical Information (OSTI)

    (Journal Article) | SciTech Connect Structural Basis of UV DNA-Damage Recognition by the DDB1-DDB2 Complex Citation Details In-Document Search Title: Structural Basis of UV DNA-Damage Recognition by the DDB1-DDB2 Complex Ultraviolet (UV) light-induced pyrimidine photodimers are repaired by the nucleotide excision repair pathway. Photolesions have biophysical parameters closely resembling undamaged DNA, impeding discovery through damage surveillance proteins. The DDB1DDB2 complex serves in

  20. Crystal Structures of mPGES-1 Inhibitor Complexes Form a Basis for the

    Office of Scientific and Technical Information (OSTI)

    Rational Design of Potent Analgesic and Anti-Inflammatory Therapeutics (Journal Article) | SciTech Connect Crystal Structures of mPGES-1 Inhibitor Complexes Form a Basis for the Rational Design of Potent Analgesic and Anti-Inflammatory Therapeutics Citation Details In-Document Search Title: Crystal Structures of mPGES-1 Inhibitor Complexes Form a Basis for the Rational Design of Potent Analgesic and Anti-Inflammatory Therapeutics Authors: Luz, John Gately ; Antonysamy, Stephen ; Kuklish,

  1. Crystal structure of a ;#8203;BRAF kinase domain monomer explains basis for

    Office of Scientific and Technical Information (OSTI)

    allosteric regulation (Journal Article) | SciTech Connect Crystal structure of a ;#8203;BRAF kinase domain monomer explains basis for allosteric regulation Citation Details In-Document Search Title: Crystal structure of a ;#8203;BRAF kinase domain monomer explains basis for allosteric regulation Authors: Thevakumaran, Neroshan ; Lavoie, Hugo ; Critton, David A. ; Tebben, Andrew ; Marinier, Anne ; Sicheri, Frank ; Therrien , Marc [1] ; Montreal) [2] ; BMS) [2] + Show Author Affiliations

  2. The Three-Dimensional Structural Basis of Type II Hyperprolinemia (Journal

    Office of Scientific and Technical Information (OSTI)

    Article) | SciTech Connect The Three-Dimensional Structural Basis of Type II Hyperprolinemia Citation Details In-Document Search Title: The Three-Dimensional Structural Basis of Type II Hyperprolinemia Type II hyperprolinemia is an autosomal recessive disorder caused by a deficiency in {Delta}{sup 1}-pyrroline-5-carboxylate dehydrogenase (P5CDH; also known as ALDH4A1), the aldehyde dehydrogenase that catalyzes the oxidation of glutamate semialdehyde to glutamate. Here, we report the first

  3. Non-homogeneous solutions of a Coulomb Schrdinger equation as basis set for scattering problems

    SciTech Connect (OSTI)

    Del Punta, J. A.; Ambrosio, M. J.; Gasaneo, G.; Zaytsev, S. A.; Ancarani, L. U.

    2014-05-15

    We introduce and study two-body Quasi Sturmian functions which are proposed as basis functions for applications in three-body scattering problems. They are solutions of a two-body non-homogeneous Schrdinger equation. We present different analytic expressions, including asymptotic behaviors, for the pure Coulomb potential with a driven term involving either Slater-type or Laguerre-type orbitals. The efficiency of Quasi Sturmian functions as basis set is numerically illustrated through a two-body scattering problem.

  4. Effect of cosolvent on protein stability: A theoretical investigation

    SciTech Connect (OSTI)

    Chalikian, Tigran V.

    2014-12-14

    We developed a statistical thermodynamic algorithm for analyzing solvent-induced folding/unfolding transitions of proteins. The energetics of protein transitions is governed by the interplay between the cavity formation contribution and the term reflecting direct solute-cosolvent interactions. The latter is viewed as an exchange reaction in which the binding of a cosolvent to a solute is accompanied by release of waters of hydration to the bulk. Our model clearly differentiates between the stoichiometric and non-stoichiometric interactions of solvent or co-solvent molecules with a solute. We analyzed the urea- and glycine betaine (GB)-induced conformational transitions of model proteins of varying size which are geometrically approximated by a sphere in their native state and a spherocylinder in their unfolded state. The free energy of cavity formation and its changes accompanying protein transitions were computed based on the concepts of scaled particle theory. The free energy of direct solute-cosolvent interactions were analyzed using empirical parameters previously determined for urea and GB interactions with low molecular weight model compounds. Our computations correctly capture the mode of action of urea and GB and yield realistic numbers for (??G/?a{sub 3}){sub T,P} which are related to the m-values of protein denaturation. Urea is characterized by negative values of (??G/?a{sub 3}){sub T,P} within the entire range of urea concentrations analyzed. At concentrations below ?1 M, GB exhibits positive values of (??G/?a{sub 3}){sub T,P} which turn positive at higher GB concentrations. The balance between the thermodynamic contributions of cavity formation and direct solute-cosolvent interactions that, ultimately, defines the mode of cosolvent action is extremely subtle. A 20% increase or decrease in the equilibrium constant for solute-cosolvent binding may change the sign of (??G/?a{sub 3}){sub T,P} thereby altering the mode of cosolvent action (stabilizing to destabilizing or vice versa)

  5. A subzone reconstruction algorithm for efficient staggered compatible remapping

    SciTech Connect (OSTI)

    Starinshak, D.P. Owen, J.M.

    2015-09-01

    Staggered-grid Lagrangian hydrodynamics algorithms frequently make use of subzonal discretization of state variables for the purposes of improved numerical accuracy, generality to unstructured meshes, and exact conservation of mass, momentum, and energy. For Arbitrary Lagrangian–Eulerian (ALE) methods using a geometric overlay, it is difficult to remap subzonal variables in an accurate and efficient manner due to the number of subzone–subzone intersections that must be computed. This becomes prohibitive in the case of 3D, unstructured, polyhedral meshes. A new procedure is outlined in this paper to avoid direct subzonal remapping. The new algorithm reconstructs the spatial profile of a subzonal variable using remapped zonal and nodal representations of the data. The reconstruction procedure is cast as an under-constrained optimization problem. Enforcing conservation at each zone and node on the remapped mesh provides the set of equality constraints; the objective function corresponds to a quadratic variation per subzone between the values to be reconstructed and a set of target reference values. Numerical results for various pure-remapping and hydrodynamics tests are provided. Ideas for extending the algorithm to staggered-grid radiation-hydrodynamics are discussed as well as ideas for generalizing the algorithm to include inequality constraints.

  6. Finding the needle in the haystack: Algorithms for conformational optimization

    SciTech Connect (OSTI)

    Andricioaei, I.; Straub, J.E.

    1996-09-01

    Algorithms are given for comformational optimization of proteins. The protein folding problems is regarded as a problem of global energy mimimization. Since proteins have hundreds of atoms, finding the lowest-energy conformation in a many-dimensional configuration space becomes a computationally demanding problem.{copyright} {ital American Institute of Physics.}

  7. THE LEVENBERG-MARQUARDT ALGORITHM: IMPLEMENTATION AND THEORY

    Office of Scientific and Technical Information (OSTI)

    jK-IIOUil,-- I THE LEVENBERG-MARQUARDT ALGORITHM: IMPLEMENTATION AND THEORY Jorge J ... e r s i t y The U n i v e r s i t y of Iowa K a n s a s State U n i v e r s i t y The U n ...

  8. Numerical Optimization Algorithms and Software for Systems Biology

    SciTech Connect (OSTI)

    Saunders, Michael

    2013-02-02

    The basic aims of this work are: to develop reliable algorithms for solving optimization problems involving large stoi- chiometric matrices; to investigate cyclic dependency between metabolic and macromolecular biosynthetic networks; and to quantify the significance of thermodynamic constraints on prokaryotic metabolism.

  9. A Parallel Ghosting Algorithm for The Flexible Distributed Mesh Database

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Mubarak, Misbah; Seol, Seegyoung; Lu, Qiukai; Shephard, Mark S.

    2013-01-01

    Critical to the scalability of parallel adaptive simulations are parallel control functions including load balancing, reduced inter-process communication and optimal data decomposition. In distributed meshes, many mesh-based applications frequently access neighborhood information for computational purposes which must be transmitted efficiently to avoid parallel performance degradation when the neighbors are on different processors. This article presents a parallel algorithm of creating and deleting data copies, referred to as ghost copies, which localize neighborhood data for computation purposes while minimizing inter-process communication. The key characteristics of the algorithm are: (1) It can create ghost copies of any permissible topological order inmore » a 1D, 2D or 3D mesh based on selected adjacencies. (2) It exploits neighborhood communication patterns during the ghost creation process thus eliminating all-to-all communication. (3) For applications that need neighbors of neighbors, the algorithm can create n number of ghost layers up to a point where the whole partitioned mesh can be ghosted. Strong and weak scaling results are presented for the IBM BG/P and Cray XE6 architectures up to a core count of 32,768 processors. The algorithm also leads to scalable results when used in a parallel super-convergent patch recovery error estimator, an application that frequently accesses neighborhood data to carry out computation.« less

  10. Developing and Implementing the Data Mining Algorithms in RAVEN

    SciTech Connect (OSTI)

    Sen, Ramazan Sonat; Maljovec, Daniel Patrick; Alfonsi, Andrea; Rabiti, Cristian

    2015-09-01

    The RAVEN code is becoming a comprehensive tool to perform probabilistic risk assessment, uncertainty quantification, and verification and validation. The RAVEN code is being developed to support many programs and to provide a set of methodologies and algorithms for advanced analysis. Scientific computer codes can generate enormous amounts of data. To post-process and analyze such data might, in some cases, take longer than the initial software runtime. Data mining algorithms/methods help in recognizing and understanding patterns in the data, and thus discovers knowledge in databases. The methodologies used in the dynamic probabilistic risk assessment or in uncertainty and error quantification analysis couple system/physics codes with simulation controller codes, such as RAVEN. RAVEN introduces both deterministic and stochastic elements into the simulation while the system/physics code model the dynamics deterministically. A typical analysis is performed by sampling values of a set of parameter values. A major challenge in using dynamic probabilistic risk assessment or uncertainty and error quantification analysis for a complex system is to analyze the large number of scenarios generated. Data mining techniques are typically used to better organize and understand data, i.e. recognizing patterns in the data. This report focuses on development and implementation of Application Programming Interfaces (APIs) for different data mining algorithms, and the application of these algorithms to different databases.

  11. Theoretical and experimental study on regenerative rotary displacer Stirling engine

    SciTech Connect (OSTI)

    Raggi, L.; Katsuta, Masafumi; Isshiki, Naotsugu; Isshiki, Seita

    1997-12-31

    Recently a quite new type of hot air engine called rotary displacer engine, in which the displacer is a rotating disk enclosed in a cylinder, has been conceived and developed. The working gas, contained in a notch excavated in the disk, is heated and cooled alternately, on account of the heat transferred through the enclosing cylinder that is heated at one side and cooled at the opposite one. The gas temperature oscillations cause the pressure fluctuations that get out mechanical power acting on a power piston. In order to attempt to increase the performances for this kind of engine, the authors propose three different regeneration methods. The first one comprises two coaxial disks that, revolving in opposite ways, cause a temperature gradient on the cylinder wall and a regenerative axial heat conduction through fins shaped on the cylinder inner wall. The other two methods are based on the heat transferred by a proper closed circuit that in one case has a circulating liquid inside and in the other one is formed by several heat pipes working each one for different temperatures. An engine based on the first principle, the Regenerative Tandem Contra-Rotary Displacer Stirling Engine, has been realized and experimented. In this paper experimental results with and without regeneration are reported comparatively with a detailed description of the unity. A basic explanation of the working principle of this engine and a theoretical analysis investigating the main influential parameters for the regenerative effect are done. This new rotating displacer Stirling engines, for their simplicity, are expected to attain high rotational speed especially for applications as demonstration and hobby unities.

  12. THEORETICAL EVOLUTION OF OPTICAL STRONG LINES ACROSS COSMIC TIME

    SciTech Connect (OSTI)

    Kewley, Lisa J.; Dopita, Michael A.; Sutherland, Ralph; Leitherer, Claus; Dave, Romeel; Allen, Mark; Groves, Brent

    2013-09-10

    We use the chemical evolution predictions of cosmological hydrodynamic simulations with our latest theoretical stellar population synthesis, photoionization, and shock models to predict the strong line evolution of ensembles of galaxies from z = 3 to the present day. In this paper, we focus on the brightest optical emission-line ratios, [N II]/H{alpha} and [O III]/H{beta}. We use the optical diagnostic Baldwin-Phillips-Terlevich (BPT) diagram as a tool for investigating the spectral properties of ensembles of active galaxies. We use four redshift windows chosen to exploit new near-infrared multi-object spectrographs. We predict how the BPT diagram will appear in these four redshift windows given different sets of assumptions. We show that the position of star-forming galaxies on the BPT diagram traces the interstellar medium conditions and radiation field in galaxies at a given redshift. Galaxies containing active galactic nucleus (AGN) form a mixing sequence with purely star-forming galaxies. This mixing sequence may change dramatically with cosmic time, due to the metallicity sensitivity of the optical emission-lines. Furthermore, the position of the mixing sequence may probe metallicity gradients in galaxies as a function of redshift, depending on the size of the AGN narrow-line region. We apply our latest slow shock models for gas shocked by galactic-scale winds. We show that at high redshift, galactic wind shocks are clearly separated from AGN in line ratio space. Instead, shocks from galactic winds mimic high metallicity starburst galaxies. We discuss our models in the context of future large near-infrared spectroscopic surveys.

  13. Genetic algorithms and their use in Geophysical Problems

    SciTech Connect (OSTI)

    Parker, Paul B.

    1999-04-01

    Genetic algorithms (GAs), global optimization methods that mimic Darwinian evolution are well suited to the nonlinear inverse problems of geophysics. A standard genetic algorithm selects the best or ''fittest'' models from a ''population'' and then applies operators such as crossover and mutation in order to combine the most successful characteristics of each model and produce fitter models. More sophisticated operators have been developed, but the standard GA usually provides a robust and efficient search. Although the choice of parameter settings such as crossover and mutation rate may depend largely on the type of problem being solved, numerous results show that certain parameter settings produce optimal performance for a wide range of problems and difficulties. In particular, a low (about half of the inverse of the population size) mutation rate is crucial for optimal results, but the choice of crossover method and rate do not seem to affect performance appreciably. Optimal efficiency is usually achieved with smaller (< 50) populations. Lastly, tournament selection appears to be the best choice of selection methods due to its simplicity and its autoscaling properties. However, if a proportional selection method is used such as roulette wheel selection, fitness scaling is a necessity, and a high scaling factor (> 2.0) should be used for the best performance. Three case studies are presented in which genetic algorithms are used to invert for crustal parameters. The first is an inversion for basement depth at Yucca mountain using gravity data, the second an inversion for velocity structure in the crust of the south island of New Zealand using receiver functions derived from teleseismic events, and the third is a similar receiver function inversion for crustal velocities beneath the Mendocino Triple Junction region of Northern California. The inversions demonstrate that genetic algorithms are effective in solving problems with reasonably large numbers of free parameters and with computationally expensive objective function calculations. More sophisticated techniques are presented for special problems. Niching and island model algorithms are introduced as methods to find multiple, distinct solutions to the nonunique problems that are typically seen in geophysics. Finally, hybrid algorithms are investigated as a way to improve the efficiency of the standard genetic algorithm.

  14. Compact Graph Representations and Parallel Connectivity Algorithms for Massive Dynamic Network Analysis

    SciTech Connect (OSTI)

    Madduri, Kamesh; Bader, David A.

    2009-02-15

    Graph-theoretic abstractions are extensively used to analyze massive data sets. Temporal data streams from socioeconomic interactions, social networking web sites, communication traffic, and scientific computing can be intuitively modeled as graphs. We present the first study of novel high-performance combinatorial techniques for analyzing large-scale information networks, encapsulating dynamic interaction data in the order of billions of entities. We present new data structures to represent dynamic interaction networks, and discuss algorithms for processing parallel insertions and deletions of edges in small-world networks. With these new approaches, we achieve an average performance rate of 25 million structural updates per second and a parallel speedup of nearly28 on a 64-way Sun UltraSPARC T2 multicore processor, for insertions and deletions to a small-world network of 33.5 million vertices and 268 million edges. We also design parallel implementations of fundamental dynamic graph kernels related to connectivity and centrality queries. Our implementations are freely distributed as part of the open-source SNAP (Small-world Network Analysis and Partitioning) complex network analysis framework.

  15. An algorithm to estimate the object support in truncated images

    SciTech Connect (OSTI)

    Hsieh, Scott S.; Nett, Brian E.; Cao, Guangzhi; Pelc, Norbert J.

    2014-07-15

    Purpose: Truncation artifacts in CT occur if the object to be imaged extends past the scanner field of view (SFOV). These artifacts impede diagnosis and could possibly introduce errors in dose plans for radiation therapy. Several approaches exist for correcting truncation artifacts, but existing correction algorithms do not accurately recover the skin line (or support) of the patient, which is important in some dose planning methods. The purpose of this paper was to develop an iterative algorithm that recovers the support of the object. Methods: The authors assume that the truncated portion of the image is made up of soft tissue of uniform CT number and attempt to find a shape consistent with the measured data. Each known measurement in the sinogram is interpreted as an estimate of missing mass along a line. An initial estimate of the object support is generated by thresholding a reconstruction made using a previous truncation artifact correction algorithm (e.g., water cylinder extrapolation). This object support is iteratively deformed to reduce the inconsistency with the measured data. The missing data are estimated using this object support to complete the dataset. The method was tested on simulated and experimentally truncated CT data. Results: The proposed algorithm produces a better defined skin line than water cylinder extrapolation. On the experimental data, the RMS error of the skin line is reduced by about 60%. For moderately truncated images, some soft tissue contrast is retained near the SFOV. As the extent of truncation increases, the soft tissue contrast outside the SFOV becomes unusable although the skin line remains clearly defined, and in reformatted images it varies smoothly from slice to slice as expected. Conclusions: The support recovery algorithm provides a more accurate estimate of the patient outline than thresholded, basic water cylinder extrapolation, and may be preferred in some radiation therapy applications.

  16. Comparison of Theoretical Efficiencies of Multi-junction Concentrator Solar Cells

    SciTech Connect (OSTI)

    Kurtz, S.; Myers, D.; McMahon, W. E.; Geisz, J.; Steiner, M.

    2008-01-01

    Champion concentrator cell efficiencies have surpassed 40% and now many are asking whether the efficiencies will surpass 50%. Theoretical efficiencies of >60% are described for many approaches, but there is often confusion about the theoretical efficiency for a specific structure. The detailed balance approach to calculating theoretical efficiency gives an upper bound that can be independent of material parameters and device design. Other models predict efficiencies that are closer to those that have been achieved. Changing reference spectra and the choice of concentration further complicate comparison of theoretical efficiencies. This paper provides a side-by-side comparison of theoretical efficiencies of multi-junction solar cells calculated with the detailed balance approach and a common one-dimensional-transport model for different spectral and irradiance conditions. Also, historical experimental champion efficiencies are compared with the theoretical efficiencies.

  17. ITP Metal Casting: Theoretical/Best Practice Energy Use in Metalcastin...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    ITP Metal Casting: TheoreticalBest Practice Energy Use in Metalcasting Operations PDF icon doebestpractice052804.pdf More Documents & Publications ITP Metal Casting: Energy Use ...

  18. Theoretical investigations of defects in a Si-based digital ferromagne...

    Office of Scientific and Technical Information (OSTI)

    digital ferromagnetic heterostructure - a spintronic material Citation Details In-Document Search Title: Theoretical investigations of defects in a Si-based digital ...

  19. Numerical Study of Velocity Shear Stabilization of 3D and Theoretical...

    Office of Scientific and Technical Information (OSTI)

    We studied the feasibility of resonantly driving GAMs in tokamaks. A numerical simulation ... Experimental implications of this were quantified. Theoretical support was provided for ...

  20. The power of simplification: Operator interface with the AP1000{sup R} during design-basis and beyond design-basis events

    SciTech Connect (OSTI)

    Williams, M. G.; Mouser, M. R.; Simon, J. B.

    2012-07-01

    The AP1000{sup R} plant is an 1100-MWe pressurized water reactor with passive safety features and extensive plant simplifications that enhance construction, operation, maintenance, safety and cost. The passive safety features are designed to function without safety-grade support systems such as component cooling water, service water, compressed air or HVAC. The AP1000 passive safety features achieve and maintain safe shutdown in case of a design-basis accident for 72 hours without need for operator action, meeting the expectations provided in the European Utility Requirements and the Utility Requirement Document for passive plants. Limited operator actions may be required to maintain safe conditions in the spent fuel pool (SFP) via passive means. This safety approach therefore minimizes the reliance on operator action for accident mitigation, and this paper examines the operator interaction with the Human-System Interface (HSI) as the severity of an accident increases from an anticipated transient to a design basis accident and finally, to a beyond-design-basis event. The AP1000 Control Room design provides an extremely effective environment for addressing the first 72 hours of design-basis events and transients, providing ease of information dissemination and minimal reliance upon operator actions. Symptom-based procedures including Emergency Operating Procedures (EOPs), Abnormal Operating Procedures (AOPs) and Alarm Response Procedures (ARPs) are used to mitigate design basis transients and accidents. Use of the Computerized Procedure System (CPS) aids the operators during mitigation of the event. The CPS provides cues and direction to the operators as the event progresses. If the event becomes progressively worse or lasts longer than 72 hours, and depending upon the nature of failures that may have occurred, minimal operator actions may be required outside of the control room in areas that have been designed to be accessible using components that have been designed to be reliable in these conditions. The primary goal of any such actions is to maintain or refill the passive inventory available to cool the core, containment and spent fuel pool in the safety-related and seismically qualified Passive Containment Cooling Water Storage Tank (PCCWST). The seismically-qualified, ground-mounted Passive Containment Cooling Ancillary Water Storage Tank (PCCAWST) is also available for this function as appropriate. The primary effect of these actions would be to increase the coping time for the AP1000 during design basis events, as well as events such as those described above, from 72 hours without operator intervention to 7 days with minimal operator actions. These Operator actions necessary to protect the health and safety of the public are addressed in the Post-72 Hour procedures, as well as some EOPs, AOPs, ARPs and the Severe Accident Management Guidelines (SAMGs). Should the event continue to become more severe and plant conditions degrade further with indications of inadequate core cooling, the SAMGs provide guidance for strategies to address these hypothetical severe accident conditions. The AP1000 SAMG diagnoses and actions are prioritized to first utilize the AP1000 features that are expected to retain a damaged core inside the reactor vessel. Only one strategy is undertaken at any time. This strategy will be followed and its effectiveness evaluated before other strategies are undertaken. This is a key feature of both the symptom-oriented AP1000 EOPs and the AP1000 SAMGs which maximizes the probability of retaining a damaged core inside the reactor vessel and containment while minimizing the chances for confusion and human errors during implementation. The AP1000 SAMGs are simple and straight-forward and have been developed with considerable input from human factors and plant operations experts. Most importantly, and different from severe accident management strategies for other plants, the AP1000 SAMGs do not require diagnosis of the location of the core (i.e., whether reactor vessel failure has occurred). This is a fun

  1. Specification of Selected Performance Monitoring and Commissioning Verification Algorithms for CHP Systems

    SciTech Connect (OSTI)

    Brambley, Michael R.; Katipamula, Srinivas

    2006-10-06

    Pacific Northwest National Laboratory (PNNL) is assisting the U.S. Department of Energy (DOE) Distributed Energy (DE) Program by developing advanced control algorithms that would lead to development of tools to enhance performance and reliability, and reduce emissions of distributed energy technologies, including combined heat and power technologies. This report documents phase 2 of the program, providing a detailed functional specification for algorithms for performance monitoring and commissioning verification, scheduled for development in FY 2006. The report identifies the systems for which algorithms will be developed, the specific functions of each algorithm, metrics which the algorithms will output, and inputs required by each algorithm.

  2. Time-dependent density functional theory quantum transport simulation in non-orthogonal basis

    SciTech Connect (OSTI)

    Kwok, Yan Ho; Xie, Hang; Yam, Chi Yung; Chen, Guan Hua; Zheng, Xiao

    2013-12-14

    Basing on the earlier works on the hierarchical equations of motion for quantum transport, we present in this paper a first principles scheme for time-dependent quantum transport by combining time-dependent density functional theory (TDDFT) and Keldysh's non-equilibrium Green's function formalism. This scheme is beyond the wide band limit approximation and is directly applicable to the case of non-orthogonal basis without the need of basis transformation. The overlap between the basis in the lead and the device region is treated properly by including it in the self-energy and it can be shown that this approach is equivalent to a lead-device orthogonalization. This scheme has been implemented at both TDDFT and density functional tight-binding level. Simulation results are presented to demonstrate our method and comparison with wide band limit approximation is made. Finally, the sparsity of the matrices and computational complexity of this method are analyzed.

  3. Hamiltonian Light-Front Ffield Theory in a Basis Function Approach

    SciTech Connect (OSTI)

    Vary, J.P.; Honkanen, H.; Li, Jun; Maris, P.; Brodsky, S.J.; Harindranath, A.; de Teramond, G.F.; Sternberg, P.; Ng, E.G.; Yang, C.

    2009-05-15

    Hamiltonian light-front quantum field theory constitutes a framework for the non-perturbative solution of invariant masses and correlated parton amplitudes of self-bound systems. By choosing the light-front gauge and adopting a basis function representation, we obtain a large, sparse, Hamiltonian matrix for mass eigenstates of gauge theories that is solvable by adapting the ab initio no-core methods of nuclear many-body theory. Full covariance is recovered in the continuum limit, the infinite matrix limit. There is considerable freedom in the choice of the orthonormal and complete set of basis functions with convenience and convergence rates providing key considerations. Here, we use a two-dimensional harmonic oscillator basis for transverse modes that corresponds with eigensolutions of the soft-wall AdS/QCD model obtained from light-front holography. We outline our approach, present illustrative features of some non-interacting systems in a cavity and discuss the computational challenges.

  4. JLab Will Begin Testing its Public Address System on a Monthly Basis |

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Jefferson Lab Will Begin Testing its Public Address System on a Monthly Basis On April 20, Jefferson Lab Will Begin Testing its Public Address System on a Monthly Basis On Wednesday, April 20, Jefferson Lab will begin conducting a monthly test of its Public Address (PA) System - the live audible announcement feature - available through the lab's Cisco phones. Starting on April 20, these tests will occur at 5:30 p.m. on the third Wednesday of each month. The Public Address System may be used

  5. Basis for Section 3116 Determination for Salt Waste Disposal at the Savannah River Site

    Office of Environmental Management (EM)

    Fuel and High-Level Waste | Department of Energy Basis for Identification of Disposal Options for R and D for Spent Nuclear Fuel and High-Level Waste Basis for Identification of Disposal Options for R and D for Spent Nuclear Fuel and High-Level Waste The Used Fuel Disposition campaign (UFD) is selecting a set of geologic media for further study that spans a suite of behavior characteristics that impose a broad range of potential conditions on the design of the repository, the engineered

  6. Structural Basis of Wnt Signaling Inhibition by Dickkopf Binding to LRP5/6

    Office of Scientific and Technical Information (OSTI)

    (Journal Article) | SciTech Connect Basis of Wnt Signaling Inhibition by Dickkopf Binding to LRP5/6 Citation Details In-Document Search Title: Structural Basis of Wnt Signaling Inhibition by Dickkopf Binding to LRP5/6 Authors: Ahn, Victoria E. ; Chu, Matthew Ling-Hon ; Choi, Hee-Jung ; Tran, Denise ; Abo, Arie ; Weis, William I. Publication Date: 2011-11-01 OSTI Identifier: 1198118 Type: Published Article Journal Name: Developmental Cell Additional Journal Information: Journal Volume: 21;

  7. Structural basis of GSK-3 inhibition by N-terminal phosphorylation and by

    Office of Scientific and Technical Information (OSTI)

    the Wnt receptor LRP6 (Journal Article) | SciTech Connect Structural basis of GSK-3 inhibition by N-terminal phosphorylation and by the Wnt receptor LRP6 Citation Details In-Document Search Title: Structural basis of GSK-3 inhibition by N-terminal phosphorylation and by the Wnt receptor LRP6 Authors: Stamos, Jennifer L. ; Chu, Matthew Ling-Hon ; Enos, Michael D. ; Shah, Niket ; Weis, William I. [1] + Show Author Affiliations (Stanford) Publication Date: 2015-02-19 OSTI Identifier: 1168492

  8. Basis for Identification of Disposal Options for R and D for Spent Nuclear

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Fuel and High-Level Waste | Department of Energy Basis for Identification of Disposal Options for R and D for Spent Nuclear Fuel and High-Level Waste Basis for Identification of Disposal Options for R and D for Spent Nuclear Fuel and High-Level Waste The Used Fuel Disposition campaign (UFD) is selecting a set of geologic media for further study that spans a suite of behavior characteristics that impose a broad range of potential conditions on the design of the repository, the engineered

  9. Structural Basis for Microcin C7 Inactivation by the MccE Acetyltransferase

    Office of Scientific and Technical Information (OSTI)

    (Journal Article) | SciTech Connect SciTech Connect Search Results Journal Article: Structural Basis for Microcin C7 Inactivation by the MccE Acetyltransferase Citation Details In-Document Search Title: Structural Basis for Microcin C7 Inactivation by the MccE Acetyltransferase The antibiotic microcin C7 (McC) acts as a bacteriocide by inhibiting aspartyl-tRNA synthetase and stalling the protein translation machinery. McC is synthesized as a heptapeptide-nucleotide conjugate, which is

  10. Structural Basis of Selective Ubiquitination of TRF1 by SCF[superscript

    Office of Scientific and Technical Information (OSTI)

    Fbx4] (Journal Article) | SciTech Connect SciTech Connect Search Results Journal Article: Structural Basis of Selective Ubiquitination of TRF1 by SCF[superscript Fbx4] Citation Details In-Document Search Title: Structural Basis of Selective Ubiquitination of TRF1 by SCF[superscript Fbx4] TRF1 is a critical regulator of telomere length. As such, TRF1 levels are regulated by ubiquitin-dependent proteolysis via an SCF E3 ligase where Fbx4 contributes to substrate specification. Here, we report

  11. The Structural Basis for Tight Control of PP2A Methylation and Function by

    Office of Scientific and Technical Information (OSTI)

    LCMT-1 (Journal Article) | SciTech Connect The Structural Basis for Tight Control of PP2A Methylation and Function by LCMT-1 Citation Details In-Document Search Title: The Structural Basis for Tight Control of PP2A Methylation and Function by LCMT-1 Proper formation of protein phosphatase 2A (PP2A) holoenzymes is essential for the fitness of all eukaryotic cells. Carboxyl methylation of the PP2A catalytic subunit plays a critical role in regulating holoenzyme assembly; methylation is

  12. Universal basis of two-center functions. Test computations of certain diatomic molecules and ions

    SciTech Connect (OSTI)

    Kirnos, V.F.; Samsonov, B.F.; Cheglokov, E.I.

    1987-05-01

    It is shown that the basis of two-center functions is universal. The dependence of the nuclei of atoms comprising a molecule on charges and on the intranuclear spacing is separated explicitly in the integrals used in analyzing diatomic molecules. The basis integrals constructed once permitted rapid and effective execution of computations for the ground state potential curves for a number of electron systems: H/sub 2/, He/sub 2//sup 2 +/, HeH/sup +/, He/sub 2/, LiH, Li/sub 2/, HeB/sup +/, Be/sub 2/.

  13. Technical Basis Spent Nuclear Fuel (SNF) Project Radiation and Contamination Trending Program

    SciTech Connect (OSTI)

    ELGIN, J.C.

    2000-10-02

    This report documents the technical basis for the Spent Nuclear Fuel (SNF) Program radiation and contamination trending program. The program consists of standardized radiation and contamination surveys of the KE Basin, radiation surveys of the KW basin, radiation surveys of the Cold Vacuum Drying Facility (CVD), and radiation surveys of the Canister Storage Building (CSB) with the associated tracking. This report also discusses the remainder of radiological areas within the SNFP that do not have standardized trending programs and the basis for not having this program in those areas.

  14. Electron Anomalous Magnetic Moment in Basis Light-Front Quantization Approach

    SciTech Connect (OSTI)

    Zhao, Xingbo; Honkanen, Heli; Maris, Pieter; Vary, James P.; Brodsky, Stanley J.; /SLAC

    2012-02-17

    We apply the Basis Light-Front Quantization (BLFQ) approach to the Hamiltonian field theory of Quantum Electrodynamics (QED) in free space. We solve for the mass eigenstates corresponding to an electron interacting with a single photon in light-front gauge. Based on the resulting non-perturbative ground state light-front amplitude we evaluate the electron anomalous magnetic moment. The numerical results from extrapolating to the infinite basis limit reproduce the perturbative Schwinger result with relative deviation less than 1.2%. We report significant improvements over previous works including the development of analytic methods for evaluating the vertex matrix elements of QED.

  15. Analytic matrix elements for the two-electron atomic basis with logarithmic terms

    SciTech Connect (OSTI)

    Liverts, Evgeny Z.; Barnea, Nir

    2014-08-01

    The two-electron problem for the helium-like atoms in S-state is considered. The basis containing the integer powers of ln r, where r is a radial variable of the Fock expansion, is studied. In this basis, the analytic expressions for the matrix elements of the corresponding Hamiltonian are presented. These expressions include only elementary and special functions, what enables very fast and accurate computation of the matrix elements. The decisive contribution of the correct logarithmic terms to the behavior of the two-electron wave function in the vicinity of the triple-coalescence point is reaffirmed.

  16. Spatio-spectral image analysis using classical and neural algorithms

    SciTech Connect (OSTI)

    Roberts, S.; Gisler, G.R.; Theiler, J.

    1996-12-31

    Remote imaging at high spatial resolution has a number of environmental, industrial, and military applications. Analysis of high-resolution multi-spectral images usually involves either spectral analysis of single pixels in a multi- or hyper-spectral image or spatial analysis of multi-pixels in a panchromatic or monochromatic image. Although insufficient for some pattern recognition applications individually, the combination of spatial and spectral analytical techniques may allow the identification of more complex signatures that might not otherwise be manifested in the individual spatial or spectral domains. We report on some preliminary investigation of unsupervised classification methodologies (using both ``classical`` and ``neural`` algorithms) to identify potentially revealing features in these images. We apply dimension-reduction preprocessing to the images, duster, and compare the clusterings obtained by different algorithms. Our classification results are analyzed both visually and with a suite of objective, quantitative measures.

  17. Parallel Algorithms for Graph Optimization using Tree Decompositions

    SciTech Connect (OSTI)

    Sullivan, Blair D; Weerapurage, Dinesh P; Groer, Christopher S

    2012-06-01

    Although many $\\cal{NP}$-hard graph optimization problems can be solved in polynomial time on graphs of bounded tree-width, the adoption of these techniques into mainstream scientific computation has been limited due to the high memory requirements of the necessary dynamic programming tables and excessive runtimes of sequential implementations. This work addresses both challenges by proposing a set of new parallel algorithms for all steps of a tree decomposition-based approach to solve the maximum weighted independent set problem. A hybrid OpenMP/MPI implementation includes a highly scalable parallel dynamic programming algorithm leveraging the MADNESS task-based runtime, and computational results demonstrate scaling. This work enables a significant expansion of the scale of graphs on which exact solutions to maximum weighted independent set can be obtained, and forms a framework for solving additional graph optimization problems with similar techniques.

  18. A spectral unaveraged algorithm for free electron laser simulations

    SciTech Connect (OSTI)

    Andriyash, I.A.; Lehe, R.; Malka, V.

    2015-02-01

    We propose and discuss a numerical method to model electromagnetic emission from the oscillating relativistic charged particles and its coherent amplification. The developed technique is well suited for free electron laser simulations, but it may also be useful for a wider range of physical problems involving resonant field–particles interactions. The algorithm integrates the unaveraged coupled equations for the particles and the electromagnetic fields in a discrete spectral domain. Using this algorithm, it is possible to perform full three-dimensional or axisymmetric simulations of short-wavelength amplification. In this paper we describe the method, its implementation, and we present examples of free electron laser simulations comparing the results with the ones provided by commonly known free electron laser codes.

  19. Optimized Algorithm for Collision Probability Calculations in Cubic Geometry

    SciTech Connect (OSTI)

    Garcia, R.D.M.

    2004-06-15

    An optimized algorithm for implementing a recently developed method of computing collision probabilities (CPs) in three dimensions is reported in this work for the case of a homogeneous cube. Use is made of the geometrical regularity of the domain to rewrite, in a very compact way, the approximate formulas for calculating CPs in general three-dimensional geometry that were derived in a previous work by the author. The ensuing gain in computation time is found to be substantial: While the computation time associated with the general formulas increases as K{sup 2}, where K is the number of elements used in the calculation, that of the specific formulas increases only linearly with K. Accurate numerical results are given for several test cases, and an extension of the algorithm for computing the self-collision probability for a hexahedron is reported at the end of the work.

  20. APPLICATION OF NEURAL NETWORK ALGORITHMS FOR BPM LINEARIZATION

    SciTech Connect (OSTI)

    Musson, John C.; Seaton, Chad; Spata, Mike F.; Yan, Jianxun

    2012-11-01

    Stripline BPM sensors contain inherent non-linearities, as a result of field distortions from the pickup elements. Many methods have been devised to facilitate corrections, often employing polynomial fitting. The cost of computation makes real-time correction difficult, particulalry when integer math is utilized. The application of neural-network technology, particularly the multi-layer perceptron algorithm, is proposed as an efficient alternative for electrode linearization. A process of supervised learning is initially used to determine the weighting coefficients, which are subsequently applied to the incoming electrode data. A non-linear layer, known as an ?activation layer,? is responsible for the removal of saturation effects. Implementation of a perceptron in an FPGA-based software-defined radio (SDR) is presented, along with performance comparisons. In addition, efficient calculation of the sigmoidal activation function via the CORDIC algorithm is presented.

  1. Invariant patterns in crystal lattices: Implications for protein folding algorithms

    SciTech Connect (OSTI)

    Hart, W.E.; Istrail, S.

    1995-12-11

    Crystal lattices are infinite periodic graphs that occur naturally in a variety of geometries and which are of fundamental importance in polymer science. Discrete models of protein folding use crystal lattices to define the space of protein conformations. Because various crystal lattices provide discretizations of the same physical phenomenon, it is reasonable to expect that there will exist ``invariants`` across lattices that define fundamental properties of protein folding process; an invariant defines a property that transcends particular lattice formulations. This paper identifies two classes of invariants, defined in terms of sublattices that are related to the design of algorithms for the structure prediction problem. The first class of invariants is, used to define a master approximation algorithm for which provable performance guarantees exist. This algorithm can be applied to generalizations of the hydrophobic-hydrophilic model that have lattices other than the cubic lattice, including most of the crystal lattices commonly used in protein folding lattice models. The second class of invariants applies to a related lattice model. Using these invariants, we show that for this model the structure prediction problem is intractable across a variety of three-dimensional lattices. It`` turns out that these two classes of invariants are respectively sublattices of the two- and three-dimensional square lattice. As the square lattices are the standard lattices used in empirical protein folding` studies, our results provide a rigorous confirmation of the ability of these lattices to provide insight into biological phenomenon. Our results are the first in the literature that identify algorithmic paradigms for the protein structure prediction problem which transcend particular lattice formulations.

  2. High-Resolution Computational Algorithms for Simulating Offshore Wind Farms

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Computational Algorithms for Simulating Offshore Wind Farms - Sandia Energy Energy Search Icon Sandia Home Locations Contact Us Employee Locator Energy & Climate Secure & Sustainable Energy Future Stationary Power Energy Conversion Efficiency Solar Energy Wind Energy Water Power Supercritical CO2 Geothermal Natural Gas Safety, Security & Resilience of the Energy Infrastructure Energy Storage Nuclear Power & Engineering Grid Modernization Battery Testing Nuclear Fuel Cycle Defense

  3. ANL CT Reconstruction Algorithm for Utilizing Digital X-ray

    Energy Science and Technology Software Center (OSTI)

    2004-05-01

    Reconstructs X-ray computed tomographic images from large data sets known as 16-bit binary sinograms when using a massively parallelized computer architecture such as a Beowuif cluster by parallelizing the X-ray CT reconstruction routine. The algorithm uses the concept of generation of an image from carefully obtained multiple 1-D or 2-D X-ray projections. The individual projections are filtered using a digital Fast Fourier Transform. The literature refers to this as filtered back projection.

  4. Physics-based signal processing algorithms for micromachined cantilever arrays

    DOE Patents [OSTI]

    Candy, James V; Clague, David S; Lee, Christopher L; Rudd, Robert E; Burnham, Alan K; Tringe, Joseph W

    2013-11-19

    A method of using physics-based signal processing algorithms for micromachined cantilever arrays. The methods utilize deflection of a micromachined cantilever that represents the chemical, biological, or physical element being detected. One embodiment of the method comprises the steps of modeling the deflection of the micromachined cantilever producing a deflection model, sensing the deflection of the micromachined cantilever and producing a signal representing the deflection, and comparing the signal representing the deflection with the deflection model.

  5. A fast contour descriptor algorithm for supernova imageclassification

    SciTech Connect (OSTI)

    Aragon, Cecilia R.; Aragon, David Bradburn

    2006-07-16

    We describe a fast contour descriptor algorithm and its application to a distributed supernova detection system (the Nearby Supernova Factory) that processes 600,000 candidate objects in 80 GB of image data per night. Our shape-detection algorithm reduced the number of false positives generated by the supernova search pipeline by 41% while producing no measurable impact on running time. Fourier descriptors are an established method of numerically describing the shapes of object contours, but transform-based techniques are ordinarily avoided in this type of application due to their computational cost. We devised a fast contour descriptor implementation for supernova candidates that meets the tight processing budget of the application. Using the lowest-order descriptors (F{sub 1} and F{sub -1}) and the total variance in the contour, we obtain one feature representing the eccentricity of the object and another denoting its irregularity. Because the number of Fourier terms to be calculated is fixed and small, the algorithm runs in linear time, rather than the O(n log n) time of an FFT. Constraints on object size allow further optimizations so that the total cost of producing the required contour descriptors is about 4n addition/subtraction operations, where n is the length of the contour.

  6. GMG: A Guaranteed, Efficient Global Optimization Algorithm for Remote Sensing.

    SciTech Connect (OSTI)

    D'Helon, CD

    2004-08-18

    The monocular passive ranging (MPR) problem in remote sensing consists of identifying the precise range of an airborne target (missile, plane, etc.) from its observed radiance. This inverse problem may be set as a global optimization problem (GOP) whereby the difference between the observed and model predicted radiances is minimized over the possible ranges and atmospheric conditions. Using additional information about the error function between the predicted and observed radiances of the target, we developed GMG, a new algorithm to find the Global Minimum with a Guarantee. The new algorithm transforms the original continuous GOP into a discrete search problem, thereby guaranteeing to find the position of the global minimum in a reasonably short time. The algorithm is first applied to the golf course problem, which serves as a litmus test for its performance in the presence of both complete and degraded additional information. GMG is further assessed on a set of standard benchmark functions and then applied to various realizations of the MPR problem.

  7. Design-Basis Flood Estimation for Site Characterization at Nuclear Power Plants in the United States of America

    SciTech Connect (OSTI)

    Prasad, Rajiv; Hibler, Lyle F.; Coleman, Andre M.; Ward, Duane L.

    2011-11-01

    The purpose of this document is to describe approaches and methods for estimation of the design-basis flood at nuclear power plant sites. Chapter 1 defines the design-basis flood and lists the U.S. Nuclear Regulatory Commission's (NRC) regulations that require estimation of the design-basis flood. For comparison, the design-basis flood estimation methods used by other Federal agencies are also described. A brief discussion of the recommendations of the International Atomic Energy Agency for estimation of the design-basis floods in its member States is also included.

  8. An efficient algorithm for incompressible N-phase flows

    SciTech Connect (OSTI)

    Dong, S.

    2014-11-01

    We present an efficient algorithm within the phase field framework for simulating the motion of a mixture of N (N?2) immiscible incompressible fluids, with possibly very different physical properties such as densities, viscosities, and pairwise surface tensions. The algorithm employs a physical formulation for the N-phase system that honors the conservations of mass and momentum and the second law of thermodynamics. We present a method for uniquely determining the mixing energy density coefficients involved in the N-phase model based on the pairwise surface tensions among the N fluids. Our numerical algorithm has several attractive properties that make it computationally very efficient: (i) it has completely de-coupled the computations for different flow variables, and has also completely de-coupled the computations for the (N?1) phase field functions; (ii) the algorithm only requires the solution of linear algebraic systems after discretization, and no nonlinear algebraic solve is needed; (iii) for each flow variable the linear algebraic system involves only constant and time-independent coefficient matrices, which can be pre-computed during pre-processing, despite the variable density and variable viscosity of the N-phase mixture; (iv) within a time step the semi-discretized system involves only individual de-coupled Helmholtz-type (including Poisson) equations, despite the strongly-coupled phasefield system of fourth spatial order at the continuum level; (v) the algorithm is suitable for large density contrasts and large viscosity contrasts among the N fluids. Extensive numerical experiments have been presented for several problems involving multiple fluid phases, large density contrasts and large viscosity contrasts. In particular, we compare our simulations with the de Gennes theory, and demonstrate that our method produces physically accurate results for multiple fluid phases. We also demonstrate the significant and sometimes dramatic effects of the gravity, density ratios, pairwise surface tensions, and drop sizes on the N-phase configurations and dynamics. The numerical results show that the method developed herein is capable of dealing with N-phase systems with large density ratios, large viscosity ratios, and pairwise surface tensions, and that it can be a powerful tool for studying the interactions among multiple types of fluid interfaces.

  9. Enhanced Algorithm for Traceability Measurements in UF6 Flow Pipe

    SciTech Connect (OSTI)

    Copinger, Thomas E; March-Leuba, Jose A; Upadhyaya, Belle R

    2007-01-01

    The Blend Down Monitoring System (BDMS) is used to continually assess the mixing and downblending of highly enriched uranium (HEU) with low-enriched uranium (LEU). This is accomplished by measuring the enrichment and the fissile mass flow rate of the UF{sub 6} gas located in each process pipe of the system by inducing the fission of the {sup 235}U contained in the gas. Measurements are taken along this process route to trace the HEU content all the way to the product stream, ensuring that HEU was down blended. A problem associated with the current traceability measuring algorithm is that it does not account for the time-varying background that is introduced to the system by the movement of the shutter located at the HEU leg of the process. The current way of dealing with that problem is to discard the data for periods when the HEU shutter is open (50% of overall data) because it correlates with the same timeframe in which the direct contribution to background from the HEU shutter was seen. The advanced algorithm presented in this paper allows for continuous measurement of traceability (100%) by accurately accounting for the varying background during the shutter-movement cycle. This algorithm utilizes advanced processing techniques that identify and discriminate the different sources of background radiation, instead of grouping them into one background group for the whole measurement cycle. By using this additional information, the traceability measurement statistics can achieve a greater number of values, thus improving the overall usefulness of these measurements in the BDMS. The effectiveness of the new algorithm was determined by modeling it in a simulation and ensuring that it retained its integrity through a large number of runs, including various shutter-failure conditions. Each run was performed with varying amounts of background radiation from each individual source and with varying traceability counts. The simulations documented in this paper prove that the algorithm can stand up to various transients introduced into the system, such as failure of shutter movement.

  10. Margin of Safety Definition and Examples Used in Safety Basis Documents and the USQ Process

    SciTech Connect (OSTI)

    Beaulieu, R. A.

    2013-10-03

    The Nuclear Safety Management final rule, 10 CFR 830, provides an undefined term, margin of safety (MOS). Safe harbors listed in 10 CFR 830, Table 2, such as DOE-STD-3009 use but do not define the term. This lack of definition has created the need for the definition. This paper provides a definition of MOS and documents examples of MOS as applied in a U.S. Department of Energy (DOE) approved safety basis for an existing nuclear facility. If we understand what MOS looks like regarding Technical Safety Requirements (TSR) parameters, then it helps us compare against other parameters that do not involve a MOS. This paper also documents parameters that are not MOS. These criteria could be used to determine if an MOS exists in safety basis documents. This paper helps DOE, including the National Nuclear Security Administration (NNSA) and its contractors responsible for the safety basis improve safety basis documents and the unreviewed safety question (USQ) process with respect to MOS.

  11. CRAD, Safety Basis- Oak Ridge National Laboratory High Flux Isotope Reactor

    Broader source: Energy.gov [DOE]

    A section of Appendix C to DOE G 226.1-2 "Federal Line Management Oversight of Department of Energy Nuclear Facilities." Consists of Criteria Review and Approach Documents (CRADs) used for a February 2007 assessment of the Safety Basis in preparation for restart of the Oak Ridge National Laboratory High Flux Isotope Reactor.

  12. CRAD, Safety Basis- Oak Ridge National Laboratory TRU ALPHA LLWT Project

    Broader source: Energy.gov [DOE]

    A section of Appendix C to DOE G 226.1-2 "Federal Line Management Oversight of Department of Energy Nuclear Facilities." Consists of Criteria Review and Approach Documents (CRADs) used for a November 2003 assessment of the Safety Basis portion of an Operational Readiness Review of the Oak Ridge National Laboratory TRU ALPHA LLWT Project.

  13. CRAD, Safety Basis- Los Alamos National Laboratory TA 55 SST Facility

    Broader source: Energy.gov [DOE]

    A section of Appendix C to DOE G 226.1-2 "Federal Line Management Oversight of Department of Energy Nuclear Facilities." Consists of Criteria Review and Approach Documents (CRADs) used for an assessment of the Safety Basis at the Los Alamos National Laboratory TA 55 SST Facility.

  14. CRAD, Safety Basis- Los Alamos National Laboratory Waste Characterization, Reduction, and Repackaging Facility

    Broader source: Energy.gov [DOE]

    A section of Appendix C to DOE G 226.1-2 "Federal Line Management Oversight of Department of Energy Nuclear Facilities." Consists of Criteria Review and Approach Documents (CRADs) used for an assessment of the Safety Basis portion of an Operational Readiness Review at the Los Alamos National Laboratory Waste Characterization, Reduction, and Repackaging Facility.

  15. CRAD, Safety Basis- Oak Ridge National Laboratory High Flux Isotope Reactor Contractor ORR

    Broader source: Energy.gov [DOE]

    A section of Appendix C to DOE G 226.1-2 "Federal Line Management Oversight of Department of Energy Nuclear Facilities." Consists of Criteria Review and Approach Documents (CRADs) used for a February 2007 assessment of the Safety Basis portion of an Operational Readiness Review of the Oak Ridge National Laboratory High Flux Isotope Reactor.

  16. CRAD, Safety Basis- Y-12 Enriched Uranium Operations Oxide Conversion Facility

    Broader source: Energy.gov [DOE]

    A section of Appendix C to DOE G 226.1-2 "Federal Line Management Oversight of Department of Energy Nuclear Facilities." Consists of Criteria Review and Approach Documents (CRADs) used for a January 2005 assessment of the Safety Basis at the Y-12 - Enriched Uranium Operations Oxide Conversion Facility.

  17. Tank Waste Remediation System (TWRS) Retrieval Authorization Basis Amendment Task Plan

    SciTech Connect (OSTI)

    HARRIS, J.P.

    1999-08-31

    This task plan is a documented agreement between Nuclear Safety and Licensing and Retrieval Engineering. The purpose of this task plan is to identify the scope of work, tasks and deliverables, responsibilities, manpower, and schedules associated with an authorization basis amendment as a result of the Waste Feed Delivery Program, Project W-211, Project W-521, and Project W-522.

  18. Tank Waste Remediation System (TWRS) Retrieval Authorization Basis Amendment Task Plan

    SciTech Connect (OSTI)

    HARRIS, J.P.

    2000-03-27

    This task plan is a documented agreement between Nuclear Safety and Licensing and Retrieval Engineering. The purpose of this task plan is to identify the scope of work, tasks and deliverables, responsibilities, manpower, and schedules associated with an authorization basis amendment as a result of the Waste Feed Delivery Program, Project W-211, Project W-521, and Project W-522.

  19. Sensitivity of the Properties of Ruthenium Blue Dimer to Method, Basis Set, and Continuum Model

    SciTech Connect (OSTI)

    Ozkanlar, Abdullah; Clark, Aurora E.

    2012-05-23

    The ruthenium blue dimer [(bpy)2RuIIIOH2]2O4+ is best known as the first well-defined molecular catalyst for water oxidation. It has been subject to numerous computational studies primarily employing density functional theory. However, those studies have been limited in the functionals, basis sets, and continuum models employed. The controversy in the calculated electronic structure and the reaction energetics of this catalyst highlights the necessity of benchmark calculations that explore the role of density functionals, basis sets, and continuum models upon the essential features of blue-dimer reactivity. In this paper, we report Kohn-Sham complete basis set (KS-CBS) limit extrapolations of the electronic structure of blue dimer using GGA (BPW91 and BP86), hybrid-GGA (B3LYP), and meta-GGA (M06-L) density functionals. The dependence of solvation free energy corrections on the different cavity types (UFF, UA0, UAHF, UAKS, Bondi, and Pauling) within polarizable and conductor-like polarizable continuum model has also been investigated. The most common basis sets of double-zeta quality are shown to yield results close to the KS-CBS limit; however, large variations are observed in the reaction energetics as a function of density functional and continuum cavity model employed.

  20. MODEL AND ALGORITHM EVALUATION FOR THE HYBRID UF6 CONTAINER INSPECTION SYSTEM

    SciTech Connect (OSTI)

    McDonald, Benjamin S.; Jordan, David V.; Orton, Christopher R.; Mace, Emily K.; Smith, Leon E.; Wittman, Richard S.

    2011-06-14

    ABSTRACT Pacific Northwest National Laboratory (PNNL) is developing an automated UF6 cylinder verification station concept based on the combined collection of traditional enrichment-meter (186 keV photons from U-235) data and non-traditional, neutron-induced, high-energy gamma-signatures (3-8.5 MeV) with an array of collimated, medium-resolution scintillators. Previous (2010) work at PNNL demonstrated proof-of-principle that this hybrid method yields accurate, full-volume assay of the cylinder enrichment, reduces systematic errors when compared to several other enrichment assay methods, and provides simplified instrumentation and algorithms suitable for long-term unattended operations. We used Monte Carlo modeling with MCNP5 to support system design (e.g., number and configuration of detector arrays, and design of iron/poly collimators for enhanced (n,γ) conversion) and enrichment algorithm development. We developed a first-generation modeling framework in 2010. These tools have since been expanded, refined and benchmarked against field measurements with a prototype system of a 30B cylinder population (0.2 to 4.95 weight % U-235). The MCNP5 model decomposes the radiation transport problem into a linear superposition of “basis spectra” representing contributions from the different uranium isotopes and gamma-ray generation mechanisms (e.g. neutron capture). This scheme accommodates fast generation of “virtual assay signatures” for arbitrary enrichment, material age, and fill variations. Ongoing (FY-2011) refinements to the physics model include accounting for generation of bremsstrahlung photons, arising primarily from the beta decay of Pa-234m, a U-238 daughter. We are using the refined model to optimize collimator design for the hybrid method. The traditional assay method benefits from a high degree of collimation (to isolate each detector’s field-of-view) and relatively small detector area, while the non-traditional method benefits from a wide field-of-view, i.e. less collimation and larger detectors. We implement the enrichment-meter method by applying a square-wave digital filter to a raw spectrum and extracting the 186-keV peak area directly from the convolute spectrum. Ongoing enhancements to this approach include mitigating a systematic peak-area measurement deficit arising from curvature in the spectrum continuum shape. An optimized system prototype based on model results is utilized in a new set of 2011 field measurements, and model and measurement enrichment assay uncertainties are compared.

  1. Automated Algorithms for Quantum-Level Accuracy in Atomistic Simulations: LDRD Final Report.

    SciTech Connect (OSTI)

    Thompson, Aidan P.; Schultz, Peter A.; Crozier, Paul; Moore, Stan Gerald; Swiler, Laura Painton; Stephens, John Adam; Trott, Christian Robert; Foiles, Stephen M.; Tucker, Garritt J.

    2014-09-01

    This report summarizes the result of LDRD project 12-0395, titled %22Automated Algorithms for Quantum-level Accuracy in Atomistic Simulations.%22 During the course of this LDRD, we have developed an interatomic potential for solids and liquids called Spectral Neighbor Analysis Poten- tial (SNAP). The SNAP potential has a very general form and uses machine-learning techniques to reproduce the energies, forces, and stress tensors of a large set of small configurations of atoms, which are obtained using high-accuracy quantum electronic structure (QM) calculations. The local environment of each atom is characterized by a set of bispectrum components of the local neighbor density projected on to a basis of hyperspherical harmonics in four dimensions. The SNAP coef- ficients are determined using weighted least-squares linear regression against the full QM training set. This allows the SNAP potential to be fit in a robust, automated manner to large QM data sets using many bispectrum components. The calculation of the bispectrum components and the SNAP potential are implemented in the LAMMPS parallel molecular dynamics code. Global optimization methods in the DAKOTA software package are used to seek out good choices of hyperparameters that define the overall structure of the SNAP potential. FitSnap.py, a Python-based software pack- age interfacing to both LAMMPS and DAKOTA is used to formulate the linear regression problem, solve it, and analyze the accuracy of the resultant SNAP potential. We describe a SNAP potential for tantalum that accurately reproduces a variety of solid and liquid properties. Most significantly, in contrast to existing tantalum potentials, SNAP correctly predicts the Peierls barrier for screw dislocation motion. We also present results from SNAP potentials generated for indium phosphide (InP) and silica (SiO 2 ). We describe efficient algorithms for calculating SNAP forces and energies in molecular dynamics simulations using massively parallel computers and advanced processor ar- chitectures. Finally, we briefly describe the MSM method for efficient calculation of electrostatic interactions on massively parallel computers.

  2. Correlation consistent basis sets for actinides. I. The Th and U atoms

    SciTech Connect (OSTI)

    Peterson, Kirk A.

    2015-02-21

    New correlation consistent basis sets based on both pseudopotential (PP) and all-electron Douglas-Kroll-Hess (DKH) Hamiltonians have been developed from double- to quadruple-zeta quality for the actinide atoms thorium and uranium. Sets for valence electron correlation (5f6s6p6d), cc − pV nZ − PP and cc − pV nZ − DK3, as well as outer-core correlation (valence + 5s5p5d), cc − pwCV nZ − PP and cc − pwCV nZ − DK3, are reported (n = D, T, Q). The -PP sets are constructed in conjunction with small-core, 60-electron PPs, while the -DK3 sets utilized the 3rd-order Douglas-Kroll-Hess scalar relativistic Hamiltonian. Both series of basis sets show systematic convergence towards the complete basis set limit, both at the Hartree-Fock and correlated levels of theory, making them amenable to standard basis set extrapolation techniques. To assess the utility of the new basis sets, extensive coupled cluster composite thermochemistry calculations of ThF{sub n} (n = 2 − 4), ThO{sub 2}, and UF{sub n} (n = 4 − 6) have been carried out. After accurately accounting for valence and outer-core correlation, spin-orbit coupling, and even Lamb shift effects, the final 298 K atomization enthalpies of ThF{sub 4}, ThF{sub 3}, ThF{sub 2}, and ThO{sub 2} are all within their experimental uncertainties. Bond dissociation energies of ThF{sub 4} and ThF{sub 3}, as well as UF{sub 6} and UF{sub 5}, were similarly accurate. The derived enthalpies of formation for these species also showed a very satisfactory agreement with experiment, demonstrating that the new basis sets allow for the use of accurate composite schemes just as in molecular systems composed only of lighter atoms. The differences between the PP and DK3 approaches were found to increase with the change in formal oxidation state on the actinide atom, approaching 5-6 kcal/mol for the atomization enthalpies of ThF{sub 4} and ThO{sub 2}. The DKH3 atomization energy of ThO{sub 2} was calculated to be smaller than the DKH2 value by ∼1 kcal/mol.

  3. Modeling and Algorithmic Approaches to Constitutively-Complex, Microstructured Fluids

    SciTech Connect (OSTI)

    Miller, Gregory H.; Forest, Gregory

    2011-12-22

    We present a new multiscale model for complex uids based on three scales: microscopic, kinetic, and continuum. We choose the microscopic level as Kramers' bead-rod model for polymers, which we describe as a system of stochastic di#11;erential equations with an implicit constraint formulation. The associated Fokker-Planck equation is then derived, and adiabatic elimination removes the fast momentum coordinates. Approached in this way, the kinetic level reduces to a dispersive drift equation. The continuum level is modeled with a #12;nite volume Godunov-projection algorithm. We demonstrate computation of viscoelastic stress divergence using this multiscale approach.

  4. Faith in the algorithm, part 1: beyond the turing test

    SciTech Connect (OSTI)

    Rodriguez, Marko A; Pepe, Alberto

    2009-01-01

    Since the Turing test was first proposed by Alan Turing in 1950, the goal of artificial intelligence has been predicated on the ability for computers to imitate human intelligence. However, the majority of uses for the computer can be said to fall outside the domain of human abilities and it is exactly outside of this domain where computers have demonstrated their greatest contribution. Another definition for artificial intelligence is one that is not predicated on human mimicry, but instead, on human amplification, where the algorithms that are best at accomplishing this are deemed the most intelligent. This article surveys various systems that augment human and social intelligence.

  5. A fast and memory-sparing probabilistic selection algorithm for the GPU

    SciTech Connect (OSTI)

    Monroe, Laura M; Wendelberger, Joanne; Michalak, Sarah

    2010-09-29

    A fast and memory-sparing probabilistic top-N selection algorithm is implemented on the GPU. This probabilistic algorithm gives a deterministic result and always terminates. The use of randomization reduces the amount of data that needs heavy processing, and so reduces both the memory requirements and the average time required for the algorithm. This algorithm is well-suited to more general parallel processors with multiple layers of memory hierarchy. Probabilistic Las Vegas algorithms of this kind are a form of stochastic optimization and can be especially useful for processors having a limited amount of fast memory available.

  6. Licensing topical report: application of probabilistic risk assessment in the selection of design basis accidents. [HTGR

    SciTech Connect (OSTI)

    Houghton, W.J.

    1980-06-01

    A probabilistic risk assessment (PRA) approach is proposed to be used to scrutinize selection of accident sequences. A technique is described in this Licensing Topical Report to identify candidates for Design Basis Accidents (DBAs) utilizing the risk assessment results. As a part of this technique, it is proposed that events with frequencies below a specified limit would not be candidates. The use of the methodology described is supplementary to the traditional, deterministic approach and may result, in some cases, in the selection of multiple failure sequences as DBAs; it may also provide a basis for not considering some traditionally postulated events as being DBAs. A process is then described for selecting a list of DBAs based on the candidates from PRA as supplementary to knowledge and judgments from past licensing practice. These DBAs would be the events considered in Chapter 15 of Safety Analysis Reports of high-temperature gas-cooled reactors (HTGRs).

  7. Comparison of CRBR design-basis events with those of foreign LMFBR plants

    SciTech Connect (OSTI)

    Agrawal, A.K.

    1983-04-01

    As part of the Construction Permit (CP) review of the Clinch River Breeder Reactor Plant (CRBR), the Brookhaven National Laboratory was asked to compare the Design Basis Accidents that are considered in CRBR Preliminary Safety Analysis Report with those of the foreign contemporary plants (PHENIX, SUPER-PHENIX, SNR-300, PFR, and MONJU). A brief introductory review of any special or unusual characteristics of these plants is given. This is followed by discussions of the design basis accidents and their acceptance criteria. In spite of some discrepancies due either to semantics or to licensing decisions, there appears to be a considerable degree of unanimity in the selection (definition) of DBAs in all of these plants.

  8. 105-K Basin material design basis feed description for spent nuclear fuel project facilities

    SciTech Connect (OSTI)

    Praga, A.N.

    1998-01-08

    Revisions 0 and 0A of this document provided estimated chemical and radionuclide inventories of spent nuclear fuel and sludge currently stored within the Hanford Site`s 105-K Basins. This Revision (Rev. 1) incorporates the following changes into Revision 0A: (1) updates the tables to reflect: improved cross section data, a decision to use accountability data as the basis for total Pu, a corrected methodology for selection of the heat generation basis fee, and a revised decay date; (2) adds section 3.3.3.1 to expand the description of the approach used to calculate the inventory values and explain why that approach yields conservative results; (3) changes the pre-irradiation braze beryllium value.

  9. Technical Basis for Work Place Air Monitoring for the Plutonium Finishing Plan (PFP)

    SciTech Connect (OSTI)

    JONES, R.A.

    1999-10-06

    This document establishes the basis for the Plutonium Finishing Plant's (PFP) work place air monitoring program in accordance with the following requirements: Title 10, Code of Federal Regulations (CFR), Part 835 ''Occupational Radiation Protection''; Hanford Site Radiological Control Manual (HSRCM-1); HNF-PRO-33 1, Work Place Air Monitoring; WHC-SD-CP-SAR-021, Plutonium Finishing Plant Final Safety Analysis Report; and Applicable recognized national standards invoked by DOE Orders and Policies.

  10. Hydro-Kansas (HK) Research Project: Tests of a Physical Basis of

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Statistical Self-Similarity in Peak Flows in the Whitewater Basin, Kansas Hydro-Kansas (HK) Research Project: Tests of a Physical Basis of Statistical Self-Similarity in Peak Flows in the Whitewater Basin, Kansas Gupta, Vijay University of Colorado Furey, Peter Colorado Research Associates Mantila, Ricardo University of Colorado Krajewski, Witold University of Iowa Kruger, Anton The University of Iowa Clayton, Jordan US Geological Survey and University of Iowa Category: Atmospheric State and

  11. Integrated Safety Management System as the Basis for Work Planning and Control for Research and Development

    Broader source: Energy.gov [DOE]

    Slide Presentation by Rich Davies, Kami Lowry, Mike Schlender, Pacific Northwest National Laboratory (PNNL) and Ted Pietrok, Pacific Northwest Site Office (PNSO). Integrated Safety Management System as the Basis for Work Planning and Control for Research and Development. Work Planning and Control (WP&C) is essential to assuring the safety of workers and the public regardless of the scope of work Research and Development (R&D) activities are no exception.

  12. Tank waste remediation system retrieval and disposal mission authorization basis amendment task plan

    SciTech Connect (OSTI)

    Goetz, T.G.

    1998-01-08

    This task plan is a documented agreement between Nuclear Safety and Licensing and the Process Development group within the Waste Feed Delivery organization. The purpose of this task plan is to identify the scope of work, tasks and deliverables, responsibilities, manpower, and schedules associated with an authorization basis amendment as a result of the Waste Feed Waste Delivery Program, Project W-211, and Project W-TBD.

  13. Technical Basis for U. S. Department of Energy Nuclear Safety Policy, DOE Policy 420.1

    Broader source: Energy.gov [DOE]

    This document provides the technical basis for the Department of Energy (DOE) Policy (P) 420.1, Nuclear Safety Policy, dated 2-8-2011. It includes an analysis of the revised Policy to determine whether it provides the necessary and sufficient high-level expectations that will lead DOE to establish and implement appropriate requirements to assure protection of the public, workers, and the environment from the hazards of DOE’s operation of nuclear facilities.

  14. A probabilistic risk assessment of the LLNL Plutonium facility`s evaluation basis fire operational accident

    SciTech Connect (OSTI)

    Brumburgh, G.

    1994-08-31

    The Lawrence Livermore National Laboratory (LLNL) Plutonium Facility conducts numerous involving plutonium to include device fabrication, development of fabrication techniques, metallurgy research, and laser isotope separation. A Safety Analysis Report (SAR) for the building 332 Plutonium Facility was completed rational safety and acceptable risk to employees, the public, government property, and the environment. This paper outlines the PRA analysis of the Evaluation Basis Fire (EDF) operational accident. The EBF postulates the worst-case programmatic impact event for the Plutonium Facility.

  15. A Numerical Algorithm for the Solution of a Phase-Field Model of Polycrystalline Materials

    SciTech Connect (OSTI)

    Dorr, M R; Fattebert, J; Wickett, M E; Belak, J F; Turchi, P A

    2008-12-04

    We describe an algorithm for the numerical solution of a phase-field model (PFM) of microstructure evolution in polycrystalline materials. The PFM system of equations includes a local order parameter, a quaternion representation of local orientation and a species composition parameter. The algorithm is based on the implicit integration of a semidiscretization of the PFM system using a backward difference formula (BDF) temporal discretization combined with a Newton-Krylov algorithm to solve the nonlinear system at each time step. The BDF algorithm is combined with a coordinate projection method to maintain quaternion unit length, which is related to an important solution invariant. A key element of the Newton-Krylov algorithm is the selection of a preconditioner to accelerate the convergence of the Generalized Minimum Residual algorithm used to solve the Jacobian linear system in each Newton step. Results are presented for the application of the algorithm to 2D and 3D examples.

  16. Demonstrating Structural Adequacy of Nuclear Power Plant Containment Structures for Beyond Design-Basis Pressure Loadings

    SciTech Connect (OSTI)

    Braverman, J.I.; Morante, R.

    2010-07-18

    ABSTRACT Demonstrating the structural integrity of U.S. nuclear power plant (NPP) containment structures, for beyond design-basis internal pressure loadings, is necessary to satisfy Nuclear Regulatory Commission (NRC) requirements and performance goals. This paper discusses methods for demonstrating the structural adequacy of the containment for beyond design-basis pressure loadings. Three distinct evaluations are addressed: (1) estimating the ultimate pressure capacity of the containment structure (10 CFR 50 and US NRC Standard Review Plan, Section 3.8) ; (2) demonstrating the structural adequacy of the containment subjected to pressure loadings associated with combustible gas generation (10 CFR 52 and 10 CFR 50); and (3) demonstrating the containment structural integrity for severe accidents (10 CFR 52 as well as SECY 90-016, SECY 93-087, and related NRC staff requirements memoranda (SRMs)). The paper describes the technical basis for specific aspects of the methods presented. It also presents examples of past issues identified in licensing activities related to these evaluations.

  17. Safety basis for the 241-AN-107 mixer pump installation and caustic addition

    SciTech Connect (OSTI)

    Van Vleet, R.J.

    1994-10-05

    This safety Basis was prepared to determine whether or not the proposed activities of installing a 76 HP jet mixer pump and the addition of approximately 50,000 gallons of 19 M (50:50 wt %) aqueous caustic are within the safety envelope as described by Tank Farms (chapter six of WHC-SD-WM-ISB-001, Rev. 0). The safety basis covers the components, structures and systems for the caustic addition and mixer pump installation. These include: installation of the mixer pump and monitoring equipment; operation of the mixer pump, process monitoring equipment and caustic addition; the pump stand, caustic addition skid, the electrical skid, the video camera system and the two densitometers. Also covered is the removal and decontamination of the mixer pump and process monitoring system. Authority for this safety basis is WHC-IP-0842 (Waste Tank Administration). Section 15.9, Rev. 2 (Unreviewed Safety Questions) of WHC-IP-0842 requires that an evaluation be performed for all physical modifications.

  18. Incorrect support and missing center tolerances of phasing algorithms

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Huang, Xiaojing; Nelson, Johanna; Steinbrener, Jan; Kirz, Janos; Turner, Joshua J.; Jacobsen, Chris

    2010-01-01

    In x-ray diffraction microscopy, iterative algorithms retrieve reciprocal space phase information, and a real space image, from an object's coherent diffraction intensities through the use of a priori information such as a finite support constraint. In many experiments, the object's shape or support is not well known, and the diffraction pattern is incompletely measured. We describe here computer simulations to look at the effects of both of these possible errors when using several common reconstruction algorithms. Overly tight object supports prevent successful convergence; however, we show that this can often be recognized through pathological behavior of the phase retrieval transfermore » function. Dynamic range limitations often make it difficult to record the central speckles of the diffraction pattern. We show that this leads to increasing artifacts in the image when the number of missing central speckles exceeds about 10, and that the removal of unconstrained modes from the reconstructed image is helpful only when the number of missing central speckles is less than about 50. In conclusion, this simulation study helps in judging the reconstructability of experimentally recorded coherent diffraction patterns.« less

  19. TRACC: Algorithm for Predicting and Tracking Barges on Inland Waterways

    Energy Science and Technology Software Center (OSTI)

    2010-04-23

    Algorithm developed in this work is used to predict the location and estimate the traveling speed of a barge moving in inland waterway network. Measurements obtained from GPS or other systems are corrupted with measurement noise and reported at large, irregular time intervals. Thus, creating uncertainty about the current location of the barge and minimizing the effectiveness of emergency response activities in case of an accident or act of terrorism. Developing a prediction algorithm becomemore » a non-trivial problem due to estimation of speed becomes challenging, attributed to the complex interactions between multiple systems associated in the process. This software, uses systems approach in modeling the motion dynamics of the barge and estimates the location and speed of the barge at next, user defined, time interval. In this work, first, to estimate the speed a non-linear, stochastic modeling technique was developed that take local variations and interactions existing in the system. Output speed is then used as an observation in a statistically optimal filtering technique, Kalman filter, formulated in state-space to minimize numerous errors observed in the system. The combined system synergistically fuses the local information available with measurements obtained to predict the location and speed of traveling of the barge accurately.« less

  20. A cooperative control algorithm for camera based observational systems.

    SciTech Connect (OSTI)

    Young, Joseph G.

    2012-01-01

    Over the last several years, there has been considerable growth in camera based observation systems for a variety of safety, scientific, and recreational applications. In order to improve the effectiveness of these systems, we frequently desire the ability to increase the number of observed objects, but solving this problem is not as simple as adding more cameras. Quite often, there are economic or physical restrictions that prevent us from adding additional cameras to the system. As a result, we require methods that coordinate the tracking of objects between multiple cameras in an optimal way. In order to accomplish this goal, we present a new cooperative control algorithm for a camera based observational system. Specifically, we present a receding horizon control where we model the underlying optimal control problem as a mixed integer linear program. The benefit of this design is that we can coordinate the actions between each camera while simultaneously respecting its kinematics. In addition, we further improve the quality of our solution by coupling our algorithm with a Kalman filter. Through this integration, we not only add a predictive component to our control, but we use the uncertainty estimates provided by the filter to encourage the system to periodically observe any outliers in the observed area. This combined approach allows us to intelligently observe the entire region of interest in an effective and thorough manner.

  1. Central Facilities Area Facilities Radioactive Waste Management Basis and DOE Manual 435.1-1 Compliance Tables

    SciTech Connect (OSTI)

    Lisa Harvego; Brion Bennett

    2011-11-01

    Department of Energy Order 435.1, 'Radioactive Waste Management,' along with its associated manual and guidance, requires development and maintenance of a radioactive waste management basis for each radioactive waste management facility, operation, and activity. This document presents a radioactive waste management basis for Idaho National Laboratory's Central Facilities Area facilities that manage radioactive waste. The radioactive waste management basis for a facility comprises existing laboratory-wide and facilityspecific documents. Department of Energy Manual 435.1-1, 'Radioactive Waste Management Manual,' facility compliance tables also are presented for the facilities. The tables serve as a tool for developing the radioactive waste management basis.

  2. Materials and Security Consolidation Complex Facilities Radioactive Waste Management Basis and DOE Manual 435.1-1 Compliance Tables

    SciTech Connect (OSTI)

    Not Listed

    2011-09-01

    Department of Energy Order 435.1, 'Radioactive Waste Management,' along with its associated manual and guidance, requires development and maintenance of a radioactive waste management basis for each radioactive waste management facility, operation, and activity. This document presents a radioactive waste management basis for Idaho National Laboratory's Materials and Security Consolidation Center facilities that manage radioactive waste. The radioactive waste management basis for a facility comprises existing laboratory-wide and facility-specific documents. Department of Energy Manual 435.1-1, 'Radioactive Waste Management Manual,' facility compliance tables also are presented for the facilities. The tables serve as a tool for developing the radioactive waste management basis.

  3. Research and Education Campus Facilities Radioactive Waste Management Basis and DOE Manual 435.1-1 Compliance Tables

    SciTech Connect (OSTI)

    L. Harvego; Brion Bennett

    2011-11-01

    U.S. Department of Energy Order 435.1, 'Radioactive Waste Management,' along with its associated manual and guidance, requires development and maintenance of a radioactive waste management basis for each radioactive waste management facility, operation, and activity. This document presents a radioactive waste management basis for Idaho National Laboratory Research and Education Campus facilities that manage radioactive waste. The radioactive waste management basis for a facility comprises existing laboratory-wide and facility-specific documents. Department of Energy Manual 435.1-1, 'Radioactive Waste Management Manual,' facility compliance tables also are presented for the facilities. The tables serve as a tool to develop the radioactive waste management basis.

  4. Materials and Fuels Complex Facilities Radioactive Waste Management Basis and DOE Manual 435.1-1 Compliance Tables

    SciTech Connect (OSTI)

    Lisa Harvego; Brion Bennett

    2011-09-01

    Department of Energy Order 435.1, 'Radioactive Waste Management,' along with its associated manual and guidance, requires development and maintenance of a radioactive waste management basis for each radioactive waste management facility, operation, and activity. This document presents a radioactive waste management basis for Idaho National Laboratory's Materials and Fuels Complex facilities that manage radioactive waste. The radioactive waste management basis for a facility comprises existing laboratory-wide and facility-specific documents. Department of Energy Manual 435.1-1, 'Radioactive Waste Management Manual,' facility compliance tables also are presented for the facilities. The tables serve as a tool for developing the radioactive waste management basis.

  5. Theoretical and Computational Physics | U.S. DOE Office of Science (SC)

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Theoretical and Computational Physics High Energy Physics (HEP) HEP Home About Research Science Drivers of Particle Physics Energy Frontier Intensity Frontier Cosmic Frontier Theoretical and Computational Physics Advanced Technology R&D Accelerator Stewardship Facilities Science Highlights Benefits of HEP Funding Opportunities Advisory Committees Community Resources Contact Information High Energy Physics U.S. Department of Energy SC-25/Germantown Building 1000 Independence Ave., SW

  6. ITP Metal Casting: Theoretical/Best Practice Energy Use in Metalcasting

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Operations | Department of Energy Theoretical/Best Practice Energy Use in Metalcasting Operations ITP Metal Casting: Theoretical/Best Practice Energy Use in Metalcasting Operations PDF icon doebestpractice_052804.pdf More Documents & Publications ITP Metal Casting: Energy Use in Selected Metalcasting Facilities - 2003 ITP Metal Casting: Energy and Environmental Profile of the U.S. Metal casting Industry ITP Metal Casting: Advanced Melting Technologies: Energy Saving Concepts and

  7. Numerical Study of Velocity Shear Stabilization of 3D and Theoretical

    Office of Scientific and Technical Information (OSTI)

    Considerations for Centrifugally Confined Plasmas and Other Interchange-Limited Fusion Concepts (Technical Report) | SciTech Connect Numerical Study of Velocity Shear Stabilization of 3D and Theoretical Considerations for Centrifugally Confined Plasmas and Other Interchange-Limited Fusion Concepts Citation Details In-Document Search Title: Numerical Study of Velocity Shear Stabilization of 3D and Theoretical Considerations for Centrifugally Confined Plasmas and Other Interchange-Limited

  8. THEORETICAL TRANSIT SPECTRA FOR GJ 1214b AND OTHER 'SUPER-EARTHS' (Journal

    Office of Scientific and Technical Information (OSTI)

    Article) | SciTech Connect THEORETICAL TRANSIT SPECTRA FOR GJ 1214b AND OTHER 'SUPER-EARTHS' Citation Details In-Document Search Title: THEORETICAL TRANSIT SPECTRA FOR GJ 1214b AND OTHER 'SUPER-EARTHS' We present new calculations of transit spectra of super-Earths that allow for atmospheres with arbitrary proportions of common molecular species and haze. We test this method with generic spectra, reproducing the expected systematics and absorption features, then apply it to the nearby

  9. Theoretical investigations of defects in a Si-based digital ferromagnetic

    Office of Scientific and Technical Information (OSTI)

    heterostructure - a spintronic material (Journal Article) | SciTech Connect Journal Article: Theoretical investigations of defects in a Si-based digital ferromagnetic heterostructure - a spintronic material Citation Details In-Document Search Title: Theoretical investigations of defects in a Si-based digital ferromagnetic heterostructure - a spintronic material Authors: Fong, C Y ; Shauhgnessy, M ; Snow, R ; Yang, L H Publication Date: 2010-09-17 OSTI Identifier: 1124958 Report Number(s):

  10. Two-electron reduction of ethylene carbonate: theoretical review of SEI

    Office of Scientific and Technical Information (OSTI)

    formation mechanisms. (Conference) | SciTech Connect Conference: Two-electron reduction of ethylene carbonate: theoretical review of SEI formation mechanisms. Citation Details In-Document Search Title: Two-electron reduction of ethylene carbonate: theoretical review of SEI formation mechanisms. Authors: Leung, Kevin Publication Date: 2012-08-01 OSTI Identifier: 1061142 Report Number(s): SAND2012-6720C DOE Contract Number: AC04-94AL85000 Resource Type: Conference Resource Relation:

  11. Two-electron reduction of ethylene carbonate: theoretical review of SEI

    Office of Scientific and Technical Information (OSTI)

    formation mechanisms. (Conference) | SciTech Connect Conference: Two-electron reduction of ethylene carbonate: theoretical review of SEI formation mechanisms. Citation Details In-Document Search Title: Two-electron reduction of ethylene carbonate: theoretical review of SEI formation mechanisms. Abstract not provided. Authors: Leung, Kevin Publication Date: 2013-04-01 OSTI Identifier: 1078871 Report Number(s): SAND2013-3422C 452174 DOE Contract Number: AC04-94AL85000 Resource Type: Conference

  12. Evaluating cloud retrieval algorithms with the ARM BBHRP framework

    SciTech Connect (OSTI)

    Mlawer,E.; Dunn,M.; Mlawer, E.; Shippert, T.; Troyan, D.; Johnson, K. L.; Miller, M. A.; Delamere, J.; Turner, D. D.; Jensen, M. P.; Flynn, C.; Shupe, M.; Comstock, J.; Long, C. N.; Clough, S. T.; Sivaraman, C.; Khaiyer, M.; Xie, S.; Rutan, D.; Minnis, P.

    2008-03-10

    Climate and weather prediction models require accurate calculations of vertical profiles of radiative heating. Although heating rate calculations cannot be directly validated due to the lack of corresponding observations, surface and top-of-atmosphere measurements can indirectly establish the quality of computed heating rates through validation of the calculated irradiances at the atmospheric boundaries. The ARM Broadband Heating Rate Profile (BBHRP) project, a collaboration of all the working groups in the program, was designed with these heating rate validations as a key objective. Given the large dependence of radiative heating rates on cloud properties, a critical component of BBHRP radiative closure analyses has been the evaluation of cloud microphysical retrieval algorithms. This evaluation is an important step in establishing the necessary confidence in the continuous profiles of computed radiative heating rates produced by BBHRP at the ARM Climate Research Facility (ACRF) sites that are needed for modeling studies. This poster details the continued effort to evaluate cloud property retrieval algorithms within the BBHRP framework, a key focus of the project this year. A requirement for the computation of accurate heating rate profiles is a robust cloud microphysical product that captures the occurrence, height, and phase of clouds above each ACRF site. Various approaches to retrieve the microphysical properties of liquid, ice, and mixed-phase clouds have been processed in BBHRP for the ACRF Southern Great Plains (SGP) and the North Slope of Alaska (NSA) sites. These retrieval methods span a range of assumptions concerning the parameterization of cloud location, particle density, size, shape, and involve different measurement sources. We will present the radiative closure results from several different retrieval approaches for the SGP site, including those from Microbase, the current 'reference' retrieval approach in BBHRP. At the NSA, mixed-phase clouds and cloud with a low optical depth are prevalent; the radiative closure studies using Microbase demonstrated significant residuals. As an alternative to Microbase at NSA, the Shupe-Turner cloud property retrieval algorithm, aimed at improving the partitioning of cloud phase and incorporating more constrained, conditional microphysics retrievals, also has been evaluated using the BBHRP data set.

  13. Resistive Network Optimal Power Flow: Uniqueness and Algorithms

    SciTech Connect (OSTI)

    Tan, CW; Cai, DWH; Lou, X

    2015-01-01

    The optimal power flow (OPF) problem minimizes the power loss in an electrical network by optimizing the voltage and power delivered at the network buses, and is a nonconvex problem that is generally hard to solve. By leveraging a recent development on the zero duality gap of OPF, we propose a second-order cone programming convex relaxation of the resistive network OPF, and study the uniqueness of the optimal solution using differential topology, especially the Poincare-Hopf Index Theorem. We characterize the global uniqueness for different network topologies, e.g., line, radial, and mesh networks. This serves as a starting point to design distributed local algorithms with global behaviors that have low complexity, are computationally fast, and can run under synchronous and asynchronous settings in practical power grids.

  14. Invariant patterns in crystal lattices: Implications for protein folding algorithms

    SciTech Connect (OSTI)

    HART,WILLIAM E.; ISTRAIL,SORIN

    2000-06-01

    Crystal lattices are infinite periodic graphs that occur naturally in a variety of geometries and which are of fundamental importance in polymer science. Discrete models of protein folding use crystal lattices to define the space of protein conformations. Because various crystal lattices provide discretizations of the same physical phenomenon, it is reasonable to expect that there will exist invariants across lattices related to fundamental properties of the protein folding process. This paper considers whether performance-guaranteed approximability is such an invariant for HP lattice models. The authors define a master approximation algorithm that has provable performance guarantees provided that a specific sublattice exists within a given lattice. They describe a broad class of crystal lattices that are approximable, which further suggests that approximability is a general property of HP lattice models.

  15. Tightly Coupled Multiphysics Algorithm for Pebble Bed Reactors

    SciTech Connect (OSTI)

    HyeongKae Park; Dana Knoll; Derek Gaston; Richard Martineau

    2010-10-01

    We have developed a tightly coupled multiphysics simulation tool for the pebble-bed reactor (PBR) concept, a type of Very High-Temperature gas-cooled Reactor (VHTR). The simulation tool, PRONGHORN, takes advantages of the Multiphysics Object-Oriented Simulation Environment library, and is capable of solving multidimensional thermal-fluid and neutronics problems implicitly with a Newton-based approach. Expensive Jacobian matrix formation is alleviated via the Jacobian-free Newton-Krylov method, and physics-based preconditioning is applied to minimize Krylov iterations. Motivation for the work is provided via analysis and numerical experiments on simpler multiphysics reactor models. We then provide detail of the physical models and numerical methods in PRONGHORN. Finally, PRONGHORN's algorithmic capability is demonstrated on a number of PBR test cases.

  16. Just in Time DSA-The Hanford Nuclear Safety Basis Strategy

    SciTech Connect (OSTI)

    Olinger, S. J.; Buhl, A. R.

    2002-02-26

    The U.S. Department of Energy, Richland Operations Office (RL) is responsible for 30 hazard category 2 and 3 nuclear facilities that are operated by its prime contractors, Fluor Hanford Incorporated (FHI), Bechtel Hanford, Incorporated (BHI) and Pacific Northwest National Laboratory (PNNL). The publication of Title 10, Code of Federal Regulations, Part 830, Subpart B, Safety Basis Requirements (the Rule) in January 2001 imposed the requirement that the Documented Safety Analyses (DSA) for these facilities be reviewed against the requirements of the Rule. Those DSA that do not meet the requirements must either be upgraded to satisfy the Rule, or an exemption must be obtained. RL and its prime contractors have developed a Nuclear Safety Strategy that provides a comprehensive approach for supporting RL's efforts to meet its long term objectives for hazard category 2 and 3 facilities while also meeting the requirements of the Rule. This approach will result in a reduction of the total number of safety basis documents that must be developed and maintained to support the remaining mission and closure of the Hanford Site and ensure that the documentation that must be developed will support: compliance with the Rule; a ''Just-In-Time'' approach to development of Rule-compliant safety bases supported by temporary exemptions; and consolidation of safety basis documents that support multiple facilities with a common mission (e.g. decontamination, decommissioning and demolition [DD&D], waste management, surveillance and maintenance). This strategy provides a clear path to transition the safety bases for the various Hanford facilities from support of operation and stabilization missions through DD&D to accelerate closure. This ''Just-In-Time'' Strategy can also be tailored for other DOE Sites, creating the potential for large cost savings and schedule reductions throughout the DOE complex.

  17. Optimized Uncertainty Quantification Algorithm Within a Dynamic Event Tree Framework

    SciTech Connect (OSTI)

    J. W. Nielsen; Akira Tokuhiro; Robert Hiromoto

    2014-06-01

    Methods for developing Phenomenological Identification and Ranking Tables (PIRT) for nuclear power plants have been a useful tool in providing insight into modelling aspects that are important to safety. These methods have involved expert knowledge with regards to reactor plant transients and thermal-hydraulic codes to identify are of highest importance. Quantified PIRT provides for rigorous method for quantifying the phenomena that can have the greatest impact. The transients that are evaluated and the timing of those events are typically developed in collaboration with the Probabilistic Risk Analysis. Though quite effective in evaluating risk, traditional PRA methods lack the capability to evaluate complex dynamic systems where end states may vary as a function of transition time from physical state to physical state . Dynamic PRA (DPRA) methods provide a more rigorous analysis of complex dynamic systems. A limitation of DPRA is its potential for state or combinatorial explosion that grows as a function of the number of components; as well as, the sampling of transition times from state-to-state of the entire system. This paper presents a method for performing QPIRT within a dynamic event tree framework such that timing events which result in the highest probabilities of failure are captured and a QPIRT is performed simultaneously while performing a discrete dynamic event tree evaluation. The resulting simulation results in a formal QPIRT for each end state. The use of dynamic event trees results in state explosion as the number of possible component states increases. This paper utilizes a branch and bound algorithm to optimize the solution of the dynamic event trees. The paper summarizes the methods used to implement the branch-and-bound algorithm in solving the discrete dynamic event trees.

  18. Technical basis for classification of low-activity waste fraction from Hanford site tanks

    SciTech Connect (OSTI)

    Petersen, C.A., Westinghouse Hanford

    1996-07-17

    The overall objective of this report is to provide a technical basis to support a U.S. Nuclear Regulatory Commission determination to classify the low-activity waste from the Hanford Site single-shell and double-shell tanks as `incidental` wastes after removal of additional radionuclides and immobilization.The proposed processing method, in addition to the previous radionuclide removal efforts, will remove the largest practical amount of total site radioactivity, attributable to high-level wastes, for disposal in a deep geologic repository. The remainder of the waste would be considered `incidental` waste and could be disposed onsite.

  19. Technical basis for classification of low-activity waste fraction from Hanford site tanks

    SciTech Connect (OSTI)

    Petersen, C.A.

    1996-09-20

    The overall objective of this report is to provide a technical basis to support a U.S. Nuclear Regulatory Commission determination to classify the low-activity waste from the Hanford Site single-shell and double-shell tanks as `incidental` wastes after removal of additional radionuclides and immobilization.The proposed processing method, in addition to the previous radionuclide removal efforts, will remove the largest practical amount of total site radioactivity, attributable to high-level waste, for disposal is a deep geologic repository. The remainder of the waste would be considered `incidental` waste and could be disposed onsite.

  20. Technical basis for cases N-629 and N-631 as an alternative for RTNDT reference temperature

    SciTech Connect (OSTI)

    Merkle, John Graham; Server, W. L.

    2007-01-01

    ASME Code Cases N-629/N-631, published in 1999, provided an important new approach to allow material specific, measured fracture toughness curves for ferritic steels in the code applications. This has enabled some of the nuclear power plants whose reactor pressure vessel materials reached a certain threshold level based on overly conservative rules to use an alternative RTNDT to justify continued operation of their plants. These code cases have been approved by the US Nuclear Regulatory Commission and these have been proposed to be codified in Appendix A and Appendix G of the ASME Boiler and Pressure Vessel Code. This paper summarizes the basis of this approach for the record.

  1. Structural Basis of Pre-existing Immunity to the 2009 H1N1 Pandemic

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Influenza Virus Structural Basis of Pre-existing Immunity to the 2009 H1N1 Pandemic Influenza Virus The emergence of the 2009 H1N1 influenza pandemic, also known as the "swine flu", marks the first human flu pandemic in 40 years and has caused significant human infection and mortality globally (1). The emergence of the 2009 H1N1 flu marks the first time that an influenza pandemic was triggered by a virus carrying the same hemagglutinin (HA) subtype as circulating seasonal strains.

  2. ARM: ARSCL: cloud boundaries from first Clothiaux algorithms on Vaisala or Belfort ceilometers, Micropulse lidar, and MMCR

    SciTech Connect (OSTI)

    Karen Johnson; Michael Jensen

    1996-11-08

    ARSCL: cloud boundaries from first Clothiaux algorithms on Vaisala or Belfort ceilometers, Micropulse lidar, and MMCR

  3. ARM: ARSCL: multiple outputs from first Clothiaux algorithms on Vaisala or Belfort ceilometers, Micropulse lidar, and MMCR

    SciTech Connect (OSTI)

    Karen Johnson; Michael Jensen

    1996-11-08

    ARSCL: multiple outputs from first Clothiaux algorithms on Vaisala or Belfort ceilometers, Micropulse lidar, and MMCR

  4. ARM: ARSCL: multiple outputs from first Clothiaux algorithms on Vaisala or Belfort ceilometers, Micropulse lidar, and MMCR

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Karen Johnson; Michael Jensen

    ARSCL: multiple outputs from first Clothiaux algorithms on Vaisala or Belfort ceilometers, Micropulse lidar, and MMCR

  5. ARM: ARSCL: cloud boundaries from first Clothiaux algorithms on Vaisala or Belfort ceilometers, Micropulse lidar, and MMCR

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Karen Johnson; Michael Jensen

    ARSCL: cloud boundaries from first Clothiaux algorithms on Vaisala or Belfort ceilometers, Micropulse lidar, and MMCR

  6. ARM: 10-minute TEMPORARY Raman Lidar: aerosol scattering ratio and backscattering coefficient profiles, from first Ferrare algorithm

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Sivaraman, Chitra; Flynn, Connor

    2010-12-15

    10-minute TEMPORARY Raman Lidar: aerosol scattering ratio and backscattering coefficient profiles, from first Ferrare algorithm

  7. ARM: 10-minute TEMPORARY Raman Lidar: aerosol extinction profiles and aerosol optical thickness, from first Ferrare algorithm

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Sivaraman, Chitra; Flynn, Connor

    2010-12-15

    10-minute TEMPORARY Raman Lidar: aerosol extinction profiles and aerosol optical thickness, from first Ferrare algorithm

  8. ARM: 10-minute TEMPORARY Raman Lidar: aerosol extinction profiles and aerosol optical thickness, from first Ferrare algorithm

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Sivaraman, Chitra; Flynn, Connor

    10-minute TEMPORARY Raman Lidar: aerosol extinction profiles and aerosol optical thickness, from first Ferrare algorithm

  9. ARM: 10-minute TEMPORARY Raman Lidar: aerosol scattering ratio and backscattering coefficient profiles, from first Ferrare algorithm

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Sivaraman, Chitra; Flynn, Connor

    10-minute TEMPORARY Raman Lidar: aerosol scattering ratio and backscattering coefficient profiles, from first Ferrare algorithm

  10. Stride search: A general algorithm for storm detection in high resolution climate data

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Bosler, Peter Andrew; Roesler, Erika Louise; Taylor, Mark A.; Mundt, Miranda

    2015-09-08

    This article discusses the problem of identifying extreme climate events such as intense storms within large climate data sets. The basic storm detection algorithm is reviewed, which splits the problem into two parts: a spatial search followed by a temporal correlation problem. Two specific implementations of the spatial search algorithm are compared. The commonly used grid point search algorithm is reviewed, and a new algorithm called Stride Search is introduced. Stride Search is designed to work at all latitudes, while grid point searches may fail in polar regions. Results from the two algorithms are compared for the application of tropicalmore » cyclone detection, and shown to produce similar results for the same set of storm identification criteria. The time required for both algorithms to search the same data set is compared. Furthermore, Stride Search's ability to search extreme latitudes is demonstrated for the case of polar low detection.« less

  11. Update on Development of Mesh Generation Algorithms in MeshKit

    SciTech Connect (OSTI)

    Jain, Rajeev; Vanderzee, Evan; Mahadevan, Vijay

    2015-09-30

    MeshKit uses a graph-based design for coding all its meshing algorithms, which includes the Reactor Geometry (and mesh) Generation (RGG) algorithms. This report highlights the developmental updates of all the algorithms, results and future work. Parallel versions of algorithms, documentation and performance results are reported. RGG GUI design was updated to incorporate new features requested by the users; boundary layer generation and parallel RGG support were added to the GUI. Key contributions to the release, upgrade and maintenance of other SIGMA1 libraries (CGM and MOAB) were made. Several fundamental meshing algorithms for creating a robust parallel meshing pipeline in MeshKit are under development. Results and current status of automated, open-source and high quality nuclear reactor assembly mesh generation algorithms such as trimesher, quadmesher, interval matching and multi-sweeper are reported.

  12. Stride search: A general algorithm for storm detection in high resolution climate data

    SciTech Connect (OSTI)

    Bosler, Peter Andrew; Roesler, Erika Louise; Taylor, Mark A.; Mundt, Miranda

    2015-09-08

    This article discusses the problem of identifying extreme climate events such as intense storms within large climate data sets. The basic storm detection algorithm is reviewed, which splits the problem into two parts: a spatial search followed by a temporal correlation problem. Two specific implementations of the spatial search algorithm are compared. The commonly used grid point search algorithm is reviewed, and a new algorithm called Stride Search is introduced. Stride Search is designed to work at all latitudes, while grid point searches may fail in polar regions. Results from the two algorithms are compared for the application of tropical cyclone detection, and shown to produce similar results for the same set of storm identification criteria. The time required for both algorithms to search the same data set is compared. Furthermore, Stride Search's ability to search extreme latitudes is demonstrated for the case of polar low detection.

  13. Digital revenue metering algorithm: development, analysis, implementation, testing, and evaluation. Final report

    SciTech Connect (OSTI)

    Schweitzer III, E.O.; To, H.W.; Ando, M.

    1980-11-01

    A digital revenue metering algorithm is described. The algorithm has been tested in a microcomputer system using two 8-bit MC6800 microprocessors and 12-bit analog-to-digital converters. The tests show that the system meets the accuracy requirements of ANSI C12-1975. The algorithm demands modest computing requirements and low data sampling rates. The algorithm uses Walsh-functions and will operate with as few as 4 samples per 60-Hz cycle. For proper response to odd harmonic frequencies, higher sampling rates must be used. Third harmonic power can be handled with an 8-sample per cycle Walsh function. However, even harmonics are effectively suppressed by the algorithm. The developed algorithm is intended for use in digital data acquisition systems for substations where interchange metering is required.

  14. Architecture-Aware Algorithms for Scalable Performance and Resilience on Heterogeneous Architectures

    SciTech Connect (OSTI)

    Dongarra, Jack

    2013-10-15

    The goal of the Extreme-scale Algorithms & Software Institute (EASI) is to close the ?application-architecture performance gap? by exploring algorithms and runtime improvements that will enable key science applications to better exploit the architectural features of DOE extreme-scale systems. For the past year of the project, our efforts at the University of Tennessee have concentrated on, and made significant progress related to, the following high-level EASI goals: ? Develop multi-precision and architecture-aware implementations of Krylov, Poisson, Helmholtz solvers, and dense factorizations for heterogeneous multi-core systems; ? Explore new methods of algorithm resilience, and develop new algorithms with these capabilities; ? Develop runtime support for adaptable algorithms that are dealing with resilience, scalability; ? Distribute the new algorithms and runtime support through widely used software packages; ? Establish a strong outreach program to disseminate results, interact with colleagues and train students and junior members of our community.

  15. GX-Means: A model-based divide and merge algorithm for geospatial image clustering

    SciTech Connect (OSTI)

    Vatsavai, Raju; Symons, Christopher T; Chandola, Varun; Jun, Goo

    2011-01-01

    One of the practical issues in clustering is the specification of the appropriate number of clusters, which is not obvious when analyzing geospatial datasets, partly because they are huge (both in size and spatial extent) and high dimensional. In this paper we present a computationally efficient model-based split and merge clustering algorithm that incrementally finds model parameters and the number of clusters. Additionally, we attempt to provide insights into this problem and other data mining challenges that are encountered when clustering geospatial data. The basic algorithm we present is similar to the G-means and X-means algorithms; however, our proposed approach avoids certain limitations of these well-known clustering algorithms that are pertinent when dealing with geospatial data. We compare the performance of our approach with the G-means and X-means algorithms. Experimental evaluation on simulated data and on multispectral and hyperspectral remotely sensed image data demonstrates the effectiveness of our algorithm.

  16. Theoretical Physics

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Their primary areas of interest are in physics beyond the Standard Model, cosmology, dark matter, lattice quantum chromodynamics, neutrinos, the fundamentals of quantum field ...

  17. Efficient algorithms for mixed aleatory-epistemic uncertainty quantification with application to radiation-hardened electronics. Part I, algorithms and benchmark results.

    SciTech Connect (OSTI)

    Swiler, Laura Painton; Eldred, Michael Scott

    2009-09-01

    This report documents the results of an FY09 ASC V&V Methods level 2 milestone demonstrating new algorithmic capabilities for mixed aleatory-epistemic uncertainty quantification. Through the combination of stochastic expansions for computing aleatory statistics and interval optimization for computing epistemic bounds, mixed uncertainty analysis studies are shown to be more accurate and efficient than previously achievable. Part I of the report describes the algorithms and presents benchmark performance results. Part II applies these new algorithms to UQ analysis of radiation effects in electronic devices and circuits for the QASPR program.

  18. Fast Combinatorial Algorithm for the Solution of Linearly Constrained Least Squares Problems

    DOE Patents [OSTI]

    Van Benthem, Mark H.; Keenan, Michael R.

    2008-11-11

    A fast combinatorial algorithm can significantly reduce the computational burden when solving general equality and inequality constrained least squares problems with large numbers of observation vectors. The combinatorial algorithm provides a mathematically rigorous solution and operates at great speed by reorganizing the calculations to take advantage of the combinatorial nature of the problems to be solved. The combinatorial algorithm exploits the structure that exists in large-scale problems in order to minimize the number of arithmetic operations required to obtain a solution.

  19. Current plans to characterize the design basis ground motion at the Yucca Mountain, Nevada Site

    SciTech Connect (OSTI)

    Simecka, W.B.; Grant, T.A.; Voegele, M.D.; Cline, K.M.

    1992-12-31

    A site at Yucca Mountain Nevada is currently being studied to assess its suitability as a potential host site for the nation`s first commercial high level waste repository. The DOE has proposed a new methodology for determining design-basis ground motions that uses both deterministic and probabilistic methods. The role of the deterministic approach is primary. It provides the level of detail needed by design engineers in the characterization of ground motions. The probabilistic approach provides a logical structured procedure for integrating the range of possible earthquakes that contribute to the ground motion hazard at the site. In addition, probabilistic methods will be used as needed to provide input for the assessment of long-term repository performance. This paper discusses the local tectonic environment, potential seismic sources and their associated displacements and ground motions. It also discusses the approach to assessing the design basis earthquake for the surface and underground facilities, as well as selected examples of the use of this type of information in design activities.

  20. Engineering Basis Document Review Supporting the Double Shell Tank (DST) System Specification Development

    SciTech Connect (OSTI)

    LEONARD, M.W.

    2000-03-14

    The Double-Shell Tank (DST) System is required to transition from its current storage mission to a storage and retrieval mission supporting the River Protection Project Phase 1 privatization, defined in HNF-SD-WM-MAR-008, Tank Waste Remediation System Mission Analysis Report. Requirements for the DST subsystems are being developed using the top-down systems engineering process outlined in HNF-SD-WM-SEMP-002, Tank Waste Remediation System Systems Engineering Management Plan. This top-down process considers existing designs to the extent that these designs impose unavoidable constraints on the Phase 1 mission. Existing engineering-basis documents were screened, and the unavoidable constraints were identified. The constraints identified herein will be added to the DST System specification (HNF-SD-WM-TRD-007, System Specification for the Double-Shell Tank System). While the letter revisions of the DST System specification were constructed with a less rigorous review of the existing engineering-basis documents, the Revision 0 release of the specification must incorporate the results of the review documented herein. The purpose of this document is to describe the screening process and criteria used to determine which constraints are unavoidable and to document the screening results.