Theoretical Basis for the Design of a DWPF Evacuated Canister
Routt, K.R.
2001-09-17
This report provides the theoretical bases for use of an evacuated canister for draining a glass melter. Design recommendations are also presented to ensure satisfactory performance in future tests of the concept.
A Decision Theoretic Approach to Evaluate Radiation Detection Algorithms
Nobles, Mallory A.; Sego, Landon H.; Cooley, Scott K.; Gosink, Luke J.; Anderson, Richard M.; Hays, Spencer E.; Tardiff, Mark F.
2013-07-01
There are a variety of sensor systems deployed at U.S. border crossings and ports of entry that scan for illicit nuclear material. In this work, we develop a framework for comparing the performance of detection algorithms that interpret the output of these scans and determine when secondary screening is needed. We optimize each algorithm to minimize its risk, or expected loss. We measure an algorithms risk by considering its performance over a sample, the probability distribution of threat sources, and the consequence of detection errors. While it is common to optimize algorithms by fixing one error rate and minimizing another, our framework allows one to simultaneously consider multiple types of detection errors. Our framework is flexible and easily adapted to many different assumptions regarding the probability of a vehicle containing illicit material, and the relative consequences of a false positive and false negative errors. Our methods can therefore inform decision makers of the algorithm family and parameter values which best reduce the threat from illicit nuclear material, given their understanding of the environment at any point in time. To illustrate the applicability of our methods, in this paper, we compare the risk from two families of detection algorithms and discuss the policy implications of our results.
Li, Z; Leng, S; Yu, L; McCollough, C
2014-06-15
Purpose: Published methods for image-based material decomposition with multi-energy CT images have required the assumption of volume conservation or accurate knowledge of the x-ray spectra and detector response. The purpose of this work was to develop an image-based material-decomposition algorithm that can overcome these limitations. Methods: An image-based material decomposition algorithm was developed that requires only mass conservation (rather than volume conservation). With this method, using multi-energy CT measurements made with n=4 energy bins, the mass density of each basis material and of the mixture can be determined without knowledge of the tube spectra and detector response. A digital phantom containing 12 samples of mixtures from water, calcium, iron, and iodine was used in the simulation (Siemens DRASIM). The calibration was performed by using pure materials at each energy bin. The accuracy of the technique was evaluated in noise-free and noisy data under the assumption of an ideal photon-counting detector. Results: Basis material densities can be estimated accurately by either theoretic calculation or calibration with known pure materials. The calibration approach requires no prior information about the spectra and detector response. Regression analysis of theoretical values versus estimated values results in excellent agreement for both noise-free and noisy data. For the calibration approach, the R-square values are 0.9960+/â0.0025 and 0.9476+/â0.0363 for noise-free and noisy data, respectively. Conclusion: From multi-energy CT images with n=4 energy bins, the developed image-based material decomposition method accurately estimated 4 basis material density (3 without k-edge and 1 with in the range of the simulated energy bins) even without any prior information about spectra and detector response. This method is applicable to mixtures of solutions and dissolvable materials, where volume conservation assumptions do not apply. CHM receives
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Applied Mathematics and Plasma Physics Theoretical Biology and Biophysics Contacts ... Capabilities Applied Mathematics Chemistry Biology Engineering Theoretical Physics ...
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
HEP Theoretical Physics Understanding discoveries at the Energy, Intensity, and Cosmic Frontiers Get ... HEP Theory at Los Alamos The Theoretical High Energy Physics group at ...
Theoretical Plasma Physics (Technical Report) | SciTech Connect
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Technical Report: Theoretical Plasma Physics Citation Details In-Document Search Title: Theoretical Plasma Physics Lattice Boltzmann algorithms are a mesoscopic method to solve ...
Belief network algorithms: A study of performance
Jitnah, N.
1996-12-31
This abstract gives an overview of the work. We present a survey of Belief Network algorithms and propose a domain characterization system to be used as a basis for algorithm comparison and for predicting algorithm performance.
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
HEP Theoretical Physics Understanding discoveries at the Energy, Intensity, and Cosmic Frontiers Get Expertise Rajan Gupta (505) 667-7664 Email Bruce Carlsten (505) 667-5657 Email HEP Theory at Los Alamos The Theoretical High Energy Physics group at Los Alamos National Laboratory is active in a number of diverse areas of research. Their primary areas of interest are in physics beyond the Standard Model, cosmology, dark matter, lattice quantum chromodynamics, neutrinos, the fundamentals of
Improved multiprocessor garbage collection algorithms
Newman, I.A.; Stallard, R.P.; Woodward, M.C.
1983-01-01
Outlines the results of an investigation of existing multiprocessor garbage collection algorithms and introduces two new algorithms which significantly improve some aspects of the performance of their predecessors. The two algorithms arise from different starting assumptions. One considers the case where the algorithm will terminate successfully whatever list structure is being processed and assumes that the extra data space should be minimised. The other seeks a very fast garbage collection time for list structures that do not contain loops. Results of both theoretical and experimental investigations are given to demonstrate the efficacy of the algorithms. 7 references.
Sharkey, Keeper L.; Adamowicz, Ludwik; Department of Physics, University of Arizona, Tucson, Arizona 85721
2014-05-07
An algorithm for quantum-mechanical nonrelativistic variational calculations of L = 0 and M = 0 states of atoms with an arbitrary number of s electrons and with three p electrons have been implemented and tested in the calculations of the ground {sup 4}S state of the nitrogen atom. The spatial part of the wave function is expanded in terms of all-electrons explicitly correlated Gaussian functions with the appropriate pre-exponential Cartesian angular factors for states with the L = 0 and M = 0 symmetry. The algorithm includes formulas for calculating the Hamiltonian and overlap matrix elements, as well as formulas for calculating the analytic energy gradient determined with respect to the Gaussian exponential parameters. The gradient is used in the variational optimization of these parameters. The Hamiltonian used in the approach is obtained by rigorously separating the center-of-mass motion from the laboratory-frame all-particle Hamiltonian, and thus it explicitly depends on the finite mass of the nucleus. With that, the mass effect on the total ground-state energy is determined.
R.J. Garrett
2002-01-14
As part of the internal Integrated Safety Management Assessment verification process, it was determined that there was a lack of documentation that summarizes the safety basis of the current Yucca Mountain Project (YMP) site characterization activities. It was noted that a safety basis would make it possible to establish a technically justifiable graded approach to the implementation of the requirements identified in the Standards/Requirements Identification Document. The Standards/Requirements Identification Documents commit a facility to compliance with specific requirements and, together with the hazard baseline documentation, provide a technical basis for ensuring that the public and workers are protected. This Safety Basis Report has been developed to establish and document the safety basis of the current site characterization activities, establish and document the hazard baseline, and provide the technical basis for identifying structures, systems, and components (SSCs) that perform functions necessary to protect the public, the worker, and the environment from hazards unique to the YMP site characterization activities. This technical basis for identifying SSCs serves as a grading process for the implementation of programs such as Conduct of Operations (DOE Order 5480.19) and the Suspect/Counterfeit Items Program. In addition, this report provides a consolidated summary of the hazards analyses processes developed to support the design, construction, and operation of the YMP site characterization facilities and, therefore, provides a tool for evaluating the safety impacts of changes to the design and operation of the YMP site characterization activities.
Broader source: Directives, Delegations, and Requirements [Office of Management (MA)]
2007-07-11
The Guide assists DOE/NNSA field elements and operating contractors in identifying and analyzing hazards at facilities and sites to provide the technical planning basis for emergency management programs. Supersedes DOE G 151.1-1, Volume 2.
Algorithmic advances in stochastic programming
Morton, D.P.
1993-07-01
Practical planning problems with deterministic forecasts of inherently uncertain parameters often yield unsatisfactory solutions. Stochastic programming formulations allow uncertain parameters to be modeled as random variables with known distributions, but the size of the resulting mathematical programs can be formidable. Decomposition-based algorithms take advantage of special structure and provide an attractive approach to such problems. We consider two classes of decomposition-based stochastic programming algorithms. The first type of algorithm addresses problems with a ``manageable`` number of scenarios. The second class incorporates Monte Carlo sampling within a decomposition algorithm. We develop and empirically study an enhanced Benders decomposition algorithm for solving multistage stochastic linear programs within a prespecified tolerance. The enhancements include warm start basis selection, preliminary cut generation, the multicut procedure, and decision tree traversing strategies. Computational results are presented for a collection of ``real-world`` multistage stochastic hydroelectric scheduling problems. Recently, there has been an increased focus on decomposition-based algorithms that use sampling within the optimization framework. These approaches hold much promise for solving stochastic programs with many scenarios. A critical component of such algorithms is a stopping criterion to ensure the quality of the solution. With this as motivation, we develop a stopping rule theory for algorithms in which bounds on the optimal objective function value are estimated by sampling. Rules are provided for selecting sample sizes and terminating the algorithm under which asymptotic validity of confidence interval statements for the quality of the proposed solution can be verified. Issues associated with the application of this theory to two sampling-based algorithms are considered, and preliminary empirical coverage results are presented.
Algorithms for builder guidelines
Balcomb, J.D.; Lekov, A.B.
1989-06-01
The Builder Guidelines are designed to make simple, appropriate guidelines available to builders for their specific localities. Builders may select from passive solar and conservation strategies with different performance potentials. They can then compare the calculated results for their particular house design with a typical house in the same location. Algorithms used to develop the Builder Guidelines are described. The main algorithms used are the monthly solar ratio (SLR) method for winter heating, the diurnal heat capacity (DHC) method for temperature swing, and a new simplified calculation method (McCool) for summer cooling. This paper applies the algorithms to estimate the performance potential of passive solar strategies, and the annual heating and cooling loads of various combinations of conservation and passive solar strategies. The basis of the McCool method is described. All three methods are implemented in a microcomputer program used to generate the guideline numbers. Guidelines for Denver, Colorado, are used to illustrate the results. The structure of the guidelines and worksheet booklets are also presented. 5 refs., 3 tabs.
Hallquist, J.O.
1983-03-01
This report provides a theoretical manual for DYNA3D, a vectorized explicit three-dimensional finite element code for analyzing the large deformation dynamic response of inelastic solids. A contact-impact algorithm that permits gaps and sliding along material interfaces is described. By a specialization of this algorithm, such interfaces can be rigidly tied to admit variable zoning without the need of transition regions. Spatial discretization is achieved by the use of 8-node solid elements, and the equations-of-motion are integrated by the central difference method. DYNA3D is operational on the CRAY-1 and CDC7600 computers.
Theoretical Biology and Biophysics
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Theoretical Biology and Biophysics Modeling biological systems and analysis and informatics of molecular and cellular biological data Mathematical BiologyImmunology Fundamental ...
Exploratory Development of Theoretical Methods | The Ames Laboratory
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Exploratory Development of Theoretical Methods Research Personnel Updates Publications Calculating Plutonium and Praseodymium Structural Transformations Read More Genetic Algorithm for Grain Boundary and Crystal Structure Predictions Read More Universal Dynamical Decoupling of a Single Solid-state Spin from a Spin Bath Read More Previous Pause Next Modeling The purpose of this FWP is to generate new theories, models, and algorithms that will be beneficial to the research programs at the Ames
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
This has a number of advantages, such as reduced dataset requirements, ability to ... then solve for these coefficients using statistical correlations in the dataset. ...
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
that this equation is a difference- equation representation in the temporal domain of a first- order-in-time nonlinear partial differential equation. The co- efficient L k...
Library of Continuation Algorithms
Energy Science and Technology Software Center (OSTI)
2005-03-01
LOCA (Library of Continuation Algorithms) is scientific software written in C++ that provides advanced analysis tools for nonlinear systems. In particular, it provides parameter continuation algorithms. bifurcation tracking algorithms, and drivers for linear stability analysis. The algorithms are aimed at large-scale applications that use NewtonÂs method for their nonlinear solve.
Broader source: Energy.gov [DOE]
CRAD for Safety Basis (SB). Criteria Review and Approach Documents (CRADs) that can be used to conduct a well-organized and thorough assessment of elements of safety and health programs.
The Basis Code Development System
Energy Science and Technology Software Center (OSTI)
1994-03-15
BASIS9.4 is a system for developing interactive computer programs in Fortran, with some support for C and C++ as well. Using BASIS9.4 you can create a program that has a sophisticated programming language as its user interface so that the user can set, calculate with, and plot, all the major variables in the program. The program author writes only the scientific part of the program; BASIS9.4 supplies an environment in which to exercise that scientificmoreÂ Â» programming which includes an interactive language, an interpreter, graphics, terminal logs, error recovery, macros, saving and retrieving variables, formatted I/O, and online documentation.Â«Â less
Basis functions for electronic structure calculations on spheres
Gill, Peter M. W. Loos, Pierre-FranĂ§ois Agboola, Davids
2014-12-28
We introduce a new basis function (the spherical Gaussian) for electronic structure calculations on spheres of any dimension D. We find general expressions for the one- and two-electron integrals and propose an efficient computational algorithm incorporating the Cauchy-Schwarz bound. Using numerical calculations for the D = 2 case, we show that spherical Gaussians are more efficient than spherical harmonics when the electrons are strongly localized.
Energy Science and Technology Software Center (OSTI)
002651IBMPC00 Algorithm for Accounting for the Interactions of Multiple Renewable Energy Technologies in Estimation of Annual Performance
Easy and hard testbeds for real-time search algorithms
Koenig, S.; Simmons, R.G.
1996-12-31
Although researchers have studied which factors influence the behavior of traditional search algorithms, currently not much is known about how domain properties influence the performance of real-time search algorithms. In this paper we demonstrate, both theoretically and experimentally, that Eulerian state spaces (a super set of undirected state spaces) are very easy for some existing real-time search algorithms to solve: even real-time search algorithms that can be intractable, in general, are efficient for Eulerian state spaces. Because traditional real-time search testbeds (such as the eight puzzle and gridworlds) are Eulerian, they cannot be used to distinguish between efficient and inefficient real-time search algorithms. It follows that one has to use non-Eulerian domains to demonstrate the general superiority of a given algorithm. To this end, we present two classes of hard-to-search state spaces and demonstrate the performance of various real-time search algorithms on them.
Authorization basis requirements comparison report
Brantley, W.M.
1997-08-18
The TWRS Authorization Basis (AB) consists of a set of documents identified by TWRS management with the concurrence of DOE-RL. Upon implementation of the TWRS Basis for Interim Operation (BIO) and Technical Safety Requirements (TSRs), the AB list will be revised to include the BIO and TSRs. Some documents that currently form part of the AB will be removed from the list. This SD identifies each - requirement from those documents, and recommends a disposition for each to ensure that necessary requirements are retained when the AB is revised to incorporate the BIO and TSRs. This SD also identifies documents that will remain part of the AB after the BIO and TSRs are implemented. This document does not change the AB, but provides guidance for the preparation of change documentation.
Hanford Generic Interim Safety Basis
Lavender, J.C.
1994-09-09
The purpose of this document is to identify WHC programs and requirements that are an integral part of the authorization basis for nuclear facilities that are generic to all WHC-managed facilities. The purpose of these programs is to implement the DOE Orders, as WHC becomes contractually obligated to implement them. The Hanford Generic ISB focuses on the institutional controls and safety requirements identified in DOE Order 5480.23, Nuclear Safety Analysis Reports.
design basis threat | National Nuclear Security Administration
National Nuclear Security Administration (NNSA)
design basis threat Design Basis Threat NNSA has taken aggressive action to improve the security of its nuclear weapons material (often referred to as special nuclear material, or ...
OSR encapsulation basis -- 100-KW
Meichle, R.H.
1995-01-27
The purpose of this report is to provide the basis for a change in the Operations Safety Requirement (OSR) encapsulated fuel storage requirements in the 105 KW fuel storage basin which will permit the handling and storing of encapsulated fuel in canisters which no longer have a water-free space in the top of the canister. The scope of this report is limited to providing the change from the perspective of the safety envelope (bases) of the Safety Analysis Report (SAR) and Operations Safety Requirements (OSR). It does not change the encapsulation process itself.
Internal dosimetry technical basis manual
Not Available
1990-12-20
The internal dosimetry program at the Savannah River Site (SRS) consists of radiation protection programs and activities used to detect and evaluate intakes of radioactive material by radiation workers. Examples of such programs are: air monitoring; surface contamination monitoring; personal contamination surveys; radiobioassay; and dose assessment. The objectives of the internal dosimetry program are to demonstrate that the workplace is under control and that workers are not being exposed to radioactive material, and to detect and assess inadvertent intakes in the workplace. The Savannah River Site Internal Dosimetry Technical Basis Manual (TBM) is intended to provide a technical and philosophical discussion of the radiobioassay and dose assessment aspects of the internal dosimetry program. Detailed information on air, surface, and personal contamination surveillance programs is not given in this manual except for how these programs interface with routine and special bioassay programs.
Beyond Design Basis Events | Department of Energy
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Beyond Design Basis Events Beyond Design Basis Events Beyond Design Basis Events Following the March 2011 Fukushima Daiichi nuclear plant accident in Japan, DOE embarked upon several initiatives to investigate the safety posture of its nuclear facilities relative to beyond design basis events (BDBEs). These initiatives included issuing Safety Bulletin 2011-01, Events Beyond Design Safety Basis Analysis, and conducting two DOE nuclear safety workshops. DOE also issued two reports documenting the
DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]
Feller, D; Schuchardt, Karen L.; Didier, Brett T.; Elsethagen, Todd; Sun, Lisong; Gurumoorthi, Vidhya; Chase, Jared; Li, Jun
The Basis Set Exchange (BSE) provides a web-based user interface for downloading and uploading Gaussian-type (GTO) basis sets, including effective core potentials (ECPs), from the EMSL Basis Set Library. It provides an improved user interface and capabilities over its predecessor, the EMSL Basis Set Order Form, for exploring the contents of the EMSL Basis Set Library. The popular Basis Set Order Form and underlying Basis Set Library were originally developed by Dr. David Feller and have been available from the EMSL webpages since 1994. BSE not only allows downloading of the more than 200 Basis sets in various formats; it allows users to annotate existing sets and to upload new sets. (Specialized Interface)
DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]
Feller, D; Schuchardt, Karen L.; Didier, Brett T.; Elsethagen, Todd; Sun, Lisong; Gurumoorthi, Vidhya; Chase, Jared; Li, Jun
The Basis Set Exchange (BSE) provides a web-based user interface for downloading and uploading Gaussian-type (GTO) basis sets, including effective core potentials (ECPs), from the EMSL Basis Set Library. It provides an improved user interface and capabilities over its predecessor, the EMSL Basis Set Order Form, for exploring the contents of the EMSL Basis Set Library. The popular Basis Set Order Form and underlying Basis Set Library were originally developed by Dr. David Feller and have been available from the EMSL webpages since 1994. BSE not only allows downloading of the more than 500 Basis sets in various formats; it allows users to annotate existing sets and to upload new sets. (Specialized Interface)
James R. Chelikowsky
2009-03-31
The work reported here took place at the University of Minnesota from September 15, 2003 to November 14, 2005. This funding resulted in 10 invited articles or book chapters, 37 articles in refereed journals and 13 invited talks. The funding helped train 5 PhD students. The research supported by this grant focused on developing theoretical methods for predicting and understanding the properties of matter at the nanoscale. Within this regime, new phenomena occur that are characteristic of neither the atomic limit, nor the crystalline limit. Moreover, this regime is crucial for understanding the emergence of macroscopic properties such as ferromagnetism. For example, elemental Fe clusters possess magnetic moments that reside between the atomic and crystalline limits, but the transition from the atomic to the crystalline limit is not a simple interpolation between the two size regimes. To capitalize properly on predicting such phenomena in this transition regime, a deeper understanding of the electronic, magnetic and structural properties of matter is required, e.g., electron correlation effects are enhanced within this size regime and the surface of a confined system must be explicitly included. A key element of our research involved the construction of new algorithms to address problems peculiar to the nanoscale. Typically, one would like to consider systems with thousands of atoms or more, e.g., a silicon nanocrystal that is 7 nm in diameter would contain over 10,000 atoms. Previous ab initio methods could address systems with hundreds of atoms whereas empirical methods can routinely handle hundreds of thousands of atoms (or more). However, these empirical methods often rely on ad hoc assumptions and lack incorporation of structural and electronic degrees of freedom. The key theoretical ingredients in our work involved the use of ab initio pseudopotentials and density functional approaches. The key numerical ingredients involved the implementation of algorithms for
Tank characterization technical sampling basis
Brown, T.M.
1998-04-28
Tank Characterization Technical Sampling Basis (this document) is the first step of an in place working process to plan characterization activities in an optimal manner. This document will be used to develop the revision of the Waste Information Requirements Document (WIRD) (Winkelman et al. 1997) and ultimately, to create sampling schedules. The revised WIRD will define all Characterization Project activities over the course of subsequent fiscal years 1999 through 2002. This document establishes priorities for sampling and characterization activities conducted under the Tank Waste Remediation System (TWRS) Tank Waste Characterization Project. The Tank Waste Characterization Project is designed to provide all TWRS programs with information describing the physical, chemical, and radiological properties of the contents of waste storage tanks at the Hanford Site. These tanks contain radioactive waste generated from the production of nuclear weapons materials at the Hanford Site. The waste composition varies from tank to tank because of the large number of chemical processes that were used when producing nuclear weapons materials over the years and because the wastes were mixed during efforts to better use tank storage space. The Tank Waste Characterization Project mission is to provide information and waste sample material necessary for TWRS to define and maintain safe interim storage and to process waste fractions into stable forms for ultimate disposal. This document integrates the information needed to address safety issues, regulatory requirements, and retrieval, treatment, and immobilization requirements. Characterization sampling to support tank farm operational needs is also discussed.
Advanced Fuel Cycle Cost Basis
D. E. Shropshire; K. A. Williams; W. B. Boore; J. D. Smith; B. W. Dixon; M. Dunzik-Gougar; R. D. Adams; D. Gombert; E. Schneider
2008-03-01
This report, commissioned by the U.S. Department of Energy (DOE), provides a comprehensive set of cost data supporting a cost analysis for the relative economic comparison of options for use in the Advanced Fuel Cycle Initiative (AFCI) Program. The report describes the AFCI cost basis development process, reference information on AFCI cost modules, a procedure for estimating fuel cycle costs, economic evaluation guidelines, and a discussion on the integration of cost data into economic computer models. This report contains reference cost data for 25 cost modulesâ23 fuel cycle cost modules and 2 reactor modules. The cost modules were developed in the areas of natural uranium mining and milling, conversion, enrichment, depleted uranium disposition, fuel fabrication, interim spent fuel storage, reprocessing, waste conditioning, spent nuclear fuel (SNF) packaging, long-term monitored retrievable storage, near surface disposal of low-level waste (LLW), geologic repository and other disposal concepts, and transportation processes for nuclear fuel, LLW, SNF, transuranic, and high-level waste.
Advanced Fuel Cycle Cost Basis
D. E. Shropshire; K. A. Williams; W. B. Boore; J. D. Smith; B. W. Dixon; M. Dunzik-Gougar; R. D. Adams; D. Gombert; E. Schneider
2009-12-01
This report, commissioned by the U.S. Department of Energy (DOE), provides a comprehensive set of cost data supporting a cost analysis for the relative economic comparison of options for use in the Advanced Fuel Cycle Initiative (AFCI) Program. The report describes the AFCI cost basis development process, reference information on AFCI cost modules, a procedure for estimating fuel cycle costs, economic evaluation guidelines, and a discussion on the integration of cost data into economic computer models. This report contains reference cost data for 25 cost modulesâ23 fuel cycle cost modules and 2 reactor modules. The cost modules were developed in the areas of natural uranium mining and milling, conversion, enrichment, depleted uranium disposition, fuel fabrication, interim spent fuel storage, reprocessing, waste conditioning, spent nuclear fuel (SNF) packaging, long-term monitored retrievable storage, near surface disposal of low-level waste (LLW), geologic repository and other disposal concepts, and transportation processes for nuclear fuel, LLW, SNF, transuranic, and high-level waste.
Advanced Fuel Cycle Cost Basis
D. E. Shropshire; K. A. Williams; W. B. Boore; J. D. Smith; B. W. Dixon; M. Dunzik-Gougar; R. D. Adams; D. Gombert
2007-04-01
This report, commissioned by the U.S. Department of Energy (DOE), provides a comprehensive set of cost data supporting a cost analysis for the relative economic comparison of options for use in the Advanced Fuel Cycle Initiative (AFCI) Program. The report describes the AFCI cost basis development process, reference information on AFCI cost modules, a procedure for estimating fuel cycle costs, economic evaluation guidelines, and a discussion on the integration of cost data into economic computer models. This report contains reference cost data for 26 cost modulesâ24 fuel cycle cost modules and 2 reactor modules. The cost modules were developed in the areas of natural uranium mining and milling, conversion, enrichment, depleted uranium disposition, fuel fabrication, interim spent fuel storage, reprocessing, waste conditioning, spent nuclear fuel (SNF) packaging, long-term monitored retrievable storage, near surface disposal of low-level waste (LLW), geologic repository and other disposal concepts, and transportation processes for nuclear fuel, LLW, SNF, and high-level waste.
Energy Science and Technology Software Center (OSTI)
2013-07-29
The OpenEIS Algorithm package seeks to provide a low-risk path for building owners, service providers and managers to explore analytical methods for improving building control and operational efficiency. Users of this software can analyze building data, and learn how commercial implementations would provide long-term value. The code also serves as a reference implementation for developers who wish to adapt the algorithms for use in commercial tools or service offerings.
Safety Basis Information System | Department of Energy
Click on the above link to log in to the Safety Basis web interface. "RESTRICTED; access ... Click on the above link to access the form to request access to the Safety Basis web ...
Theoretical Division Current Job Openings
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
PADSTE Â» ADTSC Â» T Â» Job Openings Theoretical Division Job Openings Explore the multiple dimensions of a career at Los Alamos Lab: work with the best minds on the planet in an inclusive environment that is rich in intellectual vitality and opportunities for growth. Click in the Job Number to be directed to the description/application page. Postdoc Positions IRC50385 Staff Scientist: Material Informatics IRC50253 Staff scientist: Quantum Information and Quantum Physics IRC49276 Theoretical and
ON THE VERIFICATION AND VALIDATION OF GEOSPATIAL IMAGE ANALYSIS ALGORITHMS
Roberts, Randy S.; Trucano, Timothy G.; Pope, Paul A.; Aragon, Cecilia R.; Jiang , Ming; Wei, Thomas; Chilton, Lawrence; Bakel, A. J.
2010-07-25
Verification and validation (V&V) of geospatial image analysis algorithms is a difficult task and is becoming increasingly important. While there are many types of image analysis algorithms, we focus on developing V&V methodologies for algorithms designed to provide textual descriptions of geospatial imagery. In this paper, we present a novel methodological basis for V&V that employs a domain-specific ontology, which provides a naming convention for a domain-bounded set of objects and a set of named relationship between these objects. We describe a validation process that proceeds through objectively comparing benchmark imagery, produced using the ontology, with algorithm results. As an example, we describe how the proposed V&V methodology would be applied to algorithms designed to provide textual descriptions of facilities
New Effective Multithreaded Matching Algorithms
Manne, Fredrik; Halappanavar, Mahantesh
2014-05-19
Matching is an important combinatorial problem with a number of applications in areas such as community detection, sparse linear algebra, and network alignment. Since computing optimal matchings can be very time consuming, several fast approximation algorithms, both sequential and parallel, have been suggested. Common to the algorithms giving the best solutions is that they tend to be sequential by nature, while algorithms more suitable for parallel computation give solutions of less quality. We present a new simple 1 2 -approximation algorithm for the weighted matching problem. This algorithm is both faster than any other suggested sequential 1 2 -approximation algorithm on almost all inputs and also scales better than previous multithreaded algorithms. We further extend this to a general scalable multithreaded algorithm that computes matchings of weight comparable with the best sequential algorithms. The performance of the suggested algorithms is documented through extensive experiments on different multithreaded architectures.
Lightning Talks 2015: Theoretical Division
Shlachter, Jack S.
2015-11-25
This document is a compilation of slides from a number of student presentations given to LANL Theoretical Division members. The subjects cover the range of activities of the Division, including plasma physics, environmental issues, materials research, bacterial resistance to antibiotics, and computational methods.
Energy Science and Technology Software Center (OSTI)
2005-03-30
The Robotic Follow Algorithm enables allows any robotic vehicle to follow a moving target while reactively choosing a route around nearby obstacles. The robotic follow behavior can be used with different camera systems and can be used with thermal or visual tracking as well as other tracking methods such as radio frequency tags.
Property:ExplorationBasis | Open Energy Information
Text Description Exploration Basis Why was exploration work conducted in this area (e.g., USGS report of a geothermal resource, hot springs with geothemmetry indicating...
Structural basis for Tetrahymena telomerase processivity factor...
Office of Scientific and Technical Information (OSTI)
factor Teb1 binding to single-stranded telomeric-repeat DNA Citation Details In-Document Search Title: Structural basis for Tetrahymena telomerase processivity factor Teb1 ...
Beyond Design Basis Event Pilot Evaluations
Office of Energy Efficiency and Renewable Energy (EERE)
This document provides Results and Recommendations for Improvements to Enhance Nuclear Safety at Department of Energy Nuclear Facilities based upon Beyond Design Basis Event Pilot Evaluations
On constructing optimistic simulation algorithms for the discrete event system specification
Nutaro, James J
2008-01-01
This article describes a Time Warp simulation algorithm for discrete event models that are described in terms of the Discrete Event System Specification (DEVS). The article shows how the total state transition and total output function of a DEVS atomic model can be transformed into an event processing procedure for a logical process. A specific Time Warp algorithm is constructed around this logical process, and it is shown that the algorithm correctly simulates a DEVS coupled model that consists entirely of interacting atomic models. The simulation algorithm is presented abstractly; it is intended to provide a basis for implementing efficient and scalable parallel algorithms that correctly simulate DEVS models.
Theoretical Plasma Physicist | Princeton Plasma Physics Lab
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Theoretical Plasma Physicist Department: Theory Supervisor(s): Amitava Bhattacharjee ... Department has an opening at the rank of Research Physicist in theoretical plasma physics. ...
Theoretical energy release of thermites, intermetallics, and...
Office of Scientific and Technical Information (OSTI)
Theoretical energy release of thermites, intermetallics, and combustible metals Citation Details In-Document Search Title: Theoretical energy release of thermites, intermetallics, and ...
Dynamical properties of non-ideal plasma on the basis of effective potentials
Ramazanov, T. S.; Kodanova, S. K.; Moldabekov, Zh. A.; Issanova, M. K.
2013-11-15
In this work, stopping power has been calculated on the basis of the Coulomb logarithm using the effective potentials. Calculations of the Coulomb logarithm and stopping power for different interaction potentials and degrees of ionization are compared. The comparison with the data of other theoretical and experimental works was carried out.
Large scale tracking algorithms.
Hansen, Ross L.; Love, Joshua Alan; Melgaard, David Kennett; Karelitz, David B.; Pitts, Todd Alan; Zollweg, Joshua David; Anderson, Dylan Z.; Nandy, Prabal; Whitlow, Gary L.; Bender, Daniel A.; Byrne, Raymond Harry
2015-01-01
Low signal-to-noise data processing algorithms for improved detection, tracking, discrimination and situational threat assessment are a key research challenge. As sensor technologies progress, the number of pixels will increase signi cantly. This will result in increased resolution, which could improve object discrimination, but unfortunately, will also result in a significant increase in the number of potential targets to track. Many tracking techniques, like multi-hypothesis trackers, suffer from a combinatorial explosion as the number of potential targets increase. As the resolution increases, the phenomenology applied towards detection algorithms also changes. For low resolution sensors, "blob" tracking is the norm. For higher resolution data, additional information may be employed in the detection and classfication steps. The most challenging scenarios are those where the targets cannot be fully resolved, yet must be tracked and distinguished for neighboring closely spaced objects. Tracking vehicles in an urban environment is an example of such a challenging scenario. This report evaluates several potential tracking algorithms for large-scale tracking in an urban environment.
Williams, P.T.
1993-09-01
As the field of computational fluid dynamics (CFD) continues to mature, algorithms are required to exploit the most recent advances in approximation theory, numerical mathematics, computing architectures, and hardware. Meeting this requirement is particularly challenging in incompressible fluid mechanics, where primitive-variable CFD formulations that are robust, while also accurate and efficient in three dimensions, remain an elusive goal. This dissertation asserts that one key to accomplishing this goal is recognition of the dual role assumed by the pressure, i.e., a mechanism for instantaneously enforcing conservation of mass and a force in the mechanical balance law for conservation of momentum. Proving this assertion has motivated the development of a new, primitive-variable, incompressible, CFD algorithm called the Continuity Constraint Method (CCM). The theoretical basis for the CCM consists of a finite-element spatial semi-discretization of a Galerkin weak statement, equal-order interpolation for all state-variables, a 0-implicit time-integration scheme, and a quasi-Newton iterative procedure extended by a Taylor Weak Statement (TWS) formulation for dispersion error control. Original contributions to algorithmic theory include: (a) formulation of the unsteady evolution of the divergence error, (b) investigation of the role of non-smoothness in the discretized continuity-constraint function, (c) development of a uniformly H{sup 1} Galerkin weak statement for the Reynolds-averaged Navier-Stokes pressure Poisson equation, (d) derivation of physically and numerically well-posed boundary conditions, and (e) investigation of sparse data structures and iterative methods for solving the matrix algebra statements generated by the algorithm.
A new augmentation based algorithm for extracting maximal chordal subgraphs
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Bhowmick, Sanjukta; Chen, Tzu-Yi; Halappanavar, Mahantesh
2014-10-18
If every cycle of a graph is chordal length greater than three then it contains an edge between non-adjacent vertices. Chordal graphs are of interest both theoretically, since they admit polynomial time solutions to a range of NP-hard graph problems, and practically, since they arise in many applications including sparse linear algebra, computer vision, and computational biology. A maximal chordal subgraph is a chordal subgraph that is not a proper subgraph of any other chordal subgraph. Existing algorithms for computing maximal chordal subgraphs depend on dynamically ordering the vertices, which is an inherently sequential process and therefore limits the algorithmsâmoreÂ Â» parallelizability. In our paper we explore techniques to develop a scalable parallel algorithm for extracting a maximal chordal subgraph. We demonstrate that an earlier attempt at developing a parallel algorithm may induce a non-optimal vertex ordering and is therefore not guaranteed to terminate with a maximal chordal subgraph. We then give a new algorithm that first computes and then repeatedly augments a spanning chordal subgraph. After proving that the algorithm terminates with a maximal chordal subgraph, we then demonstrate that this algorithm is more amenable to parallelization and that the parallel version also terminates with a maximal chordal subgraph. That said, the complexity of the new algorithm is higher than that of the previous parallel algorithm, although the earlier algorithm computes a chordal subgraph which is not guaranteed to be maximal. Finally, we experimented with our augmentation-based algorithm on both synthetic and real-world graphs. We provide scalability results and also explore the effect of different choices for the initial spanning chordal subgraph on both the running time and on the number of edges in the maximal chordal subgraph.Â«Â less
Theoretical Estimate of Maximum Possible Nuclear Explosion
DOE R&D Accomplishments [OSTI]
Bethe, H. A.
1950-01-31
The maximum nuclear accident which could occur in a Na-cooled, Be moderated, Pu and power producing reactor is estimated theoretically. (T.R.H.) 2O82 Results of nuclear calculations for a variety of compositions of fast, heterogeneous, sodium-cooled, U-235-fueled, plutonium- and power-producing reactors are reported. Core compositions typical of plate-, pin-, or wire-type fuel elements and with uranium as metal, alloy, and oxide were considered. These compositions included atom ratios in the following range: U-23B to U-235 from 2 to 8; sodium to U-235 from 1.5 to 12; iron to U-235 from 5 to 18; and vanadium to U-235 from 11 to 33. Calculations were performed to determine the effect of lead and iron reflectors between the core and blanket. Both natural and depleted uranium were evaluated as the blanket fertile material. Reactors were compared on a basis of conversion ratio, specific power, and the product of both. The calculated results are in general agreement with the experimental results from fast reactor assemblies. An analysis of the effect of new cross-section values as they became available is included. (auth)
Theoretical studies of combustion dynamics
Bowman, J.M.
1993-12-01
The basic objectives of this research program are to develop and apply theoretical techniques to fundamental dynamical processes of importance in gas-phase combustion. There are two major areas currently supported by this grant. One is reactive scattering of diatom-diatom systems, and the other is the dynamics of complex formation and decay based on L{sup 2} methods. In all of these studies, the authors focus on systems that are of interest experimentally, and for which potential energy surfaces based, at least in part, on ab initio calculations are available.
Safety Basis Information System | Department of Energy
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Request Click on the above link to access the form to request access to the Safety Basis web interface. If you need assistance logging in, please AU UserSupport. Contact Nimi Rao...
Basis for UCNI | Department of Energy
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
UCNI Basis for UCNI What documents contain the legal and policy foundations for the UCNI program? Section 148 of the Atomic Energy Act of 1954, as amended (42 U.S.C. 2011 et seq.), is the statutory basis for the UCNI program. 10 CFR Part 1017, Identification and Protection of Unclassified Controlled Nuclear Information specifies many detailed policies and requirements concerning the UCNI program. DOE O 471.1B, Identification and Protection of Unclassified Controlled Nuclear Information,
Theoretical perspectives on strange physics
Ellis, J.
1983-04-01
Kaons are heavy enough to have an interesting range of decay modes available to them, and light enough to be produced in sufficient numbers to explore rare modes with satisfying statistics. Kaons and their decays have provided at least two major breakthroughs in our knowledge of fundamental physics. They have revealed to us CP violation, and their lack of flavor-changing neutral interactions warned us to expect charm. In addition, K/sup 0/-anti K/sup 0/ mixing has provided us with one of our most elegant and sensitive laboratories for testing quantum mechanics. There is every reason to expect that future generations of kaon experiments with intense sources would add further to our knowledge of fundamental physics. This talk attempts to set future kaon experiments in a general theoretical context, and indicate how they may bear upon fundamental theoretical issues. A survey of different experiments which would be done with an Intense Medium Energy Source of Strangeness, including rare K decays, probes of the nature of CP isolation, ..mu.. decays, hyperon decays and neutrino physics is given. (WHK)
Rossi, Tuomas P. Sakko, Arto; Puska, Martti J.; Lehtola, Susi; Nieminen, Risto M.
2015-03-07
We present an approach for generating local numerical basis sets of improving accuracy for first-principles nanoplasmonics simulations within time-dependent density functional theory. The method is demonstrated for copper, silver, and gold nanoparticles that are of experimental interest but computationally demanding due to the semi-core d-electrons that affect their plasmonic response. The basis sets are constructed by augmenting numerical atomic orbital basis sets by truncated Gaussian-type orbitals generated by the completeness-optimization scheme, which is applied to the photoabsorption spectra of homoatomic metal atom dimers. We obtain basis sets of improving accuracy up to the complete basis set limit and demonstrate that the performance of the basis sets transfers to simulations of larger nanoparticles and nanoalloys as well as to calculations with various exchange-correlation functionals. This work promotes the use of the local basis set approach of controllable accuracy in first-principles nanoplasmonics simulations and beyond.
Critical review of theoretical models for anomalous effects in deuterated metals
Chechin, V.A.; Tsarev, V.A. ); Rabinowitz, M. ); Kim, Y.E. )
1994-03-01
The authors briefly summarize the reported anomalous effects in deuterated metals at ambient temperature commonly known as [open quotes]cold fusion[close quotes] (CF) with an emphasis on the latest experiments, as well as the theoretical basis for the opposition to interpreting them as cold fusion. Then they critically examine more than 25 theoretical models for CF, including unusual nuclear and exotic chemical hypotheses. They conclude that they do not explain the data.
Gu, Renliang E-mail: ald@iastate.edu; DogandĆŸiÄ, Aleksandar E-mail: ald@iastate.edu
2015-03-31
We develop a sparse image reconstruction method for polychromatic computed tomography (CT) measurements under the blind scenario where the material of the inspected object and the incident energy spectrum are unknown. To obtain a parsimonious measurement model parameterization, we first rewrite the measurement equation using our mass-attenuation parameterization, which has the Laplace integral form. The unknown mass-attenuation spectrum is expanded into basis functions using a B-spline basis of order one. We develop a block coordinate-descent algorithm for constrained minimization of a penalized negative log-likelihood function, where constraints and penalty terms ensure nonnegativity of the spline coefficients and sparsity of the density map image in the wavelet domain. This algorithm alternates between a Nesterovâs proximal-gradient step for estimating the density map image and an active-set step for estimating the incident spectrum parameters. Numerical simulations demonstrate the performance of the proposed scheme.
Theoretical Nuclear Physics - Research - Cyclotron Institute
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Theoretical Nuclear Physics By addressing this elastic scattering indirect technique, we hope that more accurate measurements of elastic scattering data will provide very important astrophysical information. Progress toward understanding the structure and behavior of strongly interacting many-body systems requires detailed theoretical study. The theoretical physics program concentrates on the development of fundamental and phenomenological models of nuclear behavior. In some systems, the
Structural Basis for Activation of Cholera Toxin
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Structural Basis for Activation of Cholera Toxin Structural Basis for Activation of Cholera Toxin Print Wednesday, 30 November 2005 00:00 Cholera is a serious disease that claims thousands of victims each year in third-world, war-torn, and disaster-stricken nations. The culprit is the bacterium Vibrio cholerae, which can be ingested through contaminated food or water and colonizes the mucous membrane of the human small intestine. There, it secretes cholera toxin (CT), a protein whose A1 subunit
Smith, Kyle K. G.; Poulsen, Jens Aage Nyman, Gunnar; Rossky, Peter J.
2015-06-28
We develop two classes of quasi-classical dynamics that are shown to conserve the initial quantum ensemble when used in combination with the Feynman-Kleinert approximation of the density operator. These dynamics are used to improve the Feynman-Kleinert implementation of the classical Wigner approximation for the evaluation of quantum time correlation functions known as Feynman-Kleinert linearized path-integral. As shown, both classes of dynamics are able to recover the exact classical and high temperature limits of the quantum time correlation function, while a subset is able to recover the exact harmonic limit. A comparison of the approximate quantum time correlation functions obtained from both classes of dynamics is made with the exact results for the challenging model problems of the quartic and double-well potentials. It is found that these dynamics provide a great improvement over the classical Wigner approximation, in which purely classical dynamics are used. In a special case, our first method becomes identical to centroid molecular dynamics.
TWRS authorization basis configuration control summary
Mendoza, D.P.
1997-12-26
This document was developed to define the Authorization Basis management functional requirements for configuration control, to evaluate the management control systems currently in place, and identify any additional controls that may be required until the TWRS [Tank Waste Remediation System] Configuration Management system is fully in place.
CRAD, Facility Safety- Nuclear Facility Safety Basis
Office of Energy Efficiency and Renewable Energy (EERE)
A section of Appendix C to DOE G 226.1-2 "Federal Line Management Oversight of Department of Energy Nuclear Facilities." Consists of Criteria Review and Approach Documents (CRADs) that can be used for assessment of a contractor's Nuclear Facility Safety Basis.
Research in Theoretical Particle Physics
Feldman, Hume A; Marfatia, Danny
2014-09-24
This document is the final report on activity supported under DOE Grant Number DE-FG02-13ER42024. The report covers the period July 15, 2013 â March 31, 2014. Faculty supported by the grant during the period were Danny Marfatia (1.0 FTE) and Hume Feldman (1% FTE). The grant partly supported University of Hawaii students, David Yaylali and Keita Fukushima, who are supervised by Jason Kumar. Both students are expected to graduate with Ph.D. degrees in 2014. Yaylali will be joining the University of Arizona theory group in Fall 2014 with a 3-year postdoctoral appointment under Keith Dienes. The groupâs research covered topics subsumed under the Energy Frontier, the Intensity Frontier, and the Cosmic Frontier. Many theoretical results related to the Standard Model and models of new physics were published during the reporting period. The report contains brief project descriptions in Section 1. Sections 2 and 3 lists published and submitted work, respectively. Sections 4 and 5 summarize group activity including conferences, workshops and professional presentations.
Arctic Mixed-Phase Cloud Properties from AERI Lidar Observations: Algorithm and Results from SHEBA
Turner, David D.
2005-04-01
A new approach to retrieve microphysical properties from mixed-phase Arctic clouds is presented. This mixed-phase cloud property retrieval algorithm (MIXCRA) retrieves cloud optical depth, ice fraction, and the effective radius of the water and ice particles from ground-based, high-resolution infrared radiance and lidar cloud boundary observations. The theoretical basis for this technique is that the absorption coefficient of ice is greater than that of liquid water from 10 to 13 ?m, whereas liquid water is more absorbing than ice from 16 to 25 ?m. MIXCRA retrievals are only valid for optically thin (?visible < 6) single-layer clouds when the precipitable water vapor is less than 1 cm. MIXCRA was applied to the Atmospheric Emitted Radiance Interferometer (AERI) data that were collected during the Surface Heat Budget of the Arctic Ocean (SHEBA) experiment from November 1997 to May 1998, where 63% of all of the cloudy scenes above the SHEBA site met this specification. The retrieval determined that approximately 48% of these clouds were mixed phase and that a significant number of clouds (during all 7 months) contained liquid water, even for cloud temperatures as low as 240 K. The retrieved distributions of effective radii for water and ice particles in single-phase clouds are shown to be different than the effective radii in mixed-phase clouds.
Review and Approval of Nuclear Facility Safety Basis and Safety Design Basis Documents
Broader source: Directives, Delegations, and Requirements [Office of Management (MA)]
2014-12-19
This Standard describes a framework and the criteria to be used for approval of (1) safety basis documents, as required by 10 Code of Federal Regulation (C.F.R.) 830, Nuclear Safety Management, and (2) safety design basis documents, as required by Department of Energy (DOE) Standard (STD)-1189-2008, Integration of Safety into the Design Process.
Cubit Adaptive Meshing Algorithm Library
Energy Science and Technology Software Center (OSTI)
2004-09-01
CAMAL (Cubit adaptive meshing algorithm library) is a software component library for mesh generation. CAMAL 2.0 includes components for triangle, quad and tetrahedral meshing. A simple Application Programmers Interface (API) takes a discrete boundary definition and CAMAL computes a quality interior unstructured grid. The triangle and quad algorithms may also import a geometric definition of a surface on which to define the grid. CAMALÂs triangle meshing uses a 3D space advancing front method, the quadmoreÂ Â» meshing algorithm is based upon SandiaÂs patented paving algorithm and the tetrahedral meshing algorithm employs the GHS3D-Tetmesh component developed by INRIA, France.Â«Â less
Graph Characterization and Sampling Algorithms
Office of Scientific and Technical Information (OSTI)
Sandia National Laboratories ubiquitous Computer traffic Social networks Biological ... conference on Innovations in theoretical computer science, pp. 471-482, 2014, doi:10.1145...
TECHNICAL BASIS DOCUMENT FOR NATURAL EVENT HAZARDS
KRIPPS, L.J.
2006-07-31
This technical basis document was developed to support the documented safety analysis (DSA) and describes the risk binning process and the technical basis for assigning risk bins for natural event hazard (NEH)-initiated accidents. The purpose of the risk binning process is to determine the need for safety-significant structures, systems, and components (SSC) and technical safety requirement (TSR)-level controls for a given representative accident or represented hazardous conditions based on an evaluation of the frequency and consequence. Note that the risk binning process is not applied to facility workers, because all facility worker hazardous conditions are considered for safety-significant SSCs and/or TSR-level controls.
Radioactive Waste Management BasisApril 2006
Perkins, B K
2011-08-31
This Radioactive Waste Management Basis (RWMB) documents radioactive waste management practices adopted at Lawrence Livermore National Laboratory (LLNL) pursuant to Department of Energy (DOE) Order 435.1, Radioactive Waste Management. The purpose of this Radioactive Waste Management Basis is to describe the systematic approach for planning, executing, and evaluating the management of radioactive waste at LLNL. The implementation of this document will ensure that waste management activities at LLNL are conducted in compliance with the requirements of DOE Order 435.1, Radioactive Waste Management, and the Implementation Guide for DOE Manual 435.1-1, Radioactive Waste Management Manual. Technical justification is provided where methods for meeting the requirements of DOE Order 435.1 deviate from the DOE Manual 435.1-1 and Implementation Guide.
Structural Basis for Activation of Cholera Toxin
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Structural Basis for Activation of Cholera Toxin Print Cholera is a serious disease that claims thousands of victims each year in third-world, war-torn, and disaster-stricken nations. The culprit is the bacterium Vibrio cholerae, which can be ingested through contaminated food or water and colonizes the mucous membrane of the human small intestine. There, it secretes cholera toxin (CT), a protein whose A1 subunit (CTA1) triggers a series of events culminating in the massive efflux of
Structural Basis for Activation of Cholera Toxin
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Structural Basis for Activation of Cholera Toxin Print Cholera is a serious disease that claims thousands of victims each year in third-world, war-torn, and disaster-stricken nations. The culprit is the bacterium Vibrio cholerae, which can be ingested through contaminated food or water and colonizes the mucous membrane of the human small intestine. There, it secretes cholera toxin (CT), a protein whose A1 subunit (CTA1) triggers a series of events culminating in the massive efflux of
Structural Basis for Activation of Cholera Toxin
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Structural Basis for Activation of Cholera Toxin Print Cholera is a serious disease that claims thousands of victims each year in third-world, war-torn, and disaster-stricken nations. The culprit is the bacterium Vibrio cholerae, which can be ingested through contaminated food or water and colonizes the mucous membrane of the human small intestine. There, it secretes cholera toxin (CT), a protein whose A1 subunit (CTA1) triggers a series of events culminating in the massive efflux of
Structural Basis for Activation of Cholera Toxin
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Structural Basis for Activation of Cholera Toxin Print Cholera is a serious disease that claims thousands of victims each year in third-world, war-torn, and disaster-stricken nations. The culprit is the bacterium Vibrio cholerae, which can be ingested through contaminated food or water and colonizes the mucous membrane of the human small intestine. There, it secretes cholera toxin (CT), a protein whose A1 subunit (CTA1) triggers a series of events culminating in the massive efflux of
Grid and basis adaptive polynomial chaos techniques for sensitivity and uncertainty analysis
Perkó, Zoltán Gilli, Luca Lathouwers, Danny Kloosterman, Jan Leen
2014-03-01
The demand for accurate and computationally affordable sensitivity and uncertainty techniques is constantly on the rise and has become especially pressing in the nuclear field with the shift to Best Estimate Plus Uncertainty methodologies in the licensing of nuclear installations. Besides traditional, already well developed methods such as first order perturbation theory or Monte Carlo sampling Polynomial Chaos Expansion (PCE) has been given a growing emphasis in recent years due to its simple application and good performance. This paper presents new developments of the research done at TU Delft on such Polynomial Chaos (PC) techniques. Our work is focused on the Non-Intrusive Spectral Projection (NISP) approach and adaptive methods for building the PCE of responses of interest. Recent efforts resulted in a new adaptive sparse grid algorithm designed for estimating the PC coefficients. The algorithm is based on Gerstner's procedure for calculating multi-dimensional integrals but proves to be computationally significantly cheaper, while at the same it retains a similar accuracy as the original method. More importantly the issue of basis adaptivity has been investigated and two techniques have been implemented for constructing the sparse PCE of quantities of interest. Not using the traditional full PC basis set leads to further reduction in computational time since the high order grids necessary for accurately estimating the near zero expansion coefficients of polynomial basis vectors not needed in the PCE can be excluded from the calculation. Moreover the sparse PC representation of the response is easier to handle when used for sensitivity analysis or uncertainty propagation due to the smaller number of basis vectors. The developed grid and basis adaptive methods have been implemented in Matlab as the Fully Adaptive Non-Intrusive Spectral Projection (FANISP) algorithm and were tested on four analytical problems. These show consistent good performance both in
Theoretical High Energy Physics | Argonne National Laboratory
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Research Accelerator Technology ATLAS at the LHC Cosmology & Astrophysics Instrumentation Precision Muon Physics Neutrino Physics Theoretical High Energy Physics Theoretical High Energy Physics Theoretical High Energy Physics Much of the work of high-energy physics concentrates on the interplay between theory and experiment. The theory group of Argonne's High Energy Physics Division performs high-precision calculations of Standard Model processes, interprets experimental data in terms of
Review and Approval of Nuclear Facility Safety Basis and Safety Design Basis Documents
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
SENSITIVE DOE-STD-1104-2014 December 2014 Superseding DOE-STD-1104-2009 DOE STANDARD REVIEW AND APPROVAL OF NUCLEAR FACILITY SAFETY BASIS AND SAFETY DESIGN BASIS DOCUMENTS U.S. Department of Energy AREA SAFT Washington, DC 20585 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. DOE-STD-1104-2014 i FOREWORD 1. This Standard describes a framework and the criteria to be used for approval of (1) safety basis documents, as required by 10 Code of Federal Regulation
Catalyst by Design - Theoretical, Nanostructural, and Experimental...
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Oxidation Catalyst for Diesel Engine Emission Treatment Catalyst by Design - Theoretical, Nanostructural, and Experimental Studies of Oxidation Catalyst for Diesel Engine Emission ...
Catalyst by Design - Theoretical, Nanostructural, and Experimental...
Broader source: Energy.gov (indexed) [DOE]
Catalyst by Design - Theoretical, Nanostructural, and Experimental Studies of Oxidation Catalyst for Diesel Engine Emission Treatment Catalysts via First Principles Catalysts via ...
COLLOQUIUM: Theoretical and Experimental Aspects of Controlled...
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
5:30pm MBG Auditorium COLLOQUIUM: Theoretical and Experimental Aspects of Controlled Quantum Dynamics Professor Herschel Rabitz Princeton University Abstract: PDF icon...
2005 American Conference on Theoretical Chemistry
Carter, Emily A
2006-11-19
The materials uploaded are meant to serve as final report on the funds provided by DOE-BES to help sponsor the 2005 American Conference on Theoretical Chemistry.
Theoretical calculating the thermodynamic properties of solid...
Office of Scientific and Technical Information (OSTI)
calculations, a theoretical screening methodology to identify the most promising COsub ... Such methodology not only can be used to search for good candidates from existing database ...
ORISE: The Medical Basis for Radiation-Accident Preparedness...
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
The Medical Basis for Radiation-Accident Preparedness: Medical Management Proceedings of the Fifth International REACTS Symposium on the Medical Basis for Radiation-Accident ...
Nuclear Safety Basis Program Review Overview and Management Oversight...
Office of Environmental Management (EM)
Nuclear Safety Basis Program Review Overview and Management Oversight Standard Review Plan Nuclear Safety Basis Program Review Overview and Management Oversight Standard Review ...
Structural Basis for the Interaction between Pyk2-FAT Domain...
Office of Scientific and Technical Information (OSTI)
Structural Basis for the Interaction between Pyk2-FAT Domain and Leupaxin LD Repeats Citation Details In-Document Search Title: Structural Basis for the Interaction between ...
Heavy quarkonium in a holographic basis (Journal Article) | DOE...
Office of Scientific and Technical Information (OSTI)
Heavy quarkonium in a holographic basis Title: Heavy quarkonium in a holographic basis Authors: Li, Yang Search DOE PAGES for author "Li, Yang" Search DOE PAGES for ORCID ...
Los Alamos National Laboratory fission basis (Conference) | SciTech...
Office of Scientific and Technical Information (OSTI)
Los Alamos National Laboratory fission basis Citation Details In-Document Search Title: Los Alamos National Laboratory fission basis You are accessing a document from the ...
Structural Basis of Prion Inhibition by Phenothiazine Compounds...
Office of Scientific and Technical Information (OSTI)
SciTech Connect Search Results Journal Article: Structural Basis of Prion Inhibition by Phenothiazine Compounds Citation Details In-Document Search Title: Structural Basis of Prion ...
Technical Cost Modeling - Life Cycle Analysis Basis for Program...
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
More Documents & Publications Technical Cost Modeling - Life Cycle Analysis Basis for Program Focus Technical Cost Modeling - Life Cycle Analysis Basis for Program Focus Polymer ...
Technical Cost Modeling - Life Cycle Analysis Basis for Program...
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
More Documents & Publications Technical Cost Modeling - Life Cycle Analysis Basis for Program Focus Technical Cost Modeling - Life Cycle Analysis Basis for Program Focus Life Cycle ...
Technical Cost Modeling - Life Cycle Analysis Basis for Program...
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
More Documents & Publications Technical Cost Modeling - Life Cycle Analysis Basis for Program Focus Technical Cost Modeling - Life Cycle Analysis Basis for Program Focus ...
A molecular basis for advanced materials in water treatment....
Office of Scientific and Technical Information (OSTI)
A molecular basis for advanced materials in water treatment. Citation Details In-Document Search Title: A molecular basis for advanced materials in water treatment. Authors: Rempe, ...
NDRPProtocolTechBasisCompiled020705.doc
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Basis Document for the Neutron Dose Reconstruction Project NEUTRON DOSE RECONSTRUCTION PROTOCOL Roger B. Falk, Joe M. Aldrich, Jerry Follmer, Nancy M. Daugherty, and Dr. Duane E. Hilmas Oak Ridge Institute of Science and Education and Dr. Phillip L. Chapman Department of Statistics, Colorado State University February 7, 2005 ORISE 05-0199 This document was produced under contract number DE-AC05-00OR22750 between the U.S. Department of Energy and Oak Ridge Associated Universities The authors of
Optimized Algorithms Boost Combustion Research
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Optimized Algorithms Boost Combustion Research Optimized Algorithms Boost Combustion Research Methane Flame Simulations Run 6x Faster on NERSC's Hopper Supercomputer November 25, 2014 Contact: Kathy Kincade, +1 510 495 2124, kkincade@lbl.gov Turbulent combustion simulations, which provide input to the design of more fuel-efficient combustion systems, have gotten their own efficiency boost, thanks to researchers from the Computational Research Division (CRD) at Lawrence Berkeley National
Technical Basis for PNNL Beryllium Inventory
Johnson, Michelle Lynn
2014-07-09
The Department of Energy (DOE) issued Title 10 of the Code of Federal Regulations Part 850, âChronic Beryllium Disease Prevention Programâ (the Beryllium Rule) in 1999 and required full compliance by no later than January 7, 2002. The Beryllium Rule requires the development of a baseline beryllium inventory of the locations of beryllium operations and other locations of potential beryllium contamination at DOE facilities. The baseline beryllium inventory is also required to identify workers exposed or potentially exposed to beryllium at those locations. Prior to DOE issuing 10 CFR 850, Pacific Northwest Nuclear Laboratory (PNNL) had documented the beryllium characterization and worker exposure potential for multiple facilities in compliance with DOEâs 1997 Notice 440.1, âInterim Chronic Beryllium Disease.â After DOEâs issuance of 10 CFR 850, PNNL developed an implementation plan to be compliant by 2002. In 2014, an internal self-assessment (ITS #E-00748) of PNNLâs Chronic Beryllium Disease Prevention Program (CBDPP) identified several deficiencies. One deficiency is that the technical basis for establishing the baseline beryllium inventory when the Beryllium Rule was implemented was either not documented or not retrievable. In addition, the beryllium inventory itself had not been adequately documented and maintained since PNNL established its own CBDPP, separate from Hanford Siteâs program. This document reconstructs PNNLâs baseline beryllium inventory as it would have existed when it achieved compliance with the Beryllium Rule in 2001 and provides the technical basis for the baseline beryllium inventory.
Berkeley Algorithms Help Researchers Understand Dark Energy
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Algorithms Help Researchers Understand Dark Energy Berkeley Algorithms Help Researchers Understand Dark Energy November 24, 2014 Contact: Linda Vu, +1 510 495 2402, lvu@lbl.gov ...
GPU Accelerated Event Detection Algorithm
Energy Science and Technology Software Center (OSTI)
2011-05-25
Smart grid external require new algorithmic approaches as well as parallel formulations. One of the critical components is the prediction of changes and detection of anomalies within the power grid. The state-of-the-art algorithms are not suited to handle the demands of streaming data analysis. (i) need for events detection algorithms that can scale with the size of data, (ii) need for algorithms that can not only handle multi dimensional nature of the data, but alsomoreÂ Â» model both spatial and temporal dependencies in the data, which, for the most part, are highly nonlinear, (iii) need for algorithms that can operate in an online fashion with streaming data. The GAEDA code is a new online anomaly detection techniques that take into account spatial, temporal, multi-dimensional aspects of the data set. The basic idea behind the proposed approach is to (a) to convert a multi-dimensional sequence into a univariate time series that captures the changes between successive windows extracted from the original sequence using singular value decomposition (SVD), and then (b) to apply known anomaly detection techniques for univariate time series. A key challenge for the proposed approach is to make the algorithm scalable to huge datasets by adopting techniques from perturbation theory, incremental SVD analysis. We used recent advances in tensor decomposition techniques which reduce computational complexity to monitor the change between successive windows and detect anomalies in the same manner as described above. Therefore we propose to develop the parallel solutions on many core systems such as GPUs, because these algorithms involve lot of numerical operations and are highly data-parallelizable.Â«Â less
Review and Approval of Nuclear Facility Safety Basis and Safety Design Basis Documents
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
SENSITIVE DOE-STD-1104-2009 May 2009 Superseding DOE-STD-1104-96 DOE STANDARD REVIEW AND APPROVAL OF NUCLEAR FACILITY SAFETY BASIS AND SAFETY DESIGN BASIS DOCUMENTS U.S. Department of Energy AREA SAFT Washington, DC 20585 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. DOE-STD-1104-2009 ii Available on the Department of Energy Technical Standards web page at http://www.hss.energy.gov/nuclearsafety/ns/techstds/ DOE-STD-1104-2009 iii CONTENTS FOREWORD
Real-time algorithm for robust coincidence search
Petrovic, T.; Vencelj, M.; Lipoglavsek, M.; Gajevic, J.; Pelicon, P.
2012-10-20
In in-beam {gamma}-ray spectroscopy experiments, we often look for coincident detection events. Among every N events detected, coincidence search is naively of principal complexity O(N{sup 2}). When we limit the approximate width of the coincidence search window, the complexity can be reduced to O(N), permitting the implementation of the algorithm into real-time measurements, carried out indefinitely. We have built an algorithm to find simultaneous events between two detection channels. The algorithm was tested in an experiment where coincidences between X and {gamma} rays detected in two HPGe detectors were observed in the decay of {sup 61}Cu. Functioning of the algorithm was validated by comparing calculated experimental branching ratio for EC decay and theoretical calculation for 3 selected {gamma}-ray energies for {sup 61}Cu decay. Our research opened a question on the validity of the adopted value of total angular momentum of the 656 keV state (J{sup {pi}} = 1/2{sup -}) in {sup 61}Ni.
Theoretical Fusion Research | Princeton Plasma Physics Lab
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
NSTX-U Education Organization Contact Us Overview Experimental Fusion Research Theoretical Fusion Research Basic Plasma Science Plasma Astrophysics Other Physics and Engineering Research PPPL Technical Reports NSTX-U Theoretical Fusion Research About Theory Department The fusion energy sciences mission of the Theory Department at the Princeton Plasma Physics Laboratory (PPPL) is to help provide the scientific foundations for establishing magnetic confinement as an attractive, technically
Jet measurements at D0 using a KT algorithm
V.Daniel Elvira
2002-10-03
D0 has implemented and calibrated a k{perpendicular} jet algorithm for the first time in a p{bar p} collider. We present two results based on 1992-1996 data which were recently published: the subjet multiplicity in quark and gluon jets and the central inclusive jet cross section. The measured ratio between subjet multiplicities in gluon and quark jets is consistent with theoretical predictions and previous experimental values. NLO pQCD predictions of the k{perpendicular} inclusive jet cross section agree with the D0 measurement, although marginally in the low p{sub T} range. We also present a preliminary measurement of thrust cross sections, which indicates the need to include higher than {alpha}{sub s}{sup 3} terms and resumation in the theoretical calculations.
Interim Basis for PCB Sampling and Analyses
BANNING, D.L.
2001-03-20
This document was developed as an interim basis for sampling and analysis of polychlorinated biphenyls (PCBs) and will be used until a formal data quality objective (DQO) document is prepared and approved. On August 31, 2000, the Framework Agreement for Management of Polychlorinated Biphenyls (PCBs) in Hanford Tank Waste was signed by the U.S. Department of Energy (DOE), the Environmental Protection Agency (EPA), and the Washington State Department of Ecology (Ecology) (Ecology et al. 2000). This agreement outlines the management of double shell tank (DST) waste as Toxic Substance Control Act (TSCA) PCB remediation waste based on a risk-based disposal approval option per Title 40 of the Code of Federal Regulations 761.61 (c). The agreement calls for ''Quantification of PCBs in DSTs, single shell tanks (SSTs), and incoming waste to ensure that the vitrification plant and other ancillary facilities PCB waste acceptance limits and the requirements of the anticipated risk-based disposal approval are met.'' Waste samples will be analyzed for PCBs to satisfy this requirement. This document describes the DQO process undertaken to assure appropriate data will be collected to support management of PCBs and is presented in a DQO format. The DQO process was implemented in accordance with the U.S. Environmental Protection Agency EPA QA/G4, Guidance for the Data Quality Objectives Process (EPA 1994) and the Data Quality Objectives for Sampling and Analyses, HNF-IP-0842, Rev. 1A, Vol. IV, Section 4.16 (Banning 1999).
Interim Basis for PCB Sampling and Analyses
BANNING, D.L.
2001-01-18
This document was developed as an interim basis for sampling and analysis of polychlorinated biphenyls (PCBs) and will be used until a formal data quality objective (DQO) document is prepared and approved. On August 31, 2000, the Framework Agreement for Management of Polychlorinated Biphenyls (PCBs) in Hanford Tank Waste was signed by the US. Department of Energy (DOE), the Environmental Protection Agency (EPA), and the Washington State Department of Ecology (Ecology) (Ecology et al. 2000). This agreement outlines the management of double shell tank (DST) waste as Toxic Substance Control Act (TSCA) PCB remediation waste based on a risk-based disposal approval option per Title 40 of the Code of Federal Regulations 761.61 (c). The agreement calls for ''Quantification of PCBs in DSTs, single shell tanks (SSTs), and incoming waste to ensure that the vitrification plant and other ancillary facilities PCB waste acceptance limits and the requirements of the anticipated risk-based disposal approval are met.'' Waste samples will be analyzed for PCBs to satisfy this requirement. This document describes the DQO process undertaken to assure appropriate data will be collected to support management of PCBs and is presented in a DQO format. The DQO process was implemented in accordance with the U.S. Environmental Protection Agency EPA QAlG4, Guidance for the Data Quality Objectives Process (EPA 1994) and the Data Quality Objectives for Sampling and Analyses, HNF-IP-0842, Rev. 1 A, Vol. IV, Section 4.16 (Banning 1999).
A radial basis function Galerkin method for inhomogeneous nonlocal diffusion
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Lehoucq, Richard B.; Rowe, Stephen T.
2016-02-01
We introduce a discretization for a nonlocal diffusion problem using a localized basis of radial basis functions. The stiffness matrix entries are assembled by a special quadrature routine unique to the localized basis. Combining the quadrature method with the localized basis produces a well-conditioned, sparse, symmetric positive definite stiffness matrix. We demonstrate that both the continuum and discrete problems are well-posed and present numerical results for the convergence behavior of the radial basis function method. As a result, we explore approximating the solution to anisotropic differential equations by solving anisotropic nonlocal integral equations using the radial basis function method.
Probability-theoretic characteristics of solar batteries
Lidorenko, N.S.; Asharin, L.N.; Borisova, N.A.; Evdokimov, V.M.; Ryabikov, S.V.
1980-01-01
Results are reported for an investigation into the characteristics of solar batteries on the basis of probability theory with the photocells treated as current generators; methods for reducing solar-battery circuit losses are considered.
Adaptive protection algorithm and system
Hedrick, Paul (Pittsburgh, PA) [Pittsburgh, PA; Toms, Helen L. (Irwin, PA) [Irwin, PA; Miller, Roger M. (Mars, PA) [Mars, PA
2009-04-28
An adaptive protection algorithm and system for protecting electrical distribution systems traces the flow of power through a distribution system, assigns a value (or rank) to each circuit breaker in the system and then determines the appropriate trip set points based on the assigned rank.
Theoretical vibrations of carbon chains C3, C4, C5, C6, C7, C8, and C9
Kurtz, J.; Adamowicz, L. Arizona Univ., Tucson )
1991-04-01
The MBPT (2) procedure with the 6-31g (asterisk) basis set was used to study nearly linear carbon chains. The theoretical vibrational frequencies of the molecules C3 through C9 are presented and, for C3 through C6, compared to experimental stretching frequencies and their (C-13)/(C-12) isotopomers. Predictions for C7, C8, and C9 stretching frequencies are calculated by directly scaling the theoretical frequencies with factors derived from experimental-to-theoretical ratios known for the smaller molecules. 28 refs.
PARFUME Theory and Model basis Report
Darrell L. Knudson; Gregory K Miller; G.K. Miller; D.A. Petti; J.T. Maki; D.L. Knudson
2009-09-01
The success of gas reactors depends upon the safety and quality of the coated particle fuel. The fuel performance modeling code PARFUME simulates the mechanical, thermal and physico-chemical behavior of fuel particles during irradiation. This report documents the theory and material properties behind variÂŹous capabilities of the code, which include: 1) various options for calculating CO production and fission product gas release, 2) an analytical solution for stresses in the coating layers that accounts for irradiation-induced creep and swelling of the pyrocarbon layers, 3) a thermal model that calculates a time-dependent temperature profile through a pebble bed sphere or a prismatic block core, as well as through the layers of each analyzed particle, 4) simulation of multi-dimensional particle behavior associated with cracking in the IPyC layer, partial debonding of the IPyC from the SiC, particle asphericity, and kernel migration (or amoeba effect), 5) two independent methods for determining particle failure probabilities, 6) a model for calculating release-to-birth (R/B) ratios of gaseous fission products that accounts for particle failures and uranium contamination in the fuel matrix, and 7) the evaluation of an accident condition, where a particle experiences a sudden change in temperature following a period of normal irradiation. The accident condiÂŹtion entails diffusion of fission products through the particle coating layers and through the fuel matrix to the coolant boundary. This document represents the initial version of the PARFUME Theory and Model Basis Report. More detailed descriptions will be provided in future revisions.
Authorization basis status report (miscellaneous TWRS facilities, tanks and components)
Stickney, R.G.
1998-04-29
This report presents the results of a systematic evaluation conducted to identify miscellaneous TWRS facilities, tanks and components with potential needed authorization basis upgrades. It provides the Authorization Basis upgrade plan for those miscellaneous TWRS facilities, tanks and components identified.
Review and Approval of Nuclear Facility Safety Basis and Safety...
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
104-2014, Review and Approval of Nuclear Facility Safety Basis and Safety Design Basis Documents by Website Administrator This Standard describes a framework and the criteria to be...
CRAD, Integrated Safety Basis and Engineering Design Review ...
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Integrated Safety Basis and Engineering Design Review - August 20, 2014 (EA CRAD 31-4, Rev. 0) CRAD, Integrated Safety Basis and Engineering Design Review - August 20, 2014 (EA...
Office of Nuclear Safety Basis and Facility Design
Broader source: Energy.gov [DOE]
The Office of Nuclear Safety Basis & Facility Design establishes safety basis and facility design requirements and expectations related to analysis and design of nuclear facilities to ensure protection of workers and the public from the hazards associated with nuclear operations.
CRAD, Engineering Design and Safety Basis- December 22, 2009
Broader source: Energy.gov [DOE]
Engineering Design and Safety Basis Inspection Criteria, Inspection Activities, and Lines of Inquiry (HSS CRAD 64-19, Rev. 0)
Nuclear Safety Basis Program Review Overview and Management Oversight
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Standard Review Plan | Department of Energy Safety Basis Program Review Overview and Management Oversight Standard Review Plan Nuclear Safety Basis Program Review Overview and Management Oversight Standard Review Plan This SRP, Nuclear Safety Basis Program Review, consists of five volumes. It provides information to help strengthen the technical rigor of line management oversight and federal monitoring of DOE nuclear facilities. It provides a primer on the safety basis development and
Kinetically balanced Gaussian basis-set approach to relativistic Compton profiles of atoms
Jaiswal, Prerit; Shukla, Alok
2007-02-15
Atomic Compton profiles (CPs) are a very important property which provide us information about the momentum distribution of atomic electrons. Therefore, for CPs of heavy atoms, relativistic effects are expected to be important, warranting a relativistic treatment of the problem. In this paper, we present an efficient approach aimed at ab initio calculations of atomic CPs within a Dirac-Hartree-Fock (DHF) formalism, employing kinetically balanced Gaussian basis functions. The approach is used to compute the CPs of noble gases ranging from He to Rn, and the results have been compared to the experimental and other theoretical data, wherever possible. The influence of the quality of the basis set on the calculated CPs has also been systematically investigated.
Hamiltonian Light-front Field Theory Within an AdS/QCD Basis
Vary, J.P.; Honkanen, H.; Li, Jun; Maris, P.; Brodsky, S.J.; Harindranath, A.; de Teramond, G.F.; Sternberg, P.; Ng, E.G.; Yang, C.; /LBL, Berkeley
2009-12-16
Non-perturbative Hamiltonian light-front quantum field theory presents opportunities and challenges that bridge particle physics and nuclear physics. Fundamental theories, such as Quantum Chromodynamics (QCD) and Quantum Electrodynamics (QED) offer the promise of great predictive power spanning phenomena on all scales from the microscopic to cosmic scales, but new tools that do not rely exclusively on perturbation theory are required to make connection from one scale to the next. We outline recent theoretical and computational progress to build these bridges and provide illustrative results for nuclear structure and quantum field theory. As our framework we choose light-front gauge and a basis function representation with two-dimensional harmonic oscillator basis for transverse modes that corresponds with eigensolutions of the soft-wall AdS/QCD model obtained from light-front holography.
Convergent conductivity corrections to the Casimir force via exponential basis functions
Cui, Song; Soh, Yeng Chai
2010-12-15
A closed-form finite conductivity correction factor for the ideal Casimir force is proposed, based on exponential basis functions. Our method can facilitate experimental verifications of theories in the study of the Casimir force. A theoretical analysis is given to explain why our method is accurate at both large and small separation gaps. Numerical computations have been performed to confirm that our method is accurate in various experimental configurations. Our approach is widely applicable to various Casimir force interactions between metals and dielectrics. Our study can be extended to the study of the repulsive Casimir force as well.
Theoretical studies of chemical reaction dynamics
Schatz, G.C.
1993-12-01
This collaborative program with the Theoretical Chemistry Group at Argonne involves theoretical studies of gas phase chemical reactions and related energy transfer and photodissociation processes. Many of the reactions studied are of direct relevance to combustion; others are selected they provide important examples of special dynamical processes, or are of relevance to experimental measurements. Both classical trajectory and quantum reactive scattering methods are used for these studies, and the types of information determined range from thermal rate constants to state to state differential cross sections.
Theoretical aspects of light meson spectroscopy
Barnes, T. |
1995-12-31
In this pedagogical review the authors discuss the theoretical understanding of light hadron spectroscopy in terms of QCD and the quark model. They begin with a summary of the known and surmised properties of QCD and confinement. Following this they review the nonrelativistic quark potential model for q{anti q} mesons and discuss the quarkonium spectrum and methods for identifying q{anti q} states. Finally, they review theoretical expectations for non-q{anti q} states (glueballs, hybrids and multiquark systems) and the status of experimental candidates for these states.
Office of Scientific and Technical Information (OSTI)
1 are estimated us- ing the conventional MCMC (C-MCMC) with 60,000 model executions (red-solid lines), the linear, quadratic, and cubic surrogate systems with 9226, 4375, 3765...
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
of the vehicle. Here the two domains are the fluid exterior to the vehicle (compressible, turbulent fluid flow) and the interior of the vehicle (structural dynamics)...
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
and its Use in Coupling Codes for Multiphysics Simulations Rod Schmidt, Noel Belcourt, Russell Hooper, and Roger Pawlowski Sandia National Laboratories P.O. Box 5800...
Experimental and Theoretical Investigation of Lubricant and Additive...
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Theoretical Investigation of Lubricant and Additive Effects on Engine Friction Experimental and Theoretical Investigation of Lubricant and Additive Effects on Engine Friction ...
Research in theoretical nuclear and neutrino physics. Final report...
Office of Scientific and Technical Information (OSTI)
Research in theoretical nuclear and neutrino physics. Final report Citation Details In-Document Search Title: Research in theoretical nuclear and neutrino physics. Final report The ...
Research in theoretical nuclear and neutrino physics. Final report...
Office of Scientific and Technical Information (OSTI)
Technical Report: Research in theoretical nuclear and neutrino physics. Final report Citation Details In-Document Search Title: Research in theoretical nuclear and neutrino physics...
Operando Raman and Theoretical Vibration Spectroscopy of Non...
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Operando Raman and Theoretical Vibration Spectroscopy of Non-PGM Catalysts Operando Raman and Theoretical Vibration Spectroscopy of Non-PGM Catalysts Presentation about ...
Toward Catalyst Design from Theoretical Calculations (464th Brookhaven...
Office of Scientific and Technical Information (OSTI)
Toward Catalyst Design from Theoretical Calculations (464th Brookhaven Lecture) Citation Details In-Document Search Title: Toward Catalyst Design from Theoretical Calculations...
ITP Steel: Theoretical Minimum Energies to Produce Steel for...
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Theoretical Minimum Energies to Produce Steel for Selected Conditions, March 2000 ITP Steel: Theoretical Minimum Energies to Produce Steel for Selected Conditions, March 2000 ...
Time Variant Floating Mean Counting Algorithm
Energy Science and Technology Software Center (OSTI)
1999-06-03
This software was written to test a time variant floating mean counting algorithm. The algorithm was developed by Westinghouse Savannah River Company and a provisional patent has been filed on the algorithm. The test software was developed to work with the Val Tech model IVB prototype version II count rate meter hardware. The test software was used to verify the algorithm developed by WSRC could be correctly implemented with the vendor''s hardware.
Hanford External Dosimetry Technical Basis Manual PNL-MA-842
Rathbone, Bruce A.
2010-01-01
The Hanford External Dosimetry Technical Basis Manual PNL-MA-842 documents the design and implementation of the external dosimetry system used at Hanford. The manual describes the dosimeter design, processing protocols, dose calculation methodology, radiation fields encountered, dosimeter response characteristics, limitations of dosimeter design under field conditions, and makes recommendations for effective use of the dosimeters in the field. The manual describes the technical basis for the dosimetry system in a manner intended to help ensure defensibility of the dose of record at Hanford and to demonstrate compliance with 10 CFR 835, DOELAP, DOE-RL, ORP, PNSO, and Hanford contractor requirements. The dosimetry system is operated by PNNLâs Hanford External Dosimetry Program (HEDP) which provides dosimetry services to all Hanford contractors. The primary users of this manual are DOE and DOE contractors at Hanford using the dosimetry services of PNNL. Development and maintenance of this manual is funded directly by DOE and DOE contractors. Its contents have been reviewed and approved by DOE and DOE contractors at Hanford through the Hanford Personnel Dosimetry Advisory Committee (HPDAC) which is chartered and chaired by DOE-RL and serves as means of coordinating dosimetry practices across contractors at Hanford. This manual was established in 1996. Since its inception, it has been revised many times and maintained by PNNL as a controlled document with controlled distribution. The first revision to be released through PNNLâs Electronic Records & Information Capture Architecture (ERICA) database was designated Revision 0. Revision numbers that are whole numbers reflect major revisions typically involving significant changes to all chapters in the document. Revision numbers that include a decimal fraction reflect minor revisions, usually restricted to selected chapters or selected pages in the document. Maintenance and distribution of controlled hard copies of the
A new paradigm for the molecular basis of rubber elasticity
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Hanson, David E.; Barber, John L.
2015-02-19
The molecular basis for rubber elasticity is arguably the oldest and one of the most important questions in the field of polymer physics. The theoretical investigation of rubber elasticity began in earnest almost a century ago with the development of analytic thermodynamic models, based on simple, highly-symmetric configurations of so-called Gaussian chains, i.e. polymer chains that obey Markov statistics. Numerous theories have been proposed over the past 90 years based on the ansatz that the elastic force for individual network chains arises from the entropy change associated with the distribution of end-to-end distances of a free polymer chain. There aremoreÂ Â» serious philosophical objections to this assumption and others, such as the assumption that all network nodes undergo affine motion and that all of the network chains have the same length. Recently, a new paradigm for elasticity in rubber networks has been proposed that is based on mechanisms that originate at the molecular level. Using conventional statistical mechanics analyses, quantum chemistry, and molecular dynamics simulations, the fundamental entropic and enthalpic chain extension forces for polyisoprene (natural rubber) have been determined, along with estimates for the basic force constants. Concurrently, the complex morphology of natural rubber networks (the joint probability density distributions that relate the chain end-to-end distance to its contour length) has also been captured in a numerical model. When molecular chain forces are merged with the network structure in this model, it is possible to study the mechanical response to tensile and compressive strains of a representative volume element of a polymer network. As strain is imposed on a network, pathways of connected taut chains, that completely span the network along strain axis, emerge. Although these chains represent only a few percent of the total, they account for nearly all of the elastic stress at high strain. Here we provide
Enterprise Assessments Review of the Delegation of Safety Basis Approval
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Authority for Hazard Category 1, 2, and 3 Nuclear Facilities - April 2016 | Department of Energy Review of the Delegation of Safety Basis Approval Authority for Hazard Category 1, 2, and 3 Nuclear Facilities - April 2016 Enterprise Assessments Review of the Delegation of Safety Basis Approval Authority for Hazard Category 1, 2, and 3 Nuclear Facilities - April 2016 April 2016 Enterprise Assessments Review of the Delegation of Safety Basis Approval Authority for Hazard Category 1, 2, and 3
ORISE: The Medical Basis for Radiation-Accident Preparedness: Medical
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Management (Published by REAC/TS) The Medical Basis for Radiation-Accident Preparedness: Medical Management Proceedings of the Fifth International REAC/TS Symposium on the Medical Basis for Radiation-Accident Preparedness and the Biodosimetry Workshop As part of its mission to provide continuing education for personnel responsible for treating radiation injuries, REAC/TS hosted the Fifth International REAC/TS Symposium on the Medical Basis for Radiation-Accident Preparedness symposium and
Protocol for Enhanced Evaluations of Beyond Design Basis Events Supporting
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Implementation of Operating Experience Report 2013-01 | Department of Energy Protocol for Enhanced Evaluations of Beyond Design Basis Events Supporting Implementation of Operating Experience Report 2013-01 Protocol for Enhanced Evaluations of Beyond Design Basis Events Supporting Implementation of Operating Experience Report 2013-01 April, 2013 Protocol for Enhanced Evaluations of Beyond Design Basis Events Supporting Implementation of Operating Experience Report 2013-01 To support the
Volume-preserving algorithm for secular relativistic dynamics of charged particles
Zhang, Ruili; Liu, Jian; Wang, Yulei; He, Yang; Qin, Hong; Sun, Yajuan
2015-04-15
Secular dynamics of relativistic charged particles has theoretical significance and a wide range of applications. However, conventional algorithms are not applicable to this problem due to the coherent accumulation of numerical errors. To overcome this difficulty, we develop a volume-preserving algorithm (VPA) with long-term accuracy and conservativeness via a systematic splitting method. Applied to the simulation of runaway electrons with a time-span over 10 magnitudes, the VPA generates accurate results and enables the discovery of new physics for secular runaway dynamics.
Structural and Functional Basis for Inhibition of Erythrocyte...
Office of Scientific and Technical Information (OSTI)
Target Plasmodium falciparum EBA-175 Citation Details In-Document Search Title: Structural and Functional Basis for Inhibition of Erythrocyte Invasion by Antibodies that Target ...
Protocol for Enhanced Evaluations of Beyond Design Basis Events...
Protocol for Enhanced Evaluations of Beyond Design Basis Events Supporting Implementation of Operating Experience Report 2013-01 Protocol for Enhanced Evaluations of Beyond Design ...
Assessing Beyond Design Basis Seismic Events and Implications...
Office of Environmental Management (EM)
Defense Nuclear Facilities Safety Board Topics Covered: Department of Energy Approach to Natural Phenomena Hazards Analysis and Design (Seismic) Design Basis and Beyond Design...
CRAD, Review of Safety Basis Development- January 31, 2013
Broader source: Energy.gov [DOE]
Review of Safety Basis Development for the Savannah River Site Salt Waste Processing Facility - Inspection Criteria, Approach, and Lines of Inquiry (HSS CRAD 45-57, Rev. 0)
Technical Planning Basis - DOE Directives, Delegations, and Requiremen...
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
2, Technical Planning Basis by David Freshwater Functional areas: Defense Nuclear Facility Safety and Health Requirement, Safety and Security, The Guide assists DOENNSA field...
Structural and Functional Basis for Broad-spectrum Neutralization...
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Structural and Functional Basis for Broad-spectrum Neutralization of Avian and Human ... globally that have little or no immunity, represents a grave threat to human health. ...
Final Technical Report "Multiscale Simulation Algorithms for Biochemical Systems"
Petzold, Linda R.
2012-10-25
Biochemical systems are inherently multiscale and stochastic. In microscopic systems formed by living cells, the small numbers of reactant molecules can result in dynamical behavior that is discrete and stochastic rather than continuous and deterministic. An analysis tool that respects these dynamical characteristics is the stochastic simulation algorithm (SSA, Gillespie, 1976), a numerical simulation procedure that is essentially exact for chemical systems that are spatially homogeneous or well stirred. Despite recent improvements, as a procedure that simulates every reaction event, the SSA is necessarily inefficient for most realistic problems. There are two main reasons for this, both arising from the multiscale nature of the underlying problem: (1) stiffness, i.e. the presence of multiple timescales, the fastest of which are stable; and (2) the need to include in the simulation both species that are present in relatively small quantities and should be modeled by a discrete stochastic process, and species that are present in larger quantities and are more efficiently modeled by a deterministic differential equation (or at some scale in between). This project has focused on the development of fast and adaptive algorithms, and the fun- damental theory upon which they must be based, for the multiscale simulation of biochemical systems. Areas addressed by this project include: (1) Theoretical and practical foundations for ac- celerated discrete stochastic simulation (tau-leaping); (2) Dealing with stiffness (fast reactions) in an efficient and well-justified manner in discrete stochastic simulation; (3) Development of adaptive multiscale algorithms for spatially homogeneous discrete stochastic simulation; (4) Development of high-performance SSA algorithms.
Theoretical, Methodological, and Empirical Approaches to Cost Savings: A Compendium
M Weimar
1998-12-10
This publication summarizes and contains the original documentation for understanding why the U.S. Department of Energy's (DOE's) privatization approach provides cost savings and the different approaches that could be used in calculating cost savings for the Tank Waste Remediation System (TWRS) Phase I contract. The initial section summarizes the approaches in the different papers. The appendices are the individual source papers which have been reviewed by individuals outside of the Pacific Northwest National Laboratory and the TWRS Program. Appendix A provides a theoretical basis for and estimate of the level of savings that can be" obtained from a fixed-priced contract with performance risk maintained by the contractor. Appendix B provides the methodology for determining cost savings when comparing a fixed-priced contractor with a Management and Operations (M&O) contractor (cost-plus contractor). Appendix C summarizes the economic model used to calculate cost savings and provides hypothetical output from preliminary calculations. Appendix D provides the summary of the approach for the DOE-Richland Operations Office (RL) estimate of the M&O contractor to perform the same work as BNFL Inc. Appendix E contains information on cost growth and per metric ton of glass costs for high-level waste at two other DOE sites, West Valley and Savannah River. Appendix F addresses a risk allocation analysis of the BNFL proposal that indicates,that the current approach is still better than the alternative.
Technical Cost Modeling - Life Cycle Analysis Basis for Program Focus |
Broader source: Energy.gov (indexed) [DOE]
Department of Energy 1 DOE Hydrogen and Fuel Cells Program, and Vehicle Technologies Program Annual Merit Review and Peer Evaluation lm001_das_2011_o.pdf (305.88 KB) More Documents & Publications Technical Cost Modeling - Life Cycle Analysis Basis for Program Focus Technical Cost Modeling - Life Cycle Analysis Basis for Program Focus Multi-Material Joining: Challenges and Opportunities
The double-beta decay: Theoretical challenges
Horoi, Mihai
2012-11-20
Neutrinoless double beta decay is a unique process that could reveal physics beyond the Standard Model of particle physics namely, if observed, it would prove that neutrinos are Majorana particles. In addition, it could provide information regarding the neutrino masses and their hierarchy, provided that reliable nuclear matrix elements can be obtained. The two neutrino double beta decay is an associate process that is allowed by the Standard Model, and it was observed for about ten nuclei. The present contribution gives a brief review of the theoretical challenges associated with these two process, emphasizing the reliable calculation of the associated nuclear matrix elements.
Student's algorithm solves real-world problem
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Student's algorithm solves real-world problem Supercomputing Challenge: student's algorithm solves real-world problem Students learn how to use powerful computers to analyze, model, and solve real-world problems. April 3, 2012 Jordon Medlock of Albuquerque's Manzano High School won the 2012 Lab-sponsored Supercomputing Challenge Jordon Medlock of Albuquerque's Manzano High School won the 2012 Lab-sponsored Supercomputing Challenge by creating a computer algorithm that automates the process of
THEORETICAL STUDIES OF HADRONS AND NUCLEI
STEPHEN R COTANCH
2007-03-20
This report details final research results obtained during the 9 year period from June 1, 1997 through July 15, 2006. The research project, entitled â?Theoretical Studies of Hadrons and Nucleiâ?, was supported by grant DE-FG02-97ER41048 between North Carolina State University [NCSU] and the U. S. Department of Energy [DOE]. In compliance with grant requirements the Principal Investigator [PI], Professor Stephen R. Cotanch, conducted a theoretical research program investigating hadrons and nuclei and devoted to this program 50% of his time during the academic year and 100% of his time in the summer. Highlights of new, significant research results are briefly summarized in the following three sections corresponding to the respective sub-programs of this project (hadron structure, probing hadrons and hadron systems electromagnetically, and many-body studies). Recent progress is also discussed in a recent renewal/supplemental grant proposal submitted to DOE. Finally, full detailed descriptions of completed work can be found in the publications listed at the end of this report.
Hybrid Discrete - Continuum Algorithms for Stochastic Reaction...
Office of Scientific and Technical Information (OSTI)
for Stochastic Reaction Networks. Citation Details In-Document Search Title: Hybrid Discrete - Continuum Algorithms for Stochastic Reaction Networks. Abstract not provided. ...
Generation of attributes for learning algorithms
Hu, Yuh-Jyh; Kibler, D.
1996-12-31
Inductive algorithms rely strongly on their representational biases. Constructive induction can mitigate representational inadequacies. This paper introduces the notion of a relative gain measure and describes a new constructive induction algorithm (GALA) which is independent of the learning algorithm. Unlike most previous research on constructive induction, our methods are designed as preprocessing step before standard machine learning algorithms are applied. We present the results which demonstrate the effectiveness of GALA on artificial and real domains for several learners: C4.5, CN2, perceptron and backpropagation.
Solar Position Algorithm (SPA) - Energy Innovation Portal
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Thermal Solar Thermal Energy Analysis Energy Analysis Find More Like This Return to Search Solar Position Algorithm (SPA) National Renewable Energy Laboratory Contact NREL About ...
Java implementation of Class Association Rule algorithms
Energy Science and Technology Software Center (OSTI)
2007-08-30
Java implementation of three Class Association Rule mining algorithms, NETCAR, CARapriori, and clustering based rule mining. NETCAR algorithm is a novel algorithm developed by Makio Tamura. The algorithm is discussed in a paper: UCRL-JRNL-232466-DRAFT, and would be published in a peer review scientific journal. The software is used to extract combinations of genes relevant with a phenotype from a phylogenetic profile and a phenotype profile. The phylogenetic profiles is represented by a binary matrix andmoreÂ Â» a phenotype profile is represented by a binary vector. The present application of this software will be in genome analysis, however, it could be applied more generally.Â«Â less
Microsoft Word - Final_SRS_FTF_WD_Basis_March_2012
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
2-001 Revision 0 Basis for Section 3116 Determination for Closure of F-Tank Farm at the Savannah River Site March 2012 Basis for Section 3116 Determination DOE/SRS-WD-2012-001 for Closure of F-Tank Farm Revision 0 at the Savannah River Site March 2012 Page ii REVISION SUMMARY REV. # DESCRIPTION DATE OF ISSUE 0 Initial Issue March 2012 Basis for Section 3116 Determination DOE/SRS-WD-2012-001 for Closure of F-Tank Farm Revision 0 at the Savannah River Site March 2012 Page iii TABLE OF CONTENTS
Final Report: Sublinear Algorithms for In-situ and In-transit Data Analysis at Exascale.
Bennett, Janine Camille; Pinar, Ali; Seshadhri, C.; Thompson, David; Salloum, Maher; Bhagatwala, Ankit; Chen, Jacqueline H.
2015-09-01
Post-Moore's law scaling is creating a disruptive shift in simulation workflows, as saving the entirety of raw data to persistent storage becomes expensive. We are moving away from a post-process centric data analysis paradigm towards a concurrent analysis framework, in which raw simulation data is processed as it is computed. Algorithms must adapt to machines with extreme concurrency, low communication bandwidth, and high memory latency, while operating within the time constraints prescribed by the simulation. Furthermore, in- put parameters are often data dependent and cannot always be prescribed. The study of sublinear algorithms is a recent development in theoretical computer science and discrete mathematics that has significant potential to provide solutions for these challenges. The approaches of sublinear algorithms address the fundamental mathematical problem of understanding global features of a data set using limited resources. These theoretical ideas align with practical challenges of in-situ and in-transit computation where vast amounts of data must be processed under severe communication and memory constraints. This report details key advancements made in applying sublinear algorithms in-situ to identify features of interest and to enable adaptive workflows over the course of a three year LDRD. Prior to this LDRD, there was no precedent in applying sublinear techniques to large-scale, physics based simulations. This project has definitively demonstrated their efficacy at mitigating high performance computing challenges and highlighted the rich potential for follow-on re- search opportunities in this space.
CRAD, Review of Safety Basis Development- October 11, 2012
Broader source: Energy.gov [DOE]
Review of Safety Basis Development for the Y-12 National Security Complex Uranium Processing Facility Inspection Criteria, Approach, and Lines of Inquiry (HSS CRAD 45-55, Rev. 0)
SRS FTF Section 3116 Basis for Determination | Department of...
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Basis for Section 3116 Determination for Closure of F-Tank Farm at the Savannah River Site. In accordance with NDAA Section 3116, certain waste from reprocessing of spent nuclear ...
General Engineer/Physical Scientist (Safety Basis Engineer/Scientist)
Broader source: Energy.gov [DOE]
A successful candidate in this position will serve as an authority in the safety basis functional area. The incumbent is responsible for managing, coordinating, and authorizing work in the context...
Basis for Section 3116 Determination for Salt Waste Disposal...
Office of Environmental Management (EM)
WD-2005-001 January 2006 Basis for Section 3116 Determination for Salt Waste Disposal at ......... 28 4.0 THE WASTE DOES NOT REQUIRE PERMANENT ISOLATION IN A ...
Structural basis for ubiquitin-mediated antiviral signal activation...
Office of Scientific and Technical Information (OSTI)
Title: Structural basis for ubiquitin-mediated antiviral signal activation by RIG-I Authors: Peisley, Alys ; Wu, Bin ; Xu, Hui ; Chen, Zhijian J. ; Hur , Sun 1 ; HHMI) 2 ; ...
Structural basis for the antibody neutralization of Herpes simplex...
Office of Scientific and Technical Information (OSTI)
of Herpes simplex virus Citation Details In-Document Search Title: Structural basis for the antibody neutralization of Herpes simplex virus The gD-E317-Fab complex ...
Advanced Test Reactor Design Basis Reconstitution Project Issue Resolution Process
Steven D. Winter; Gregg L. Sharp; William E. Kohn; Richard T. McCracken
2007-05-01
The Advanced Test Reactor (ATR) Design Basis Reconstitution Program (DBRP) is a structured assessment and reconstitution of the design basis for the ATR. The DBRP is designed to establish and document the ties between the Document Safety Analysis (DSA), design basis, and actual system configurations. Where the DBRP assessment team cannot establish a link between these three major elements, a gap is identified. Resolutions to identified gaps represent configuration management and design basis recovery actions. The proposed paper discusses the process being applied to define, evaluate, report, and address gaps that are identified through the ATR DBRP. Design basis verification may be performed or required for a nuclear facility safety basis on various levels. The process is applicable to large-scale design basis reconstitution efforts, such as the ATR DBRP, or may be scaled for application on smaller projects. The concepts are applicable to long-term maintenance of a nuclear facility safety basis and recovery of degraded safety basis components. The ATR DBRP assessment team has observed numerous examples where a clear and accurate link between the DSA, design basis, and actual system configuration was not immediately identifiable in supporting documentation. As a result, a systematic approach to effectively document, prioritize, and evaluate each observation is required. The DBRP issue resolution process provides direction for consistent identification, documentation, categorization, and evaluation, and where applicable, entry into the determination process for a potential inadequacy in the safety analysis (PISA). The issue resolution process is a key element for execution of the DBRP. Application of the process facilitates collection, assessment, and reporting of issues identified by the DBRP team. Application of the process results in an organized database of safety basis gaps and prioritized corrective action planning and resolution. The DBRP team follows the ATR
Mathematical challenges from theoretical/computational chemistry
1995-12-31
The committee believes that this report has relevance and potentially valuable suggestions for a wide range of readers. Target audiences include: graduate departments in the mathematical and chemical sciences; federal and private agencies that fund research in the mathematical and chemical sciences; selected industrial and government research and development laboratories; developers of software and hardware for computational chemistry; and selected individual researchers. Chapter 2 of this report covers some history of computational chemistry for the nonspecialist, while Chapter 3 illustrates the fruits of some past successful cross-fertilization between mathematical scientists and computational/theoretical chemists. In Chapter 4 the committee has assembled a representative, but not exhaustive, survey of research opportunities. Most of these are descriptions of important open problems in computational/theoretical chemistry that could gain much from the efforts of innovative mathematical scientists, written so as to be accessible introductions to the nonspecialist. Chapter 5 is an assessment, necessarily subjective, of cultural differences that must be overcome if collaborative work is to be encouraged between the mathematical and the chemical communities. Finally, the report ends with a brief list of conclusions and recommendations that, if followed, could promote accelerated progress at this interface. Recognizing that bothersome language issues can inhibit prospects for collaborative research at the interface between distinctive disciplines, the committee has attempted throughout to maintain an accessible style, in part by using illustrative boxes, and has included at the end of the report a glossary of technical terms that may be familiar to only a subset of the target audiences listed above.
Technical Basis Document for PFP Area Monitoring Dosimetry Program
COOPER, J.R.
2000-04-17
This document describes the phantom dosimetry used for the PFP Area Monitoring program and establishes the basis for the Plutonium Finishing Plant's (PFP) area monitoring dosimetry program in accordance with the following requirements: Title 10, Code of Federal Regulations (CFR), part 835, ''Occupational Radiation Protection'' Part 835.403; Hanford Site Radiological Control Manual (HSRCM-1), Part 514; HNF-PRO-382, Area Dosimetry Program; and PNL-MA-842, Hanford External Dosimetry Technical Basis Manual.
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Wang, C. L.
2016-05-17
On the basis of FluoroBancroft linear-algebraic method [S.B. Andersson, Opt. Exp. 16, 18714 (2008)] three highly-resolved positioning methodswere proposed for wavelength-shifting fiber (WLSF) neutron detectors. Using a Gaussian or exponential-decay light-response function (LRF), the non-linear relation of photon-number profiles vs. x-pixels was linearized and neutron positions were determined. The proposed algorithms give an average 0.03-0.08 pixel position error, much smaller than that (0.29 pixel) from a traditional maximum photon algorithm (MPA). The new algorithms result in better detector uniformity, less position misassignment (ghosting), better spatial resolution, and an equivalent or better instrument resolution in powder diffraction than the MPA. Moreover,moreÂ Â» these characters will facilitate broader applications of WLSF detectors at time-of-flight neutron powder diffraction beamlines, including single-crystal diffraction and texture analysis.Â«Â less
Petascale algorithms for reactor hydrodynamics.
Fischer, P.; Lottes, J.; Pointer, W. D.; Siegel, A.
2008-01-01
We describe recent algorithmic developments that have enabled large eddy simulations of reactor flows on up to P = 65, 000 processors on the IBM BG/P at the Argonne Leadership Computing Facility. Petascale computing is expected to play a pivotal role in the design and analysis of next-generation nuclear reactors. Argonne's SHARP project is focused on advanced reactor simulation, with a current emphasis on modeling coupled neutronics and thermal-hydraulics (TH). The TH modeling comprises a hierarchy of computational fluid dynamics approaches ranging from detailed turbulence computations, using DNS (direct numerical simulation) and LES (large eddy simulation), to full core analysis based on RANS (Reynolds-averaged Navier-Stokes) and subchannel models. Our initial study is focused on LES of sodium-cooled fast reactor cores. The aim is to leverage petascale platforms at DOE's Leadership Computing Facilities (LCFs) to provide detailed information about heat transfer within the core and to provide baseline data for less expensive RANS and subchannel models.
Initial borehole acoustic televiewer data processing algorithms
Moore, T.K.
1988-06-01
With the development of a new digital televiewer, several algorithms have been developed in support of off-line data processing. This report describes the initial set of utilities developed to support data handling as well as data display. Functional descriptions, implementation details, and instructions for use of the seven algorithms are provided. 5 refs., 33 figs., 1 tab.
Computing single step operators of logic programming in radial basis function neural networks
Hamadneh, Nawaf; Sathasivam, Saratha; Choon, Ong Hong
2014-07-10
Logic programming is the process that leads from an original formulation of a computing problem to executable programs. A normal logic program consists of a finite set of clauses. A valuation I of logic programming is a mapping from ground atoms to false or true. The single step operator of any logic programming is defined as a function (T{sub p}:IâI). Logic programming is well-suited to building the artificial intelligence systems. In this study, we established a new technique to compute the single step operators of logic programming in the radial basis function neural networks. To do that, we proposed a new technique to generate the training data sets of single step operators. The training data sets are used to build the neural networks. We used the recurrent radial basis function neural networks to get to the steady state (the fixed point of the operators). To improve the performance of the neural networks, we used the particle swarm optimization algorithm to train the networks.
Theoretical crystallography with the Advanced Visualization System
Younkin, C.R.; Thornton, E.N.; Nicholas, J.B.; Jones, D.R.; Hess, A.C.
1993-05-01
Space is an Application Visualization System (AVS) graphics module designed for crystallographic and molecular research. The program can handle molecules, two-dimensional periodic systems, and three-dimensional periodic systems, all referred to in the paper as models. Using several methods, the user can select atoms, groups of atoms, or entire molecules. Selections can be moved, copied, deleted, and merged. An important feature of Space is the crystallography component. The program allows the user to generate the unit cell from the asymmetric unit, manipulate the unit cell, and replicate it in three dimensions. Space includes the Buerger reduction algorithm which determines the asymmetric unit and the space group of highest symmetry of an input unit cell. Space also allows the user to display planes in the lattice based on Miller indices, and to cleave the crystal to expose the surface. The user can display important precalculated volumetric data in Space, such as electron densities and electrostatic surfaces. With a variety of methods, Space can compute the electrostatic potential of any chemical system based on input point charges.
Theoretical priors on modified growth parametrisations
Song, Yong-Seon; Hollenstein, Lukas; Caldera-Cabral, Gabriela; Koyama, Kazuya E-mail: Lukas.Hollenstein@unige.ch E-mail: Kazuya.Koyama@port.ac.uk
2010-04-01
Next generation surveys will observe the large-scale structure of the Universe with unprecedented accuracy. This will enable us to test the relationships between matter over-densities, the curvature perturbation and the Newtonian potential. Any large-distance modification of gravity or exotic nature of dark energy modifies these relationships as compared to those predicted in the standard smooth dark energy model based on General Relativity. In linear theory of structure growth such modifications are often parameterised by virtue of two functions of space and time that enter the relation of the curvature perturbation to, first, the matter over- density, and second, the Newtonian potential. We investigate the predictions for these functions in Brans-Dicke theory, clustering dark energy models and interacting dark energy models. We find that each theory has a distinct path in the parameter space of modified growth. Understanding these theoretical priors on the parameterisations of modified growth is essential to reveal the nature of cosmic acceleration with the help of upcoming observations of structure formation.
Theoretical Model for Nanoporous Carbon Supercapacitors
Sumpter, Bobby G; Meunier, Vincent; Huang, Jingsong
2008-01-01
The unprecedented anomalous increase in capacitance of nanoporous carbon supercapacitors at pore sizes smaller than 1 nm [Science 2006, 313, 1760.] challenges the long-held presumption that pores smaller than the size of solvated electrolyte ions do not contribute to energy storage. We propose a heuristic model to replace the commonly used model for an electric double-layer capacitor (EDLC) on the basis of an electric double-cylinder capacitor (EDCC) for mesopores (2 {50 nm pore size), which becomes an electric wire-in-cylinder capacitor (EWCC) for micropores (< 2 nm pore size). Our analysis of the available experimental data in the micropore regime is confirmed by 1st principles density functional theory calculations and reveals significant curvature effects for carbon capacitance. The EDCC (and/or EWCC) model allows the supercapacitor properties to be correlated with pore size, specific surface area, Debye length, electrolyte concentration and dielectric constant, and solute ion size. The new model not only explains the experimental data, but also offers a practical direction for the optimization of the properties of carbon supercapacitors through experiments.
Development of design basis capacity for SNF project systems
Pajunen, A.L.
1996-02-27
An estimate of the design capacity for Spent Nuclear Fuel Project systems producing Multi-Canister Overpacks is developed based on completing fuel processing in a two year period. The design basis capacity for systems relates the desired annual processing rate to potential operating inefficiencies which may be actually experienced to project a design capacity for systems. The basis for estimating operating efficiency factors is described. Estimates of the design basis capacity were limited to systems actually producing the Multi-Canister Overpack. These systems include Fuel Retrieval, K Basin SNF Vacuum Drying, Canister Storage Building support for Staging and Storage, and Hot Vacuum conditioning. The capacity of other systems are assumed to be derived from these system capacities such that systems producing a Multi-Canister Overpack are not constrained.
Advanced Imaging Algorithms for Radiation Imaging Systems
Marleau, Peter
2015-10-01
The intent of the proposed work, in collaboration with University of Michigan, is to develop the algorithms that will bring the analysis from qualitative images to quantitative attributes of objects containing SNM. The first step to achieving this is to develop an indepth understanding of the intrinsic errors associated with the deconvolution and MLEM algorithms. A significant new effort will be undertaken to relate the image data to a posited three-dimensional model of geometric primitives that can be adjusted to get the best fit. In this way, parameters of the model such as sizes, shapes, and masses can be extracted for both radioactive and non-radioactive materials. This model-based algorithm will need the integrated response of a hypothesized configuration of material to be calculated many times. As such, both the MLEM and the model-based algorithm require significant increases in calculation speed in order to converge to solutions in practical amounts of time.
Tracking Algorithm for Multi- Dimensional Array Transposition
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
192002 Yun (Helen) He, SC2002 1 MPI and OpenMP Paradigms on Cluster of SMP Architectures: the Vacancy Tracking Algorithm for Multi- Dimensional Array Transposition Yun (Helen) He...
Advanced CHP Control Algorithms: Scope Specification
Katipamula, Srinivas; Brambley, Michael R.
2006-04-28
The primary objective of this multiyear project is to develop algorithms for combined heat and power systems to ensure optimal performance, increase reliability, and lead to the goal of clean, efficient, reliable and affordable next generation energy systems.
Genetic algorithms at UC Davis/LLNL
Vemuri, V.R.
1993-12-31
A tutorial introduction to genetic algorithms is given. This brief tutorial should serve the purpose of introducing the subject to the novice. The tutorial is followed by a brief commentary on the term project reports that follow.
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Slattery, Stuart R.
2015-12-02
In this study we analyze and extend mesh-free algorithms for three-dimensional data transfer problems in partitioned multiphysics simulations. We first provide a direct comparison between a mesh-based weighted residual method using the common-refinement scheme and two mesh-free algorithms leveraging compactly supported radial basis functions: one using a spline interpolation and one using a moving least square reconstruction. Through the comparison we assess both the conservation and accuracy of the data transfer obtained from each of the methods. We do so for a varying set of geometries with and without curvature and sharp features and for functions with and without smoothnessmoreÂ Â» and with varying gradients. Our results show that the mesh-based and mesh-free algorithms are complementary with cases where each was demonstrated to perform better than the other. We then focus on the mesh-free methods by developing a set of algorithms to parallelize them based on sparse linear algebra techniques. This includes a discussion of fast parallel radius searching in point clouds and restructuring the interpolation algorithms to leverage data structures and linear algebra services designed for large distributed computing environments. The scalability of our new algorithms is demonstrated on a leadership class computing facility using a set of basic scaling studies. Finally, these scaling studies show that for problems with reasonable load balance, our new algorithms for both spline interpolation and moving least square reconstruction demonstrate both strong and weak scalability using more than 100,000 MPI processes with billions of degrees of freedom in the data transfer operation.Â«Â less
Slattery, Stuart R.
2015-12-02
In this study we analyze and extend mesh-free algorithms for three-dimensional data transfer problems in partitioned multiphysics simulations. We first provide a direct comparison between a mesh-based weighted residual method using the common-refinement scheme and two mesh-free algorithms leveraging compactly supported radial basis functions: one using a spline interpolation and one using a moving least square reconstruction. Through the comparison we assess both the conservation and accuracy of the data transfer obtained from each of the methods. We do so for a varying set of geometries with and without curvature and sharp features and for functions with and without smoothness and with varying gradients. Our results show that the mesh-based and mesh-free algorithms are complementary with cases where each was demonstrated to perform better than the other. We then focus on the mesh-free methods by developing a set of algorithms to parallelize them based on sparse linear algebra techniques. This includes a discussion of fast parallel radius searching in point clouds and restructuring the interpolation algorithms to leverage data structures and linear algebra services designed for large distributed computing environments. The scalability of our new algorithms is demonstrated on a leadership class computing facility using a set of basic scaling studies. Finally, these scaling studies show that for problems with reasonable load balance, our new algorithms for both spline interpolation and moving least square reconstruction demonstrate both strong and weak scalability using more than 100,000 MPI processes with billions of degrees of freedom in the data transfer operation.
Drainage Algorithm for Geospatial Knowledge
Energy Science and Technology Software Center (OSTI)
2006-08-15
The Pacific Northwest National Laboratory (PNNL) has developed a prototype stream extraction algorithm that semi-automatically extracts and characterizes streams using a variety of multisensor imagery and digital terrain elevation data (DTEDĂÂĂÂŻĂÂĂÂĂÂĂÂą) data. The system is currently optimized for three types of single-band imagery: radar, visible, and thermal. Method of Solution: DRAGON: (1) classifies pixels into clumps of water objects based on the classification of water pixels by spectral signatures and neighborhood relationships, (2) uses themoreÂ Â» morphology operations (erosion and dilation) to separate out large lakes (or embayment), isolated lakes, ponds, wide rivers and narrow rivers, and (3) translates the river objects into vector objects. In detail, the process can be broken down into the following steps. A. Water pixels are initially identified using on the extend range and slope values (if an optional DEM file is available). B. Erode to the distance that defines a large water body and then dilate back. The resulting mask can be used to identify large lake and embayment objects that are then removed from the image. Since this operation be time consuming it is only performed if a simple test (i.e. a large box can be found somewhere in the image that contains only water pixels) that indicates a large water body is present. C. All water pixels are ĂÂĂÂąĂÂĂÂĂÂĂÂclumpedĂÂĂÂąĂÂĂÂĂÂĂÂ (in Imagine terminology clumping is when pixels of a common classification that touch are connected) and clumps which do not contain pure water pixels (e.g. dark cloud shadows) are removed D. The resulting true water pixels are clumped and water objects which are too small (e.g. ponds) or isolated lakes (i.e. isolated objects with a small compactness ratio) are removed. Note that at this point lakes have been identified has a byproduct of the filtering process and can be output has vector layers if needed. E. At this point only river pixels
Canister Storage Building (CSB) Design Basis Accident Analysis Documentation
CROWE, R.D.
1999-09-09
This document provides the detailed accident analysis to support ''HNF-3553, Spent Nuclear Fuel Project Final Safety, Analysis Report, Annex A,'' ''Canister Storage Building Final Safety Analysis Report.'' All assumptions, parameters, and models used to provide the analysis of the design basis accidents are documented to support the conclusions in the Canister Storage Building Final Safety Analysis Report.
Cold Vacuum Drying (CVD) Facility Design Basis Accident Analysis Documentation
PIEPHO, M.G.
1999-10-20
This document provides the detailed accident analysis to support HNF-3553, Annex B, Spent Nuclear Fuel Project Final Safety Analysis Report, ''Cold Vacuum Drying Facility Final Safety Analysis Report (FSAR).'' All assumptions, parameters and models used to provide the analysis of the design basis accidents are documented to support the conclusions in the FSAR.
Canister Storage Building (CSB) Design Basis Accident Analysis Documentation
CROWE, R.D.; PIEPHO, M.G.
2000-03-23
This document provided the detailed accident analysis to support HNF-3553, Spent Nuclear Fuel Project Final Safety Analysis Report, Annex A, ''Canister Storage Building Final Safety Analysis Report''. All assumptions, parameters, and models used to provide the analysis of the design basis accidents are documented to support the conclusions in the Canister Storage Building Final Safety Analysis Report.
Canister storage building design basis accident analysis documentation
KOPELIC, S.D.
1999-02-25
This document provides the detailed accident analysis to support HNF-3553, Spent Nuclear Fuel Project Final Safety Analysis Report, Annex A, ''Canister Storage Building Final Safety Analysis Report.'' All assumptions, parameters, and models used to provide the analysis of the design basis accidents are documented to support the conclusions in the Canister Storage Building Final Safety Analysis Report.
CRAD, Safety Basis- Idaho MF-628 Drum Treatment Facility
Broader source: Energy.gov [DOE]
A section of Appendix C to DOE G 226.1-2 "Federal Line Management Oversight of Department of Energy Nuclear Facilities." Consists of Criteria Review and Approach Documents (CRADs) used for a May 2007 readiness assessment of the Safety Basis at the Advanced Mixed Waste Treatment Project.
Solar Power Tower Design Basis Document, Revision 0
ZAVOICO,ALEXIS B.
2001-07-01
This report contains the design basis for a generic molten-salt solar power tower. A solar power tower uses a field of tracking mirrors (heliostats) that redirect sunlight on to a centrally located receiver mounted on top a tower, which absorbs the concentrated sunlight. Molten nitrate salt, pumped from a tank at ground level, absorbs the sunlight, heating it up to 565 C. The heated salt flows back to ground level into another tank where it is stored, then pumped through a steam generator to produce steam and make electricity. This report establishes a set of criteria upon which the next generation of solar power towers will be designed. The report contains detailed criteria for each of the major systems: Collector System, Receiver System, Thermal Storage System, Steam Generator System, Master Control System, and Electric Heat Tracing System. The Electric Power Generation System and Balance of Plant discussions are limited to interface requirements. This design basis builds on the extensive experience gained from the Solar Two project and includes potential design innovations that will improve reliability and lower technical risk. This design basis document is a living document and contains several areas that require trade-studies and design analysis to fully complete the design basis. Project- and site-specific conditions and requirements will also resolve open To Be Determined issues.
CRAD, Safety Basis- Idaho Accelerated Retrieval Project Phase II
Broader source: Energy.gov [DOE]
A section of Appendix C to DOE G 226.1-2 "Federal Line Management Oversight of Department of Energy Nuclear Facilities." Consists of Criteria Review and Approach Documents (CRADs) used for a February 2006 Commencement of Operations assessment of the Safety Basis at the Idaho Accelerated Retrieval Project Phase II.
Theoretical & Experimental Studies of Elementary Particles
McFarland, Kevin
2012-10-04
Abstract High energy physics has been one of the signature research programs at the University of Rochester for over 60 years. The group has made leading contributions to experimental discoveries at accelerators and in cosmic rays and has played major roles in developing the theoretical framework that gives us our ``standard model'' of fundamental interactions today. This award from the Department of Energy funded a major portion of that research for more than 20 years. During this time, highlights of the supported work included the discovery of the top quark at the Fermilab Tevatron, the completion of a broad program of physics measurements that verified the electroweak unified theory, the measurement of three generations of neutrino flavor oscillations, and the first observation of a ``Higgs like'' boson at the Large Hadron Collider. The work has resulted in more than 2000 publications over the period of the grant. The principal investigators supported on this grant have been recognized as leaders in the field of elementary particle physics by their peers through numerous awards and leadership positions. Most notable among them is the APS W.K.H. Panofsky Prize awarded to Arie Bodek in 2004, the J.J. Sakurai Prizes awarded to Susumu Okubo and C. Richard Hagen in 2005 and 2010, respectively, the Wigner medal awarded to Susumu Okubo in 2006, and five principal investigators (Das, Demina, McFarland, Orr, Tipton) who received Department of Energy Outstanding Junior Investigator awards during the period of this grant. The University of Rochester Department of Physics and Astronomy, which houses the research group, provides primary salary support for the faculty and has waived most tuition costs for graduate students during the period of this grant. The group also benefits significantly from technical support and infrastructure available at the University which supports the work. The research work of the group has provided educational opportunities for graduate students
Theoretical Studies of Hydrogen Storage Alloys.
Jonsson, Hannes
2012-03-22
Theoretical calculations were carried out to search for lightweight alloys that can be used to reversibly store hydrogen in mobile applications, such as automobiles. Our primary focus was on magnesium based alloys. While MgH{sub 2} is in many respects a promising hydrogen storage material, there are two serious problems which need to be solved in order to make it useful: (i) the binding energy of the hydrogen atoms in the hydride is too large, causing the release temperature to be too high, and (ii) the diffusion of hydrogen through the hydride is so slow that loading of hydrogen into the metal takes much too long. In the first year of the project, we found that the addition of ca. 15% of aluminum decreases the binding energy to the hydrogen to the target value of 0.25 eV which corresponds to release of 1 bar hydrogen gas at 100 degrees C. Also, the addition of ca. 15% of transition metal atoms, such as Ti or V, reduces the formation energy of interstitial H-atoms making the diffusion of H-atoms through the hydride more than ten orders of magnitude faster at room temperature. In the second year of the project, several calculations of alloys of magnesium with various other transition metals were carried out and systematic trends in stability, hydrogen binding energy and diffusivity established. Some calculations of ternary alloys and their hydrides were also carried out, for example of Mg{sub 6}AlTiH{sub 16}. It was found that the binding energy reduction due to the addition of aluminum and increased diffusivity due to the addition of a transition metal are both effective at the same time. This material would in principle work well for hydrogen storage but it is, unfortunately, unstable with respect to phase separation. A search was made for a ternary alloy of this type where both the alloy and the corresponding hydride are stable. Promising results were obtained by including Zn in the alloy.
Theoretical Description of the Fission Process
Witold Nazarewicz
2009-10-25
Advanced theoretical methods and high-performance computers may finally unlock the secrets of nuclear fission, a fundamental nuclear decay that is of great relevance to society. In this work, we studied the phenomenon of spontaneous fission using the symmetry-unrestricted nuclear density functional theory (DFT). Our results show that many observed properties of fissioning nuclei can be explained in terms of pathways in multidimensional collective space corresponding to different geometries of fission products. From the calculated collective potential and collective mass, we estimated spontaneous fission half-lives, and good agreement with experimental data was found. We also predicted a new phenomenon of trimodal spontaneous fission for some transfermium isotopes. Our calculations demonstrate that fission barriers of excited superheavy nuclei vary rapidly with particle number, pointing to the importance of shell effects even at large excitation energies. The results are consistent with recent experiments where superheavy elements were created by bombarding an actinide target with 48-calcium; yet even at high excitation energies, sizable fission barriers remained. Not only does this reveal clues about the conditions for creating new elements, it also provides a wider context for understanding other types of fission. Understanding of the fission process is crucial for many areas of science and technology. Fission governs existence of many transuranium elements, including the predicted long-lived superheavy species. In nuclear astrophysics, fission influences the formation of heavy elements on the final stages of the r-process in a very high neutron density environment. Fission applications are numerous. Improved understanding of the fission process will enable scientists to enhance the safety and reliability of the nations nuclear stockpile and nuclear reactors. The deployment of a fleet of safe and efficient advanced reactors, which will also minimize radiotoxic
Solar Position Algorithm for Solar Radiation Applications (Revised...
Office of Scientific and Technical Information (OSTI)
Solar Position Algorithm for Solar Radiation Applications (Revised) Citation Details In-Document Search Title: Solar Position Algorithm for Solar Radiation Applications (Revised) ...
Parallel Algorithms and Patterns (Technical Report) | SciTech...
Office of Scientific and Technical Information (OSTI)
Parallel Algorithms and Patterns Citation Details In-Document Search Title: Parallel Algorithms and Patterns Authors: Robey, Robert W. 1 + Show Author Affiliations Los Alamos ...
Efficient algorithm for generating spectra using line-by-lne...
Office of Scientific and Technical Information (OSTI)
Citation Details In-Document Search Title: Efficient algorithm for generating spectra ... Subject: 74 ATOMIC AND MOLECULAR PHYSICS; 70 PLASMA PHYSICS AND FUSION; ALGORITHMS; ...
Robust Algorithm for Computing Statistical Stark Broadening of...
Office of Scientific and Technical Information (OSTI)
Citation Details In-Document Search Title: Robust Algorithm for Computing Statistical ... Language: English Subject: 70 PLASMA PHYSICS AND FUSION; ACCURACY; ALGORITHMS; ...
New Design Methods and Algorithms for Multi-component Distillation...
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Design Methods and Algorithms for Multi-component Distillation Processes New Design Methods and Algorithms for Multi-component Distillation Processes multicomponent.pdf (517.32 KB) ...
New Algorithm Enables Faster Simulations of Ultrafast Processes
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Algorithm Enables Faster Simulations of Ultrafast Processes New Algorithm Enables Faster ... Academy of Sciences, have developed a new real-time time-dependent density function ...
Efficient Theoretical Screening of Solid Sorbents for CO2 Capture
Office of Scientific and Technical Information (OSTI)
Applications* (Journal Article) | SciTech Connect Journal Article: Efficient Theoretical Screening of Solid Sorbents for CO2 Capture Applications* Citation Details In-Document Search Title: Efficient Theoretical Screening of Solid Sorbents for CO2 Capture Applications* By combining thermodynamic database mining with first principles density functional theory and phonon lattice dynamics calculations, a theoretical screening methodology to identify the most promising CO2 sorbent candidates
EXPERIMENTAL AND THEORETICAL DETERMINATION OF HEAVY OIL VISCOSITY...
Office of Scientific and Technical Information (OSTI)
OF HEAVY OIL VISCOSITY UNDER RESERVOIR CONDITIONS Citation Details In-Document Search Title: EXPERIMENTAL AND THEORETICAL DETERMINATION OF HEAVY OIL VISCOSITY UNDER ...
Improvements of Nuclear Data and Its Uncertainties by Theoretical...
Office of Scientific and Technical Information (OSTI)
Improvements of Nuclear Data and Its Uncertainties by Theoretical Modeling Citation Details In-Document Search Title: Improvements of Nuclear Data and Its Uncertainties by ...
Theoretical analysis of uranium-doped thorium dioxide: Introduction...
Office of Scientific and Technical Information (OSTI)
polarization Citation Details In-Document Search Title: Theoretical analysis of uranium-doped thorium dioxide: Introduction of a thoria force field with explicit polarization ...
Fraction of Theoretical Specific Energy Achieved at Battery Pack...
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Fraction of Theoretical Specific Energy Achieved at Battery Pack Level Is Very Sensitive ... factors in determining the fraction of battery material specific energy captured at pack ...
Improvements of Nuclear Data and Its Uncertainties by Theoretical...
Office of Scientific and Technical Information (OSTI)
Improvements of Nuclear Data and Its Uncertainties by Theoretical Modeling Talou, Patrick Los Alamos National Laboratory; Nazarewicz, Witold University of Tennessee, Knoxville,...
Theoretical/Computational Tools for Energy-Relevant Catalysis...
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Essential to these efforts will be the development of novel new approaches in not only theoretical chemistry and materials science (BES), but also computational science and applied ...
OSTIblog Articles in the theoretical physics Topic | OSTI, US...
Office of Scientific and Technical Information (OSTI)
theoretical physics Topic The Remarkable Legacy of Kenneth Geddes Wilson by Kathy Chambers ... Laureate Kenneth Geddes Wilson (1936 -2013) forever changed how we think about physics. ...
Catalysis by Design - Theoretical and Experimental Studies of...
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Catalysis by Design - Theoretical and Experimental Studies of Model Catalysts for Lean NOx ... Lean NOx Traps - Microstructural Studies of Real World and Model Catalysts Catalysis by ...
First-Principles Theoretical Studies of Hydrogen Interaction...
Office of Scientific and Technical Information (OSTI)
Studies of Hydrogen Interaction with Ultrathin Mg and Mg-based Alloy Films Citation Details In-Document Search Title: First-Principles Theoretical Studies of Hydrogen Interaction ...
Industrial ecology: A basis for sustainable relations and cooperation
Blades, K.
1996-07-19
The Commission for Environmental Cooperation (CEC) seeks to address, in a cooperative manner, the environmental issues affecting the North American region and understand the linkages between environment and economy. Broadly, the goal of the CEC can be thought of as an attempt to achieve a sustainable economy concomitantly with continued economic, cultural, and technological evolution. The emerging field of industrial ecology provides a useful means for balancing the environmental and economical objectives of NAFTA. As NAFTA stimulates economic cooperation and growth, we must collectively develop mechanisms that enhance the environmental quality of the region. LLNL`s effort in industrial ecology provides the scientific basis and innovative use of technology to reconcile environmental and economic concerns. Nevertheless, these are not issues which can be resolved by a single institution. Efficient use of the linkages established by NAFTA is necessary to nurture our regional partnership which forms the basis for a sustainable environment, economy and relationship.
Basis for NGNP Reactor Design Down-Selection
L.E. Demick
2011-11-01
The purpose of this paper is to identify the extent of technology development, design and licensing maturity anticipated to be required to credibly identify differences that could make a technical choice practical between the prismatic and pebble bed reactor designs. This paper does not address a business decision based on the economics, business model and resulting business case since these will vary based on the reactor application. The selection of the type of reactor, the module ratings, the number of modules, the configuration of the balance of plant and other design selections will be made on the basis of optimizing the Business Case for the application. These are not decisions that can be made on a generic basis.
Design-Load Basis for LANL Structures, Systems, and Components
I. Cuesta
2004-09-01
This document supports the recommendations in the Los Alamos National Laboratory (LANL) Engineering Standard Manual (ESM), Chapter 5--Structural providing the basis for the loads, analysis procedures, and codes to be used in the ESM. It also provides the justification for eliminating the loads to be considered in design, and evidence that the design basis loads are appropriate and consistent with the graded approach required by the Department of Energy (DOE) Code of Federal Regulation Nuclear Safety Management, 10, Part 830. This document focuses on (1) the primary and secondary natural phenomena hazards listed in DOE-G-420.1-2, Appendix C, (2) additional loads not related to natural phenomena hazards, and (3) the design loads on structures during construction.
Resilient Control Systems Practical Metrics Basis for Defining Mission Impact
Craig G. Rieger
2014-08-01
"Resilienceâ describes how systems operate at an acceptable level of normalcy despite disturbances or threats. In this paper we first consider the cognitive, cyber-physical interdependencies inherent in critical infrastructure systems and how resilience differs from reliability to mitigate these risks. Terminology and metrics basis are provided to integrate the cognitive, cyber-physical aspects that should be considered when defining solutions for resilience. A practical approach is taken to roll this metrics basis up to system integrity and business case metrics that establish âproper operationâ and âimpact.â A notional chemical processing plant is the use case for demonstrating how the system integrity metrics can be applied to establish performance, and
Modeling the Molecular Basis of Parkinson's Disease | Argonne Leadership
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Computing Facility Alpha-synuclein pentamer constructed with 4ns molecular dynamics (MD) conformers after equilibration on the membrane with MD. Alpha-synuclein pentamer constructed with 4ns molecular dynamics (MD) conformers after equilibration on the membrane with MD. Modeling the Molecular Basis of Parkinson's Disease PI Name: Igor Tsigelny PI Email: itsigeln@ucsd.edu Institution: University of California-San Diego/SDSC Allocation Program: INCITE Allocation Hours at ALCF: 1.2 Million
Interim Safety Basis for Fuel Supply Shutdown Facility
BENECKE, M.W.
2000-09-07
This ISB, in conjunction with the IOSR, provides the required basis for interim operation or restrictions on interim operations and administrative controls for the facility until a SAR is prepared in accordance with the new requirements or the facility is shut down. It is concluded that the risks associated with tha current and anticipated mode of the facility, uranium disposition, clean up, and transition activities required for permanent closure, are within risk guidelines.
Design Basis Threat | National Nuclear Security Administration | (NNSA)
National Nuclear Security Administration (NNSA)
Design Basis Threat NNSA has taken aggressive action to improve the security of its nuclear weapons material (often referred to as special nuclear material, or SNM) and nuclear weapons in its custody. NNSA has taken aggressive action to improve the security of its nuclear weapons material (often referred to as special nuclear material, or SNM) and nuclear weapons in its custody. NNSA has taken aggressive action to improve the security of its nuclear weapons material (often referred to as special
Online Monitoring Technical Basis and Analysis Framework for Large Power
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Transformers; Interim Report for FY 2012 | Department of Energy for Large Power Transformers; Interim Report for FY 2012 Online Monitoring Technical Basis and Analysis Framework for Large Power Transformers; Interim Report for FY 2012 The Light Water Reactor Sustainability Program is a research, development, and deployment program sponsored by the U.S. Department of Energy Office of Nuclear Energy. The program is operated in collaboration with the Electric Power Research Institute's (EPRI's)
Auxiliary basis expansions for large-scale electronic structure calculations
Jung, Yousung; Sodt, Alexander; Gill, Peter W.M.; Head-Gordon, Martin
2005-04-04
One way to reduce the computational cost of electronic structure calculations is to employ auxiliary basis expansions to approximate 4 center integrals in terms of 2 and 3-center integrals, usually using the variationally optimum Coulomb metric to determine the expansion coefficients. However the long-range decay behavior of the auxiliary basis expansion coefficients has not been characterized. We find that this decay can be surprisingly slow. Numerical experiments on linear alkanes and a toy model both show that the decay can be as slow as 1/r in the distance between the auxiliary function and the fitted charge distribution. The Coulomb metric fitting equations also involve divergent matrix elements for extended systems treated with periodic boundary conditions. An attenuated Coulomb metric that is short-range can eliminate these oddities without substantially degrading calculated relative energies. The sparsity of the fit coefficients is assessed on simple hydrocarbon molecules, and shows quite early onset of linear growth in the number of significant coefficients with system size using the attenuated Coulomb metric. This means it is possible to design linear scaling auxiliary basis methods without additional approximations to treat large systems.
Theoretical minimum energies to produce steel for selected conditions
Fruehan, R. J.; Fortini, O.; Paxton, H. W.; Brindle, R.
2000-03-01
An ITP study has determined the theoretical minimum energy requirements for producing steel from ore, scrap, and direct reduced iron. Dr. Richard Fruehan's report, Theoretical Minimum Energies to Produce Steel for Selected Conditions, provides insight into the potential energy savings (and associated reductions in carbon dioxide emissions) for ironmaking, steelmaking, and rolling processes (PDF 459 KB).
Factorization using the quadratic sieve algorithm
Davis, J.A.; Holdridge, D.B.
1983-01-01
Since the cryptosecurity of the RSA two key cryptoalgorithm is no greater than the difficulty of factoring the modulus (product of two secret primes), a code that implements the Quadratic Sieve factorization algorithm on the CRAY I computer has been developed at the Sandia National Laboratories to determine as sharply as possible the current state-of-the-art in factoring. Because all viable attacks on RSA thus far proposed are equivalent to factorization of the modulus, sharper bounds on the computational difficulty of factoring permit improved estimates for the size of RSA parameters needed for given levels of cryptosecurity. Analysis of the Quadratic Sieve indicates that it may be faster than any previously published general purpose algorithm for factoring large integers. The high speed of the CRAY I coupled with the capability of the CRAY to pipeline certain vectorized operations make this algorithm (and code) the front runner in current factoring techniques.
Factorization using the quadratic sieve algorithm
Davis, J.A.; Holdridge, D.B.
1983-12-01
Since the cryptosecurity of the RSA two key cryptoalgorithm is no greater than the difficulty of factoring the modulus (product of two secret primes), a code that implements the Quadratic Sieve factorization algorithm on the CRAY I computer has been developed at the Sandia National Laboratories to determine as sharply as possible the current state-of-the-art in factoring. Because all viable attacks on RSA thus far proposed are equivalent to factorization of the modulus, sharper bounds on the computational difficulty of factoring permit improved estimates for the size of RSA parameters needed for given levels of cryptosecurity. Analysis of the Quadratic Sieve indicates that it may be faster than any previously published general purpose algorithm for factoring large integers. The high speed of the CRAY I coupled with the capability of the CRAY to pipeline certain vectorized operations make this algorithm (and code) the front runner in current factoring techniques.
Nonlinear Global Optimization Using Curdling Algorithm
Energy Science and Technology Software Center (OSTI)
1996-03-01
An algorithm for performing curdling optimization which is a derivative-free, grid-refinement approach to nonlinear optimization was developed and implemented in software. This approach overcomes a number of deficiencies in existing approaches. Most notably, it finds extremal regions rather than only single external extremal points. The program is interactive and collects information on control parameters and constraints using menus. For up to four dimensions, function convergence is displayed graphically. Because the algorithm does not compute derivatives,moreÂ Â» gradients or vectors, it is numerically stable. It can find all the roots of a polynomial in one pass. It is an inherently parallel algorithm. Constraints are handled as being initially fuzzy, but become tighter with each iteration.Â«Â less
Bootstrap performance profiles in stochastic algorithms assessment
Costa, Lino; Espírito Santo, Isabel A.C.P.; Oliveira, Pedro
2015-03-10
Optimization with stochastic algorithms has become a relevant research field. Due to its stochastic nature, its assessment is not straightforward and involves integrating accuracy and precision. Performance profiles for the mean do not show the trade-off between accuracy and precision, and parametric stochastic profiles require strong distributional assumptions and are limited to the mean performance for a large number of runs. In this work, bootstrap performance profiles are used to compare stochastic algorithms for different statistics. This technique allows the estimation of the sampling distribution of almost any statistic even with small samples. Multiple comparison profiles are presented for more than two algorithms. The advantages and drawbacks of each assessment methodology are discussed.
Parallelism of the SANDstorm hash algorithm.
Torgerson, Mark Dolan; Draelos, Timothy John; Schroeppel, Richard Crabtree
2009-09-01
Mainstream cryptographic hashing algorithms are not parallelizable. This limits their speed and they are not able to take advantage of the current trend of being run on multi-core platforms. Being limited in speed limits their usefulness as an authentication mechanism in secure communications. Sandia researchers have created a new cryptographic hashing algorithm, SANDstorm, which was specifically designed to take advantage of multi-core processing and be parallelizable on a wide range of platforms. This report describes a late-start LDRD effort to verify the parallelizability claims of the SANDstorm designers. We have shown, with operating code and bench testing, that the SANDstorm algorithm may be trivially parallelized on a wide range of hardware platforms. Implementations using OpenMP demonstrates a linear speedup with multiple cores. We have also shown significant performance gains with optimized C code and the use of assembly instructions to exploit particular platform capabilities.
Speckle imaging algorithms for planetary imaging
Johansson, E.
1994-11-15
I will discuss the speckle imaging algorithms used to process images of the impact sites of the collision of comet Shoemaker-Levy 9 with Jupiter. The algorithms use a phase retrieval process based on the average bispectrum of the speckle image data. High resolution images are produced by estimating the Fourier magnitude and Fourier phase of the image separately, then combining them and inverse transforming to achieve the final result. I will show raw speckle image data and high-resolution image reconstructions from our recent experiment at Lick Observatory.
Graph algorithms in the titan toolkit.
McLendon, William Clarence, III; Wylie, Brian Neil
2009-10-01
Graph algorithms are a key component in a wide variety of intelligence analysis activities. The Graph-Based Informatics for Non-Proliferation and Counter-Terrorism project addresses the critical need of making these graph algorithms accessible to Sandia analysts in a manner that is both intuitive and effective. Specifically we describe the design and implementation of an open source toolkit for doing graph analysis, informatics, and visualization that provides Sandia with novel analysis capability for non-proliferation and counter-terrorism.
Berkeley Algorithms Help Researchers Understand Dark Energy
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Algorithms Help Researchers Understand Dark Energy Berkeley Algorithms Help Researchers Understand Dark Energy November 24, 2014 Contact: Linda Vu, +1 510 495 2402, lvu@lbl.gov Scientists believe that dark energy-the mysterious force that is accelerating cosmic expansion-makes up about 70 percent of the mass and energy of the universe. But because they don't know what it is, they cannot observe it directly. To unlock the mystery of dark energy and its influence on the universe, researchers
AUDIT REPORT Follow-up on Nuclear Safety: Safety Basis and Quality...
Broader source: Energy.gov (indexed) [DOE]
Nuclear Safety: Safety Basis and Quality Assurance at the Los Alamos National Laboratory ... INFORMATION: Audit Report: "Follow-up on Nuclear Safety: Safety Basis and Quality ...
5th International REAC/TS Symposium: The Medical Basis for Radiation...
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
...TS Symposium: The Medical Basis for Radiation Accident Preparedness Skip site ...TS Symposium: The Medical Basis for Radiation Accident Preparedness Sept. 27-29, 2011 ...
CRAD, Safety Basis Upgrade Review (DOE-STD-3009-2014) - May 15...
Office of Environmental Management (EM)
1) provides objectives, criteria, and approaches for establishing and maintaining the safety basis at nuclear facilities. CRAD, Safety Basis Upgrade Review (DOE-STD-3009-2014)...
Improved algorithm for processing grating-based phase contrast interferometry image sets
Marathe, Shashidhara Assoufid, Lahsen Xiao, Xianghui; Ham, Kyungmin; Johnson, Warren W.; Butler, Leslie G.
2014-01-15
Grating-based X-ray and neutron interferometry tomography using phase-stepping methods generates large data sets. An improved algorithm is presented for solving for the parameters to calculate transmissions, differential phase contrast, and dark-field images. The method takes advantage of the vectorization inherent in high-level languages such as Mathematica and MATLAB and can solve a 16 Ă 1k Ă 1k data set in less than a second. In addition, the algorithm can function with partial data sets. This is demonstrated with processing of a 16-step grating data set with partial use of the original data chosen without any restriction. Also, we have calculated the reduced chi-square for the fit and notice the effect of grating support structural elements upon the differential phase contrast image and have explored expanded basis set representations to mitigate the impact.
RELEASE OF DRIED RADIOACTIVE WASTE MATERIALS TECHNICAL BASIS DOCUMENT
KOZLOWSKI, S.D.
2007-05-30
This technical basis document was developed to support RPP-23429, Preliminary Documented Safety Analysis for the Demonstration Bulk Vitrification System (PDSA) and RPP-23479, Preliminary Documented Safety Analysis for the Contact-Handled Transuranic Mixed (CH-TRUM) Waste Facility. The main document describes the risk binning process and the technical basis for assigning risk bins to the representative accidents involving the release of dried radioactive waste materials from the Demonstration Bulk Vitrification System (DBVS) and to the associated represented hazardous conditions. Appendices D through F provide the technical basis for assigning risk bins to the representative dried waste release accident and associated represented hazardous conditions for the Contact-Handled Transuranic Mixed (CH-TRUM) Waste Packaging Unit (WPU). The risk binning process uses an evaluation of the frequency and consequence of a given representative accident or represented hazardous condition to determine the need for safety structures, systems, and components (SSC) and technical safety requirement (TSR)-level controls. A representative accident or a represented hazardous condition is assigned to a risk bin based on the potential radiological and toxicological consequences to the public and the collocated worker. Note that the risk binning process is not applied to facility workers because credible hazardous conditions with the potential for significant facility worker consequences are considered for safety-significant SSCs and/or TSR-level controls regardless of their estimated frequency. The controls for protection of the facility workers are described in RPP-23429 and RPP-23479. Determination of the need for safety-class SSCs was performed in accordance with DOE-STD-3009-94, Preparation Guide for US. Department of Energy Nonreactor Nuclear Facility Documented Safety Analyses, as described below.
Optimizing SRF Gun Cavity Profiles in a Genetic Algorithm Framework
Alicia Hofler, Pavel Evtushenko, Frank Marhauser
2009-09-01
Automation of DC photoinjector designs using a genetic algorithm (GA) based optimization is an accepted practice in accelerator physics. Allowing the gun cavity field profile shape to be varied can extend the utility of this optimization methodology to superconducting and normal conducting radio frequency (SRF/RF) gun based injectors. Finding optimal field and cavity geometry configurations can provide guidance for cavity design choices and verify existing designs. We have considered two approaches for varying the electric field profile. The first is to determine the optimal field profile shape that should be used independent of the cavity geometry, and the other is to vary the geometry of the gun cavity structure to produce an optimal field profile. The first method can provide a theoretical optimal and can illuminate where possible gains can be made in field shaping. The second method can produce more realistically achievable designs that can be compared to existing designs. In this paper, we discuss the design and implementation for these two methods for generating field profiles for SRF/RF guns in a GA based injector optimization scheme and provide preliminary results.
Guidance For Preparatioon of Basis For Interim Operation (BIO) Documents
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
3011-2002 December 2002 Superceding DOE-STD-3011-94 November 1994 DOE STANDARD GUIDANCE FOR PREPARATION OF BASIS FOR INTERIM OPERATION (BIO) DOCUMENTS U.S. Department of Energy AREA SAFT Washington, D.C. 20585 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. NOT MEASUREMENT SENSITIVE DOE-STD-3011-2002 ii This document has been reproduced directly from the best available copy. Available to DOE and DOE contractors from ES&H Technical Information Services, U.S.
The Bender-Dunne basis operators as Hilbert space operators
Bunao, Joseph; Galapon, Eric A. E-mail: eric.galapon@upd.edu.ph
2014-02-15
The Bender-Dunne basis operators, T{sub ?m,n}=2{sup ?n}?{sub k=0}{sup n}(n/k )q{sup k}p{sup ?m}q{sup n?k} where q and p are the position and momentum operators, respectively, are formal integral operators in position representation in the entire real line R for positive integers n and m. We show, by explicit construction of a dense domain, that the operators T{sub ?m,n}'s are densely defined operators in the Hilbert space L{sup 2}(R)
Preparation of Safety Basis Documents for Transuranic (TRU) Waste Facilities
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
5506-2007 April 2007 DOE STANDARD Preparation of Safety Basis Documents for Transuranic (TRU) Waste Facilities U.S. Department of Energy Washington, D.C. 20585 AREA-SAFT DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. DOE-STD-5506-2007 ii Available on the Department of Energy Technical Standards Program Web Site at Http://tis.eh.doe.gov/techstds/ DOE-STD-5506-2007 iii Foreword This Standard provides analytical assumptions and methods, as well as hazard controls
Gamma-ray spectral analysis algorithm library
Energy Science and Technology Software Center (OSTI)
2013-05-06
The routines of the Gauss Algorithms library are used to implement special purpose products that need to analyze gamma-ray spectra from Ge semiconductor detectors as a part of their function. These routines provide the ability to calibrate energy, calibrate peakwidth, search for peaks, search for regions, and fit the spectral data in a given region to locate gamma rays.
Gamma-ray Spectral Analysis Algorithm Library
Energy Science and Technology Software Center (OSTI)
1997-09-25
The routines of the Gauss Algorithm library are used to implement special purpose products that need to analyze gamma-ray spectra from GE semiconductor detectors as a part of their function. These routines provide the ability to calibrate energy, calibrate peakwidth, search for peaks, search for regions, and fit the spectral data in a given region to locate gamma rays.
PDES. FIPS Standard Data Encryption Algorithm
Nessett, D.N.
1992-03-03
PDES performs the National Bureau of Standards FIPS Pub. 46 data encryption/description algorithm used for the cryptographic protection of computer data. The DES algorithm is designed to encipher and decipher blocks of data consisting of 64 bits under control of a 64-bit key. The key is generated in such a way that each of the 56 bits used directly by the algorithm are random and the remaining 8 error-detecting bits are set to make the parity of each 8-bit byte of the key odd, i.e. there is an odd number of 1 bits in each 8-bit byte. Each member of a group of authorized users of encrypted computer data must have the key that was used to encipher the data in order to use it. Data can be recovered from cipher only by using exactly the same key used to encipher it, but with the schedule of addressing the key bits altered so that the deciphering process is the reverse of the enciphering process. A block of data to be enciphered is subjected to an initial permutation, then to a complex key-dependent computation, and finally to a permutation which is the inverse of the initial permutation. Two PDES routines are included; both perform the same calculation. One, identified as FDES.MAR, is designed to achieve speed in execution, while the other identified as PDES.MAR, presents a clearer view of how the algorithm is executed.
Control algorithms for autonomous robot navigation
Jorgensen, C.C.
1985-09-20
This paper examines control algorithm requirements for autonomous robot navigation outside laboratory environments. Three aspects of navigation are considered: navigation control in explored terrain, environment interactions with robot sensors, and navigation control in unanticipated situations. Major navigation methods are presented and relevance of traditional human learning theory is discussed. A new navigation technique linking graph theory and incidental learning is introduced.
Theoretical Studies of Low Frequency Instabilities in the Ionosphere. Final Report
Dimant, Y. S.
2003-08-20
The objective of the current project is to provide a theoretical basis for better understanding of numerous radar and rocket observations of density irregularities and related effects in the lower equatorial and high-latitude ionospheres. The research focused on: (1) continuing efforts to develop a theory of nonlinear saturation of the Farley-Buneman instability; (2) revision of the kinetic theory of electron-thermal instability at low altitudes; (3) studying the effects of strong anomalous electron heating in the high-latitude electrojet; (4) analytical and numerical studies of the combined Farley-Bunemadion-thermal instabilities in the E-region ionosphere; (5) studying the effect of dust charging in Polar Mesospheric Clouds. Revision of the kinetic theory of electron thermal instability at low altitudes.
Hanford External Dosimetry Technical Basis Manual PNL-MA-842
Rathbone, Bruce A.
2005-02-25
The Hanford External Dosimetry Technical Basis Manual PNL-MA-842 documents the design and implementation of the external dosimetry system used at Hanford. The manual describes the dosimeter design, processing protocols, dose calculation methodology, radiation fields encountered, dosimeter response characteristics, limitations of dosimeter design under field conditions, and makes recommendations for effective use of the dosimeters in the field. The manual describes the technical basis for the dosimetry system in a manner intended to help ensure defensibility of the dose of record at Hanford and to demonstrate compliance with 10 CFR 835, DOELAP, DOE-RL, ORP, PNSO, and Hanford contractor requirements. The dosimetry system is operated by PNNLâs Hanford External Dosimetry Program which provides dosimetry services to all Hanford contractors. The primary users of this manual are DOE and DOE contractors at Hanford using the dosimetry services of PNNL. Development and maintenance of this manual is funded directly by DOE and DOE contractors. Its contents have been reviewed and approved by DOE and DOE contractors at Hanford through the Hanford Personnel Dosimetry Advisory Committee which is chartered and chaired by DOE-RL and serves as means of coordinating dosimetry practices across contractors at Hanford. This manual was established in 1996. Since inception, it has been revised many times and maintained by PNNL as a controlled document with controlled distribution. Rev. 0 marks the first revision to be released through PNNLâs Electronic Records & Information Capture Architecture (ERICA) database.
Cold Vacuum Drying facility design basis accident analysis documentation
CROWE, R.D.
2000-08-08
This document provides the detailed accident analysis to support HNF-3553, Annex B, Spent Nuclear Fuel Project Final Safety Analysis Report (FSAR), ''Cold Vacuum Drying Facility Final Safety Analysis Report.'' All assumptions, parameters, and models used to provide the analysis of the design basis accidents are documented to support the conclusions in the FSAR. The calculations in this document address the design basis accidents (DBAs) selected for analysis in HNF-3553, ''Spent Nuclear Fuel Project Final Safety Analysis Report'', Annex B, ''Cold Vacuum Drying Facility Final Safety Analysis Report.'' The objective is to determine the quantity of radioactive particulate available for release at any point during processing at the Cold Vacuum Drying Facility (CVDF) and to use that quantity to determine the amount of radioactive material released during the DBAs. The radioactive material released is used to determine dose consequences to receptors at four locations, and the dose consequences are compared with the appropriate evaluation guidelines and release limits to ascertain the need for preventive and mitigative controls.
Advanced Fuel Cycle Economic Tools, Algorithms, and Methodologies
David E. Shropshire
2009-05-01
The Advanced Fuel Cycle Initiative (AFCI) Systems Analysis supports engineering economic analyses and trade-studies, and requires a requisite reference cost basis to support adequate analysis rigor. In this regard, the AFCI program has created a reference set of economic documentation. The documentation consists of the âAdvanced Fuel Cycle (AFC) Cost Basisâ report (Shropshire, et al. 2007), âAFCI Economic Analysisâ report, and the âAFCI Economic Tools, Algorithms, and Methodologies Report.â Together, these documents provide the reference cost basis, cost modeling basis, and methodologies needed to support AFCI economic analysis. The application of the reference cost data in the cost and econometric systems analysis models will be supported by this report. These methodologies include: the energy/environment/economic evaluation of nuclear technology penetration in the energy marketâdomestic and internationallyâand impacts on AFCI facility deployment, uranium resource modeling to inform the front-end fuel cycle costs, facility first-of-a-kind to nth-of-a-kind learning with application to deployment of AFCI facilities, cost tradeoffs to meet nuclear non-proliferation requirements, and international nuclear facility supply/demand analysis. The economic analysis will be performed using two cost models. VISION.ECON will be used to evaluate and compare costs under dynamic conditions, consistent with the cases and analysis performed by the AFCI Systems Analysis team. Generation IV Excel Calculations of Nuclear Systems (G4-ECONS) will provide static (snapshot-in-time) cost analysis and will provide a check on the dynamic results. In future analysis, additional AFCI measures may be developed to show the value of AFCI in closing the fuel cycle. Comparisons can show AFCI in terms of reduced global proliferation (e.g., reduction in enrichment), greater sustainability through preservation of a natural resource (e.g., reduction in uranium ore depletion), value from
Semi-Implicit Reversible Algorithms for Rigid Body Rotational Dynamics
Nukala, Phani K; Shelton Jr, William Allison
2006-09-01
This paper presents two semi-implicit algorithms based on splitting methodology for rigid body rotational dynamics. The first algorithm is a variation of partitioned Runge-Kutta (PRK) methodology that can be formulated as a splitting method. The second algorithm is akin to a multiple time stepping scheme and is based on modified Crouch-Grossman (MCG) methodology, which can also be expressed as a splitting algorithm. These algorithms are second-order accurate and time-reversible; however, they are not Poisson integrators, i.e., non-symplectic. These algorithms conserve some of the first integrals of motion, but some others are not conserved; however, the fluctuations in these invariants are bounded over exponentially long time intervals. These algorithms exhibit excellent long-term behavior because of their reversibility property and their (approximate) Poisson structure preserving property. The numerical results indicate that the proposed algorithms exhibit superior performance compared to some of the currently well known algorithms such as the Simo-Wong algorithm, Newmark algorithm, discrete Moser-Veselov algorithm, Lewis-Simo algorithm, and the LIEMID[EA] algorithm.
Theoretical investigations of two Si-based spintronic materials...
Office of Scientific and Technical Information (OSTI)
Title: Theoretical investigations of two Si-based spintronic materials Two Si-based spintronic materials, a Mn-Si digital ferromagnetic heterostructure (delta-layer of Mn doped ...
Neutron-antineutron oscillations: Theoretical status and experimental prospects
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Phillips, D. G.; Snow, W. M.; Babu, K.; Banerjee, S.; Baxter, D. V.; Berezhiani, Z.; Bergevin, M.; Bhattacharya, S.; Brooijmans, G.; Castellanos, L.; et al
2016-02-01
This paper summarizes the relevant theoretical developments, outlines some ideas to improve experimental searches for free neutron-antineutron oscillations, and suggests avenues for future improvement in the experimental sensitivity.
Theoretical/best practice energy use in metalcasting operations
Schifo, J. F.; Radia, J. T.
2004-05-01
This study determined the theoretical minimum energy requirements for melting processes for all ferrous and noferrous engenieering alloys. Also the report details the Best Practice energy consumption for the industry.
EXPERIMENTAL AND THEORETICAL DETERMINATION OF HEAVY OIL VISCOSITY...
Office of Scientific and Technical Information (OSTI)
EXPERIMENTAL AND THEORETICAL DETERMINATION OF HEAVY OIL VISCOSITY UNDER RESERVOIR CONDITIONS FINAL PROGRESS REPORT PERIOD: OCT 1999-MAY 2003 CONTRACT NUMBER: DE-FG26-99FT40615 ...
Neutron-Antineutron Oscillations: Theoretical Status and Experimental Prospects
Phillips, D. G.; Snow, W. M.; Babu, K.; Banerjee, S.; Baxter, D. V.; Berezhiani, Z.; Bergevin, M.; Bhattacharya, S.; Brooijmans, G.; Castellanos, L.; et al.,
2014-10-04
This paper summarizes the relevant theoretical developments, outlines some ideas to improve experimental searches for free neutron-antineutron oscillations, and suggests avenues for future improvement in the experimental sensitivity.
Final Report. Research in Theoretical High Energy Physics
Greensite, Jeffrey P.; Golterman, Maarten F.L.
2015-04-30
Grant-supported research in theoretical high-energy physics, conducted in the period 1992-2015 is briefly described, and a full listing of published articles result from those research activities is supplied.