National Library of Energy BETA

Sample records for analytical methodologies model

  1. Analytical Modeling | Open Energy Information

    Open Energy Info (EERE)

    & Analytical Models Website - University of Washington, Department of Economic Business and Geography Page Area Activity Start Date Activity End Date Reference Material...

  2. Evaluation methodology for comparing memory and communication of analytic processes in visual analytics

    SciTech Connect (OSTI)

    Ragan, Eric D; Goodall, John R

    2014-01-01

    Provenance tools can help capture and represent the history of analytic processes. In addition to supporting analytic performance, provenance tools can be used to support memory of the process and communication of the steps to others. Objective evaluation methods are needed to evaluate how well provenance tools support analyst s memory and communication of analytic processes. In this paper, we present several methods for the evaluation of process memory, and we discuss the advantages and limitations of each. We discuss methods for determining a baseline process for comparison, and we describe various methods that can be used to elicit process recall, step ordering, and time estimations. Additionally, we discuss methods for conducting quantitative and qualitative analyses of process memory. By organizing possible memory evaluation methods and providing a meta-analysis of the potential benefits and drawbacks of different approaches, this paper can inform study design and encourage objective evaluation of process memory and communication.

  3. Model and Analytic Processes for Export License Assessments

    SciTech Connect (OSTI)

    Thompson, Sandra E.; Whitney, Paul D.; Weimar, Mark R.; Wood, Thomas W.; Daly, Don S.; Brothers, Alan J.; Sanfilippo, Antonio P.; Cook, Diane; Holder, Larry

    2011-09-29

    This paper represents the Department of Energy Office of Nonproliferation Research and Development (NA-22) Simulations, Algorithms and Modeling (SAM) Program's first effort to identify and frame analytical methods and tools to aid export control professionals in effectively predicting proliferation intent; a complex, multi-step and multi-agency process. The report focuses on analytical modeling methodologies that alone, or combined, may improve the proliferation export control license approval process. It is a follow-up to an earlier paper describing information sources and environments related to international nuclear technology transfer. This report describes the decision criteria used to evaluate modeling techniques and tools to determine which approaches will be investigated during the final 2 years of the project. The report also details the motivation for why new modeling techniques and tools are needed. The analytical modeling methodologies will enable analysts to evaluate the information environment for relevance to detecting proliferation intent, with specific focus on assessing risks associated with transferring dual-use technologies. Dual-use technologies can be used in both weapons and commercial enterprises. A decision-framework was developed to evaluate which of the different analytical modeling methodologies would be most appropriate conditional on the uniqueness of the approach, data availability, laboratory capabilities, relevance to NA-22 and Office of Arms Control and Nonproliferation (NA-24) research needs and the impact if successful. Modeling methodologies were divided into whether they could help micro-level assessments (e.g., help improve individual license assessments) or macro-level assessment. Macro-level assessment focuses on suppliers, technology, consumers, economies, and proliferation context. Macro-level assessment technologies scored higher in the area of uniqueness because less work has been done at the macro level. An approach to

  4. Modeling of Diesel Exhaust Systems: A methodology to better simulate...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    of Diesel Exhaust Systems: A methodology to better simulate soot reactivity Modeling of Diesel Exhaust Systems: A methodology to better simulate soot reactivity Discussed ...

  5. Analytic models of plausible gravitational lens potentials

    SciTech Connect (OSTI)

    Baltz, Edward A.; Marshall, Phil; Oguri, Masamune, E-mail: eabaltz@slac.stanford.edu, E-mail: pjm@physics.ucsb.edu, E-mail: oguri@slac.stanford.edu [Kavli Institute for Particle Astrophysics and Cosmology, Stanford University, PO Box 20450, MS29, Stanford, CA 94309 (United States)] [Kavli Institute for Particle Astrophysics and Cosmology, Stanford University, PO Box 20450, MS29, Stanford, CA 94309 (United States)

    2009-01-15

    Gravitational lenses on galaxy scales are plausibly modelled as having ellipsoidal symmetry and a universal dark matter density profile, with a Sersic profile to describe the distribution of baryonic matter. Predicting all lensing effects requires knowledge of the total lens potential: in this work we give analytic forms for that of the above hybrid model. Emphasising that complex lens potentials can be constructed from simpler components in linear combination, we provide a recipe for attaining elliptical symmetry in either projected mass or lens potential. We also provide analytic formulae for the lens potentials of Sersic profiles for integer and half-integer index. We then present formulae describing the gravitational lensing effects due to smoothly-truncated universal density profiles in cold dark matter model. For our isolated haloes the density profile falls off as radius to the minus fifth or seventh power beyond the tidal radius, functional forms that allow all orders of lens potential derivatives to be calculated analytically, while ensuring a non-divergent total mass. We show how the observables predicted by this profile differ from that of the original infinite-mass NFW profile. Expressions for the gravitational flexion are highlighted. We show how decreasing the tidal radius allows stripped haloes to be modelled, providing a framework for a fuller investigation of dark matter substructure in galaxies and clusters. Finally we remark on the need for finite mass halo profiles when doing cosmological ray-tracing simulations, and the need for readily-calculable higher order derivatives of the lens potential when studying catastrophes in strong lenses.

  6. ANALYTIC MODELING OF THE MORETON WAVE KINEMATICS

    SciTech Connect (OSTI)

    Temmer, M.; Veronig, A. M.

    2009-09-10

    The issue whether Moreton waves are flare-ignited or coronal mass ejection (CME)-driven, or a combination of both, is still a matter of debate. We develop an analytical model describing the evolution of a large-amplitude coronal wave emitted by the expansion of a circular source surface in order to mimic the evolution of a Moreton wave. The model results are confronted with observations of a strong Moreton wave observed in association with the X3.8/3B flare/CME event from 2005 January 17. Using different input parameters for the expansion of the source region, either derived from the real CME observations (assuming that the upward moving CME drives the wave), or synthetically generated scenarios (expanding flare region, lateral expansion of the CME flanks), we calculate the kinematics of the associated Moreton wave signature. Those model input parameters are determined which fit the observed Moreton wave kinematics best. Using the measured kinematics of the upward moving CME as the model input, we are not able to reproduce the observed Moreton wave kinematics. The observations of the Moreton wave can be reproduced only by applying a strong and impulsive acceleration for the source region expansion acting in a piston mechanism scenario. Based on these results we propose that the expansion of the flaring region or the lateral expansion of the CME flanks is more likely the driver of the Moreton wave than the upward moving CME front.

  7. Proposed Methodology for LEED Baseline Refrigeration Modeling (Presentation)

    SciTech Connect (OSTI)

    Deru, M.

    2011-02-01

    This PowerPoint presentation summarizes a proposed methodology for LEED baseline refrigeration modeling. The presentation discusses why refrigeration modeling is important, the inputs of energy models, resources, reference building model cases, baseline model highlights, example savings calculations and results.

  8. Theoretical description of methodology in PHASER (Probabilistic hybrid analytical system evaluation routine)

    SciTech Connect (OSTI)

    Cooper, J.A.

    1996-01-01

    Probabilistic safety analyses (PSAs) frequently depend on fault tree and event tree models, using probabilities of `events` for inputs. Uncertainty or variability is sometimes included by assuming that the input probabilities vary independently and according to an assumed stochastic probability distribution modes. Evidence is accumulating that this methodology does not apply well to some situations, most significantly when the inputs contain a degree of subjectivity or are dependent. This report documents the current status of an investigation into methods for effectively incorporating subjectivity and dependence in PSAs and into the possibility of incorporating inputs that are partly subjective and partly stochastic. One important byproduct of this investigation was a computer routine that combines conventional PSA techniques with newly developed subjective techniques in a `hybrid` (subjective and conventional PSA) program. This program (PHASER) and a user`s manual are now available for beta use.

  9. A simple Analytical Model to Study and Control Azimuthal Instabilities...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    A simple Analytical Model to Study and Control Azimuthal Instabilities in Annular Combustion Chambers Authors: Parmentier, J-F., Salas, P., Wolf, P., Staffelbach, G., Nicoud, F., ...

  10. Analytical Modeling At Lightning Dock Geothermal Area (Brook...

    Open Energy Info (EERE)

    Lightning Dock Geothermal Area (Brook, Et Al., 1978) Jump to: navigation, search GEOTHERMAL ENERGYGeothermal Home Exploration Activity: Analytical Modeling At Lightning Dock...

  11. An Analytical Elastic Plastic Contact Model with Strain Hardening...

    Office of Scientific and Technical Information (OSTI)

    Journal Article: An Analytical Elastic Plastic Contact Model with Strain Hardening and Frictional Effects for Normal and Oblique Impacts. Citation Details In-Document Search Title:...

  12. High-Throughput Analytical Model to Evaluate Materials for Temperature...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    High-Throughput Analytical Model to Evaluate Materials for Temperature Swing Adsorption Processes Previous Next List mcontent.jpg Julian P. Sculley, Wolfgang M. Verdegaal, Weigang...

  13. Methodology for characterizing modeling and discretization uncertainties in computational simulation

    SciTech Connect (OSTI)

    ALVIN,KENNETH F.; OBERKAMPF,WILLIAM L.; RUTHERFORD,BRIAN M.; DIEGERT,KATHLEEN V.

    2000-03-01

    This research effort focuses on methodology for quantifying the effects of model uncertainty and discretization error on computational modeling and simulation. The work is directed towards developing methodologies which treat model form assumptions within an overall framework for uncertainty quantification, for the purpose of developing estimates of total prediction uncertainty. The present effort consists of work in three areas: framework development for sources of uncertainty and error in the modeling and simulation process which impact model structure; model uncertainty assessment and propagation through Bayesian inference methods; and discretization error estimation within the context of non-deterministic analysis.

  14. Evaluation Methodology for Advance Heat Exchanger Concepts Using Analytical Hierarchy Process

    SciTech Connect (OSTI)

    Piyush Sabharwall; Eung Soo Kim

    2012-07-01

    The primary purpose of this study is to aid in the development and selection of the secondary/process heat exchanger (SHX) for power production and process heat application for a Next Generation Nuclear Reactors (NGNR). The potential options for use as an SHX are explored such as shell and tube, printed circuit heat exchanger. A shell and tube (helical coiled) heat exchanger is a recommended for a demonstration reactor because of its reliability while the reactor design is being further developed. The basic setup for the selection of the SHX has been established with evaluation goals, alternatives, and criteria. This study describes how these criteria and the alternatives are evaluated using the analytical hierarchy process (AHP).

  15. Analytical model for fast-shock ignition

    SciTech Connect (OSTI)

    Ghasemi, S. A. Farahbod, A. H.; Sobhanian, S.

    2014-07-15

    A model and its improvements are introduced for a recently proposed approach to inertial confinement fusion, called fast-shock ignition (FSI). The analysis is based upon the gain models of fast ignition, shock ignition and considerations for the fast electrons penetration into the pre-compressed fuel to examine the formation of an effective central hot spot. Calculations of fast electrons penetration into the dense fuel show that if the initial electron kinetic energy is of the order ∼4.5 MeV, the electrons effectively reach the central part of the fuel. To evaluate more realistically the performance of FSI approach, we have used a quasi-two temperature electron energy distribution function of Strozzi (2012) and fast ignitor energy formula of Bellei (2013) that are consistent with 3D PIC simulations for different values of fast ignitor laser wavelength and coupling efficiency. The general advantages of fast-shock ignition in comparison with the shock ignition can be estimated to be better than 1.3 and it is seen that the best results can be obtained for the fuel mass around 1.5 mg, fast ignitor laser wavelength ∼0.3  micron and the shock ignitor energy weight factor about 0.25.

  16. Palm: Easing the Burden of Analytical Performance Modeling

    SciTech Connect (OSTI)

    Tallent, Nathan R.; Hoisie, Adolfy

    2014-06-01

    Analytical (predictive) application performance models are critical for diagnosing performance-limiting resources, optimizing systems, and designing machines. Creating models, however, is difficult because they must be both accurate and concise. To ease the burden of performance modeling, we developed Palm, a modeling tool that combines top-down (human-provided) semantic insight with bottom-up static and dynamic analysis. To express insight, Palm defines a source code modeling annotation language. By coordinating models and source code, Palm's models are `first-class' and reproducible. Unlike prior work, Palm formally links models, functions, and measurements. As a result, Palm (a) uses functions to either abstract or express complexity (b) generates hierarchical models (representing an application's static and dynamic structure); and (c) automatically incorporates measurements to focus attention, represent constant behavior, and validate models. We discuss generating models for three different applications.

  17. Analytic models of supercomputer performance in multiprogramming environments

    SciTech Connect (OSTI)

    Menasce, D.A. ); Almeida, V.A.F. )

    1989-01-01

    Supercomputers run multiprogrammed time-sharing operating systems, so their facilities can be shared by many local and remote users. Therefore, it is important to be able to assess the performance of supercomputers and multiprogrammed environments. Analytic models based on Queueing Networks (QNs) and Stochastic Petri Nets (SPNs) are used in this paper with two purposes: to evaluate the performance of supercomputers in multiprogrammed environments, and to compare, performance-wise, conventional supercomputer architectures with a novel architecture proposed here. It is shown, with the aid of the analytic models, that the proposed architecture is preferable performance-wise over the existing conventional supercomputer architectures. A three-level workload characterization model for supercomputers is presented. Input data for the numerical examples discussed here are extracted from the well-known Los Alamos benchmark, and the results are validated by simulation.

  18. A semi-analytic model of magnetized liner inertial fusion

    SciTech Connect (OSTI)

    McBride, Ryan D.; Slutz, Stephen A.

    2015-05-15

    Presented is a semi-analytic model of magnetized liner inertial fusion (MagLIF). This model accounts for several key aspects of MagLIF, including: (1) preheat of the fuel (optionally via laser absorption); (2) pulsed-power-driven liner implosion; (3) liner compressibility with an analytic equation of state, artificial viscosity, internal magnetic pressure, and ohmic heating; (4) adiabatic compression and heating of the fuel; (5) radiative losses and fuel opacity; (6) magnetic flux compression with Nernst thermoelectric losses; (7) magnetized electron and ion thermal conduction losses; (8) end losses; (9) enhanced losses due to prescribed dopant concentrations and contaminant mix; (10) deuterium-deuterium and deuterium-tritium primary fusion reactions for arbitrary deuterium to tritium fuel ratios; and (11) magnetized α-particle fuel heating. We show that this simplified model, with its transparent and accessible physics, can be used to reproduce the general 1D behavior presented throughout the original MagLIF paper [S. A. Slutz et al., Phys. Plasmas 17, 056303 (2010)]. We also discuss some important physics insights gained as a result of developing this model, such as the dependence of radiative loss rates on the radial fraction of the fuel that is preheated.

  19. Fuel cycle assessment: A compendium of models, methodologies, and approaches

    SciTech Connect (OSTI)

    Not Available

    1994-07-01

    The purpose of this document is to profile analytical tools and methods which could be used in a total fuel cycle analysis. The information in this document provides a significant step towards: (1) Characterizing the stages of the fuel cycle. (2) Identifying relevant impacts which can feasibly be evaluated quantitatively or qualitatively. (3) Identifying and reviewing other activities that have been conducted to perform a fuel cycle assessment or some component thereof. (4) Reviewing the successes/deficiencies and opportunities/constraints of previous activities. (5) Identifying methods and modeling techniques/tools that are available, tested and could be used for a fuel cycle assessment.

  20. "Violent Intent Modeling: Incorporating Cultural Knowledge into the Analytical Process

    SciTech Connect (OSTI)

    Sanfilippo, Antonio P.; Nibbs, Faith G.

    2007-08-24

    While culture has a significant effect on the appropriate interpretation of textual data, the incorporation of cultural considerations into data transformations has not been systematic. Recognizing that the successful prevention of terrorist activities could hinge on the knowledge of the subcultures, Anthropologist and DHS intern Faith Nibbs has been addressing the need to incorporate cultural knowledge into the analytical process. In this Brown Bag she will present how cultural ideology is being used to understand how the rhetoric of group leaders influences the likelihood of their constituents to engage in violent or radicalized behavior, and how violent intent modeling can benefit from understanding that process.

  1. Urban stormwater management planning with analytical probabilistic models

    SciTech Connect (OSTI)

    Adams, B.J.

    2000-07-01

    Understanding how to properly manage urban stormwater is a critical concern to civil and environmental engineers the world over. Mismanagement of stormwater and urban runoff results in flooding, erosion, and water quality problems. In an effort to develop better management techniques, engineers have come to rely on computer simulation and advanced mathematical modeling techniques to help plan and predict water system performance. This important book outlines a new method that uses probability tools to model how stormwater behaves and interacts in a combined- or single-system municipal water system. Complete with sample problems and case studies illustrating how concepts really work, the book presents a cost-effective, easy-to-master approach to analytical modeling of stormwater management systems.

  2. Methodology Using MELCOR Code to Model Proposed Hazard Scenario

    SciTech Connect (OSTI)

    Gavin Hawkley

    2010-07-01

    This study demonstrates a methodology for using the MELCOR code to model a proposed hazard scenario within a building containing radioactive powder, and the subsequent evaluation of a leak path factor (LPF) (or the amount of respirable material which that escapes a facility into the outside environment), implicit in the scenario. This LPF evaluation will analyzes the basis and applicability of an assumed standard multiplication of 0.5 × 0.5 (in which 0.5 represents the amount of material assumed to leave one area and enter another), for calculating an LPF value. The outside release is dependsent upon the ventilation/filtration system, both filtered and un-filtered, and from other pathways from the building, such as doorways (, both open and closed). This study is presents ed to show how the multiple leak path factorsLPFs from the interior building can be evaluated in a combinatory process in which a total leak path factorLPF is calculated, thus addressing the assumed multiplication, and allowing for the designation and assessment of a respirable source term (ST) for later consequence analysis, in which: the propagation of material released into the environmental atmosphere can be modeled and the dose received by a receptor placed downwind can be estimated and the distance adjusted to maintains such exposures as low as reasonably achievableALARA.. Also, this study will briefly addresses particle characteristics thatwhich affect atmospheric particle dispersion, and compares this dispersion with leak path factorLPF methodology.

  3. Modeling of Diesel Exhaust Systems: A methodology to better simulate soot reactivity

    Broader source: Energy.gov [DOE]

    Discussed development of a methodology for creating accurate soot models for soot samples from various origins with minimal characterization

  4. Strong field coherent control of molecular torsions—Analytical models

    SciTech Connect (OSTI)

    Ashwell, Benjamin A.; Ramakrishna, S.; Seideman, Tamar

    2015-08-14

    We introduce analytical models of torsional alignment by moderately intense laser pulses that are applicable to the limiting cases of the torsional barrier heights. Using these models, we explore in detail the role that the laser intensity and pulse duration play in coherent torsional dynamics, addressing both experimental and theoretical concerns. Our results suggest strategies for minimizing the risk of off-resonant ionization, noting the qualitative differences between the case of torsional alignment subject to a field-free torsional barrier and that of torsional alignment of a barrier-less system (equivalent to a 2D rigid rotor). We also investigate several interesting torsional phenomena, including the onset of impulsive alignment of torsions, field-driven oscillations in quantum number space, and the disappearance of an alignment upper bound observed for a rigid rotor in the impulsive torsional alignment limit.

  5. A Double Scattering Analytical Model For Elastic Recoil Detection Analysis

    SciTech Connect (OSTI)

    Barradas, N. P.; Lorenz, K.; Alves, E.; Darakchieva, V.

    2011-06-01

    We present an analytical model for calculation of double scattering in elastic recoil detection measurements. Only events involving the beam particle and the recoil are considered, i.e. 1) an ion scatters off a target element and then produces a recoil, and 2) an ion produces a recoil which then scatters off a target element. Events involving intermediate recoils are not considered, i.e. when the primary ion produces a recoil which then produces a second recoil. If the recoil element is also present in the stopping foil, recoil events in the stopping foil are also calculated. We included the model in the standard code for IBA data analysis NDF, and applied it to the measurement of hydrogen in Si.

  6. Model choice considerations and information integration using analytical hierarchy process

    SciTech Connect (OSTI)

    Langenbrunner, James R; Hemez, Francois M; Booker, Jane M; Ross, Timothy J.

    2010-10-15

    Using the theory of information-gap for decision-making under severe uncertainty, it has been shown that model output compared to experimental data contains irrevocable trade-offs between fidelity-to-data, robustness-to-uncertainty and confidence-in-prediction. We illustrate a strategy for information integration by gathering and aggregating all available data, knowledge, theory, experience, similar applications. Such integration of information becomes important when the physics is difficult to model, when observational data are sparse or difficult to measure, or both. To aggregate the available information, we take an inference perspective. Models are not rejected, nor wasted, but can be integrated into a final result. We show an example of information integration using Saaty's Analytic Hierarchy Process (AHP), integrating theory, simulation output and experimental data. We used expert elicitation to determine weights for two models and two experimental data sets, by forming pair-wise comparisons between model output and experimental data. In this way we transform epistemic and/or statistical strength from one field of study into another branch of physical application. The price to pay for utilizing all available knowledge is that inferences drawn for the integrated information must be accounted for and the costs can be considerable. Focusing on inferences and inference uncertainty (IU) is one way to understand complex information.

  7. Analytical thermal model validation for Cassini radioisotope thermoelectric generator

    SciTech Connect (OSTI)

    Lin, E.I.

    1997-12-31

    The Saturn-bound Cassini spacecraft is designed to rely, without precedent, on the waste heat from its three radioisotope thermoelectric generators (RTGs) to warm the propulsion module subsystem, and the RTG end dome temperature is a key determining factor of the amount of waste heat delivered. A previously validated SINDA thermal model of the RTG was the sole guide to understanding its complex thermal behavior, but displayed large discrepancies against some initial thermal development test data. A careful revalidation effort led to significant modifications and adjustments of the model, which result in a doubling of the radiative heat transfer from the heat source support assemblies to the end domes and bring up the end dome and flange temperature predictions to within 2 C of the pertinent test data. The increased inboard end dome temperature has a considerable impact on thermal control of the spacecraft central body. The validation process offers an example of physically-driven analytical model calibration with test data from not only an electrical simulator but also a nuclear-fueled flight unit, and has established the end dome temperatures of a flight RTG where no in-flight or ground-test data existed before.

  8. Comparison of the Amanzi Model against Analytical Solutions and...

    Office of Scientific and Technical Information (OSTI)

    Subject: Environmental Sciences(54); Mathematics & Computing(97) Earth Sciences; Environmental Protection; Amanzi, FEHM, Flow, Analytical solutions Word Cloud More Like This Full ...

  9. An analytical model of axial compressor off-design performance

    SciTech Connect (OSTI)

    Camp, T.R.; Horlock, J.H. . Whittle Lab.)

    1994-07-01

    An analysis is presented of the off-design performance of multistage axial-flow compressors. It is based on an analytical solution, valid for small perturbations in operating conditions from the design point, and provides an insight into the effects of choices made during the compressor design process on performance and off-design stage matching. It is shown that the mean design value of stage loading coefficient ([psi] = [Delta]h[sub 0]/U[sup 2]) has a dominant effect on off-design performance, whereas the stage-wise distribution of stage loading coefficient and the design value of flow coefficient have little influence. The powerful effects of variable stator vanes on stage-matching are also demonstrated and these results are shown to agree well with previous work. The slope of the working line of a gas turbine engine, overlaid on overall compressor characteristics, is shown to have a strong effect on the off-design stage-matching through the compressor. The model is also used to analyze design changes to the compressor geometry and to show how errors in estimates of annulus blockage, decided during the design process, have less effect on compressor performance than has previously been thought.

  10. Natural gas production problems : solutions, methodologies, and modeling.

    SciTech Connect (OSTI)

    Rautman, Christopher Arthur; Herrin, James M.; Cooper, Scott Patrick; Basinski, Paul M.; Olsson, William Arthur; Arnold, Bill Walter; Broadhead, Ronald F.; Knight, Connie D.; Keefe, Russell G.; McKinney, Curt; Holm, Gus; Holland, John F.; Larson, Rich; Engler, Thomas W.; Lorenz, John Clay

    2004-10-01

    Natural gas is a clean fuel that will be the most important domestic energy resource for the first half the 21st centtuy. Ensuring a stable supply is essential for our national energy security. The research we have undertaken will maximize the extractable volume of gas while minimizing the environmental impact of surface disturbances associated with drilling and production. This report describes a methodology for comprehensive evaluation and modeling of the total gas system within a basin focusing on problematic horizontal fluid flow variability. This has been accomplished through extensive use of geophysical, core (rock sample) and outcrop data to interpret and predict directional flow and production trends. Side benefits include reduced environmental impact of drilling due to reduced number of required wells for resource extraction. These results have been accomplished through a cooperative and integrated systems approach involving industry, government, academia and a multi-organizational team within Sandia National Laboratories. Industry has provided essential in-kind support to this project in the forms of extensive core data, production data, maps, seismic data, production analyses, engineering studies, plus equipment and staff for obtaining geophysical data. This approach provides innovative ideas and technologies to bring new resources to market and to reduce the overall environmental impact of drilling. More importantly, the products of this research are not be location specific but can be extended to other areas of gas production throughout the Rocky Mountain area. Thus this project is designed to solve problems associated with natural gas production at developing sites, or at old sites under redevelopment.

  11. HIERARCHICAL METHODOLOGY FOR MODELING HYDROGEN STORAGE SYSTEMS PART II: DETAILED MODELS

    SciTech Connect (OSTI)

    Hardy, B; Donald L. Anton, D

    2008-12-22

    There is significant interest in hydrogen storage systems that employ a media which either adsorbs, absorbs or reacts with hydrogen in a nearly reversible manner. In any media based storage system the rate of hydrogen uptake and the system capacity is governed by a number of complex, coupled physical processes. To design and evaluate such storage systems, a comprehensive methodology was developed, consisting of a hierarchical sequence of models that range from scoping calculations to numerical models that couple reaction kinetics with heat and mass transfer for both the hydrogen charging and discharging phases. The scoping models were presented in Part I [1] of this two part series of papers. This paper describes a detailed numerical model that integrates the phenomena occurring when hydrogen is charged and discharged. A specific application of the methodology is made to a system using NaAlH{sub 4} as the storage media.

  12. A Hydro-mechanical Model and Analytical Solutions for Geomechanical Modeling of Carbon Dioxide Geological Sequestration

    SciTech Connect (OSTI)

    Xu, Zhijie; Fang, Yilin; Scheibe, Timothy D.; Bonneville, Alain

    2012-05-15

    We present a hydro-mechanical model for geological sequestration of carbon dioxide. The model considers the poroelastic effects by taking into account the coupling between the geomechanical response and the fluid flow in greater detail. The simplified hydro-mechanical model includes the geomechanical part that relies on the linear elasticity, while the fluid flow is based on the Darcys law. Two parts were coupled using the standard linear poroelasticity. Analytical solutions for pressure field were obtained for a typical geological sequestration scenario. The model predicts the temporal and spatial variation of pressure field and effects of permeability and elastic modulus of formation on the fluid pressure distribution.

  13. Quantitative analytical model for magnetic reconnection in hall magnetohydrodynamics

    SciTech Connect (OSTI)

    Simakov, Andrei N

    2008-01-01

    Magnetic reconnection is of fundamental importance for laboratory and naturally occurring plasmas. Reconnection usually develops on time scales which are much shorter than those associated with classical collisional dissipation processes, and which are not fully understood. While such dissipation-independent (or 'fast') reconnection rates have been observed in particle and Hall magnetohydrodynamics (MHD) simulations and predicted analytically in electron MHD, a quantitative analytical theory of fast reconnection valid for arbitrary ion inertial lengths d{sub i} has been lacking. Here we propose such a theory without a guide field. The theory describes two-dimensional magnetic field diffusion regions, provides expressions for the reconnection rates, and derives a formal criterion for fast reconnection in terms of dissipation parameters and di. It also demonstrates that both open X-point and elongated diffusion regions allow dissipation-independent reconnection and reveals a possibility of strong dependence of the reconnection rates on d{sub i}.

  14. Design Evaluation of Wind Turbine Spline Couplings Using an Analytical Model: Preprint

    SciTech Connect (OSTI)

    Guo, Y.; Keller, J.; Wallen, R.; Errichello, R.; Halse, C.; Lambert, S.

    2015-02-01

    Articulated splines are commonly used in the planetary stage of wind turbine gearboxes for transmitting the driving torque and improving load sharing. Direct measurement of spline loads and performance is extremely challenging because of limited accessibility. This paper presents an analytical model for the analysis of articulated spline coupling designs. For a given torque and shaft misalignment, this analytical model quickly yields insights into relationships between the spline design parameters and resulting loads; bending, contact, and shear stresses; and safety factors considering various heat treatment methods. Comparisons of this analytical model against previously published computational approaches are also presented.

  15. Tornado missile simulation and design methodology. Volume 2: model verification and data base updates. Final report

    SciTech Connect (OSTI)

    Twisdale, L.A.; Dunn, W.L.

    1981-08-01

    A probabilistic methodology has been developed to predict the probabilities of tornado-propelled missiles impacting and damaging nuclear power plant structures. Mathematical models of each event in the tornado missile hazard have been developed and sequenced to form an integrated, time-history simulation methodology. The models are data based where feasible. The data include documented records of tornado occurrence, field observations of missile transport, results of wind tunnel experiments, and missile impact tests. Probabilistic Monte Carlo techniques are used to estimate the risk probabilities. The methodology has been encoded in the TORMIS computer code to facilitate numerical analysis and plant-specific tornado missile probability assessments.

  16. Incorporating photon recycling into the analytical drift-diffusion model of high efficiency solar cells

    SciTech Connect (OSTI)

    Lumb, Matthew P.; Steiner, Myles A.; Geisz, John F.; Walters, Robert J.

    2014-11-21

    The analytical drift-diffusion formalism is able to accurately simulate a wide range of solar cell architectures and was recently extended to include those with back surface reflectors. However, as solar cells approach the limits of material quality, photon recycling effects become increasingly important in predicting the behavior of these cells. In particular, the minority carrier diffusion length is significantly affected by the photon recycling, with consequences for the solar cell performance. In this paper, we outline an approach to account for photon recycling in the analytical Hovel model and compare analytical model predictions to GaAs-based experimental devices operating close to the fundamental efficiency limit.

  17. A new analytic-adaptive model for EGS assessment, development...

    Open Energy Info (EERE)

    ability to quantitative test hypotheses for new EGS designs and technologies, as well as reservoir sustainability modeling. Funding Source American Recovery and Reinvestment Act...

  18. Model Validation and Testing: The Methodological Foundation of ASHRAE Standard 140; Preprint

    SciTech Connect (OSTI)

    Judkoff, R.; Neymark, J.

    2006-07-01

    Ideally, whole-building energy simulation programs model all aspects of a building that influence energy use and thermal and visual comfort for the occupants. An essential component of the development of such computer simulation models is a rigorous program of validation and testing. This paper describes a methodology to evaluate the accuracy of whole-building energy simulation programs. The methodology is also used to identify and diagnose differences in simulation predictions that may be caused by algorithmic differences, modeling limitations, coding errors, or input errors. The methodology has been adopted by ANSI/ASHRAE Standard 140 (ANSI/ASHRAE 2001, 2004), Method of Test for the Evaluation of Building Energy Analysis Computer Programs. A summary of the method is included in the ASHRAE Handbook of Fundamentals (ASHRAE 2005). This paper describes the ANSI/ASHRAE Standard 140 method of test and its methodological basis. Also discussed are possible future enhancements to Standard 140 and related research recommendations.

  19. Model Validation and Testing: The Methodological Foundation of ASHRAE Standard 140

    SciTech Connect (OSTI)

    Judkoff, R.; Neymark, J.

    2006-01-01

    Ideally, whole-building energy simulation programs model all aspects of a building that influence energy use and thermal and visual comfort for the occupants. An essential component of the development of such computer simulation models is a rigorous program of validation and testing. This paper describes a methodology to evaluate the accuracy of whole-building energy simulation programs. The methodology is also used to identify and diagnose differences in simulation predictions that may be caused by algorithmic differences, modeling limitations, coding errors, or input errors. The methodology has been adopted by ANSI/ASHRAE Standard 140, Method of Test for the Evaluation of Building Energy Analysis Computer Programs (ASHRAE 2001a, 2004). A summary of the method is included in the 2005 ASHRAE Handbook--Fundamentals (ASHRAE 2005). This paper describes the ASHRAE Standard 140 method of test and its methodological basis. Also discussed are possible future enhancements to ASHRAE Standard 140 and related research recommendations.

  20. Mathematical Modeling of Microbial Community Dynamics: A Methodological Review

    SciTech Connect (OSTI)

    Song, Hyun-Seob; Cannon, William R.; Beliaev, Alex S.; Konopka, Allan

    2014-10-17

    Microorganisms in nature form diverse communities that dynamically change in structure and function in response to environmental variations. As a complex adaptive system, microbial communities show higher-order properties that are not present in individual microbes, but arise from their interactions. Predictive mathematical models not only help to understand the underlying principles of the dynamics and emergent properties of natural and synthetic microbial communities, but also provide key knowledge required for engineering them. In this article, we provide an overview of mathematical tools that include not only current mainstream approaches, but also less traditional approaches that, in our opinion, can be potentially useful. We discuss a broad range of methods ranging from low-resolution supra-organismal to high-resolution individual-based modeling. Particularly, we highlight the integrative approaches that synergistically combine disparate methods. In conclusion, we provide our outlook for the key aspects that should be further developed to move microbial community modeling towards greater predictive power.

  1. Human performance modeling for system of systems analytics :soldier fatigue.

    SciTech Connect (OSTI)

    Lawton, Craig R.; Campbell, James E.; Miller, Dwight Peter

    2005-10-01

    The military has identified Human Performance Modeling (HPM) as a significant requirement and challenge of future systems modeling and analysis initiatives as can be seen in the Department of Defense's (DoD) Defense Modeling and Simulation Office's (DMSO) Master Plan (DoD 5000.59-P 1995). To this goal, the military is currently spending millions of dollars on programs devoted to HPM in various military contexts. Examples include the Human Performance Modeling Integration (HPMI) program within the Air Force Research Laboratory, which focuses on integrating HPMs with constructive models of systems (e.g. cockpit simulations) and the Navy's Human Performance Center (HPC) established in September 2003. Nearly all of these initiatives focus on the interface between humans and a single system. This is insufficient in the era of highly complex network centric SoS. This report presents research and development in the area of HPM in a system-of-systems (SoS). Specifically, this report addresses modeling soldier fatigue and the potential impacts soldier fatigue can have on SoS performance.

  2. Analytical model for transient gas flow in nuclear fuel rods. [PWR; BWR

    SciTech Connect (OSTI)

    Rowe, D.S.; Oehlberg, R.N.

    1981-08-01

    An analytical model for calculating gas flow and pressure inside a nuclear fuel rod is presented. Such a model is required to calculate the pressure loading of cladding during ballooning that could occur for postulated reactor accidents. The mathematical model uses a porous media (permeability) concept to define the resistance to gas flow along the fuel rod. 7 refs.

  3. Prototype integration of the joint munitions assessment and planning model with the OSD threat methodology

    SciTech Connect (OSTI)

    Lynn, R.Y.S.; Bolmarcich, J.J.

    1994-06-01

    The purpose of this Memorandum is to propose a prototype procedure which the Office of Munitions might employ to exercise, in a supportive joint fashion, two of its High Level Conventional Munitions Models, namely, the OSD Threat Methodology and the Joint Munitions Assessment and Planning (JMAP) model. The joint application of JMAP and the OSD Threat Methodology provides a tool to optimize munitions stockpiles. The remainder of this Memorandum comprises five parts. The first is a description of the structure and use of the OSD Threat Methodology. The second is a description of JMAP and its use. The third discusses the concept of the joint application of JMAP and OSD Threat Methodology. The fourth displays sample output of the joint application. The fifth is a summary and epilogue. Finally, three appendices contain details of the formulation, data, and computer code.

  4. Semi-analytical modeling of the NIO1 source

    SciTech Connect (OSTI)

    Cazzador, M.; Cavenago, M.; Serianni, G.; Veltri, P.

    2015-04-08

    NIO1 is a compact and versatile negative ion source, with a total current of 130 mA accelerated to 60 keV. Negative ions are created inside the plasma, which is inductively coupled to an external rf cylindrical coil operating in the range of 2 ± 0.2 MHz. The plasma is confined in the source chamber (a 50 mm radius cylinder) by a multipole magnetic field and the ions are extracted through a 3x3 matrix of apertures. The use of cesium, to enhance the negative ion production by H{sub 0} bombardment of the surfaces, is foreseen in a second stage of the operation, so that at present time the source is operating in pure volume configuration. This paper presents a model aimed to describe the main physical phenomena occurring in the source, focusing on the rf coupling with the plasma and the evolution of plasma parameters in the source. With respect to more sophisticated models of negative ion sources here we aimed to develop a fast tool capable of qualitatively describing the response of the system to variations in the basic operating parameters. The findings of this models is finally compared with the first experimental results of NIO1.

  5. A simple Analytical Model to Study and Control Azimuthal Instabilities in

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Annular Combustion Chambers | Argonne Leadership Computing Facility A simple Analytical Model to Study and Control Azimuthal Instabilities in Annular Combustion Chambers Authors: Parmentier, J-F., Salas, P., Wolf, P., Staffelbach, G., Nicoud, F., Poinsot, T. This study describes a simple analytical method to compute the azimuthal modes appearing in annular combustion chambers and help analyzing experimental, acoustic and large eddy simulation (LES) data obtained in these combustion chambers.

  6. An analytical model for a class of processor-memory interconnection networks

    SciTech Connect (OSTI)

    Conterno, R.; Melen, R.

    1987-11-01

    The performance of a delta interconnection network for multiprocessors is evaluated in a circuit switching environment. An error is pointed out in previous literature and an exact analytical model is given for regeneration systems, where a connection request is considered lost if not immediately granted. An approximated numerical method is suggested for the correction of the analytical results, which gave outputs in very good agreement with the simulation of real systems where requests are maintained.

  7. Analytical models of calcium binding in a calcium channel

    SciTech Connect (OSTI)

    Liu, Jinn-Liang; Eisenberg, Bob

    2014-08-21

    The anomalous mole fraction effect of L-type calcium channels is analyzed using a Fermi like distribution with the experimental data of Almers and McCleskey [J. Physiol. 353, 585 (1984)] and the atomic resolution model of Lipkind and Fozzard [Biochemistry 40, 6786 (2001)] of the selectivity filter of the channel. Much of the analysis is algebraic, independent of differential equations. The Fermi distribution is derived from the configuration entropy of ions and water molecules with different sizes, different valences, and interstitial voids between particles. It allows us to calculate potentials and distances (between the binding ion and the oxygen ions of the glutamate side chains) directly from the experimental data using algebraic formulas. The spatial resolution of these results is comparable with those of molecular models, but of course the accuracy is no better than that implied by the experimental data. The glutamate side chains in our model are flexible enough to accommodate different types of binding ions in different bath conditions. The binding curves of Na{sup +} and Ca{sup 2+} for [CaCl{sub 2}] ranging from 10{sup −8} to 10{sup −2} M with a fixed 32 mM background [NaCl] are shown to agree with published Monte Carlo simulations. The Poisson-Fermi differential equation—that includes both steric and correlation effects—is then used to obtain the spatial profiles of energy, concentration, and dielectric coefficient from the solvent region to the filter. The energy profiles of ions are shown to depend sensitively on the steric energy that is not taken into account in the classical rate theory. We improve the rate theory by introducing a steric energy that lumps the effects of excluded volumes of all ions and water molecules and empty spaces between particles created by Lennard-Jones type and electrostatic forces. We show that the energy landscape varies significantly with bath concentrations. The energy landscape is not constant.

  8. Analytical solution for two-phase flow in a wellbore using the drift-flux model

    SciTech Connect (OSTI)

    Pan, L.; Webb, S.W.; Oldenburg, C.M.

    2011-11-01

    This paper presents analytical solutions for steady-state, compressible two-phase flow through a wellbore under isothermal conditions using the drift flux conceptual model. Although only applicable to highly idealized systems, the analytical solutions are useful for verifying numerical simulation capabilities that can handle much more complicated systems, and can be used in their own right for gaining insight about two-phase flow processes in wells. The analytical solutions are obtained by solving the mixture momentum equation of steady-state, two-phase flow with an assumption that the two phases are immiscible. These analytical solutions describe the steady-state behavior of two-phase flow in the wellbore, including profiles of phase saturation, phase velocities, and pressure gradients, as affected by the total mass flow rate, phase mass fraction, and drift velocity (i.e., the slip between two phases). Close matching between the analytical solutions and numerical solutions for a hypothetical CO{sub 2} leakage problem as well as to field data from a CO{sub 2} production well indicates that the analytical solution is capable of capturing the major features of steady-state two-phase flow through an open wellbore, and that the related assumptions and simplifications are justified for many actual systems. In addition, we demonstrate the utility of the analytical solution to evaluate how the bottomhole pressure in a well in which CO{sub 2} is leaking upward responds to the mass flow rate of CO{sub 2}-water mixture.

  9. Analytical Methodologies for Detection of Gamma-Valerolactone, Delta-Valerolactone, Acephate and Azinphos Methyl and Their Associated Metabolites in Complex Biological Matrices

    SciTech Connect (OSTI)

    Zink, E.; Clark, R.; Grant, K.; Campbell, J.; Hoppe, E.

    2005-01-01

    Non-invasive biomonitoring for chemicals of interest in law enforcement and similar monitoring of pesticides, together with their metabolites, can not only save money but can lead to faster medical attention for individuals exposed to these chemicals. This study describes methods developed for the analysis of gamma-valerolactone (GVL), delta-valerolactone (DVL), acephate, and azinphos methyl in saliva and serum. Liquid chromatography/mass spectrometry (LC/MS) operated in the negative and positive ion mode and gas chromatography/mass spectrometry (GC/MS) were used to analyze GVL and DVL. Although both analytical techniques worked well, lower detection limits were obtained with GC/MS. The lactones and their corresponding sodium salts were spiked into both saliva and serum. The lactones were isolated from saliva or serum using newly developed extraction techniques and then subsequently analyzed using GC/MS. The sodium salts of the lactones are nonvolatile and require derivatization prior to analysis by this method. N-methyl-N-(t-butyldimethylsilyl)-trifluoroacetamide (MTBSTFA) was ultimately selected as the reagent for derivatization because the acidic conditions required for reactions with diazomethane caused the salts to undergo intramolecular cyclization to the corresponding lactones. In vitro studies were conducted using rat liver microsomes to determine other metabolites associated with these compounds. Azinphos methyl and acephate are classified as organophosphate pesticides, and are known to be cholinesterase inhibitors in humans and insects, causing neurotoxicity. For this reason they have both exposure and environmental impact implications. These compounds were spiked into serum and saliva and prepared for analysis by GC/MS. Continuation of this research would include analysis by GC/MS under positive ion mode to determine the parent ions of the unknown metabolites. Further research is planned through an in vivo analysis of the lactones and pesticides. These

  10. On Improving Analytical Models of Cosmic Reionization for Matching Numerical Simulations

    SciTech Connect (OSTI)

    Kaurov, Alexander A.

    2016-01-01

    The methods for studying the epoch of cosmic reionization vary from full radiative transfer simulations to purely analytical models. While numerical approaches are computationally expensive and are not suitable for generating many mock catalogs, analytical methods are based on assumptions and approximations. We explore the interconnection between both methods. First, we ask how the analytical framework of excursion set formalism can be used for statistical analysis of numerical simulations and visual representation of the morphology of ionization fronts. Second, we explore the methods of training the analytical model on a given numerical simulation. We present a new code which emerged from this study. Its main application is to match the analytical model with a numerical simulation. Then, it allows one to generate mock reionization catalogs with volumes exceeding the original simulation quickly and computationally inexpensively, meanwhile reproducing large scale statistical properties. These mock catalogs are particularly useful for CMB polarization and 21cm experiments, where large volumes are required to simulate the observed signal.

  11. Comparison of the Bioavailability of Waste Laden Soils Using ''In Vivo'' ''In Vitro'' Analytical Methodology and Bioaccessibility of Radionuclides for Refinement of Exposure/Dose Estimates

    SciTech Connect (OSTI)

    P. J. Lioy; M. Gallo; P. Georgopoulos; R. Tate; B. Buckley

    1999-09-15

    The bioavailability of soil contaminants can be measured using in vitro or in vivo techniques. Since there was no standard method for intercomparison among laboratories, we compared two techniques for bioavailability estimation: in vitro dissolution and in vivo rat feeding model for a NIST-traceable soil material. Bioaccessibility was measured using a sequential soil extraction in synthetic analogues of human saliva, gastric and intestinal fluids. Bioavailability was measured in Sprague Dawley rats by determining metal levels in the major organs and urine, feces, and blood. Bioaccessibility was found to be a good indicator of relative metal bioavailability. Results are presented from bioaccessible experiments with Cesium in contaminated DOE soils, and total alpha and beta bioaccessibility. The results indicate that the modified methodology for bioaccessibility can be used for specific radionuclide analysis.

  12. Monte Carlo and analytical model predictions of leakage neutron exposures from passively scattered proton therapy

    SciTech Connect (OSTI)

    Prez-Andjar, Anglica [Department of Radiation Physics, Unit 1202, The University of Texas MD Anderson Cancer Center, 1515 Holcombe Boulevard, Houston, Texas 77030 (United States)] [Department of Radiation Physics, Unit 1202, The University of Texas MD Anderson Cancer Center, 1515 Holcombe Boulevard, Houston, Texas 77030 (United States); Zhang, Rui; Newhauser, Wayne [Department of Radiation Physics, Unit 1202, The University of Texas MD Anderson Cancer Center, 1515 Holcombe Boulevard, Houston, Texas 77030 and The University of Texas Graduate School of Biomedical Sciences at Houston, 6767 Bertner Avenue, Houston, Texas 77030 (United States)] [Department of Radiation Physics, Unit 1202, The University of Texas MD Anderson Cancer Center, 1515 Holcombe Boulevard, Houston, Texas 77030 and The University of Texas Graduate School of Biomedical Sciences at Houston, 6767 Bertner Avenue, Houston, Texas 77030 (United States)

    2013-12-15

    Purpose: Stray neutron radiation is of concern after radiation therapy, especially in children, because of the high risk it might carry for secondary cancers. Several previous studies predicted the stray neutron exposure from proton therapy, mostly using Monte Carlo simulations. Promising attempts to develop analytical models have also been reported, but these were limited to only a few proton beam energies. The purpose of this study was to develop an analytical model to predict leakage neutron equivalent dose from passively scattered proton beams in the 100-250-MeV interval.Methods: To develop and validate the analytical model, the authors used values of equivalent dose per therapeutic absorbed dose (H/D) predicted with Monte Carlo simulations. The authors also characterized the behavior of the mean neutron radiation-weighting factor, w{sub R}, as a function of depth in a water phantom and distance from the beam central axis.Results: The simulated and analytical predictions agreed well. On average, the percentage difference between the analytical model and the Monte Carlo simulations was 10% for the energies and positions studied. The authors found that w{sub R} was highest at the shallowest depth and decreased with depth until around 10 cm, where it started to increase slowly with depth. This was consistent among all energies.Conclusion: Simple analytical methods are promising alternatives to complex and slow Monte Carlo simulations to predict H/D values. The authors' results also provide improved understanding of the behavior of w{sub R} which strongly depends on depth, but is nearly independent of lateral distance from the beam central axis.

  13. Uncertainty Quantification for the Reliability of the Analytical Analysis for the Simplified Model of CO2 Geological Sequestration

    SciTech Connect (OSTI)

    Bao, Jie; Xu, Zhijie; Fang, Yilin

    2015-04-01

    A hydro-mechanical model with analytical solutions including pressure evolution and geomechanical deformation for geological CO2 injection and sequestration were introduced in our previous work. However, the reliability and accuracy of the hydro-mechanical model and the companion analytical solution are uncertain because of the assumptions and simplifications in the analytical model, though it was validated by a few example cases. This study introduce the method to efficiently measure the accuracy of the analytical model, and specify the acceptable input parameters range that can guarantee the accuracy and reliability of the analytical solution. A coupled hydro-geomechanical subsurface transport simulator STOMP was adopted as a reference to justify the reliability of the hydro-mechanical model and the analytical solution. A quasi-Monte Carlo sampling method was applied to efficiently sample the input parameter space.

  14. The analytic model of a laser-accelerated plasma target and its stability

    SciTech Connect (OSTI)

    Khudik, V. Yi, S. A.; Siemon, C.; Shvets, G.

    2014-01-15

    A self-consistent kinetic theory of a laser-accelerated plasma target with distributed electron/ion densities is developed. The simplified model assumes that after an initial transition period the bulk of cold ions are uniformly accelerated by the self-consistent electric field generated by hot electrons trapped in combined ponderomotive and electrostatic potentials. Several distinct target regions (non-neutral ion tail, non-neutral electron sheath, and neutral plasma bulk) are identified and analytically described. It is shown analytically that such laser-accelerated finite-thickness target is susceptible to Rayleigh-Taylor (RT) instability. Particle-in-cell simulations of the seeded perturbations of the plasma target reveal that, for ultra-relativistic laser intensities, the growth rate of the RT instability is depressed from the analytic estimates.

  15. Analytical Modeling of a Novel Transverse Flux Machine for Direct Drive Wind Turbine Applications

    SciTech Connect (OSTI)

    Hasan, IIftekhar; Husain, Tausif; Uddin, Md Wasi; Sozer, Yilmaz; Husain, Iqbal; Muljadi, Eduard

    2015-09-02

    This paper presents a nonlinear analytical model of a novel double sided flux concentrating Transverse Flux Machine (TFM) based on the Magnetic Equivalent Circuit (MEC) model. The analytical model uses a series-parallel combination of flux tubes to predict the flux paths through different parts of the machine including air gaps, permanent magnets (PM), stator, and rotor. The two-dimensional MEC model approximates the complex three-dimensional flux paths of the TFM and includes the effects of magnetic saturation. The model is capable of adapting to any geometry which makes it a good alternative for evaluating prospective designs of TFM as compared to finite element solvers which are numerically intensive and require more computation time. A single phase, 1 kW, 400 rpm machine is analytically modeled and its resulting flux distribution, no-load EMF and torque, verified with Finite Element Analysis (FEA). The results are found to be in agreement with less than 5% error, while reducing the computation time by 25 times.

  16. Analytical Modeling of a Novel Transverse Flux Machine for Direct Drive Wind Turbine Applications: Preprint

    SciTech Connect (OSTI)

    Hasan, IIftekhar; Husain, Tausif; Uddin, Md Wasi; Sozer, Yilmaz; Husain; Iqbal; Muljadi, Eduard

    2015-08-24

    This paper presents a nonlinear analytical model of a novel double-sided flux concentrating Transverse Flux Machine (TFM) based on the Magnetic Equivalent Circuit (MEC) model. The analytical model uses a series-parallel combination of flux tubes to predict the flux paths through different parts of the machine including air gaps, permanent magnets, stator, and rotor. The two-dimensional MEC model approximates the complex three-dimensional flux paths of the TFM and includes the effects of magnetic saturation. The model is capable of adapting to any geometry that makes it a good alternative for evaluating prospective designs of TFM compared to finite element solvers that are numerically intensive and require more computation time. A single-phase, 1-kW, 400-rpm machine is analytically modeled, and its resulting flux distribution, no-load EMF, and torque are verified with finite element analysis. The results are found to be in agreement, with less than 5% error, while reducing the computation time by 25 times.

  17. The Analytical Repository Source-Term (AREST) model: Description and documentation

    SciTech Connect (OSTI)

    Liebetrau, A.M.; Apted, M.J.; Engel, D.W.; Altenhofen, M.K.; Strachan, D.M.; Reid, C.R.; Windisch, C.F.; Erikson, R.L.; Johnson, K.I.

    1987-10-01

    The geologic repository system consists of several components, one of which is the engineered barrier system. The engineered barrier system interfaces with natural barriers that constitute the setting of the repository. A model that simulates the releases from the engineered barrier system into the natural barriers of the geosphere, called a source-term model, is an important component of any model for assessing the overall performance of the geologic repository system. The Analytical Repository Source-Term (AREST) model being developed is one such model. This report describes the current state of development of the AREST model and the code in which the model is implemented. The AREST model consists of three component models and five process models that describe the post-emplacement environment of a waste package. All of these components are combined within a probabilistic framework. The component models are a waste package containment (WPC) model that simulates the corrosion and degradation processes which eventually result in waste package containment failure; a waste package release (WPR) model that calculates the rates of radionuclide release from the failed waste package; and an engineered system release (ESR) model that controls the flow of information among all AREST components and process models and combines release output from the WPR model with failure times from the WPC model to produce estimates of total release. 167 refs., 40 figs., 12 tabs.

  18. Method of and apparatus for determining the similarity of a biological analyte from a model constructed from known biological fluids

    DOE Patents [OSTI]

    Robinson, Mark R.; Ward, Kenneth J.; Eaton, Robert P.; Haaland, David M.

    1990-01-01

    The characteristics of a biological fluid sample having an analyte are determined from a model constructed from plural known biological fluid samples. The model is a function of the concentration of materials in the known fluid samples as a function of absorption of wideband infrared energy. The wideband infrared energy is coupled to the analyte containing sample so there is differential absorption of the infrared energy as a function of the wavelength of the wideband infrared energy incident on the analyte containing sample. The differential absorption causes intensity variations of the infrared energy incident on the analyte containing sample as a function of sample wavelength of the energy, and concentration of the unknown analyte is determined from the thus-derived intensity variations of the infrared energy as a function of wavelength from the model absorption versus wavelength function.

  19. Analytical modeling of localized surface plasmon resonance in heterostructure copper sulfide nanocrystals

    SciTech Connect (OSTI)

    Caldwell, Andrew H.; Ha, Don-Hyung; Robinson, Richard D.; Ding, Xiaoyue

    2014-10-28

    Localized surface plasmon resonance (LSPR) in semiconductor nanocrystals is a relatively new field of investigation that promises greater tunability of plasmonic properties compared to metal nanoparticles. A novel process by which the LSPR in semiconductor nanocrystals can be altered is through heterostructure formation arising from solution-based cation exchange. Herein, we describe the development of an analytical model of LSPR in heterostructure copper sulfide-zinc sulfide nanocrystals synthesized via a cation exchange reaction between copper sulfide (Cu{sub 1.81}S) nanocrystals and Zn ions. The cation exchange reaction produces dual-interface, heterostructure nanocrystals in which the geometry of the copper sulfide phase can be tuned from a sphere to a thin disk separating symmetrically-grown sulfide (ZnS) grains. Drude model electronic conduction and Mie-Gans theory are applied to describe how the LSPR wavelength changes during cation exchange, taking into account the morphology evolution and changes to the local permittivity. The results of the modeling indicate that the presence of the ZnS grains has a significant effect on the out-of-plane LSPR mode. By comparing the results of the model to previous studies on solid-solid phase transformations of copper sulfide in these nanocrystals during cation exchange, we show that the carrier concentration is independent of the copper vacancy concentration dictated by its atomic phase. The evolution of the effective carrier concentration calculated from the model suggests that the out-of-plane resonance mode is dominant. The classical model was compared to a simplified quantum mechanical model which suggested that quantum mechanical effects become significant when the characteristic size is less than ∼8 nm. Overall, we find that the analytical models are not accurate for these heterostructured semiconductor nanocrystals, indicating the need for new model development for this emerging field.

  20. The Biosurveillance Analytics Resource Directory (BARD): Facilitating the use of epidemiological models for infectious disease surveillance

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Margevicius, Kristen J.; Generous, Nicholas; Abeyta, Esteban; Althouse, Ben; Burkom, Howard; Castro, Lauren; Daughton, Ashlynn; Del Valle, Sara Y.; Fairchild, Geoffrey; Hyman, James M.; et al

    2016-01-28

    Epidemiological modeling for infectious disease is important for disease management and its routine implementation needs to be facilitated through better description of models in an operational context. A standardized model characterization process that allows selection or making manual comparisons of available models and their results is currently lacking. A key need is a universal framework to facilitate model description and understanding of its features. Los Alamos National Laboratory (LANL) has developed a comprehensive framework that can be used to characterize an infectious disease model in an operational context. The framework was developed through a consensus among a panel of subjectmore » matter experts. In this paper, we describe the framework, its application to model characterization, and the development of the Biosurveillance Analytics Resource Directory (BARD; http://brd.bsvgateway.org/brd/), to facilitate the rapid selection of operational models for specific infectious/communicable diseases. We offer this framework and associated database to stakeholders of the infectious disease modeling field as a tool for standardizing model description and facilitating the use of epidemiological models.« less

  1. Dixie Valley Engineered Geothermal System Exploration Methodology Project, Baseline Conceptual Model Report

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Iovenitti, Joe

    2014-01-02

    FSR Part I presents (1) an assessment of the readily available public domain data and some proprietary data provided by Terra-Gen Power, LLC, (2) a re-interpretation of these data as required, (3) an exploratory geostatistical data analysis, (4) the baseline geothermal conceptual model, and (5) the EGS favorability/trust mapping. The conceptual model presented applies to both the hydrothermal system and EGS in the Dixie Valley region. FSR Part II presents (1) 278 new gravity stations; (2) enhanced gravity-magnetic modeling; (3) 42 new ambient seismic noise survey stations; (4) an integration of the new seismic noise data with a regional seismic network; (5) a new methodology and approach to interpret this data; (5) a novel method to predict rock type and temperature based on the newly interpreted data; (6) 70 new magnetotelluric (MT) stations; (7) an integrated interpretation of the enhanced MT data set; (8) the results of a 308 station soil CO2 gas survey; (9) new conductive thermal modeling in the project area; (10) new convective modeling in the Calibration Area; (11) pseudo-convective modeling in the Calibration Area; (12) enhanced data implications and qualitative geoscience correlations at three scales (a) Regional, (b) Project, and (c) Calibration Area; (13) quantitative geostatistical exploratory data analysis; and (14) responses to nine questions posed in the proposal for this investigation. Enhanced favorability/trust maps were not generated because there was not a sufficient amount of new, fully-vetted (see below) rock type, temperature, and stress data. The enhanced seismic data did generate a new method to infer rock type and temperature. However, in the opinion of the Principal Investigator for this project, this new methodology needs to be tested and evaluated at other sites in the Basin and Range before it is used to generate the referenced maps. As in the baseline conceptual model, the enhanced findings can be applied to both the hydrothermal

  2. Dixie Valley Engineered Geothermal System Exploration Methodology Project, Baseline Conceptual Model Report

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Iovenitti, Joe

    FSR Part I presents (1) an assessment of the readily available public domain data and some proprietary data provided by Terra-Gen Power, LLC, (2) a re-interpretation of these data as required, (3) an exploratory geostatistical data analysis, (4) the baseline geothermal conceptual model, and (5) the EGS favorability/trust mapping. The conceptual model presented applies to both the hydrothermal system and EGS in the Dixie Valley region. FSR Part II presents (1) 278 new gravity stations; (2) enhanced gravity-magnetic modeling; (3) 42 new ambient seismic noise survey stations; (4) an integration of the new seismic noise data with a regional seismic network; (5) a new methodology and approach to interpret this data; (5) a novel method to predict rock type and temperature based on the newly interpreted data; (6) 70 new magnetotelluric (MT) stations; (7) an integrated interpretation of the enhanced MT data set; (8) the results of a 308 station soil CO2 gas survey; (9) new conductive thermal modeling in the project area; (10) new convective modeling in the Calibration Area; (11) pseudo-convective modeling in the Calibration Area; (12) enhanced data implications and qualitative geoscience correlations at three scales (a) Regional, (b) Project, and (c) Calibration Area; (13) quantitative geostatistical exploratory data analysis; and (14) responses to nine questions posed in the proposal for this investigation. Enhanced favorability/trust maps were not generated because there was not a sufficient amount of new, fully-vetted (see below) rock type, temperature, and stress data. The enhanced seismic data did generate a new method to infer rock type and temperature. However, in the opinion of the Principal Investigator for this project, this new methodology needs to be tested and evaluated at other sites in the Basin and Range before it is used to generate the referenced maps. As in the baseline conceptual model, the enhanced findings can be applied to both the hydrothermal

  3. A methodology for assessing the market benefits of alternative motor fuels: The Alternative Fuels Trade Model

    SciTech Connect (OSTI)

    Leiby, P.N.

    1993-09-01

    This report describes a modeling methodology for examining the prospective economic benefits of displacing motor gasoline use by alternative fuels. The approach is based on the Alternative Fuels Trade Model (AFTM). AFTM development was undertaken by the US Department of Energy (DOE) as part of a longer term study of alternative fuels issues. The AFTM is intended to assist with evaluating how alternative fuels may be promoted effectively, and what the consequences of substantial alternative fuels use might be. Such an evaluation of policies and consequences of an alternative fuels program is being undertaken by DOE as required by Section 502(b) of the Energy Policy Act of 1992. Interest in alternative fuels is based on the prospective economic, environmental and energy security benefits from the substitution of these fuels for conventional transportation fuels. The transportation sector is heavily dependent on oil. Increased oil use implies increased petroleum imports, with much of the increase coming from OPEC countries. Conversely, displacement of gasoline has the potential to reduce US petroleum imports, thereby reducing reliance on OPEC oil and possibly weakening OPEC`s ability to extract monopoly profits. The magnitude of US petroleum import reduction, the attendant fuel price changes, and the resulting US benefits, depend upon the nature of oil-gas substitution and the supply and demand behavior of other world regions. The methodology applies an integrated model of fuel market interactions to characterize these effects.

  4. Human performance modeling for system of systems analytics: combat performance-shaping factors.

    SciTech Connect (OSTI)

    Lawton, Craig R.; Miller, Dwight Peter

    2006-01-01

    The US military has identified Human Performance Modeling (HPM) as a significant requirement and challenge of future systems modeling and analysis initiatives. To support this goal, Sandia National Laboratories (SNL) has undertaken a program of HPM as an integral augmentation to its system-of-system (SoS) analytics capabilities. The previous effort, reported in SAND2005-6569, evaluated the effects of soldier cognitive fatigue on SoS performance. The current effort began with a very broad survey of any performance-shaping factors (PSFs) that also might affect soldiers performance in combat situations. The work included consideration of three different approaches to cognition modeling and how appropriate they would be for application to SoS analytics. This bulk of this report categorizes 47 PSFs into three groups (internal, external, and task-related) and provides brief descriptions of how each affects combat performance, according to the literature. The PSFs were then assembled into a matrix with 22 representative military tasks and assigned one of four levels of estimated negative impact on task performance, based on the literature. Blank versions of the matrix were then sent to two ex-military subject-matter experts to be filled out based on their personal experiences. Data analysis was performed to identify the consensus most influential PSFs. Results indicate that combat-related injury, cognitive fatigue, inadequate training, physical fatigue, thirst, stress, poor perceptual processing, and presence of chemical agents are among the PSFs with the most negative impact on combat performance.

  5. Methodology for the Incorporation of Passive Component Aging Modeling into the RAVEN/ RELAP-7 Environment

    SciTech Connect (OSTI)

    Mandelli, Diego; Rabiti, Cristian; Cogliati, Joshua; Alfonsi, Andrea; Askin Guler; Tunc Aldemir

    2014-11-01

    Passive system, structure and components (SSCs) will degrade over their operation life and this degradation may cause to reduction in the safety margins of a nuclear power plant. In traditional probabilistic risk assessment (PRA) using the event-tree/fault-tree methodology, passive SSC failure rates are generally based on generic plant failure data and the true state of a specific plant is not reflected realistically. To address aging effects of passive SSCs in the traditional PRA methodology [1] does consider physics based models that account for the operating conditions in the plant, however, [1] does not include effects of surveillance/inspection. This paper represents an overall methodology for the incorporation of aging modeling of passive components into the RAVEN/RELAP-7 environment which provides a framework for performing dynamic PRA. Dynamic PRA allows consideration of both epistemic and aleatory uncertainties (including those associated with maintenance activities) in a consistent phenomenological and probabilistic framework and is often needed when there is complex process/hardware/software/firmware/ human interaction [2]. Dynamic PRA has gained attention recently due to difficulties in the traditional PRA modeling of aging effects of passive components using physics based models and also in the modeling of digital instrumentation and control systems. RAVEN (Reactor Analysis and Virtual control Environment) [3] is a software package under development at the Idaho National Laboratory (INL) as an online control logic driver and post-processing tool. It is coupled to the plant transient code RELAP-7 (Reactor Excursion and Leak Analysis Program) also currently under development at INL [3], as well as RELAP 5 [4]. The overall methodology aims to: • Address multiple aging mechanisms involving large number of components in a computational feasible manner where sequencing of events is conditioned on the physical conditions predicted in a simulation

  6. Modeling and Analysis of The Pressure Die Casting Using Response Surface Methodology

    SciTech Connect (OSTI)

    Kittur, Jayant K.; Herwadkar, T. V. [KLS Gogte Institute of Technology, Belgaum -590 008, Karnataka (India); Parappagoudar, M. B. [Chhatrapati Shivaji Institute of Technology, Durg (C.G)-491001 (India)

    2010-10-26

    Pressure die casting is successfully used in the manufacture of Aluminum alloys components for automobile and many other industries. Die casting is a process involving many process parameters having complex relationship with the quality of the cast product. Though various process parameters have influence on the quality of die cast component, major influence is seen by the die casting machine parameters and their proper settings. In the present work, non-linear regression models have been developed for making predictions and analyzing the effect of die casting machine parameters on the performance characteristics of die casting process. Design of Experiments (DOE) with Response Surface Methodology (RSM) has been used to analyze the effect of effect of input parameters and their interaction on the response and further used to develop nonlinear input-output relationships. Die casting machine parameters, namely, fast shot velocity, slow shot to fast shot change over point, intensification pressure and holding time have been considered as the input variables. The quality characteristics of the cast product were determined by porosity, hardness and surface rough roughness (output/responses). Design of experiments has been used to plan the experiments and analyze the impact of variables on the quality of casting. On the other-hand Response Surface Methodology (Central Composite Design) is utilized to develop non-linear input-output relationships (regression models). The developed regression models have been tested for their statistical adequacy through ANOVA test. The practical usefulness of these models has been tested with some test cases. These models can be used to make the predictions about different quality characteristics, for the known set of die casting machine parameters, without conducting the experiments.

  7. A predictive analytic model for the solar modulation of cosmic rays

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Cholis, Ilias; Hooper, Dan; Linden, Tim

    2016-02-23

    An important factor limiting our ability to understand the production and propagation of cosmic rays pertains to the effects of heliospheric forces, commonly known as solar modulation. The solar wind is capable of generating time- and charge-dependent effects on the spectrum and intensity of low-energy (≲10 GeV) cosmic rays reaching Earth. Previous analytic treatments of solar modulation have utilized the force-field approximation, in which a simple potential is adopted whose amplitude is selected to best fit the cosmic-ray data taken over a given period of time. Making use of recently available cosmic-ray data from the Voyager 1 spacecraft, along withmore » measurements of the heliospheric magnetic field and solar wind, we construct a time-, charge- and rigidity-dependent model of solar modulation that can be directly compared to data from a variety of cosmic-ray experiments. Here, we provide a simple analytic formula that can be easily utilized in a variety of applications, allowing us to better predict the effects of solar modulation and reduce the number of free parameters involved in cosmic-ray propagation models.« less

  8. Dixie Valley Engineered Geothermal System Exploration Methodology Project, Baseline Conceptual Model Report

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Iovenitti, Joe

    2014-01-02

    The Engineered Geothermal System (EGS) Exploration Methodology Project is developing an exploration approach for EGS through the integration of geoscientific data. The Project chose the Dixie Valley Geothermal System in Nevada as a field laboratory site for methodology calibration purposes because, in the public domain, it is a highly characterized geothermal system in the Basin and Range with a considerable amount of geoscience and most importantly, well data. The overall project area is 2500km2 with the Calibration Area (Dixie Valley Geothermal Wellfield) being about 170km2. The project was subdivided into five tasks (1) collect and assess the existing public domain geoscience data; (2) design and populate a GIS database; (3) develop a baseline (existing data) geothermal conceptual model, evaluate geostatistical relationships, and generate baseline, coupled EGS favorability/trust maps from +1km above sea level (asl) to -4km asl for the Calibration Area at 0.5km intervals to identify EGS drilling targets at a scale of 5km x 5km; (4) collect new geophysical and geochemical data, and (5) repeat Task 3 for the enhanced (baseline + new ) data. Favorability maps were based on the integrated assessment of the three critical EGS exploration parameters of interest: rock type, temperature and stress. A complimentary trust map was generated to compliment the favorability maps to graphically illustrate the cumulative confidence in the data used in the favorability mapping. The Final Scientific Report (FSR) is submitted in two parts with Part I describing the results of project Tasks 1 through 3 and Part II covering the results of project Tasks 4 through 5 plus answering nine questions posed in the proposal for the overall project. FSR Part I presents (1) an assessment of the readily available public domain data and some proprietary data provided by Terra-Gen Power, LLC, (2) a re-interpretation of these data as required, (3) an exploratory geostatistical data analysis, (4

  9. WaterSense Program: Methodology for National Water Savings Analysis Model Indoor Residential Water Use

    SciTech Connect (OSTI)

    Whitehead, Camilla Dunham; McNeil, Michael; Dunham_Whitehead, Camilla; Letschert, Virginie; della_Cava, Mirka

    2008-02-28

    The U.S. Environmental Protection Agency (EPA) influences the market for plumbing fixtures and fittings by encouraging consumers to purchase products that carry the WaterSense label, which certifies those products as performing at low flow rates compared to unlabeled fixtures and fittings. As consumers decide to purchase water-efficient products, water consumption will decline nationwide. Decreased water consumption should prolong the operating life of water and wastewater treatment facilities.This report describes the method used to calculate national water savings attributable to EPA?s WaterSense program. A Microsoft Excel spreadsheet model, the National Water Savings (NWS) analysis model, accompanies this methodology report. Version 1.0 of the NWS model evaluates indoor residential water consumption. Two additional documents, a Users? Guide to the spreadsheet model and an Impacts Report, accompany the NWS model and this methodology document. Altogether, these four documents represent Phase One of this project. The Users? Guide leads policy makers through the spreadsheet options available for projecting the water savings that result from various policy scenarios. The Impacts Report shows national water savings that will result from differing degrees of market saturation of high-efficiency water-using products.This detailed methodology report describes the NWS analysis model, which examines the effects of WaterSense by tracking the shipments of products that WaterSense has designated as water-efficient. The model estimates market penetration of products that carry the WaterSense label. Market penetration is calculated for both existing and new construction. The NWS model estimates savings based on an accounting analysis of water-using products and of building stock. Estimates of future national water savings will help policy makers further direct the focus of WaterSense and calculate stakeholder impacts from the program.Calculating the total gallons of water the

  10. Exploring magnetized liner inertial fusion with a semi-analytic model

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    McBride, Ryan D.; Slutz, Stephen A.; Vesey, Roger A.; Gomez, Matthew R.; Sefkow, Adam B.; Hansen, Stephanie B.; Knapp, Patrick F.; Schmit, Paul F.; Geissel, Matthias; Harvey-Thompson, Adam James; et al

    2016-01-01

    In this study, we explore magnetized liner inertial fusion (MagLIF) [S. A. Slutz et al., Phys. Plasmas 17, 056303 (2010)] using a semi-analytic model [R. D. McBride and S. A. Slutz, Phys. Plasmas 22, 052708 (2015)]. Specifically, we present simulation results from this model that: (a) illustrate the parameter space, energetics, and overall system efficiencies of MagLIF; (b) demonstrate the dependence of radiative loss rates on the radial fraction of the fuel that is preheated; (c) explore some of the recent experimental results of the MagLIF program at Sandia National Laboratories [M. R. Gomez et al., Phys. Rev. Lett. 113,more » 155003 (2014)]; (d) highlight the experimental challenges presently facing the MagLIF program; and (e) demonstrate how increases to the preheat energy, fuel density, axial magnetic field, and drive current could affect future MagLIF performance.« less

  11. Exploring magnetized liner inertial fusion with a semi-analytic model

    SciTech Connect (OSTI)

    McBride, Ryan D.; Slutz, Stephen A.; Vesey, Roger A.; Gomez, Matthew R.; Sefkow, Adam B.; Hansen, Stephanie B.; Knapp, Patrick F.; Schmit, Paul F.; Geissel, Matthias; Harvey-Thompson, Adam James; Jennings, Christopher Ashley; Harding, Eric C.; Awe, Thomas James; Rovang, Dean C.; Hahn, Kelly D.; Martin, Matthew R.; Cochrane, Kyle R.; Peterson, Kyle J.; Rochau, Gregory A.; Porter, John L.; Stygar, William A.; Campbell, Edward Michael; Nakhleh, Charles W.; Herrmann, Mark C.; Cuneo, Michael E.; Sinars, Daniel B.

    2016-01-01

    In this study, we explore magnetized liner inertial fusion (MagLIF) [S. A. Slutz et al., Phys. Plasmas 17, 056303 (2010)] using a semi-analytic model [R. D. McBride and S. A. Slutz, Phys. Plasmas 22, 052708 (2015)]. Specifically, we present simulation results from this model that: (a) illustrate the parameter space, energetics, and overall system efficiencies of MagLIF; (b) demonstrate the dependence of radiative loss rates on the radial fraction of the fuel that is preheated; (c) explore some of the recent experimental results of the MagLIF program at Sandia National Laboratories [M. R. Gomez et al., Phys. Rev. Lett. 113, 155003 (2014)]; (d) highlight the experimental challenges presently facing the MagLIF program; and (e) demonstrate how increases to the preheat energy, fuel density, axial magnetic field, and drive current could affect future MagLIF performance.

  12. Physical properties and analytical models of band-to-band tunneling in low-bandgap semiconductors

    SciTech Connect (OSTI)

    Shih, Chun-Hsing Dang Chien, Nguyen

    2014-01-28

    Low-bandgap semiconductors, such as InAs and InSb, are widely considered to be ideal for use in tunnel field-effect transistors to ensure sufficient on-current boosting at low voltages. This work elucidates the physical and mathematical considerations of applying conventional band-to-band tunneling models in low-bandgap semiconductors, and presents a new analytical alternative for practical use. The high-bandgap tunneling generates most at maximum field region with shortest tunnel path, whereas the low-bandgap generations occur dispersedly because of narrow tunnel barrier. The local electrical field associated with tunneling-electron numbers dominates in low-bandgap materials. This work proposes decoupled electric-field terms in the pre-exponential factor and exponential function of generation-rate expressions. Without fitting, the analytical results and approximated forms exhibit great agreements with the sophisticated forms both in high- and low-bandgap semiconductors. Neither nonlocal nor local field is appropriate to be used in numerical simulations for predicting the tunneling generations in a variety of low- and high-bandgap semiconductors.

  13. Dixie Valley Engineered Geothermal System Exploration Methodology Project, Baseline Conceptual Model Report

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Iovenitti, Joe

    The Engineered Geothermal System (EGS) Exploration Methodology Project is developing an exploration approach for EGS through the integration of geoscientific data. The Project chose the Dixie Valley Geothermal System in Nevada as a field laboratory site for methodlogy calibration purposes because, in the public domain, it is a highly characterized geothermal systems in the Basin and Range with a considerable amount of geoscience and most importantly, well data. This Baseline Conceptual Model report summarizes the results of the first three project tasks (1) collect and assess the existing public domain geoscience data, (2) design and populate a GIS database, and (3) develop a baseline (existing data) geothermal conceptual model, evaluate geostatistical relationships, and generate baseline, coupled EGS favorability/trust maps from +1km above sea level (asl) to -4km asl for the Calibration Area (Dixie Valley Geothermal Wellfield) to identify EGS drilling targets at a scale of 5km x 5km. It presents (1) an assessment of the readily available public domain data and some proprietary data provided by Terra-Gen Power, LLC, (2) a re-interpretation of these data as required, (3) an exploratory geostatistical data analysis, (4) the baseline geothermal conceptual model, and (5) the EGS favorability/trust mapping. The conceptual model presented applies to both the hydrothermal system and EGS in the Dixie Valley region.

  14. Mathematical model of marine diesel engine simulator for a new methodology of self propulsion tests

    SciTech Connect (OSTI)

    Izzuddin, Nur; Sunarsih,; Priyanto, Agoes

    2015-05-15

    As a vessel operates in the open seas, a marine diesel engine simulator whose engine rotation is controlled to transmit through propeller shaft is a new methodology for the self propulsion tests to track the fuel saving in a real time. Considering the circumstance, this paper presents the real time of marine diesel engine simulator system to track the real performance of a ship through a computer-simulated model. A mathematical model of marine diesel engine and the propeller are used in the simulation to estimate fuel rate, engine rotating speed, thrust and torque of the propeller thus achieve the target vessel’s speed. The input and output are a real time control system of fuel saving rate and propeller rotating speed representing the marine diesel engine characteristics. The self-propulsion tests in calm waters were conducted using a vessel model to validate the marine diesel engine simulator. The simulator then was used to evaluate the fuel saving by employing a new mathematical model of turbochargers for the marine diesel engine simulator. The control system developed will be beneficial for users as to analyze different condition of vessel’s speed to obtain better characteristics and hence optimize the fuel saving rate.

  15. Dixie Valley Engineered Geothermal System Exploration Methodology Project, Baseline Conceptual Model Report

    SciTech Connect (OSTI)

    Iovenitti, Joe

    2013-05-15

    The Engineered Geothermal System (EGS) Exploration Methodology Project is developing an exploration approach for EGS through the integration of geoscientific data. The Project chose the Dixie Valley Geothermal System in Nevada as a field laboratory site for methodlogy calibration purposes because, in the public domain, it is a highly characterized geothermal systems in the Basin and Range with a considerable amount of geoscience and most importantly, well data. This Baseline Conceptual Model report summarizes the results of the first three project tasks (1) collect and assess the existing public domain geoscience data, (2) design and populate a GIS database, and (3) develop a baseline (existing data) geothermal conceptual model, evaluate geostatistical relationships, and generate baseline, coupled EGS favorability/trust maps from +1km above sea level (asl) to -4km asl for the Calibration Area (Dixie Valley Geothermal Wellfield) to identify EGS drilling targets at a scale of 5km x 5km. It presents (1) an assessment of the readily available public domain data and some proprietary data provided by Terra-Gen Power, LLC, (2) a re-interpretation of these data as required, (3) an exploratory geostatistical data analysis, (4) the baseline geothermal conceptual model, and (5) the EGS favorability/trust mapping. The conceptual model presented applies to both the hydrothermal system and EGS in the Dixie Valley region.

  16. Analytical modeling and structural response of a stretched-membrane reflective module

    SciTech Connect (OSTI)

    Murphy, L.M.; Sallis, D.V.

    1984-06-01

    The optical and structural load deformation response behavior of a uniform pressure-loaded stretched-membrane reflective module subject to nonaxisymmetric support constraints is studied in this report. To aid in the understanding of this behavior, an idealized analytical model is developed and implemented and predictions are compared with predictions based on the detailed structural analysis code NASTRAN. Single structural membrane reflector modules are studied in this analysis. In particular, the interaction of the frame-membrane combination and variations in membrane pressure loading and tension are studied in detail. Variations in the resulting lateral shear load on the frame, frame lateral support, and frame twist as a function of distance between the supports are described as are the resulting optical effects. Results indicate the need to consider the coupled deformation problem as the lateral frame deformations are amplified by increasing the membrane tension. The importance of accurately considering the effects of different membrane attachment approaches is also demonstrated.

  17. A non-linear dimension reduction methodology for generating data-driven stochastic input models

    SciTech Connect (OSTI)

    Ganapathysubramanian, Baskar; Zabaras, Nicholas

    2008-06-20

    Stochastic analysis of random heterogeneous media (polycrystalline materials, porous media, functionally graded materials) provides information of significance only if realistic input models of the topology and property variations are used. This paper proposes a framework to construct such input stochastic models for the topology and thermal diffusivity variations in heterogeneous media using a data-driven strategy. Given a set of microstructure realizations (input samples) generated from given statistical information about the medium topology, the framework constructs a reduced-order stochastic representation of the thermal diffusivity. This problem of constructing a low-dimensional stochastic representation of property variations is analogous to the problem of manifold learning and parametric fitting of hyper-surfaces encountered in image processing and psychology. Denote by M the set of microstructures that satisfy the given experimental statistics. A non-linear dimension reduction strategy is utilized to map M to a low-dimensional region, A. We first show that M is a compact manifold embedded in a high-dimensional input space R{sup n}. An isometric mapping F from M to a low-dimensional, compact, connected set A is contained in R{sup d}(d<methodology uses arguments from graph theory and differential geometry to construct the isometric transformation F:M{yields}A. Asymptotic convergence of the representation of M by A is shown. This mapping F serves as an accurate, low-dimensional, data-driven representation of the property variations. The reduced-order model of the material topology and thermal diffusivity variations is subsequently used as an input in the solution of stochastic partial differential equations that describe the evolution of dependant variables. A sparse grid collocation strategy (Smolyak algorithm) is utilized to solve these stochastic equations efficiently. We showcase the

  18. Shock compression modeling of metallic single crystals: comparison of finite difference, steady wave, and analytical solutions

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Lloyd, Jeffrey T.; Clayton, John D.; Austin, Ryan A.; McDowell, David L.

    2015-07-10

    Background: The shock response of metallic single crystals can be captured using a micro-mechanical description of the thermoelastic-viscoplastic material response; however, using a such a description within the context of traditional numerical methods may introduce a physical artifacts. Advantages and disadvantages of complex material descriptions, in particular the viscoplastic response, must be framed within approximations introduced by numerical methods. Methods: Three methods of modeling the shock response of metallic single crystals are summarized: finite difference simulations, steady wave simulations, and algebraic solutions of the Rankine-Hugoniot jump conditions. For the former two numerical techniques, a dislocation density based framework describes themore » rate- and temperature-dependent shear strength on each slip system. For the latter analytical technique, a simple (two-parameter) rate- and temperature-independent linear hardening description is necessarily invoked to enable simultaneous solution of the governing equations. For all models, the same nonlinear thermoelastic energy potential incorporating elastic constants of up to order 3 is applied. Results: Solutions are compared for plate impact of highly symmetric orientations (all three methods) and low symmetry orientations (numerical methods only) of aluminum single crystals shocked to 5 GPa (weak shock regime) and 25 GPa (overdriven regime). Conclusions: For weak shocks, results of the two numerical methods are very similar, regardless of crystallographic orientation. For strong shocks, artificial viscosity affects the finite difference solution, and effects of transverse waves for the lower symmetry orientations not captured by the steady wave method become important. The analytical solution, which can only be applied to highly symmetric orientations, provides reasonable accuracy with regards to prediction of most variables in the final shocked state but, by construction, does not provide

  19. Methodology for modeling the devolatilization of refuse-derived fuel from thermogravimetric analysis of municipal solid waste components

    SciTech Connect (OSTI)

    Fritsky, K.J.; Miller, D.L.; Cernansky, N.P.

    1994-09-01

    A methodology was introduced for modeling the devolatilization characteristics of refuse-derived fuel (RFD) in terms of temperature-dependent weight loss. The basic premise of the methodology is that RDF is modeled as a combination of select municipal solid waste (MSW) components. Kinetic parameters are derived for each component from thermogravimetric analyzer (TGA) data measured at a specific set of conditions. These experimentally derived parameters, along with user-derived parameters, are inputted to model equations for the purpose of calculating thermograms for the components. The component thermograms are summed to create a composite thermogram that is an estimate of the devolatilization for the as-modeled RFD. The methodology has several attractive features as a thermal analysis tool for waste fuels. 7 refs., 10 figs., 3 tabs.

  20. Comparison of limited measurements of the OTEC-1 plume with analytical-model predictions

    SciTech Connect (OSTI)

    Paddock, R.A.; Ditmars, J.D.

    1981-07-01

    Ocean Thermal Energy Conversion (OTEC) requires significant amounts of warm surface waters and cold deep waters for power production. Because these waters are returned to the ocean as effluents, their behavior may affect plant operation and impact the environment. The OTEC-1 facility tested 1-MWe heat exchangers aboard the vessel Ocean Energy Converter moored off the island of Hawaii. The warm and cold waters used by the OTEC-1 facility were combined prior to discharge from the vessel to create a mixed discharge condition. A limited field survey of the mixed discharge plume using fluorescent dye as a tracer was conducted on April 11, 1981, as part of the environmental studies at OTEC-1 coordinated by the Marine Sciences Group at Lawrence Berkeley Laboratory. Results of that survey were compared with analytical model predictions of plume behavior. Although the predictions were in general agreement with the results of the plume survey, inherent limitations in the field measurements precluded complete description of the plume or detailed evaluation of the models.

  1. A new time-dependent analytic model for radiation-induced photocurrent in finite 1D epitaxial diodes.

    SciTech Connect (OSTI)

    Verley, Jason C.; Axness, Carl L.; Hembree, Charles Edward; Keiter, Eric Richard; Kerr, Bert

    2012-04-01

    Photocurrent generated by ionizing radiation represents a threat to microelectronics in radiation environments. Circuit simulation tools such as SPICE [1] can be used to analyze these threats, and typically rely on compact models for individual electrical components such as transistors and diodes. Compact models consist of a handful of differential and/or algebraic equations, and are derived by making simplifying assumptions to any of the many semiconductor transport equations. Historically, many photocurrent compact models have suffered from accuracy issues due to the use of qualitative approximation, rather than mathematically correct solutions to the ambipolar diffusion equation. A practical consequence of this inaccuracy is that a given model calibration is trustworthy over only a narrow range of operating conditions. This report describes work to produce improved compact models for photocurrent. Specifically, an analytic model is developed for epitaxial diode structures that have a highly doped subcollector. The analytic model is compared with both numerical TCAD calculations, as well as the compact model described in reference [2]. The new analytic model compares well against TCAD over a wide range of operating conditions, and is shown to be superior to the compact model from reference [2].

  2. Ion heating and energy partition at the heliospheric termination shock: hybrid simulations and analytical model

    SciTech Connect (OSTI)

    Gary, S Peter; Winske, Dan; Wu, Pin; Schwadron, N A; Lee, M

    2009-01-01

    The Los Alamos hybrid simulation code is used to examine heating and the partition of dissipation energy at the perpendicular heliospheric termination shock in the presence of pickup ions. The simulations are one-dimensional in space but three-dimensional in field and velocity components, and are carried out for a range of values of pickup ion relative density. Results from the simulations show that because the solar wind ions are relatively cold upstream, the temperature of these ions is raised by a relatively larger factor than the temperature of the pickup ions. An analytic model for energy partition is developed on the basis of the Rankine-Hugoniot relations and a polytropic energy equation. The polytropic index {gamma} used in the Rankine-Hugoniot relations is varied to improve agreement between the model and the simulations concerning the fraction of downstream heating in the pickup ions as well as the compression ratio at the shock. When the pickup ion density is less than 20%, the polytropic index is about 5/3, whereas for pickup ion densities greater than 20%, the polytropic index tends toward 2.2, suggesting a fundamental change in the character of the shock, as seen in the simulations, when the pickup ion density is large. The model and the simulations both indicate for the upstream parameters chosen for Voyager 2 conditions that the pickup ion density is about 25% and the pickup ions gain the larger share (approximately 90%) of the downstream thermal pressure, consistent with Voyager 2 observations near the shock.

  3. Analytical modeling of a hydraulically-compensated compressed-air energy-storage system

    SciTech Connect (OSTI)

    McMonagle, C.A.; Rowe, D.S.

    1982-12-01

    A computer program was developed to calculate the dynamic response of a hydraulically-compensated compressed air energy storage (CAES) system, including the compressor, air pipe, cavern, and hydraulic compensation pipe. The model is theoretically based on the two-fluid model in which the dynamics of each phase are presented by its set of conservation equations for mass and momentum. The conservation equations define the space and time distribution of pressure, void fraction, air saturation, and phase velocities. The phases are coupled by two interface equations. The first defines the rate of generation (or dissolution) of gaseous air in water and can include the effects of supersaturation. The second defines the frictional shear coupling (drag) between the gaseous air and water as they move relative to each other. The relative motion of the air and water is, therefore, calculated and not specified by a slip or drift-velocity correlation. The total CASE system is represented by a nodal arrangement. The conservation equations are written for each nodal volume and are solved numerically. System boundary conditions include the air flow rate, atmospheric pressure at the top of the compensation pipe, and air saturation in the reservoir. Initial conditions are selected for velocity and air saturation. Uniform and constant temperature (60/sup 0/F) is assumed. The analytical model was used to investigate the dynamic response of a proposed system.Investigative calculations considered high and low water levels, and a variety of charging and operating conditions. For all cases investigated, the cavern response to air-charging, was a damped oscillation of pressure and flow. Detailed results are presented. These calculations indicate that the Champagne Effect is unlikely to cause blowout for a properly designed CAES system.

  4. Human Factors Engineering Program Review Model (NUREG-0711)Revision 3: Update Methodology and Key Revisions

    SciTech Connect (OSTI)

    OHara J. M.; Higgins, J.; Fleger, S.

    2012-07-22

    The U.S. Nuclear Regulatory Commission (NRC) reviews the human factors engineering (HFE) programs of applicants for nuclear power plant construction permits, operating licenses, standard design certifications, and combined operating licenses. The purpose of these safety reviews is to help ensure that personnel performance and reliability are appropriately supported. Detailed design review procedures and guidance for the evaluations is provided in three key documents: the Standard Review Plan (NUREG-0800), the HFE Program Review Model (NUREG-0711), and the Human-System Interface Design Review Guidelines (NUREG-0700). These documents were last revised in 2007, 2004 and 2002, respectively. The NRC is committed to the periodic update and improvement of the guidance to ensure that it remains a state-of-the-art design evaluation tool. To this end, the NRC is updating its guidance to stay current with recent research on human performance, advances in HFE methods and tools, and new technology being employed in plant and control room design. NUREG-0711 is the first document to be addressed. We present the methodology used to update NUREG-0711 and summarize the main changes made. Finally, we discuss the current status of the update program and the future plans.

  5. ANALYTICAL MODELS OF EXOPLANETARY ATMOSPHERES. II. RADIATIVE TRANSFER VIA THE TWO-STREAM APPROXIMATION

    SciTech Connect (OSTI)

    Heng, Kevin; Mendonça, João M.; Lee, Jae-Min E-mail: joao.mendonca@csh.unibe.ch

    2014-11-01

    We present a comprehensive analytical study of radiative transfer using the method of moments and include the effects of non-isotropic scattering in the coherent limit. Within this unified formalism, we derive the governing equations and solutions describing two-stream radiative transfer (which approximates the passage of radiation as a pair of outgoing and incoming fluxes), flux-limited diffusion (which describes radiative transfer in the deep interior), and solutions for the temperature-pressure profiles. Generally, the problem is mathematically underdetermined unless a set of closures (Eddington coefficients) is specified. We demonstrate that the hemispheric (or hemi-isotropic) closure naturally derives from the radiative transfer equation if energy conservation is obeyed, while the Eddington closure produces spurious enhancements of both reflected light and thermal emission. We concoct recipes for implementing two-stream radiative transfer in stand-alone numerical calculations and general circulation models. We use our two-stream solutions to construct toy models of the runaway greenhouse effect. We present a new solution for temperature-pressure profiles with a non-constant optical opacity and elucidate the effects of non-isotropic scattering in the optical and infrared. We derive generalized expressions for the spherical and Bond albedos and the photon deposition depth. We demonstrate that the value of the optical depth corresponding to the photosphere is not always 2/3 (Milne's solution) and depends on a combination of stellar irradiation, internal heat, and the properties of scattering in both the optical and infrared. Finally, we derive generalized expressions for the total, net, outgoing, and incoming fluxes in the convective regime.

  6. ANALYTICAL MODELS OF EXOPLANETARY ATMOSPHERES. I. ATMOSPHERIC DYNAMICS VIA THE SHALLOW WATER SYSTEM

    SciTech Connect (OSTI)

    Heng, Kevin; Workman, Jared E-mail: jworkman@coloradomesa.edu

    2014-08-01

    Within the context of exoplanetary atmospheres, we present a comprehensive linear analysis of forced, damped, magnetized shallow water systems, exploring the effects of dimensionality, geometry (Cartesian, pseudo-spherical, and spherical), rotation, magnetic tension, and hydrodynamic and magnetic sources of friction. Across a broad range of conditions, we find that the key governing equation for atmospheres and quantum harmonic oscillators are identical, even when forcing (stellar irradiation), sources of friction (molecular viscosity, Rayleigh drag, and magnetic drag), and magnetic tension are included. The global atmospheric structure is largely controlled by a single key parameter that involves the Rossby and Prandtl numbers. This near-universality breaks down when either molecular viscosity or magnetic drag acts non-uniformly across latitude or a poloidal magnetic field is present, suggesting that these effects will introduce qualitative changes to the familiar chevron-shaped feature witnessed in simulations of atmospheric circulation. We also find that hydrodynamic and magnetic sources of friction have dissimilar phase signatures and affect the flow in fundamentally different ways, implying that using Rayleigh drag to mimic magnetic drag is inaccurate. We exhaustively lay down the theoretical formalism (dispersion relations, governing equations, and time-dependent wave solutions) for a broad suite of models. In all situations, we derive the steady state of an atmosphere, which is relevant to interpreting infrared phase and eclipse maps of exoplanetary atmospheres. We elucidate a pinching effect that confines the atmospheric structure to be near the equator. Our suite of analytical models may be used to develop decisively physical intuition and as a reference point for three-dimensional magnetohydrodynamic simulations of atmospheric circulation.

  7. Analytic model of electron self-injection in a plasma wakefield accelerator in the strongly nonlinear bubble regime

    SciTech Connect (OSTI)

    Yi, S. A.; Khudik, V.; Siemon, C.; Shvets, G.

    2012-12-21

    Self-injection of background electrons in plasma wakefield accelerators in the highly nonlinear bubble regime is analyzed using particle-in-cell and semi-analytic modeling. It is shown that the return current in the bubble sheath layer is crucial for accurate determination of the trapped particle trajectories.

  8. Development and application of a statistical methodology to evaluate the predictive accuracy of building energy baseline models

    SciTech Connect (OSTI)

    Granderson, Jessica; Price, Phillip N.

    2014-03-01

    This paper documents the development and application of a general statistical methodology to assess the accuracy of baseline energy models, focusing on its application to Measurement and Verification (M&V) of whole-­building energy savings. The methodology complements the principles addressed in resources such as ASHRAE Guideline 14 and the International Performance Measurement and Verification Protocol. It requires fitting a baseline model to data from a ``training period’’ and using the model to predict total electricity consumption during a subsequent ``prediction period.’’ We illustrate the methodology by evaluating five baseline models using data from 29 buildings. The training period and prediction period were varied, and model predictions of daily, weekly, and monthly energy consumption were compared to meter data to determine model accuracy. Several metrics were used to characterize the accuracy of the predictions, and in some cases the best-­performing model as judged by one metric was not the best performer when judged by another metric.

  9. Precarious Rock Methodology for Seismic Hazard: Physical Testing, Numerical Modeling and Coherence Studies

    SciTech Connect (OSTI)

    Anooshehpoor, Rasool; Purvance, Matthew D.; Brune, James N.; Preston, Leiph A.; Anderson, John G.; Smith, Kenneth D.

    2006-09-29

    This report covers the following projects: Shake table tests of precarious rock methodology, field tests of precarious rocks at Yucca Mountain and comparison of the results with PSHA predictions, study of the coherence of the wave field in the ESF, and a limited survey of precarious rocks south of the proposed repository footprint. A series of shake table experiments have been carried out at the University of Nevada, Reno Large Scale Structures Laboratory. The bulk of the experiments involved scaling acceleration time histories (uniaxial forcing) from 0.1g to the point where the objects on the shake table overturned a specified number of times. The results of these experiments have been compared with numerical overturning predictions. Numerical predictions for toppling of large objects with simple contact conditions (e.g., I-beams with sharp basal edges) agree well with shake-table results. The numerical model slightly underpredicts the overturning of small rectangular blocks. It overpredicts the overturning PGA for asymmetric granite boulders with complex basal contact conditions. In general the results confirm the approximate predictions of previous studies. Field testing of several rocks at Yucca Mountain has approximately confirmed the preliminary results from previous studies, suggesting that he PSHA predictions are too high, possibly because the uncertainty in the mean of the attenuation relations. Study of the coherence of wavefields in the ESF has provided results which will be very important in design of the canisters distribution, in particular a preliminary estimate of the wavelengths at which the wavefields become incoherent. No evidence was found for extreme focusing by lens-like inhomogeneities. A limited survey for precarious rocks confirmed that they extend south of the repository, and one of these has been field tested.

  10. An analytical model for studying effects of gas release from a failed fuel pin of a liquid-metal reactor

    SciTech Connect (OSTI)

    Shin, Y.W.

    1993-01-01

    A analytical model for describing dynamics of a gas bubble in the liquid sodium of a liquid-metal reactor as the result of failed fuel pins is discussed. A model to describe the coupled response of the liquid sodium surrounding the gas bubble is also discussed. The analysis method is programmed in a computer code and used to analyze some available experimental data, and the results are discussed.

  11. Methodology Development for Passive Component Reliability Modeling in a Multi-Physics Simulation Environment

    SciTech Connect (OSTI)

    Aldemir, Tunc; Denning, Richard; Catalyurek, Umit; Unwin, Stephen

    2015-01-23

    Reduction in safety margin can be expected as passive structures and components undergo degradation with time. Limitations in the traditional probabilistic risk assessment (PRA) methodology constrain its value as an effective tool to address the impact of aging effects on risk and for quantifying the impact of aging management strategies in maintaining safety margins. A methodology has been developed to address multiple aging mechanisms involving large numbers of components (with possibly statistically dependent failures) within the PRA framework in a computationally feasible manner when the sequencing of events is conditioned on the physical conditions predicted in a simulation environment, such as the New Generation System Code (NGSC) concept. Both epistemic and aleatory uncertainties can be accounted for within the same phenomenological framework and maintenance can be accounted for in a coherent fashion. The framework accommodates the prospective impacts of various intervention strategies such as testing, maintenance, and refurbishment. The methodology is illustrated with several examples.

  12. HELIOSPHERIC PROPAGATION OF CORONAL MASS EJECTIONS: COMPARISON OF NUMERICAL WSA-ENLIL+CONE MODEL AND ANALYTICAL DRAG-BASED MODEL

    SciTech Connect (OSTI)

    Vrnak, B.; ic, T.; Dumbovi?, M.; Temmer, M.; Mstl, C.; Veronig, A. M.; Taktakishvili, A.; Mays, M. L.; Odstr?il, D. E-mail: tzic@geof.hr E-mail: manuela.temmer@uni-graz.at E-mail: astrid.veronig@uni-graz.at E-mail: m.leila.mays@nasa.gov

    2014-08-01

    Real-time forecasting of the arrival of coronal mass ejections (CMEs) at Earth, based on remote solar observations, is one of the central issues of space-weather research. In this paper, we compare arrival-time predictions calculated applying the numerical ''WSA-ENLIL+Cone model'' and the analytical ''drag-based model'' (DBM). Both models use coronagraphic observations of CMEs as input data, thus providing an early space-weather forecast two to four days before the arrival of the disturbance at the Earth, depending on the CME speed. It is shown that both methods give very similar results if the drag parameter ? = 0.1 is used in DBM in combination with a background solar-wind speed of w = 400 km s{sup 1}. For this combination, the mean value of the difference between arrival times calculated by ENLIL and DBM is ?-bar =0.099.0 hr with an average of the absolute-value differences of |?|-bar =7.1 hr. Comparing the observed arrivals (O) with the calculated ones (C) for ENLIL gives O C = 0.3 16.9 hr and, analogously, O C = +1.1 19.1 hr for DBM. Applying ? = 0.2 with w = 450 km s{sup 1} in DBM, one finds O C = 1.7 18.3 hr, with an average of the absolute-value differences of 14.8 hr, which is similar to that for ENLIL, 14.1 hr. Finally, we demonstrate that the prediction accuracy significantly degrades with increasing solar activity.

  13. SociAL Sensor Analytics: Measuring Phenomenology at Scale

    SciTech Connect (OSTI)

    Corley, Courtney D.; Dowling, Chase P.; Rose, Stuart J.; McKenzie, Taylor K.

    2013-06-04

    The objective of this paper is to present a system for interrogating immense social media streams through analytical methodologies that characterize topics and events critical to tactical and strategic planning. First, we propose a conceptual framework for interpreting social media as a sensor network. Time-series models and topic clustering algorithms are used to implement this concept into a functioning analytical system. Next, we address two scientific challenges: 1) to understand, quantify, and baseline phenomenology of social media at scale, and 2) to develop analytical methodologies to detect and investigate events of interest. This paper then documents computational methods and reports experimental findings that address these challenges. Ultimately, the ability to process billions of social media posts per week over a period of years enables the identification of patterns and predictors of tactical and strategic concerns at an unprecedented rate through SociAL Sensor Analytics (SALSA).

  14. Revenue Requirements Modeling System (RRMS) documentation. Volume I. Methodology description and user's guide. Appendix A: model abstract; Appendix B: technical appendix; Appendix C: sample input and output. [Compustat

    SciTech Connect (OSTI)

    Not Available

    1986-03-01

    The Revenue Requirements Modeling System (RRMS) is a utility specific financial modeling system used by the Energy Information Administration (EIA) to evaluate the impact on electric utilities of changes in the regulatory, economic, and tax environments. Included in the RRMS is a power plant life-cycle revenue requirements model designed to assess the comparative economic advantage of alternative generating plant. This report is Volume I of a 2-volume set and provides a methodology description and user's guide, a model abstract and technical appendix, and sample input and output for the models. Volume II provides an operator's manual and a program maintenance guide.

  15. World Energy Projection System Plus Model Documentation: Refinery Model

    Reports and Publications (EIA)

    2016-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) Refinery Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  16. World Energy Projection System Plus Model Documentation: District Heat Model

    Reports and Publications (EIA)

    2011-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) District Heat Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  17. World Energy Projection System Plus Model Documentation: Coal Model

    Reports and Publications (EIA)

    2011-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) Coal Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  18. World Energy Projection System Plus Model Documentation: Commercial Model

    Reports and Publications (EIA)

    2011-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) Commercial Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  19. World Energy Projection System Plus Model Documentation: Natural Gas Model

    Reports and Publications (EIA)

    2011-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) Natural Gas Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  20. World Energy Projection System Plus Model Documentation: Main Model

    Reports and Publications (EIA)

    2011-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) Main Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  1. World Energy Projection System Plus Model Documentation: Industrial Model

    Reports and Publications (EIA)

    2011-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) World Industrial Model (WIM). It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  2. World Energy Projection System Plus Model Documentation: Refinery Model

    Reports and Publications (EIA)

    2011-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) Refinery Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  3. International Natural Gas Model 2011, Model Documentation Report

    Reports and Publications (EIA)

    2013-01-01

    This report documents the objectives, analytical approach and development of the International Natural Gas Model (INGM). It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  4. World Energy Projection System Plus Model Documentation: World Electricity Model

    Reports and Publications (EIA)

    2011-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) World Electricity Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  5. World Energy Projection System Plus Model Documentation: Transportation Model

    Reports and Publications (EIA)

    2011-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) International Transportation model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  6. World Energy Projection System Plus Model Documentation: Greenhouse Gases Model

    Reports and Publications (EIA)

    2011-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) Greenhouse Gases Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  7. World Energy Projection System Plus Model Documentation: Residential Model

    Reports and Publications (EIA)

    2011-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) Residential Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  8. AN ANALYTICAL MODEL OF INTERSTELLAR GAS IN THE HELIOSPHERE TAILORED TO INTERSTELLAR BOUNDARY EXPLORER OBSERVATIONS

    SciTech Connect (OSTI)

    Lee, Martin A.; Kucharek, Harald; Moebius, Eberhard; Wu Xian [Space Science Center and Department of Physics, University of New Hampshire, Durham, NH 03824 (United States); Bzowski, Maciej [Space Research Centre, Polish Academy of Sciences, 00-716 Warsaw (Poland); McComas, David, E-mail: marty.lee@unh.edu [Engineering and Space Science Division, Southwest Research Institute, San Antonio, TX 78228 (United States)

    2012-02-01

    The stationary distribution of interstellar neutral gas in the heliosphere subject to solar gravity, solar radiation pressure, photoionization, and charge exchange is investigated analytically assuming ionization rates and radiation pressure that are proportional to R{sup -2}, where R is the heliocentric radius. The collisionless hyperbolic trajectories of the individual atoms including ionization losses are combined with Liouville's Theorem to construct the heliospheric phase-space distribution function of an interstellar gas species in the solar reference frame under the assumption that the distribution is a drifting Maxwellian at large distances from the Sun. The distribution is transformed to the Earth (essentially Interstellar Boundary Explorer (IBEX)) frame as a function of solar longitude. The expression is then tailored to the latitudinal scan of IBEX as a function of longitude using the fact that IBEX detects each atom close to perihelion in its hyperbolic orbit. The distribution is further adapted to IBEX by integrating the differential intensity over the entrance aperture solid angle of the IBEX-Lo collimator, and over energy to predict the IBEX count rate of helium. The major features of the predicted count rate are described, including a peak in longitude, a peak in latitude at each longitude, and the widths of the major peak in both latitude and longitude. Analytical formulae for these features are derived for comparison with IBEX observations in order to determine the temperature and bulk velocity of the gas in interstellar space. Based in part on these formulae, the results for helium are presented in the companion paper by Moebius et al.

  9. An Update of the Analytical Groundwater Modeling to Assess Water Resource Impacts at the Afton Solar Energy Zone

    SciTech Connect (OSTI)

    Quinn, John J.; Greer, Christopher B.; Carr, Adrianne E.

    2014-10-01

    The purpose of this study is to update a one-dimensional analytical groundwater flow model to examine the influence of potential groundwater withdrawal in support of utility-scale solar energy development at the Afton Solar Energy Zone (SEZ) as a part of the Bureau of Land Management’s (BLM’s) Solar Energy Program. This report describes the modeling for assessing the drawdown associated with SEZ groundwater pumping rates for a 20-year duration considering three categories of water demand (high, medium, and low) based on technology-specific considerations. The 2012 modeling effort published in the Final Programmatic Environmental Impact Statement for Solar Energy Development in Six Southwestern States (Solar PEIS; BLM and DOE 2012) has been refined based on additional information described below in an expanded hydrogeologic discussion.

  10. Numerical modeling of the groundwater contaminant transport for the Lake Karachai Area: The methodological approach and the basic two- dimensional regional model

    SciTech Connect (OSTI)

    Petrov, A.V.; Samsonova, L.M.; Vasil`kova, N.A.; Zinin, A.I.; Zinina, G.A. |

    1994-06-01

    Methodological aspects of the numerical modeling of the groundwater contaminant transport for the Lake Karachay area are discussed. Main features of conditions of the task are the high grade of non-uniformity of the aquifer in the fractured rock massif and the high density of the waste solutions, and also the high volume of the input data: both on the part of parameters of the aquifer (number of pump tests) and on the part of observations of functions of processes (long-time observations by the monitoring well grid). The modeling process for constructing the two dimensional regional model is described, and this model is presented as the basic model for subsequent full three-dimensional modeling in sub-areas of interest. Original powerful mathematical apparatus and computer codes for finite-difference numerical modeling are used.

  11. Analytical model for CMB temperature angular power spectrum from cosmic (super-)strings

    SciTech Connect (OSTI)

    Yamauchi, Daisuke; Yoo, Chul-Moon; Sasaki, Misao; Takahashi, Keitaro; Sendouda, Yuuiti

    2010-09-15

    We present a new analytical method to calculate the small angle cosmic microwave background (CMB) temperature angular power spectrum due to cosmic (super-)string segments. In particular, using our method, we clarify the dependence on the intercommuting probability P. We find that the power spectrum is dominated by Poisson-distributed string segments. The power spectrum for a general value of P has a plateau on large angular scales and shows a power-law decrease on small angular scales. The resulting spectrum in the case of conventional cosmic strings is in very good agreement with the numerical result obtained by Fraisse et al.. Then we estimate the upper bound on the dimensionless tension of the string for various values of P by assuming that the fraction of the CMB power spectrum due to cosmic (super-)strings is less than ten percent at various angular scales up to l=2000. We find that the amplitude of the spectrum increases as the intercommuting probability. As a consequence, strings with smaller intercommuting probabilities are found to be more tightly constrained.

  12. An analytic model for the response of a CZT detector in diagnostic energy dispersive x-ray spectroscopy

    SciTech Connect (OSTI)

    LeClair, Robert J.; Wang Yinkun; Zhao Peiying; Boileau, Michel; Wang, Lilie; Fleurot, Fabrice [Department of Physics and Astronomy, Laurentian University, 935 Ramsey Lake Road, Sudbury, Ontario P3E 2C6 (Canada) and Biomolecular Sciences Program, Laurentian University, 935 Ramsey Lake Road, Sudbury, Ontario P3E 2C6 (Canada); Department of Physics and Astronomy, Laurentian University, 935 Ramsey Lake Road, Sudbury, Ontario P3E 2C6 (Canada)

    2006-05-15

    A CdZnTe detector (CZTD) can be very useful for measuring diagnostic x-ray spectra. The semiconductor detector does, however, exhibit poor hole transport properties and fluorescence generation upon atomic de-excitations. This article describes an analytic model to characterize these two phenomena that occur when a CZTD is exposed to diagnostic x rays. The analytical detector response functions compare well with those obtained via Monte Carlo calculations. The response functions were applied to 50, 80, and 110 kV x-ray spectra. Two 50 kV spectra were measured; one with no filtration and the other with 1.35 mm Al filtration. The unfiltered spectrum was numerically filtered with 1.35 mm of Al in order to see whether the recovered spectrum resembled the filtered spectrum actually measured. A deviation curve was obtained by subtracting one curve from the other on an energy bin by bin basis. The deviation pattern fluctuated around the zero line when corrections were applied to both spectra. Significant deviations from zero towards the lower energies were observed when the uncorrected spectra were used. Beside visual observations, the exposure obtained using the numerically attenuated unfiltered beam was compared to the exposure calculated with the actual filtered beam. The percent differences were 0.8% when corrections were applied and 25% for no corrections. The model can be used to correct diagnostic x-ray spectra measured with a CdZnTe detector.

  13. Analysis of Wind Turbine Simulation Models: Assessment of Simplified versus Complete Methodologies: Preprint

    SciTech Connect (OSTI)

    Honrubia-Escribano, A.; Jimenez-Buendia, F.; Molina-Garcia, A.; Fuentes-Moreno, J. A.; Muljadi, Eduard; Gomez-Lazaro, E.

    2015-09-14

    This paper presents the current status of simplified wind turbine models used for power system stability analysis. This work is based on the ongoing work being developed in IEC 61400-27. This international standard, for which a technical committee was convened in October 2009, is focused on defining generic (also known as simplified) simulation models for both wind turbines and wind power plants. The results of the paper provide an improved understanding of the usability of generic models to conduct power system simulations.

  14. On the Inclusion of Energy-Shifting Demand Response in Production Cost Models: Methodology and a Case Study

    SciTech Connect (OSTI)

    O'Connell, Niamh; Hale, Elaine; Doebber, Ian; Jorgenson, Jennie

    2015-07-20

    In the context of future power system requirements for additional flexibility, demand response (DR) is an attractive potential resource. Its proponents widely laud its prospective benefits, which include enabling higher penetrations of variable renewable generation at lower cost than alternative storage technologies, and improving economic efficiency. In practice, DR from the commercial and residential sectors is largely an emerging, not a mature, resource, and its actual costs and benefits need to be studied to determine promising combinations of physical DR resource, enabling controls and communications, power system characteristics, regulatory environments, market structures, and business models. The work described in this report focuses on the enablement of such analysis from the production cost modeling perspective. In particular, we contribute a bottom-up methodology for modeling load-shifting DR in production cost models. The resulting model is sufficiently detailed to reflect the physical characteristics and constraints of the underlying flexible load, and includes the possibility of capturing diurnal and seasonal variations in the resource. Nonetheless, the model is of low complexity and thus suitable for inclusion in conventional unit commitment and market clearing algorithms. The ability to simulate DR as an operational resource on a power system over a year facilitates an assessment of its time-varying value to the power system.

  15. GREET 1.0 -- Transportation fuel cycles model: Methodology and use

    SciTech Connect (OSTI)

    Wang, M.Q.

    1996-06-01

    This report documents the development and use of the Greenhouse Gases, Regulated Emissions, and Energy Use in Transportation (GREET) model. The model, developed in a spreadsheet format, estimates the full fuel-cycle emissions and energy use associated with various transportation fuels for light-duty vehicles. The model calculates fuel-cycle emissions of five criteria pollutants (volatile organic compounds, Co, NOx, SOx, and particulate matter measuring 10 microns or less) and three greenhouse gases (carbon dioxide, methane, and nitrous oxide). The model also calculates the total fuel-cycle energy consumption, fossil fuel consumption, and petroleum consumption using various transportation fuels. The GREET model includes 17 fuel cycles: petroleum to conventional gasoline, reformulated gasoline, clean diesel, liquefied petroleum gas, and electricity via residual oil; natural gas to compressed natural gas, liquefied petroleum gas, methanol, hydrogen, and electricity; coal to electricity; uranium to electricity; renewable energy (hydropower, solar energy, and wind) to electricity; corn, woody biomass, and herbaceous biomass to ethanol; and landfill gases to methanol. This report presents fuel-cycle energy use and emissions for a 2000 model-year car powered by each of the fuels that are produced from the primary energy sources considered in the study.

  16. A new analytic-adaptive model for EGS assessment, development and management support

    Broader source: Energy.gov [DOE]

    This project will develop an in depth model of EGS systems that will allow engineers, practitioners, and researchers to more accurately predict how new fluid technologies would work in a reservoir.

  17. A New Analytic-Adaptive Model for EGS Assessment, Development and Management Support

    SciTech Connect (OSTI)

    Danko, George L

    2014-05-29

    To increase understanding of the energy extraction capacity of Enhanced Geothermal System(s) (EGS), a numerical model development and application project is completed. The general objective of the project is to develop and apply a new, data-coupled Thermal-Hydrological-Mechanical-Chemical (T-H-M-C) model in which the four internal components can be freely selected from existing simulation software without merging and cross-combining a diverse set of computational codes. Eight tasks are completed during the project period. The results are reported in five publications, an MS thesis, twelve quarterly, and two annual reports to DOE. Two US patents have also been issued during the project period, with one patent application originated prior to the start of the project. The Multiphase Physical Transport Modeling Method and Modeling System (U.S. Patent 8,396,693 B2, 2013), a key element in the GHE sub-model solution, is successfully used for EGS studies. The Geothermal Energy Extraction System and Method" invention (U.S. Patent 8,430,166 B2, 2013) originates from the time of project performance, describing a new fluid flow control solution. The new, coupled T-H-M-C numerical model will help analyzing and designing new, efficient EGS systems.

  18. SU-C-9A-04: Alternative Analytic Solution to the Paralyzable Detector Model to Calculate Deadtime and Deadtime Loss

    SciTech Connect (OSTI)

    Siman, W; Kappadath, S

    2014-06-01

    Purpose: Some common methods to solve for deadtime are (1) dual-source method, which assumes two equal activities; (2) model fitting, which requires multiple acquisitions as source decays; and (3) lossless model, which assumes no deadtime loss at low count rates. We propose a new analytic alternative solution to calculate deadtime for paralyzable gamma camera. Methods: Deadtime T can be calculated analytically from two distinct observed count rates M1 and M2 when the ratio of the true count rates alpha=N2/N1 is known. Alpha can be measured as a ratio of two measured activities using dose calibrators or via radioactive decay. Knowledge of alpha creates a system with 2 equations and 2 unknowns, i.e., T and N1. To verify the validity of the proposed method, projections of a non-uniform phantom (4GBq 99mTc) were acquired in using Siemens SymbiaS multiple times over 48 hours. Each projection has >100kcts. The deadtime for each projection was calculated by fitting the data to a paralyzable model and also by using the proposed 2-acquisition method. The two estimates of deadtime were compared using the Bland-Altmann method. In addition, the dependency of uncertainty in T on uncertainty in alpha was investigated for several imaging conditions. Results: The results strongly suggest that the 2-acquisition method is equivalent to the fitting method. The Bland-Altman analysis yielded mean difference in deadtime estimate of ∼0.076us (95%CI: -0.049us, 0.103us) between the 2-acquisition and model fitting methods. The 95% limits of agreement were calculated to be -0.104 to 0.256us. The uncertainty in deadtime calculated using the proposed method is highly dependent on the uncertainty in the ratio alpha. Conclusion: The 2-acquisition method was found to be equivalent to the parameter fitting method. The proposed method offers a simpler and more practical way to analytically solve for a paralyzable detector deadtime, especially during physics testing.

  19. Comparison of Two Gas Selection Methodologies: An Application of Bayesian Model Averaging

    SciTech Connect (OSTI)

    Renholds, Andrea S.; Thompson, Sandra E.; Anderson, Kevin K.; Chilton, Lawrence K.

    2006-03-31

    One goal of hyperspectral imagery analysis is the detection and characterization of plumes. Characterization includes identifying the gases in the plumes, which is a model selection problem. Two gas selection methods compared in this report are Bayesian model averaging (BMA) and minimum Akaike information criterion (AIC) stepwise regression (SR). Simulated spectral data from a three-layer radiance transfer model were used to compare the two methods. Test gases were chosen to span the types of spectra observed, which exhibit peaks ranging from broad to sharp. The size and complexity of the search libraries were varied. Background materials were chosen to either replicate a remote area of eastern Washington or feature many common background materials. For many cases, BMA and SR performed the detection task comparably in terms of the receiver operating characteristic curves. For some gases, BMA performed better than SR when the size and complexity of the search library increased. This is encouraging because we expect improved BMA performance upon incorporation of prior information on background materials and gases.

  20. Emulating a System Dynamics Model with Agent-Based Models: A Methodological Case Study in Simulation of Diabetes Progression

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Schryver, Jack; Nutaro, James; Shankar, Mallikarjun

    2015-10-30

    An agent-based simulation model hierarchy emulating disease states and behaviors critical to progression of diabetes type 2 was designed and implemented in the DEVS framework. The models are translations of basic elements of an established system dynamics model of diabetes. In this model hierarchy, which mimics diabetes progression over an aggregated U.S. population, was dis-aggregated and reconstructed bottom-up at the individual (agent) level. Four levels of model complexity were defined in order to systematically evaluate which parameters are needed to mimic outputs of the system dynamics model. Moreover, the four estimated models attempted to replicate stock counts representing disease statesmore » in the system dynamics model, while estimating impacts of an elderliness factor, obesity factor and health-related behavioral parameters. Health-related behavior was modeled as a simple realization of the Theory of Planned Behavior, a joint function of individual attitude and diffusion of social norms that spread over each agent s social network. Although the most complex agent-based simulation model contained 31 adjustable parameters, all models were considerably less complex than the system dynamics model which required numerous time series inputs to make its predictions. In all three elaborations of the baseline model provided significantly improved fits to the output of the system dynamics model. The performances of the baseline agent-based model and its extensions illustrate a promising approach to translate complex system dynamics models into agent-based model alternatives that are both conceptually simpler and capable of capturing main effects of complex local agent-agent interactions.« less

  1. Data Analytics

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    MANTISSA Energy Aware Computing Exascale Computing Partnerships Shifter: User Defined Images Archive APEX Home » R & D » Data Analytics Data Analytics MANTISSA Massive Acceleration of New Techniques In Science with Scalable Algorithms: Scalable Statistics and Machine Learning Algorithms are essential for extracting insights from Big Data. Our interdisciplinary team is trying to address a number of challenging analysis problems from a number of science domains at Lawrence Berkeley National

  2. INTERLINE 5. 0 -- An expanded railroad routing model: Program description, methodology, and revised user's manual

    SciTech Connect (OSTI)

    Johnson, P.E.; Joy, D.S. ); Clarke, D.B.; Jacobi, J.M. . Transportation Center)

    1993-03-01

    A rail routine model, INTERLINE, has been developed at the Oak Ridge National Laboratory to investigate potential routes for transporting radioactive materials. In Version 5.0, the INTERLINE routing algorithms have been enhanced to include the ability to predict alternative routes, barge routes, and population statistics for any route. The INTERLINE railroad network is essentially a computerized rail atlas describing the US railroad system. All rail lines, with the exception of industrial spurs, are included in the network. Inland waterways and deep water routes along with their interchange points with the US railroadsystem are also included. The network contains over 15,000 rail and barge segments (links) and over 13,000 stations, interchange points, ports, and other locations (nodes). The INTERLINE model has been converted to operate on an IBM-compatible personal computer. At least a 286 computer with a hard disk containing approximately 6 MB of free space is recommended. Enhanced program performance will be obtained by using arandom-access memory drive on a 386 or 486 computer.

  3. Modeling threat assessments of water supply systems using markov latent effects methodology.

    SciTech Connect (OSTI)

    Silva, Consuelo Juanita

    2006-12-01

    Recent amendments to the Safe Drinking Water Act emphasize efforts toward safeguarding our nation's water supplies against attack and contamination. Specifically, the Public Health Security and Bioterrorism Preparedness and Response Act of 2002 established requirements for each community water system serving more than 3300 people to conduct an assessment of the vulnerability of its system to a terrorist attack or other intentional acts. Integral to evaluating system vulnerability is the threat assessment, which is the process by which the credibility of a threat is quantified. Unfortunately, full probabilistic assessment is generally not feasible, as there is insufficient experience and/or data to quantify the associated probabilities. For this reason, an alternative approach is proposed based on Markov Latent Effects (MLE) modeling, which provides a framework for quantifying imprecise subjective metrics through possibilistic or fuzzy mathematics. Here, an MLE model for water systems is developed and demonstrated to determine threat assessments for different scenarios identified by the assailant, asset, and means. Scenario assailants include terrorists, insiders, and vandals. Assets include a water treatment plant, water storage tank, node, pipeline, well, and a pump station. Means used in attacks include contamination (onsite chemicals, biological and chemical), explosives and vandalism. Results demonstrated highest threats are vandalism events and least likely events are those performed by a terrorist.

  4. The {ital Energy Interaction Model}: A promising new methodology for projecting GPHS-RTG cladding failures, release amounts & respirable release fractions for postulated pre-launch, launch, and post-reentry earth impact accidents

    SciTech Connect (OSTI)

    Coleman, J.R.; Sholtis, J.A. Jr.; McCulloch, W.H.

    1998-01-01

    Safety analyses and evaluations must be scrutable, defensible, and credible. This is particularly true when nuclear systems are involved, with their attendant potential for releases of radioactive materials (source terms) to the unrestricted environment. Analytical projections of General Purpose Heat Source Radioisotope Thermoelectric Generator (GPHS-RTG) source terms, for safety analyses conducted to date, have relied upon generic data correlations using a single parameter of cladding damage, termed {open_quotes}distortion.{close_quotes} However, distortion is not an unequivocal measure of cladding insult, failure, or release. Furthermore, the analytical foundation, applicability, and broad use of distortion are argumentative and, thus, somewhat troublesome. In an attempt to avoid the complications associated with the use of distortion, a new methodology, referred to as the {ital Energy Interaction Model (EIM)}, has been preliminarily developed. This new methodology is based upon the physical principles of energy and energy exchange during mechanical interactions. Specifically, the {ital EIM} considers the energy imparted to GPHS-RTG components (bare fueled clads, GPHS modules, and full GPHS-RTGs) when exposed to mechanical threats (blast/overpressure, shrapnel and fragment impacts, and Earth surface impacts) posed by the full range of potential accidents. Expected forms are developed for equations intended to project cladding failure probabilities, the number of cladding failures expected, release amounts, and the fraction released as respirable particles. The coefficients of the equations developed are then set to fit the GPHS-RTG test data, ensuring good agreement with the experimental database. This assured, fitted agreement with the test database, along with the foundation of the {ital EIM} in first principles, provides confidence in the model{close_quote}s projections beyond the available database. In summary, the newly developed {ital EIM} methodology is

  5. SU-E-J-145: Validation of An Analytical Model for in Vivo Range Verification Using GATE Monte Carlo Simulation in Proton Therapy

    SciTech Connect (OSTI)

    Lee, C; Lin, H; Chao, T; Hsiao, I; Chuang, K

    2015-06-15

    Purpose: Predicted PET images on the basis of analytical filtering approach for proton range verification has been successful developed and validated using FLUKA Monte Carlo (MC) codes and phantom measurements. The purpose of the study is to validate the effectiveness of analytical filtering model for proton range verification on GATE/GEANT4 Monte Carlo simulation codes. Methods: In this study, we performed two experiments for validation of predicted β+-isotope by the analytical model with GATE/GEANT4 simulations. The first experiments to evaluate the accuracy of predicting β+-yields as a function of irradiated proton energies. In second experiment, we simulate homogeneous phantoms of different materials irradiated by a mono-energetic pencil-like proton beam. The results of filtered β+-yields distributions by the analytical model is compared with those of MC simulated β+-yields in proximal and distal fall-off ranges. Results: The results investigate the distribution between filtered β+-yields and MC simulated β+-yields distribution in different conditions. First, we found that the analytical filtering can be applied over the whole range of the therapeutic energies. Second, the range difference between filtered β+-yields and MC simulated β+-yields at the distal fall-off region are within 1.5mm for all materials used. The findings validated the usefulness of analytical filtering model on range verification of proton therapy on GATE Monte Carlo simulations. In addition, there is a larger discrepancy between filtered prediction and MC simulated β+-yields using GATE code, especially in proximal region. This discrepancy might Result from the absence of wellestablished theoretical models for predicting the nuclear interactions. Conclusion: Despite the fact that large discrepancies of the distributions between MC-simulated and predicted β+-yields were observed, the study prove the effectiveness of analytical filtering model for proton range verification using

  6. Developing a Cost Model and Methodology to Estimate Capital Costs for Thermal Energy Storage

    SciTech Connect (OSTI)

    Glatzmaier, G.

    2011-12-01

    This report provides an update on the previous cost model for thermal energy storage (TES) systems. The update allows NREL to estimate the costs of such systems that are compatible with the higher operating temperatures associated with advanced power cycles. The goal of the Department of Energy (DOE) Solar Energy Technology Program is to develop solar technologies that can make a significant contribution to the United States domestic energy supply. The recent DOE SunShot Initiative sets a very aggressive cost goal to reach a Levelized Cost of Energy (LCOE) of 6 cents/kWh by 2020 with no incentives or credits for all solar-to-electricity technologies.1 As this goal is reached, the share of utility power generation that is provided by renewable energy sources is expected to increase dramatically. Because Concentrating Solar Power (CSP) is currently the only renewable technology that is capable of integrating cost-effective energy storage, it is positioned to play a key role in providing renewable, dispatchable power to utilities as the share of power generation from renewable sources increases. Because of this role, future CSP plants will likely have as much as 15 hours of Thermal Energy Storage (TES) included in their design and operation. As such, the cost and performance of the TES system is critical to meeting the SunShot goal for solar technologies. The cost of electricity from a CSP plant depends strongly on its overall efficiency, which is a product of two components - the collection and conversion efficiencies. The collection efficiency determines the portion of incident solar energy that is captured as high-temperature thermal energy. The conversion efficiency determines the portion of thermal energy that is converted to electricity. The operating temperature at which the overall efficiency reaches its maximum depends on many factors, including material properties of the CSP plant components. Increasing the operating temperature of the power generation

  7. Analytical Technology

    SciTech Connect (OSTI)

    Goheen, Steven C.

    2001-07-01

    Characterizing environmental samples has been exhaustively addressed in the literature for most analytes of environmental concern. One of the weak areas of environmental analytical chemistry is that of radionuclides and samples contaminated with radionuclides. The analysis of samples containing high levels of radionuclides can be far more complex than that of non-radioactive samples. This chapter addresses the analysis of samples with a wide range of radioactivity. The other areas of characterization examined in this chapter are the hazardous components of mixed waste, and special analytes often associated with radioactive materials. Characterizing mixed waste is often similar to characterizing waste components in non-radioactive materials. The largest differences are in associated safety precautions to minimize exposure to dangerous levels of radioactivity. One must attempt to keep radiological dose as low as reasonably achievable (ALARA). This chapter outlines recommended procedures to safely and accurately characterize regulated components of radioactive samples.

  8. Analytic derivation of an approximate SU(3) symmetry inside the symmetry triangle of the interacting boson approximation model

    SciTech Connect (OSTI)

    Bonatsos, Dennis; Karampagia, S.; Casten, R. F.

    2011-05-15

    Using a contraction of the SU(3) algebra to the algebra of the rigid rotator in the large-boson-number limit of the interacting boson approximation (IBA) model, a line is found inside the symmetry triangle of the IBA, along which the SU(3) symmetry is preserved. The line extends from the SU(3) vertex to near the critical line of the first-order shape/phase transition separating the spherical and prolate deformed phases, and it lies within the Alhassid-Whelan arc of regularity, the unique valley of regularity connecting the SU(3) and U(5) vertices in the midst of chaotic regions. In addition to providing an explanation for the existence of the arc of regularity, the present line represents an example of an analytically determined approximate symmetry in the interior of the symmetry triangle of the IBA. The method is applicable to algebraic models possessing subalgebras amenable to contraction. This condition is equivalent to algebras in which the equilibrium ground state and its rotational band become energetically isolated from intrinsic excitations, as typified by deformed solutions to the IBA for large numbers of valence nucleons.

  9. Bearing Analytics

    Broader source: Energy.gov [DOE]

    Bearing Analytics is a leading-edge equipment monitoring company aimed at pioneering a new era in industrial bearing condition monitoring. Our objective is to consolidate the needs of customers, environment, and manufacturers to improve asset management and energy efficiency capabilities one bearing at a time.

  10. Combined surface analytical methods to characterize degradative...

    Office of Scientific and Technical Information (OSTI)

    a silane monolayer, and local displacement of silane molecules from the Si surface.more We have applied this analytical methodology at the Si coupon level up to MEMS devices. ...

  11. Magnetohydrodynamics and deep mixing in evolved stars. I. Two- and three-dimensional analytical models for the asymptotic giant branch

    SciTech Connect (OSTI)

    Nucci, M. C.; Busso, M. E-mail: busso@fisica.unipg.it

    2014-06-01

    The advection of thermonuclear ashes by magnetized domains emerging near the H shell was suggested to explain asymptotic giant branch (AGB) star abundances. Here we verify this idea quantitatively through exact MHD models. Starting with a simple two-dimensional (2D) geometry and in an inertia frame, we study plasma equilibria avoiding the complications of numerical simulations. We show that below the convective envelope of an AGB star, variable magnetic fields induce a natural expansion, permitted by the almost ideal MHD conditions, in which the radial velocity grows as the second power of the radius. We then study the convective envelope, where the complexity of macroturbulence allows only for a schematic analytical treatment. Here the radial velocity depends on the square root of the radius. We then verify the robustness of our results with 3D calculations for the velocity, showing that for both studied regions the solution previously found can be seen as a planar section of a more complex behavior, in which the average radial velocity retains the same dependency on the radius found in 2D. As a final check, we compare our results to approximate descriptions of buoyant magnetic structures. For realistic boundary conditions, the envelope crossing times are sufficient to disperse in the huge convective zone any material transported, suggesting magnetic advection as a promising mechanism for deep mixing. The mixing velocities are smaller than for convection but larger than for diffusion and adequate for extra mixing in red giants.

  12. A semi-analytic power balance model for low (L) to high (H) mode transition power threshold

    SciTech Connect (OSTI)

    Singh, R., E-mail: rsingh129@yahoo.co.in [WCI Center for Fusion Theory, National Fusion Research Institute, Daejeon 305-333 (Korea, Republic of); Institute for Plasma Research, Bhat Gandhinagar 2382 428 (India); Jhang, Hogun [WCI Center for Fusion Theory, National Fusion Research Institute, Daejeon 305-333 (Korea, Republic of); Kaw, P. K. [Institute for Plasma Research, Bhat Gandhinagar 2382 428 (India); Diamond, P. H. [WCI Center for Fusion Theory, National Fusion Research Institute, Daejeon 305-333 (Korea, Republic of); Center for Momentum Transport and Flow Organization, University of California, San Diego, California 92093 (United States); Center for Astrophysics and Space Sciences, University of California, San Diego, 9500 Gilman Dr., La Jolla, California 92093-0424 (United States); Nordman, H. [Department of Earth and Space Sciences, Chalmers University of Technology, SE-412 96 Gteborg (Sweden); Bourdelle, C. [Euratom-CEA Association, CEA/DSM/DRFC, CEA Cadarache F-13108 Saint-Paul-Lez-Durance (France); Loarte, A. [ITER Organization, Route de Vinon Sur Verdon, A. 13115 Saint Paul Lez Durance (France)

    2014-06-15

    We present a semi-analytic model for low (L) to high (H) mode transition power threshold (P{sub th}). Two main assumptions are made in our study. First, high poloidal mode number drift resistive ballooning modes (high-m DRBM) are assumed to be the dominant turbulence driver in a narrow edge region near to last closed flux surface. Second, the pre-transition edge profile and turbulent diffusivity at the narrow edge region pertain to turbulent equipartition. An edge power balance relation is derived by calculating the dissipated power flux through both turbulent conduction and convection, and radiation in the edge region. P{sub th} is obtained by imposing the turbulence quench rule due to sheared E??B rotation. Evaluation of P{sub th} shows a good agreement with experimental results in existing machines. Increase of P{sub th} at low density (i.e., the existence of roll-over density in P{sub th} vs. density) is shown to originate from the longer scale length of the density profile than that of the temperature profile.

  13. SASSI Analytical Methods Compared with SHAKE Free-Field Results

    Office of Environmental Management (EM)

    Analytical Methods Compared with SHAKE Results Structural Mechanics - SRS October 4, 2011 1 Objective This study presents a methodology for validating SASSI for use with a...

  14. Quantifying construction and demolition waste: An analytical review

    SciTech Connect (OSTI)

    Wu, Zezhou; Yu, Ann T.W.; Shen, Liyin; Liu, Guiwen

    2014-09-15

    Highlights: • Prevailing C and D waste quantification methodologies are identified and compared. • One specific methodology cannot fulfill all waste quantification scenarios. • A relevance tree for appropriate quantification methodology selection is proposed. • More attentions should be paid to civil and infrastructural works. • Classified information is suggested for making an effective waste management plan. - Abstract: Quantifying construction and demolition (C and D) waste generation is regarded as a prerequisite for the implementation of successful waste management. In literature, various methods have been employed to quantify the C and D waste generation at both regional and project levels. However, an integrated review that systemically describes and analyses all the existing methods has yet to be conducted. To bridge this research gap, an analytical review is conducted. Fifty-seven papers are retrieved based on a set of rigorous procedures. The characteristics of the selected papers are classified according to the following criteria - waste generation activity, estimation level and quantification methodology. Six categories of existing C and D waste quantification methodologies are identified, including site visit method, waste generation rate method, lifetime analysis method, classification system accumulation method, variables modelling method and other particular methods. A critical comparison of the identified methods is given according to their characteristics and implementation constraints. Moreover, a decision tree is proposed for aiding the selection of the most appropriate quantification method in different scenarios. Based on the analytical review, limitations of previous studies and recommendations of potential future research directions are further suggested.

  15. Methodology for Validating Building Energy Analysis Simulations

    SciTech Connect (OSTI)

    Judkoff, R.; Wortman, D.; O'Doherty, B.; Burch, J.

    2008-04-01

    The objective of this report was to develop a validation methodology for building energy analysis simulations, collect high-quality, unambiguous empirical data for validation, and apply the validation methodology to the DOE-2.1, BLAST-2MRT, BLAST-3.0, DEROB-3, DEROB-4, and SUNCAT 2.4 computer programs. This report covers background information, literature survey, validation methodology, comparative studies, analytical verification, empirical validation, comparative evaluation of codes, and conclusions.

  16. Analytical Services Management System

    Energy Science and Technology Software Center (OSTI)

    2005-03-30

    Analytical Services Management System (ASMS) provides sample management services. Sample management includes sample planning for analytical requests, sample tracking for shipping and receiving by the laboratory, receipt of the analytical data deliverable, processing the deliverable and payment of the laboratory conducting the analyses. ASMS is a web based application that provides the ability to manage these activities at multiple locations for different customers. ASMS provides for the assignment of single to multiple samples for standardmore » chemical and radiochemical analyses. ASMS is a flexible system which allows the users to request analyses by line item code. Line item codes are selected based on the Basic Ordering Agreement (BOA) format for contracting with participating laboratories. ASMS also allows contracting with non-BOA laboratories using a similar line item code contracting format for their services. ASMS allows sample and analysis tracking from sample planning and collection in the field through sample shipment, laboratory sample receipt, laboratory analysis and submittal of the requested analyses, electronic data transfer, and payment of the laboratories for the completed analyses. The software when in operation contains business sensitive material that is used as a principal portion of the Kaiser Analytical Management Services business model. The software version provided is the most recent version, however the copy of the application does not contain business sensitive data from the associated Oracle tables such as contract information or price per line item code.« less

  17. Analytical Services Management System

    SciTech Connect (OSTI)

    Church, Shane; Nigbor, Mike; Hillman, Daniel

    2005-03-30

    Analytical Services Management System (ASMS) provides sample management services. Sample management includes sample planning for analytical requests, sample tracking for shipping and receiving by the laboratory, receipt of the analytical data deliverable, processing the deliverable and payment of the laboratory conducting the analyses. ASMS is a web based application that provides the ability to manage these activities at multiple locations for different customers. ASMS provides for the assignment of single to multiple samples for standard chemical and radiochemical analyses. ASMS is a flexible system which allows the users to request analyses by line item code. Line item codes are selected based on the Basic Ordering Agreement (BOA) format for contracting with participating laboratories. ASMS also allows contracting with non-BOA laboratories using a similar line item code contracting format for their services. ASMS allows sample and analysis tracking from sample planning and collection in the field through sample shipment, laboratory sample receipt, laboratory analysis and submittal of the requested analyses, electronic data transfer, and payment of the laboratories for the completed analyses. The software when in operation contains business sensitive material that is used as a principal portion of the Kaiser Analytical Management Services business model. The software version provided is the most recent version, however the copy of the application does not contain business sensitive data from the associated Oracle tables such as contract information or price per line item code.

  18. Re-appraisal and extension of the Gratton-Vargas two-dimensional analytical snowplow model of plasma focus evolution in the context of contemporary research

    SciTech Connect (OSTI)

    Auluck, S. K. H.

    2013-11-15

    Recent resurgence of interest in applications of dense plasma focus and doubts about the conventional view of dense plasma focus as a purely irrotational compressive flow have re-opened questions concerning device optimization. In this context, this paper re-appraises and extends the analytical snowplow model of plasma focus sheath evolution developed by F. Gratton and J. M. Vargas [Energy Storage, Compression and Switching, edited by V. Nardi, H. Sahlin, and W. H. Bostick (Plenum, New York, 1983), Vol. 2, p. 353)] and shows its relevance to contemporary research. The Gratton-Vargas (GV) model enables construction of a special orthogonal coordinate system in which the plasma flow problem can be simplified and a model of sheath structure can be formulated. The Lawrenceville Plasma Physics (LPP) plasma focus facility, which reports neutron yield better than global scaling law, is shown to be operating closer to an optimum operating point of the GV model as compared with PF-1000.

  19. Climate Change Modeling and Downscaling Issues and Methodological Perspectives for the U.S. National Climate Assessment

    SciTech Connect (OSTI)

    Janetos, Anthony C.; Collins, William D.; Wuebbles, D.J.; Diffenbaugh, Noah; Hayhoe, Katharine; Hibbard, Kathleen A.; Hurtt, George

    2012-03-31

    This is the full workshop report for the modeling workshop we did for the National Climate Assessment, with DOE support.

  20. Oblique incidence effects in direct x-ray detectors: A first-order approximation using a physics-based analytical model

    SciTech Connect (OSTI)

    Badano, Aldo; Freed, Melanie; Fang Yuan

    2011-04-15

    Purpose: The authors describe the modifications to a previously developed analytical model of indirect CsI:Tl-based detector response required for studying oblique x-ray incidence effects in direct semiconductor-based detectors. This first-order approximation analysis allows the authors to describe the associated degradation in resolution in direct detectors and compare the predictions to the published data for indirect detectors. Methods: The proposed model is based on a physics-based analytical description developed by Freed et al. [''A fast, angle-dependent, analytical model of CsI detector response for optimization of 3D x-ray breast imaging systems,'' Med. Phys. 37(6), 2593-2605 (2010)] that describes detector response functions for indirect detectors and oblique incident x rays. The model, modified in this work to address direct detector response, describes the dependence of the response with x-ray energy, thickness of the transducer layer, and the depth-dependent blur and collection efficiency. Results: The authors report the detector response functions for indirect and direct detector models for typical thicknesses utilized in clinical systems for full-field digital mammography (150 {mu}m for indirect CsI:Tl and 200 {mu}m for a-Se direct detectors). The results suggest that the oblique incidence effect in a semiconductor detector differs from that in indirect detectors in two ways: The direct detector model produces a sharper overall PRF compared to the response corresponding to the indirect detector model for normal x-ray incidence and a larger relative increase in blur along the x-ray incidence direction compared to that found in indirect detectors with respect to the response at normal incidence angles. Conclusions: Compared to the effect seen in indirect detectors, the direct detector model exhibits a sharper response at normal x-ray incidence and a larger relative increase in blur along the x-ray incidence direction with respect to the blur in the

  1. Semantic Interaction for Visual Analytics: Toward Coupling Cognition and Computation

    SciTech Connect (OSTI)

    Endert, Alexander

    2014-07-01

    The dissertation discussed in this article [1] was written in the midst of an era of digitization. The world is becoming increasingly instrumented with sensors, monitoring, and other methods for generating data describing social, physical, and natural phenomena. Thus, data exist with the potential of being analyzed to uncover, or discover, the phenomena from which it was created. However, as the analytic models leveraged to analyze these data continue to increase in complexity and computational capability, how can visualizations and user interaction methodologies adapt and evolve to continue to foster discovery and sensemaking?

  2. Data and Analytics Strategy

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    and Analytics Strategy --- 1 --- Prabhat Data and Analytics Group Lead February 23, 2015 ... ata S o6ware * Big D ata U sers --- 3 --- Data and Analytics Team --- 4 --- DAS T eam M ...

  3. An analytical elastic plastic contact model with strain hardening and frictional effects for normal and oblique impacts

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Brake, M. R. W.

    2015-02-17

    Impact between metallic surfaces is a phenomenon that is ubiquitous in the design and analysis of mechanical systems. We found that to model this phenomenon, a new formulation for frictional elastic–plastic contact between two surfaces is developed. The formulation is developed to consider both frictional, oblique contact (of which normal, frictionless contact is a limiting case) and strain hardening effects. The constitutive model for normal contact is developed as two contiguous loading domains: the elastic regime and a transitionary region in which the plastic response of the materials develops and the elastic response abates. For unloading, the constitutive model ismore » based on an elastic process. Moreover, the normal contact model is assumed to only couple one-way with the frictional/tangential contact model, which results in the normal contact model being independent of the frictional effects. Frictional, tangential contact is modeled using a microslip model that is developed to consider the pressure distribution that develops from the elastic–plastic normal contact. This model is validated through comparisons with experimental results reported in the literature, and is demonstrated to be significantly more accurate than 10 other normal contact models and three other tangential contact models found in the literature.« less

  4. An analytical elastic plastic contact model with strain hardening and frictional effects for normal and oblique impacts

    SciTech Connect (OSTI)

    Brake, M. R. W.

    2015-02-17

    Impact between metallic surfaces is a phenomenon that is ubiquitous in the design and analysis of mechanical systems. We found that to model this phenomenon, a new formulation for frictional elasticplastic contact between two surfaces is developed. The formulation is developed to consider both frictional, oblique contact (of which normal, frictionless contact is a limiting case) and strain hardening effects. The constitutive model for normal contact is developed as two contiguous loading domains: the elastic regime and a transitionary region in which the plastic response of the materials develops and the elastic response abates. For unloading, the constitutive model is based on an elastic process. Moreover, the normal contact model is assumed to only couple one-way with the frictional/tangential contact model, which results in the normal contact model being independent of the frictional effects. Frictional, tangential contact is modeled using a microslip model that is developed to consider the pressure distribution that develops from the elasticplastic normal contact. This model is validated through comparisons with experimental results reported in the literature, and is demonstrated to be significantly more accurate than 10 other normal contact models and three other tangential contact models found in the literature.

  5. Analytical and computational study of the ideal full two-fluid plasma model and asymptotic approximations for Hall-magnetohydrodynamics

    SciTech Connect (OSTI)

    Srinivasan, B.; Shumlak, U.

    2011-09-15

    The 5-moment two-fluid plasma model uses Euler equations to describe the ion and electron fluids and Maxwell's equations to describe the electric and magnetic fields. Two-fluid physics becomes significant when the characteristic spatial scales are on the order of the ion skin depth and characteristic time scales are on the order of the ion cyclotron period. The full two-fluid plasma model has disparate characteristic speeds ranging from the ion and electron speeds of sound to the speed of light. Two asymptotic approximations are applied to the full two-fluid plasma to arrive at the Hall-MHD model, namely negligible electron inertia and infinite speed of light. The full two-fluid plasma model and the Hall-MHD model are studied for applications to an electromagnetic plasma shock, geospace environmental modeling (GEM challenge) magnetic reconnection, an axisymmetric Z-pinch, and an axisymmetric field reversed configuration (FRC).

  6. Nonlinear dynamics of the ion Weibel-filamentation instability: An analytical model for the evolution of the plasma and spectral properties

    SciTech Connect (OSTI)

    Ruyer, C. Gremillet, L. Debayle, A.; Bonnaud, G.

    2015-03-15

    We present a predictive model of the nonlinear phase of the Weibel instability induced by two symmetric, counter-streaming ion beams in the non-relativistic regime. This self-consistent model combines the quasilinear kinetic theory of Davidson et al. [Phys. Fluids 15, 317 (1972)] with a simple description of current filament coalescence. It allows us to follow the evolution of the ion parameters up to a stage close to complete isotropization, and is thus of prime interest to understand the dynamics of collisionless shock formation. Its predictions are supported by 2-D and 3-D particle-in-cell simulations of the ion Weibel instability. The derived approximate analytical solutions reveal the various dependencies of the ion relaxation to isotropy. In particular, it is found that the influence of the electron screening can affect the results of simulations using an unphysical electron mass.

  7. Analytical Services - Hanford Site

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Contracting Wastren Advantage, Inc. Analytical Services Contracting ORP Contracts and Procurements RL Contracts and Procurements CH2M HILL Plateau Remediation Company Mission Support Alliance Washington Closure Hanford HPM Corporation (HPMC) Wastren Advantage, Inc. Analytical Services HASQARD Focus Group Bechtel National, Inc. Washington River Protection Solutions Analytical Services Email Email Page | Print Print Page | Text Increase Font Size Decrease Font Size Analytical laboratory analyses

  8. DOE Challenge Home Label Methodology

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    October 2012 1 Label Methodology DOE Challenge Home Label Methodology October 2012 DOE Challenge Home October 2012 2 Label Methodology Contents Background ............................................................................................................................................... 3 Methodology ............................................................................................................................................. 5 Comfort/Quiet

  9. Analytical Chemistry Division annual progress report for period ending December 31, 1979

    SciTech Connect (OSTI)

    Shults, W.D.; Lyon, W.S.

    1980-05-01

    The progress is reported in the following sections: analytical methodology, mass and emission spectrometry, technical support, bio-organic analysis, nuclear and radiochemical analysis, and quality assurance. (DLC)

  10. INTERLINE 5.0 -- An expanded railroad routing model: Program description, methodology, and revised user`s manual

    SciTech Connect (OSTI)

    Johnson, P.E.; Joy, D.S.; Clarke, D.B.; Jacobi, J.M.

    1993-03-01

    A rail routine model, INTERLINE, has been developed at the Oak Ridge National Laboratory to investigate potential routes for transporting radioactive materials. In Version 5.0, the INTERLINE routing algorithms have been enhanced to include the ability to predict alternative routes, barge routes, and population statistics for any route. The INTERLINE railroad network is essentially a computerized rail atlas describing the US railroad system. All rail lines, with the exception of industrial spurs, are included in the network. Inland waterways and deep water routes along with their interchange points with the US railroadsystem are also included. The network contains over 15,000 rail and barge segments (links) and over 13,000 stations, interchange points, ports, and other locations (nodes). The INTERLINE model has been converted to operate on an IBM-compatible personal computer. At least a 286 computer with a hard disk containing approximately 6 MB of free space is recommended. Enhanced program performance will be obtained by using arandom-access memory drive on a 386 or 486 computer.

  11. Quantitative Analysis of Variability and Uncertainty in Environmental Data and Models. Volume 1. Theory and Methodology Based Upon Bootstrap Simulation

    SciTech Connect (OSTI)

    Frey, H. Christopher; Rhodes, David S.

    1999-04-30

    This is Volume 1 of a two-volume set of reports describing work conducted at North Carolina State University sponsored by Grant Number DE-FG05-95ER30250 by the U.S. Department of Energy. The title of the project is “Quantitative Analysis of Variability and Uncertainty in Acid Rain Assessments.” The work conducted under sponsorship of this grant pertains primarily to two main topics: (1) development of new methods for quantitative analysis of variability and uncertainty applicable to any type of model; and (2) analysis of variability and uncertainty in the performance, emissions, and cost of electric power plant combustion-based NOx control technologies. These two main topics are reported separately in Volumes 1 and 2.

  12. Multimedia Analysis plus Visual Analytics = Multimedia Analytics

    SciTech Connect (OSTI)

    Chinchor, Nancy; Thomas, James J.; Wong, Pak C.; Christel, Michael; Ribarsky, Martin W.

    2010-10-01

    Multimedia analysis has focused on images, video, and to some extent audio and has made progress in single channels excluding text. Visual analytics has focused on the user interaction with data during the analytic process plus the fundamental mathematics and has continued to treat text as did its precursor, information visualization. The general problem we address in this tutorial is the combining of multimedia analysis and visual analytics to deal with multimedia information gathered from different sources, with different goals or objectives, and containing all media types and combinations in common usage.

  13. Actinide Analytical Chemistry

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    AAC Actinide Analytical Chemistry We do analyses that range from assay of the major and ... Group Office (505) 667-4087 The Actinide Analytical Chemistry (C-AAC) Group at Los Alamos ...

  14. Risk Assessment of Cascading Outages: Methodologies and Challenges

    SciTech Connect (OSTI)

    Vaiman, Marianna; Bell, Keith; Chen, Yousu; Chowdhury, Badrul; Dobson, Ian; Hines, Paul; Papic, Milorad; Miller, Stephen; Zhang, Pei

    2012-05-31

    Abstract- This paper is a result of ongoing activity carried out by Understanding, Prediction, Mitigation and Restoration of Cascading Failures Task Force under IEEE Computer Analytical Methods Subcommittee (CAMS). The task force's previous papers are focused on general aspects of cascading outages such as understanding, prediction, prevention and restoration from cascading failures. This is the first of two new papers, which extend this previous work to summarize the state of the art in cascading failure risk analysis methodologies and modeling tools. This paper is intended to be a reference document to summarize the state of the art in the methodologies for performing risk assessment of cascading outages caused by some initiating event(s). A risk assessment should cover the entire potential chain of cascades starting with the initiating event(s) and ending with some final condition(s). However, this is a difficult task and heuristic approaches and approximations have been suggested. This paper discusses different approaches to this and suggests directions for future development of methodologies. The second paper summarizes the state of the art in modeling tools for risk assessment of cascading outages.

  15. Analytical admittance characterization of high mobility channel

    SciTech Connect (OSTI)

    Mammeri, A. M.; Mahi, F. Z.; Varani, L.

    2015-03-30

    In this contribution, we investigate the small-signal admittance of the high electron mobility transistors field-effect channels under a continuation branching of the current between channel and gate by using an analytical model. The analytical approach takes into account the linearization of the 2D Poisson equation and the drift current along the channel. The analytical equations discuss the frequency dependence of the admittance at source and drain terminals on the geometrical transistor parameters.

  16. Risk Assessment of Cascading Outages: Part I - Overview of Methodologies

    SciTech Connect (OSTI)

    Vaiman, Marianna; Bell, Keith; Chen, Yousu; Chowdhury, Badrul; Dobson, Ian; Hines, Paul; Papic, Milorad; Miller, Stephen; Zhang, Pei

    2011-07-31

    This paper is a result of ongoing activity carried out by Understanding, Prediction, Mitigation and Restoration of Cascading Failures Task Force under IEEE Computer Analytical Methods Subcommittee (CAMS). The task force's previous papers are focused on general aspects of cascading outages such as understanding, prediction, prevention and restoration from cascading failures. This is the first of two new papers, which will extend this previous work to summarize the state of the art in cascading failure risk analysis methodologies and modeling tools. This paper is intended to be a reference document to summarize the state of the art in the methodologies for performing risk assessment of cascading outages caused by some initiating event(s). A risk assessment should cover the entire potential chain of cascades starting with the initiating event(s) and ending with some final condition(s). However, this is a difficult task and heuristic approaches and approximations have been suggested. This paper discusses diffeent approaches to this and suggests directions for future development of methodologies.

  17. 2008 ASC Methodology Errata

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    BONNEVILLE POWER ADMINISTRATION'S ERRATA CORRECTIONS TO THE 2008 AVERAGE SYSTEM COST METHODOLOGY September 12, 2008 I. DESCRIPTION OF ERRATA CORRECTIONS A. Attachment A, ASC...

  18. Draft Tiered Rate Methodology

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    For Regional Dialogue Discussion Purposes Only Pre-Decisional Draft Tiered Rates Methodology March 7, 2008 Pre-decisional, Deliberative, For Discussion Purposes Only March 7,...

  19. Web Analytics and Statistics

    Office of Energy Efficiency and Renewable Energy (EERE)

    EERE uses Google Analytics to capture statistics on its websites. These statistics help website managers measure and report on users, sessions, most visited pages, and more.

  20. Graph Analytics for Signature Discovery

    SciTech Connect (OSTI)

    Hogan, Emilie A.; Johnson, John R.; Halappanavar, Mahantesh; Lo, Chaomei

    2013-06-01

    Within large amounts of seemingly unstructured data it can be diffcult to find signatures of events. In our work we transform unstructured data into a graph representation. By doing this we expose underlying structure in the data and can take advantage of existing graph analytics capabilities, as well as develop new capabilities. Currently we focus on applications in cybersecurity and communication domains. Within cybersecurity we aim to find signatures for perpetrators using the pass-the-hash attack, and in communications we look for emails or phone calls going up or down a chain of command. In both of these areas, and in many others, the signature we look for is a path with certain temporal properties. In this paper we discuss our methodology for finding these temporal paths within large graphs.

  1. Vehicle Technologies Office: Transportation System Analytical Tools |

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Department of Energy Modeling, Testing, Data & Results » Vehicle Technologies Office: Transportation System Analytical Tools Vehicle Technologies Office: Transportation System Analytical Tools The Vehicle Technologies Office (VTO) has supported the development of a number of software packages and online tools to model individual vehicles and the overall transportation system. Most of these tools are available for free or a nominal charge. Modeling tools that simulate entire vehicles and

  2. Model documentation: Electricity Market Module, Electricity Fuel Dispatch Submodule

    SciTech Connect (OSTI)

    Not Available

    1994-04-08

    This report documents the objectives, analytical approach and development of the National Energy Modeling System Electricity Fuel Dispatch Submodule (EFD), a submodule of the Electricity Market Module (EMM). The report catalogues and describes the model assumptions, computational methodology, parameter estimation techniques, model source code, and forecast results generated through the synthesis and scenario development based on these components.

  3. Extreme Scale Visual Analytics

    SciTech Connect (OSTI)

    Steed, Chad A; Potok, Thomas E; Pullum, Laura L; Ramanathan, Arvind; Shipman, Galen M; Thornton, Peter E; Potok, Thomas E

    2013-01-01

    Given the scale and complexity of today s data, visual analytics is rapidly becoming a necessity rather than an option for comprehensive exploratory analysis. In this paper, we provide an overview of three applications of visual analytics for addressing the challenges of analyzing climate, text streams, and biosurveilance data. These systems feature varying levels of interaction and high performance computing technology integration to permit exploratory analysis of large and complex data of global significance.

  4. Regional Shelter Analysis Methodology

    SciTech Connect (OSTI)

    Dillon, Michael B.; Dennison, Deborah; Kane, Jave; Walker, Hoyt; Miller, Paul

    2015-08-01

    The fallout from a nuclear explosion has the potential to injure or kill 100,000 or more people through exposure to external gamma (fallout) radiation. Existing buildings can reduce radiation exposure by placing material between fallout particles and exposed people. Lawrence Livermore National Laboratory was tasked with developing an operationally feasible methodology that could improve fallout casualty estimates. The methodology, called a Regional Shelter Analysis, combines the fallout protection that existing buildings provide civilian populations with the distribution of people in various locations. The Regional Shelter Analysis method allows the consideration of (a) multiple building types and locations within buildings, (b) country specific estimates, (c) population posture (e.g., unwarned vs. minimally warned), and (d) the time of day (e.g., night vs. day). The protection estimates can be combined with fallout predictions (or measurements) to (a) provide a more accurate assessment of exposure and injury and (b) evaluate the effectiveness of various casualty mitigation strategies. This report describes the Regional Shelter Analysis methodology, highlights key operational aspects (including demonstrating that the methodology is compatible with current tools), illustrates how to implement the methodology, and provides suggestions for future work.

  5. Integrating Heterogeneous Healthcare Datasets and Visual Analytics for Disease Bio-surveillance and Dynamics

    SciTech Connect (OSTI)

    Ramanathan, Arvind; Pullum, Laura L; Steed, Chad A; Quinn, Shannon; Chennubhotla, Chakra; Parker, Tara L

    2013-01-01

    n this paper, we present an overview of the big data chal- lenges in disease bio-surveillance and then discuss the use of visual analytics for integrating data and turning it into knowl- edge. We will explore two integration scenarios: (1) combining text and multimedia sources to improve situational awareness and (2) enhancing disease spread model data with real-time bio-surveillance data. Together, the proposed integration methodologies can improve awareness about when, where and how emerging diseases can affect wide geographic regions.

  6. Renewable Analytics | Open Energy Information

    Open Energy Info (EERE)

    Analytics Jump to: navigation, search Name: Renewable Analytics Place: San Francisco, California Zip: 94104 Product: San francisco-based provider of public market trading...

  7. Web Applications for Data Analytics

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Web Applications for Data Web Applications for Data Analytics Description and Overview NERSC is providing, on an experimental basis, web-based applications for data analytics. This ...

  8. DOE Systems Engineering Methodology

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Systems Engineering Methodology (SEM) Computer System Retirement Guidelines Version 3 September 2002 U.S. Department of Energy Office of the Chief Information Officer Computer System Retirement Guidelines Date: September 2002 Page 1 Rev Date: Table of Contents Section Page Purpose ............................................................................................................................................ 2 Initiation and Distribution

  9. Waste Package Design Methodology Report

    SciTech Connect (OSTI)

    D.A. Brownson

    2001-09-28

    The objective of this report is to describe the analytical methods and processes used by the Waste Package Design Section to establish the integrity of the various waste package designs, the emplacement pallet, and the drip shield. The scope of this report shall be the methodology used in criticality, risk-informed, shielding, source term, structural, and thermal analyses. The basic features and appropriateness of the methods are illustrated, and the processes are defined whereby input values and assumptions flow through the application of those methods to obtain designs that ensure defense-in-depth as well as satisfy requirements on system performance. Such requirements include those imposed by federal regulation, from both the U.S. Department of Energy (DOE) and U.S. Nuclear Regulatory Commission (NRC), and those imposed by the Yucca Mountain Project to meet repository performance goals. The report is to be used, in part, to describe the waste package design methods and techniques to be used for producing input to the License Application Report.

  10. Analysis Methodologies | Department of Energy

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Systems Analysis » Analysis Methodologies Analysis Methodologies A spectrum of analysis methodologies are used in combination to provide a sound understanding of hydrogen and fuel cell systems and developing markets, as follows: Resource Analysis Technological Feasibility and Cost Analysis Environmental Analysis Delivery Analysis Infrastructure Development and Financial Analysis Energy Market Analysis In general, each methodology builds on previous efforts to quantify the benefits, drawbacks,

  11. Methodologies for Reservoir Characterization Using Fluid Inclusion...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Methodologies for Reservoir Characterization Using Fluid Inclusion Gas Chemistry Methodologies for Reservoir Characterization Using Fluid Inclusion Gas Chemistry Methodologies for ...

  12. Methodology, status and plans for development and assessment of Cathare code

    SciTech Connect (OSTI)

    Bestion, D.; Barre, F.; Faydide, B.

    1997-07-01

    This paper presents the methodology, status and plans for the development, assessment and uncertainty evaluation of the Cathare code. Cathare is a thermalhydraulic code developed by CEA (DRN), IPSN, EDF and FRAMATOME for PWR safety analysis. First, the status of the code development and assessment is presented. The general strategy used for the development and the assessment of the code is presented. Analytical experiments with separate effect tests, and component tests are used for the development and the validation of closure laws. Successive Revisions of constitutive laws are implemented in successive Versions of the code and assessed. System tests or integral tests are used to validate the general consistency of the Revision. Each delivery of a code Version + Revision is fully assessed and documented. A methodology is being developed to determine the uncertainty on all constitutive laws of the code using calculations of many analytical tests and applying the Discrete Adjoint Sensitivity Method (DASM). At last, the plans for the future developments of the code are presented. They concern the optimization of the code performance through parallel computing - the code will be used for real time full scope plant simulators - the coupling with many other codes (neutronic codes, severe accident codes), the application of the code for containment thermalhydraulics. Also, physical improvements are required in the field of low pressure transients and in the modeling for the 3-D model.

  13. Industrial Analytics Corporation

    SciTech Connect (OSTI)

    Industrial Analytics Corporation

    2004-01-30

    The lost foam casting process is sensitive to the properties of the EPS patterns used for the casting operation. In this project Industrial Analytics Corporation (IAC) has developed a new low voltage x-ray instrument for x-ray radiography of very low mass EPS patterns. IAC has also developed a transmitted visible light method for characterizing the properties of EPS patterns. The systems developed are also applicable to other low density materials including graphite foams.

  14. Lifecycle Model

    Broader source: Directives, Delegations, and Requirements [Office of Management (MA)]

    1997-05-21

    This chapter describes the lifecycle model used for the Departmental software engineering methodology.

  15. Emergency exercise methodology

    SciTech Connect (OSTI)

    Klimczak, C.A.

    1993-03-01

    Competence for proper response to hazardous materials emergencies is enhanced and effectively measured by exercises which test plans and procedures and validate training. Emergency exercises are most effective when realistic criteria is used and a sequence of events is followed. The scenario is developed from pre-determined exercise objectives based on hazard analyses, actual plans and procedures. The scenario should address findings from previous exercises and actual emergencies. Exercise rules establish the extent of play and address contingencies during the exercise. All exercise personnel are assigned roles as players, controllers or evaluators. These participants should receive specialized training in advance. A methodology for writing an emergency exercise plan will be detailed.

  16. Emergency exercise methodology

    SciTech Connect (OSTI)

    Klimczak, C.A.

    1993-01-01

    Competence for proper response to hazardous materials emergencies is enhanced and effectively measured by exercises which test plans and procedures and validate training. Emergency exercises are most effective when realistic criteria is used and a sequence of events is followed. The scenario is developed from pre-determined exercise objectives based on hazard analyses, actual plans and procedures. The scenario should address findings from previous exercises and actual emergencies. Exercise rules establish the extent of play and address contingencies during the exercise. All exercise personnel are assigned roles as players, controllers or evaluators. These participants should receive specialized training in advance. A methodology for writing an emergency exercise plan will be detailed.

  17. Spark Distributed Analytic Framework

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Apache Spark Spark Distributed Analytic Framework Description and Overview Apache Spark(tm) is a fast and general engine for large-scale data processing. How to Use Spark Because of its high memory and I/O bandwidth requirements, we recommend you run your spark jobs on Cori. Follow the steps below to use spark, note that the order of the commands matters. DO NOT load the spark module until you are inside a batch job. Interactive mode Submit an interactive batch job with at least 2 nodes: salloc

  18. Sandia National Laboratories: Data Analytics

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Pathfinder Airborne ISR Systems What is SAR? Areas of Expertise Images VideoSAR Publications Facebook Twitter YouTube Flickr RSS Top Areas of Expertise Capabilities Hardware Modes & Frequency Bands of Operation Platforms Missions Tasking, Processing, Exploitation & Dissemination (TPED) Data Analytics Pathfinder Airborne ISR Systems Data Analytics Data Analytics Sandia National Laboratories: Synthetic Apperature Radar (SAR): SAR Hardware PANTHER - Pattern ANalytics To support

  19. Workshop on Current Issues in Predictive Approaches to Intelligence and Security Analytics: Fostering the Creation of Decision Advantage through Model Integration and Evaluation

    SciTech Connect (OSTI)

    Sanfilippo, Antonio P.

    2010-05-23

    The increasing asymmetric nature of threats to the security, health and sustainable growth of our society requires that anticipatory reasoning become an everyday activity. Currently, the use of anticipatory reasoning is hindered by the lack of systematic methods for combining knowledge- and evidence-based models, integrating modeling algorithms, and assessing model validity, accuracy and utility. The workshop addresses these gaps with the intent of fostering the creation of a community of interest on model integration and evaluation that may serve as an aggregation point for existing efforts and a launch pad for new approaches.

  20. Model documentation report: Industrial sector demand module of the national energy modeling system

    SciTech Connect (OSTI)

    1998-01-01

    This report documents the objectives, analytical approach, and development of the National Energy Modeling System (NEMS) Industrial Demand Model. The report catalogues and describes model assumptions, computational methodology, parameter estimation techniques, and model source code. This document serves three purposes. First, it is a reference document providing a detailed description of the NEMS Industrial Model for model analysts, users, and the public. Second, this report meets the legal requirements of the Energy Information Administration (EIA) to provide adequate documentation in support of its model. Third, it facilitates continuity in model development by providing documentation from which energy analysts can undertake model enhancements, data updates, and parameter refinements as future projects.

  1. VERDE Analytic Modules

    Energy Science and Technology Software Center (OSTI)

    2008-01-15

    The Verde Analytic Modules permit the user to ingest openly available data feeds about phenomenology (storm tracks, wind, precipitation, earthquake, wildfires, and similar natural and manmade power grid disruptions and forecast power outages, restoration times, customers outaged, and key facilities that will lose power. Damage areas are predicted using historic damage criteria of the affected area. The modules use a cellular automata approach to estimating the distribution circuits assigned to geo-located substations. Population estimates servedmore » within the service areas are located within 1 km grid cells and converted to customer counts by conversion through demographic estimation of households and commercial firms within the population cells. Restoration times are estimated by agent-based simulation of restoration crews working according to utility published prioritization calibrated by historic performance.« less

  2. Reflood completion report: Volume 1. A phenomenological thermal-hydraulic model of hot rod bundles experiencing simultaneous bottom and top quenching and an optimization methodology for closure development

    SciTech Connect (OSTI)

    Nelson, R.A. Jr.; Pimentel, D.A.; Jolly-Woodruff, S.; Spore, J.

    1998-04-01

    In this report, a phenomenological model of simultaneous bottom-up and top-down quenching is developed and discussed. The model was implemented in the TRAC-PF1/MOD2 computer code. Two sets of closure relationships were compared within the study, the Absolute set and the Conditional set. The Absolute set of correlations is frequently viewed as the pure set because the correlations is frequently viewed as the pure set because the correlations utilize their original coefficients as suggested by the developer. The Conditional set is a modified set of correlations with changes to the correlation coefficient only. Results for these two sets indicate quite similar results. This report also summarizes initial results of an effort to investigate nonlinear optimization techniques applied to the closure model development. Results suggest that such techniques can provide advantages for future model development work, but that extensive expertise is required to utilize such techniques (i.e., the model developer must fully understand both the physics of the process being represented and the computational techniques being employed). The computer may then be used to improve the correlation of computational results with experiments.

  3. Statistically qualified neuro-analytic failure detection method and system

    DOE Patents [OSTI]

    Vilim, Richard B.; Garcia, Humberto E.; Chen, Frederick W.

    2002-03-02

    An apparatus and method for monitoring a process involve development and application of a statistically qualified neuro-analytic (SQNA) model to accurately and reliably identify process change. The development of the SQNA model is accomplished in two stages: deterministic model adaption and stochastic model modification of the deterministic model adaptation. Deterministic model adaption involves formulating an analytic model of the process representing known process characteristics, augmenting the analytic model with a neural network that captures unknown process characteristics, and training the resulting neuro-analytic model by adjusting the neural network weights according to a unique scaled equation error minimization technique. Stochastic model modification involves qualifying any remaining uncertainty in the trained neuro-analytic model by formulating a likelihood function, given an error propagation equation, for computing the probability that the neuro-analytic model generates measured process output. Preferably, the developed SQNA model is validated using known sequential probability ratio tests and applied to the process as an on-line monitoring system. Illustrative of the method and apparatus, the method is applied to a peristaltic pump system.

  4. U.S. Energy-by-Rail Data Methodology

    U.S. Energy Information Administration (EIA) Indexed Site

    by-Rail Data Methodology June 2016 Independent Statistics & Analysis www.eia.gov U.S. Department of Energy Washington, DC 20585 U.S. Energy Information Administration | U.S. Energy-by-Rail Data Methodology i This report was prepared by the U.S. Energy Information Administration (EIA), the statistical and analytical agency within the U.S. Department of Energy. By law, EIA's data, analyses, and forecasts are independent of approval by any other officer or employee of the United States

  5. Appendix C, Analytical Data | Department of Energy

    Energy Savers [EERE]

    C, Analytical Data Appendix C, Analytical Data Docket No. EO-05-01: Appendix C, Analytical Data from Final Report: Particulate Emissions Testing, Unit 1, Potomac River Generating ...

  6. Hanford transuranic analytical capability

    SciTech Connect (OSTI)

    McVey, C.B.

    1995-02-24

    With the current DOE focus on ER/WM programs, an increase in the quantity of waste samples that requires detailed analysis is forecasted. One of the prime areas of growth is the demand for DOE environmental protocol analyses of TRU waste samples. Currently there is no laboratory capacity to support analysis of TRU waste samples in excess of 200 nCi/gm. This study recommends that an interim solution be undertaken to provide these services. By adding two glove boxes in room 11A of 222S the interim waste analytical needs can be met for a period of four to five years or until a front end facility is erected at or near the 222-S facility. The yearly average of samples is projected to be approximately 600 samples. The figure has changed significantly due to budget changes and has been downgraded from 10,000 samples to the 600 level. Until these budget and sample projection changes become firmer, a long term option is not recommended at this time. A revision to this document is recommended by March 1996 to review the long term option and sample projections.

  7. Re-appraisal and extension of the Gratton-Vargas two-dimensional analytical snowplow model of plasma focus. II. Looking at the singularity

    SciTech Connect (OSTI)

    Auluck, S. K. H.

    2015-11-15

    The Gratton-Vargas snowplow model, recently revisited and expanded [S. K. H. Auluck, Phys. Plasmas 20, 112501 (2013)], has given rise to significant new insights into some aspects of the Dense Plasma Focus (DPF), in spite of being a purely kinematic description having no reference to plasma phenomena. It is able to provide a good fit to the experimental current waveforms in at least 4 large facilities. It has been used for construction of a local curvilinear frame of reference, in which conservation laws for mass, momentum, and energy can be reduced to effectively-one-dimensional hyperbolic conservation law equations. Its utility in global parameter optimization of device parameters has been demonstrated. These features suggest that the Gratton-Vargas model deserves a closer look at its supposed limitations near the singular phase of the DPF. This paper presents a discussion of its development near the device axis, based on the original work of Gratton and Vargas, with some differences. It is shown that the Gratton-Vargas partial differential equation has solutions for times after the current singularity, which exhibit an expanding bounded volume (which can serve as model of an expanding plasma column) and decreasing dynamic inductance of the discharge, in spite of having no built-in hydrodynamics. This enables the model to qualitatively reproduce the characteristic shape of the current derivative in DPF experiments without reference to any plasma phenomena, such as instabilities, anomalous resistance, or reflection of hydrodynamic shock wave from the axis. The axial propagation of the solution exhibits a power-law dependence on the dimensionless time starting from the time of singularity, which is similar to the power-law relations predicted by theory of point explosions in ideal gases and which has also been observed experimentally.

  8. Ecologic Analytics | Open Energy Information

    Open Energy Info (EERE)

    Analytics Place: Bloomington, Minnesota Zip: 55425 Product: Minnesota-based meter data management company. Coordinates: 42.883574, -90.926122 Show Map Loading map......

  9. Analytical Improvements in PV Degradation Rate Determination

    SciTech Connect (OSTI)

    Jordan, D. C.; Kurtz, S. R.

    2011-02-01

    As photovoltaic (PV) penetration of the power grid increases, it becomes vital to know how decreased power output may affect cost over time. In order to predict power delivery, the decline or degradation rates must be determined accurately. For non-spectrally corrected data several complete seasonal cycles (typically 3-5 years) are required to obtain reasonably accurate degradation rates. In a rapidly evolving industry such a time span is often unacceptable and the need exists to determine degradation rates accurately in a shorter period of time. Occurrence of outliers and data shifts are two examples of analytical problems leading to greater uncertainty and therefore to longer observation times. In this paper we compare three methodologies of data analysis for robustness in the presence of outliers, data shifts and shorter measurement time periods.

  10. Analytical evaluation of atomic form factors: Application to Rayleigh scattering

    SciTech Connect (OSTI)

    Safari, L.; Santos, J. P.; Amaro, P.; Jnkl, K.; Fratini, F.

    2015-05-15

    Atomic form factors are widely used for the characterization of targets and specimens, from crystallography to biology. By using recent mathematical results, here we derive an analytical expression for the atomic form factor within the independent particle model constructed from nonrelativistic screened hydrogenic wave functions. The range of validity of this analytical expression is checked by comparing the analytically obtained form factors with the ones obtained within the Hartee-Fock method. As an example, we apply our analytical expression for the atomic form factor to evaluate the differential cross section for Rayleigh scattering off neutral atoms.

  11. Analytical laboratory quality audits

    SciTech Connect (OSTI)

    Kelley, William D.

    2001-06-11

    Analytical Laboratory Quality Audits are designed to improve laboratory performance. The success of the audit, as for many activities, is based on adequate preparation, precise performance, well documented and insightful reporting, and productive follow-up. Adequate preparation starts with definition of the purpose, scope, and authority for the audit and the primary standards against which the laboratory quality program will be tested. The scope and technical processes involved lead to determining the needed audit team resources. Contact is made with the auditee and a formal audit plan is developed, approved and sent to the auditee laboratory management. Review of the auditee's quality manual, key procedures and historical information during preparation leads to better checklist development and more efficient and effective use of the limited time for data gathering during the audit itself. The audit begins with the opening meeting that sets the stage for the interactions between the audit team and the laboratory staff. Arrangements are worked out for the necessary interviews and examination of processes and records. The information developed during the audit is recorded on the checklists. Laboratory management is kept informed of issues during the audit so there are no surprises at the closing meeting. The audit report documents whether the management control systems are effective. In addition to findings of nonconformance, positive reinforcement of exemplary practices provides balance and fairness. Audit closure begins with receipt and evaluation of proposed corrective actions from the nonconformances identified in the audit report. After corrective actions are accepted, their implementation is verified. Upon closure of the corrective actions, the audit is officially closed.

  12. SUSS PM 5 Analytic Probe

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    SUSS PM 5 Analytic Probe analytic.jpg (63416 bytes) CAMD refurbished a Suss microprobe station to perform resist adhesion test. The apparatus is equipped with a 10 lb. linear motor, two microprobes and a CCD camera for observation. Capabilities: Capable of removing PMMA bonded sheets from Si Fine probing of microstructures Back to Equipment

  13. An Integrated Safety Assessment Methodology for Generation IV Nuclear Systems

    SciTech Connect (OSTI)

    Timothy J. Leahy

    2010-06-01

    The Generation IV International Forum (GIF) Risk and Safety Working Group (RSWG) was created to develop an effective approach for the safety of Generation IV advanced nuclear energy systems. Early work of the RSWG focused on defining a safety philosophy founded on lessons learned from current and prior generations of nuclear technologies, and on identifying technology characteristics that may help achieve Generation IV safety goals. More recent RSWG work has focused on the definition of an integrated safety assessment methodology for evaluating the safety of Generation IV systems. The methodology, tentatively called ISAM, is an integrated toolkit consisting of analytical techniques that are available and matched to appropriate stages of Generation IV system concept development. The integrated methodology is intended to yield safety-related insights that help actively drive the evolving design throughout the technology development cycle, potentially resulting in enhanced safety, reduced costs, and shortened development time.

  14. Technosocial Predictive Analytics for Illicit Nuclear Trafficking

    SciTech Connect (OSTI)

    Sanfilippo, Antonio P.; Butner, R. Scott; Cowell, Andrew J.; Dalton, Angela C.; Haack, Jereme N.; Kreyling, Sean J.; Riensche, Roderick M.; White, Amanda M.; Whitney, Paul D.

    2011-03-29

    Illicit nuclear trafficking networks are a national security threat. These networks can directly lead to nuclear proliferation, as state or non-state actors attempt to identify and acquire nuclear weapons-related expertise, technologies, components, and materials. The ability to characterize and anticipate the key nodes, transit routes, and exchange mechanisms associated with these networks is essential to influence, disrupt, interdict or destroy the function of the networks and their processes. The complexities inherent to the characterization and anticipation of illicit nuclear trafficking networks requires that a variety of modeling and knowledge technologies be jointly harnessed to construct an effective analytical and decision making workflow in which specific case studies can be built in reasonable time and with realistic effort. In this paper, we explore a solution to this challenge that integrates evidentiary and dynamic modeling with knowledge management and analytical gaming, and demonstrate its application to a geopolitical region at risk.

  15. Measurement of laminar burning speeds and Markstein lengths using a novel methodology

    SciTech Connect (OSTI)

    Tahtouh, Toni; Halter, Fabien; Mounaim-Rousselle, Christine [Institut PRISME, Universite d'Orleans, 8 rue Leonard de Vinci-45072, Orleans Cedex 2 (France)

    2009-09-15

    Three different methodologies used for the extraction of laminar information are compared and discussed. Starting from an asymptotic analysis assuming a linear relation between the propagation speed and the stretch acting on the flame front, temporal radius evolutions of spherically expanding laminar flames are postprocessed to obtain laminar burning velocities and Markstein lengths. The first methodology fits the temporal radius evolution with a polynomial function, while the new methodology proposed uses the exact solution of the linear relation linking the flame speed and the stretch as a fit. The last methodology consists in an analytical resolution of the problem. To test the different methodologies, experiments were carried out in a stainless steel combustion chamber with methane/air mixtures at atmospheric pressure and ambient temperature. The equivalence ratio was varied from 0.55 to 1.3. The classical shadowgraph technique was used to detect the reaction zone. The new methodology has proven to be the most robust and provides the most accurate results, while the polynomial methodology induces some errors due to the differentiation process. As original radii are used in the analytical methodology, it is more affected by the experimental radius determination. Finally, laminar burning velocity and Markstein length values determined with the new methodology are compared with results reported in the literature. (author)

  16. Selecting the best defect reduction methodology

    SciTech Connect (OSTI)

    Hinckley, C.M.; Barkan, P.

    1994-04-01

    Defect rates less than 10 parts per million, unimaginable a few years ago, have become the standard of world-class quality. To reduce defects, companies are aggressively implementing various quality methodologies, such as Statistical Quality Control Motorola`s Six Sigma, or Shingo`s poka-yok. Although each quality methodology reduces defects, selection has been based on an intuitive sense without understanding their relative effectiveness in each application. A missing link in developing superior defect reduction strategies has been a lack of a general defect model that clarifies the unique focus of each method. Toward the goal of efficient defect reduction, we have developed an event tree which addresses a broad spectrum of quality factors and two defect sources, namely, error and variation. The Quality Control Tree (QCT) predictions are more consistent with production experience than obtained by the other methodologies considered independently. The QCT demonstrates that world-class defect rates cannot be achieved through focusing on a single defect source or quality control factor, a common weakness of many methodologies. We have shown that the most efficient defect reduction strategy depend on the relative strengths and weaknesses of each organization. The QCT can help each organization identify the most promising defect reduction opportunities for achieving its goals.

  17. eGallon-methodology-final

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    traditional gallon of unleaded fuel -- the dominant fuel choice for vehicles in the U.S. eGallon Methodology The eGallon is measured as an "implicit" cost of a gallon of gasoline. ...

  18. Weekly Coal Production Estimation Methodology

    Gasoline and Diesel Fuel Update (EIA)

    Weekly Coal Production Estimation Methodology Step 1 (Estimate total amount of weekly U.S. coal production) U.S. coal production for the current week is estimated using a ratio ...

  19. Culture, and a Metrics Methodology for Biological Countermeasure Scenarios

    SciTech Connect (OSTI)

    Simpson, Mary J.

    2007-03-15

    . With uncertain data and limited common units, the aggregation of results is not inherently obvious. Candidate methodologies discussed include statistical, analytical, and expert-based numerical approaches. Most statistical methods require large amounts of data with a random distribution of values for validity. Analytical methods predominate wherein structured data or patterns are evident and randomness is low. The analytical hierarchy process is shown to satisfy all requirements and provide a detailed method for measurement that depends on expert judgment by decision makers.

  20. Functionalized magnetic nanoparticle analyte sensor

    DOE Patents [OSTI]

    Yantasee, Wassana; Warner, Maryin G; Warner, Cynthia L; Addleman, Raymond S; Fryxell, Glen E; Timchalk, Charles; Toloczko, Mychailo B

    2014-03-25

    A method and system for simply and efficiently determining quantities of a preselected material in a particular solution by the placement of at least one superparamagnetic nanoparticle having a specified functionalized organic material connected thereto into a particular sample solution, wherein preselected analytes attach to the functionalized organic groups, these superparamagnetic nanoparticles are then collected at a collection site and analyzed for the presence of a particular analyte.

  1. Analytical Tools | Department of Energy

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Information Resources » Analytical Tools Analytical Tools The Bioenergy Technologies Office and its national lab partners provide a variety of online tools to help analyze data and facilitate decision making. This page links to several of them and includes a widget that calculates the potential volume of ethanol produced from biomass feedstocks. Knowledge Discovery Framework (KDF): The Bioenergy Knowledge Discovery Framework (KDF) facilitates informed decision making by providing a means to

  2. Laboratory Analytical Procedures | Bioenergy | NREL

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Laboratory Analytical Procedures NREL develops laboratory analytical procedures (LAPs) to provide validated methods for biofuels and pyrolysis bio-oils research. Biomass Compositional Analysis These lab procedures provide tested and accepted methods for performing analyses commonly used in biofuels research. Bio-Oil Analysis These lab procedures allow for the analysis of raw and upgraded pyrolysis bio-oils. Microalgae Compositional Analysis These lab procedures help scientists and researchers

  3. Analytical Modeling and Simulation of Thermoelectric Devices...

    Broader source: Energy.gov (indexed) [DOE]

    and Technologies Micro- & Nano-Technologies Enabling More Compact, Lightweight Thermoelectric Power Generation & Cooling Systems Automotive Thermoelectric Generators and HVAC

  4. Tornado missile simulation and design methodology. Volume 1: simulation methodology, design applications, and TORMIS computer code. Final report

    SciTech Connect (OSTI)

    Twisdale, L.A.; Dunn, W.L.

    1981-08-01

    A probabilistic methodology has been developed to predict the probabilities of tornado-propelled missiles impacting and damaging nuclear power plant structures. Mathematical models of each event in the tornado missile hazard have been developed and sequenced to form an integrated, time-history simulation methodology. The models are data based where feasible. The data include documented records of tornado occurrence, field observations of missile transport, results of wind tunnel experiments, and missile impact tests. Probabilistic Monte Carlo techniques are used to estimate the risk probabilities. The methodology has been encoded in the TORMIS computer code to facilitate numerical analysis and plant-specific tornado missile probability assessments. Sensitivity analyses have been performed on both the individual models and the integrated methodology, and risk has been assessed for a hypothetical nuclear power plant design case study.

  5. Analytical Study on Thermal and Mechanical Design of Printed Circuit Heat Exchanger

    SciTech Connect (OSTI)

    Su-Jong Yoon; Piyush Sabharwall; Eung-Soo Kim

    2013-09-01

    The analytical methodologies for the thermal design, mechanical design and cost estimation of printed circuit heat exchanger are presented in this study. In this study, three flow arrangements of parallel flow, countercurrent flow and crossflow are taken into account. For each flow arrangement, the analytical solution of temperature profile of heat exchanger is introduced. The size and cost of printed circuit heat exchangers for advanced small modular reactors, which employ various coolants such as sodium, molten salts, helium, and water, are also presented.

  6. Measuring the Impact of Benchmarking & Transparency - Methodologies...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Measuring the Impact of Benchmarking & Transparency - Methodologies and the NYC Example Measuring the Impact of Benchmarking & Transparency - Methodologies and the NYC Example ...

  7. Chemical incident economic impact analysis methodology. (Technical...

    Office of Scientific and Technical Information (OSTI)

    Chemical incident economic impact analysis methodology. Citation Details In-Document Search Title: Chemical incident economic impact analysis methodology. You are accessing a ...

  8. Superhydrophobic analyte concentration utilizing colloid-pillar...

    Office of Scientific and Technical Information (OSTI)

    Superhydrophobic analyte concentration utilizing colloid-pillar array SERS substrates Citation Details In-Document Search Title: Superhydrophobic analyte concentration utilizing ...

  9. Hydrogen Fuel Quality - Focus: Analytical Methods Development...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Fuel Quality - Focus: Analytical Methods Development & Hydrogen Fuel Quality Results Hydrogen Fuel Quality - Focus: Analytical Methods Development & Hydrogen Fuel Quality Results ...

  10. Analytical Chemistry Laboratory | Argonne National Laboratory

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Chemistry Laboratory provides a broad range of analytical chemistry support services to the scientific and engineering programs. AnalyticalChemistryLaboratoryfactsheet...

  11. Savannah River Analytical Laboratories Achieve International...

    National Nuclear Security Administration (NNSA)

    Savannah River Analytical Laboratories Achieve International Standard Accreditation Tuesday, September 8, 2015 - 12:55pm Savannah River National Laboratory's FH Analytical ...

  12. Hot Water Distribution System Model Enhancements

    SciTech Connect (OSTI)

    Hoeschele, M.; Weitzel, E.

    2012-11-01

    This project involves enhancement of the HWSIM distribution system model to more accurately model pipe heat transfer. Recent laboratory testing efforts have indicated that the modeling of radiant heat transfer effects is needed to accurately characterize piping heat loss. An analytical methodology for integrating radiant heat transfer was implemented with HWSIM. Laboratory test data collected in another project was then used to validate the model for a variety of uninsulated and insulated pipe cases (copper, PEX, and CPVC). Results appear favorable, with typical deviations from lab results less than 8%.

  13. Reports of the AAAI 2009 Spring Symposia: Technosocial Predictive Analytics.

    SciTech Connect (OSTI)

    Sanfilippo, Antonio P.

    2009-10-01

    The Technosocial Predictive Analytics AAAI symposium was held at Stanford University, Stanford, CA, March 23-25, 2009. The goal of this symposium was to explore new methods for anticipatory analytical thinking that provide decision advantage through the integration of human and physical models. Special attention was also placed on how to leverage supporting disciplines to (a) facilitate the achievement of knowledge inputs, (b) improve the user experience, and (c) foster social intelligence through collaborative/competitive work.

  14. Analytical Tool Development for Aftertreatment Sub-Systems Integration |

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Department of Energy Analytical Tool Development for Aftertreatment Sub-Systems Integration Analytical Tool Development for Aftertreatment Sub-Systems Integration 2003 DEER Conference Presentation: Detroit Diesel Corporation 2003_deer_bolton2.pdf (847.55 KB) More Documents & Publications Advanced Diesel Engine and Aftertreatment Technology Development for Tier 2 Emissions Update on Modeling for Effective Diesel Engine Aftertreatment Implementation - Master Plan, Status and Critical Needs

  15. Analytical Spectroscopy - Energy Innovation Portal

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Industrial Technologies Industrial Technologies Find More Like This Return to Search Analytical Spectroscopy Idaho National Laboratory Contact INL About This Technology Technology Marketing Summary The use of lasers has become increasingly widespread, especially for manufacturing products and material analysis. Recently, laser desorption (LD) techniques for mass spectrometry have attracted attention because it produces intact molecular ions, avoids surface charging issues, and allows tuning of

  16. Improved methodology to assess modification and completion of landfill gas management in the aftercare period

    SciTech Connect (OSTI)

    Morris, Jeremy W.F.; Crest, Marion; Barlaz, Morton A.; Spokas, Kurt A.; Akerman, Anna; Yuan, Lei

    2012-12-15

    Highlights: Black-Right-Pointing-Pointer Performance-based evaluation of landfill gas control system. Black-Right-Pointing-Pointer Analytical framework to evaluate transition from active to passive gas control. Black-Right-Pointing-Pointer Focus on cover oxidation as an alternative means of passive gas control. Black-Right-Pointing-Pointer Integrates research on long-term landfill behavior with practical guidance. - Abstract: Municipal solid waste landfills represent the dominant option for waste disposal in many parts of the world. While some countries have greatly reduced their reliance on landfills, there remain thousands of landfills that require aftercare. The development of cost-effective strategies for landfill aftercare is in society's interest to protect human health and the environment and to prevent the emergence of landfills with exhausted aftercare funding. The Evaluation of Post-Closure Care (EPCC) methodology is a performance-based approach in which landfill performance is assessed in four modules including leachate, gas, groundwater, and final cover. In the methodology, the objective is to evaluate landfill performance to determine when aftercare monitoring and maintenance can be reduced or possibly eliminated. This study presents an improved gas module for the methodology. While the original version of the module focused narrowly on regulatory requirements for control of methane migration, the improved gas module also considers best available control technology for landfill gas in terms of greenhouse gas emissions, air quality, and emissions of odoriferous compounds. The improved module emphasizes the reduction or elimination of fugitive methane by considering the methane oxidation capacity of the cover system. The module also allows for the installation of biologically active covers or other features designed to enhance methane oxidation. A methane emissions model, CALMIM, was used to assist with an assessment of the methane oxidation capacity of

  17. Model documentation report: Transportation sector model of the National Energy Modeling System

    SciTech Connect (OSTI)

    Not Available

    1994-03-01

    This report documents the objectives, analytical approach and development of the National Energy Modeling System (NEMS) Transportation Model (TRAN). The report catalogues and describes the model assumptions, computational methodology, parameter estimation techniques, model source code, and forecast results generated by the model. This document serves three purposes. First, it is a reference document providing a detailed description of TRAN for model analysts, users, and the public. Second, this report meets the legal requirements of the Energy Information Administration (EIA) to provide adequate documentation in support of its statistical and forecast reports (Public Law 93-275, 57(b)(1)). Third, it permits continuity in model development by providing documentation from which energy analysts can undertake model enhancements, data updates, and parameter refinements.

  18. PANTHER. Pattern ANalytics To support High-performance Exploitation and Reasoning.

    SciTech Connect (OSTI)

    Czuchlewski, Kristina Rodriguez; Hart, William E.

    2015-09-01

    Sandia has approached the analysis of big datasets with an integrated methodology that uses computer science, image processing, and human factors to exploit critical patterns and relationships in large datasets despite the variety and rapidity of information. The work is part of a three-year LDRD Grand Challenge called PANTHER (Pattern ANalytics To support High-performance Exploitation and Reasoning). To maximize data analysis capability, Sandia pursued scientific advances across three key technical domains: (1) geospatial-temporal feature extraction via image segmentation and classification; (2) geospatial-temporal analysis capabilities tailored to identify and process new signatures more efficiently; and (3) domain- relevant models of human perception and cognition informing the design of analytic systems. Our integrated results include advances in geographical information systems (GIS) in which we discover activity patterns in noisy, spatial-temporal datasets using geospatial-temporal semantic graphs. We employed computational geometry and machine learning to allow us to extract and predict spatial-temporal patterns and outliers from large aircraft and maritime trajectory datasets. We automatically extracted static and ephemeral features from real, noisy synthetic aperture radar imagery for ingestion into a geospatial-temporal semantic graph. We worked with analysts and investigated analytic workflows to (1) determine how experiential knowledge evolves and is deployed in high-demand, high-throughput visual search workflows, and (2) better understand visual search performance and attention. Through PANTHER, Sandia's fundamental rethinking of key aspects of geospatial data analysis permits the extraction of much richer information from large amounts of data. The project results enable analysts to examine mountains of historical and current data that would otherwise go untouched, while also gaining meaningful, measurable, and defensible insights into

  19. Analytic Challenges to Valuing Energy Storage Workshop Report

    Office of Energy Efficiency and Renewable Energy (EERE)

    The U.S. Department of Energy (DOE) has coordinated energy storage efforts from a research and development (R&D) perspective – identifying technology needs, metrics, and goals – but DOE and the research and analytic community have struggled with valuing storage at a systems level. Sixteen stakeholders and experts from across the electric power industry, research universities, national laboratories, and federal agencies were invited to join 8 DOE staff members in a workshop on September 19-20, 2011, in Washington, D.C. to discuss the current state of knowledge for grid-scale energy storage and, in particular, the methodologies to assess its value on the grid.

  20. Analytical Chemistry Division annual progress report for period ending December 31, 1984

    SciTech Connect (OSTI)

    Lyon, W.S.

    1985-04-01

    Progress reports are presented for the following sections: analytical methodology; mass and emission spectroscopy; radioactive materials analysis; bio/organic analysis; and general and environmental analysis; quality assurance, safety, and tabulation analyses. In addition a list of publications and oral presentations and supplemental activities are included.

  1. Energy Efficiency Indicators Methodology Booklet

    SciTech Connect (OSTI)

    Sathaye, Jayant; Price, Lynn; McNeil, Michael; de la rue du Can, Stephane

    2010-05-01

    This Methodology Booklet provides a comprehensive review and methodology guiding principles for constructing energy efficiency indicators, with illustrative examples of application to individual countries. It reviews work done by international agencies and national government in constructing meaningful energy efficiency indicators that help policy makers to assess changes in energy efficiency over time. Building on past OECD experience and best practices, and the knowledge of these countries' institutions, relevant sources of information to construct an energy indicator database are identified. A framework based on levels of hierarchy of indicators -- spanning from aggregate, macro level to disaggregated end-use level metrics -- is presented to help shape the understanding of assessing energy efficiency. In each sector of activity: industry, commercial, residential, agriculture and transport, indicators are presented and recommendations to distinguish the different factors affecting energy use are highlighted. The methodology booklet addresses specifically issues that are relevant to developing indicators where activity is a major factor driving energy demand. A companion spreadsheet tool is available upon request.

  2. Spent fuel management fee methodology and computer code user's manual.

    SciTech Connect (OSTI)

    Engel, R.L.; White, M.K.

    1982-01-01

    The methodology and computer model described here were developed to analyze the cash flows for the federal government taking title to and managing spent nuclear fuel. The methodology has been used by the US Department of Energy (DOE) to estimate the spent fuel disposal fee that will provide full cost recovery. Although the methodology was designed to analyze interim storage followed by spent fuel disposal, it could be used to calculate a fee for reprocessing spent fuel and disposing of the waste. The methodology consists of two phases. The first phase estimates government expenditures for spent fuel management. The second phase determines the fees that will result in revenues such that the government attains full cost recovery assuming various revenue collection philosophies. These two phases are discussed in detail in subsequent sections of this report. Each of the two phases constitute a computer module, called SPADE (SPent fuel Analysis and Disposal Economics) and FEAN (FEe ANalysis), respectively.

  3. Visual Analytics for Power Grid Contingency Analysis

    SciTech Connect (OSTI)

    Wong, Pak C.; Huang, Zhenyu; Chen, Yousu; Mackey, Patrick S.; Jin, Shuangshuang

    2014-01-20

    Contingency analysis is the process of employing different measures to model scenarios, analyze them, and then derive the best response to remove the threats. This application paper focuses on a class of contingency analysis problems found in the power grid management system. A power grid is a geographically distributed interconnected transmission network that transmits and delivers electricity from generators to end users. The power grid contingency analysis problem is increasingly important because of both the growing size of the underlying raw data that need to be analyzed and the urgency to deliver working solutions in an aggressive timeframe. Failure to do so may bring significant financial, economic, and security impacts to all parties involved and the society at large. The paper presents a scalable visual analytics pipeline that transforms about 100 million contingency scenarios to a manageable size and form for grid operators to examine different scenarios and come up with preventive or mitigation strategies to address the problems in a predictive and timely manner. Great attention is given to the computational scalability, information scalability, visual scalability, and display scalability issues surrounding the data analytics pipeline. Most of the large-scale computation requirements of our work are conducted on a Cray XMT multi-threaded parallel computer. The paper demonstrates a number of examples using western North American power grid models and data.

  4. Cognitive Foundations for Visual Analytics

    SciTech Connect (OSTI)

    Greitzer, Frank L.; Noonan, Christine F.; Franklin, Lyndsey

    2011-02-25

    In this report, we provide an overview of scientific/technical literature on information visualization and VA. Topics discussed include an update and overview of the extensive literature search conducted for this study, the nature and purpose of the field, major research thrusts, and scientific foundations. We review methodologies for evaluating and measuring the impact of VA technologies as well as taxonomies that have been proposed for various purposes to support the VA community. A cognitive science perspective underlies each of these discussions.

  5. Analyte detection using an active assay

    DOE Patents [OSTI]

    Morozov, Victor; Bailey, Charles L.; Evanskey, Melissa R.

    2010-11-02

    Analytes using an active assay may be detected by introducing an analyte solution containing a plurality of analytes to a lacquered membrane. The lacquered membrane may be a membrane having at least one surface treated with a layer of polymers. The lacquered membrane may be semi-permeable to nonanalytes. The layer of polymers may include cross-linked polymers. A plurality of probe molecules may be arrayed and immobilized on the lacquered membrane. An external force may be applied to the analyte solution to move the analytes towards the lacquered membrane. Movement may cause some or all of the analytes to bind to the lacquered membrane. In cases where probe molecules are presented, some or all of the analytes may bind to probe molecules. The direction of the external force may be reversed to remove unbound or weakly bound analytes. Bound analytes may be detected using known detection types.

  6. Estimating Fuel Cycle Externalities: Analytical Methods and Issues, Report 2

    SciTech Connect (OSTI)

    Barnthouse, L.W.; Cada, G.F.; Cheng, M.-D.; Easterly, C.E.; Kroodsma, R.L.; Lee, R.; Shriner, D.S.; Tolbert, V.R.; Turner, R.S.

    1994-07-01

    The activities that produce electric power typically range from extracting and transporting a fuel, to its conversion into electric power, and finally to the disposition of residual by-products. This chain of activities is called a fuel cycle. A fuel cycle has emissions and other effects that result in unintended consequences. When these consequences affect third parties (i.e., those other than the producers and consumers of the fuel-cycle activity) in a way that is not reflected in the price of electricity, they are termed ''hidden'' social costs or externalities. They are the economic value of environmental, health and any other impacts, that the price of electricity does not reflect. How do you estimate the externalities of fuel cycles? Our previous report describes a methodological framework for doing so--called the damage function approach. This approach consists of five steps: (1) characterize the most important fuel cycle activities and their discharges, where importance is based on the expected magnitude of their externalities, (2) estimate the changes in pollutant concentrations or other effects of those activities, by modeling the dispersion and transformation of each pollutant, (3) calculate the impacts on ecosystems, human health, and any other resources of value (such as man-made structures), (4) translate the estimates of impacts into economic terms to estimate damages and benefits, and (5) assess the extent to which these damages and benefits are externalities, not reflected in the price of electricity. Each step requires a different set of equations, models and analysis. Analysts generally believe this to be the best approach for estimating externalities, but it has hardly been used! The reason is that it requires considerable analysis and calculation, and to this point in time, the necessary equations and models have not been assembled. Equally important, the process of identifying and estimating externalities leads to a number of complex issues

  7. A Simplified Methodology for Estimating the Pressure Buildup and Hydrogen Concentration Within a 2R/6M Container

    SciTech Connect (OSTI)

    SANCHEZ,LAWRENCE C.; OTTINGER,CATHY A.; POLANSKY,GARY F.

    2001-08-01

    A simplified and bounding methodology for analyzing the pressure buildup and hydrogen concentration within an unvented 2R container was developed (the 2R is a sealed container within a 6M package). The specific case studied was the gas buildup due to alpha radiolysis of water moisture sorbed on small quantities (less than 20 Ci per package) of plutonium oxide. Analytical solutions for gas pressure buildup and hydrogen concentration within the unvented 2R container were developed. Key results indicated that internal pressure buildup would not be significant for a wide range of conditions. Hydrogen concentrations should also be minimal but are difficult to quantify due to a large variation/uncertainty in model parameters. Additional assurance of non-flammability can be obtained by the use of an inert backfill gas in the 2R container.

  8. Analytical effective tensor for flow-through composites

    DOE Patents [OSTI]

    Sviercoski, Rosangela De Fatima

    2012-06-19

    A machine, method and computer-usable medium for modeling an average flow of a substance through a composite material. Such a modeling includes an analytical calculation of an effective tensor K.sup.a suitable for use with a variety of media. The analytical calculation corresponds to an approximation to the tensor K, and follows by first computing the diagonal values, and then identifying symmetries of the heterogeneity distribution. Additional calculations include determining the center of mass of the heterogeneous cell and its angle according to a defined Cartesian system, and utilizing this angle into a rotation formula to compute the off-diagonal values and determining its sign.

  9. eGallon Methodology | Department of Energy

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    eGallon Methodology eGallon Methodology The average American measures the day-to-day cost of driving by the price of a gallon of gasoline. In other words, as the price of gasoline ...

  10. Advanced Analytics | GE Global Research

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    GE Predictivity(tm) Industrial Internet Solutions Click to email this to a friend (Opens in new window) Share on Facebook (Opens in new window) Click to share (Opens in new window) Click to share on LinkedIn (Opens in new window) Click to share on Tumblr (Opens in new window) GE Predictivity(tm) Industrial Internet Solutions As a key player in GE's commitment to advance the Industrial Internet, the GE Software Center is at work helping industrial organizations use data, analytics, data

  11. Energy Intensity Indicators: Methodology | Department of Energy

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Methodology Energy Intensity Indicators: Methodology The files listed below contain methodology documentation and related studies that support the information presented on this website. The files are available to view and/or download as Adobe Acrobat PDF files. 2003. Energy Indicators System: Index Construction Methodology 2004. Changing the Base Year for the Index Boyd GA, and JM Roop. 2004. "A Note on the Fisher Ideal Index Decomposition for Structural Change in Energy Intensity."

  12. Siting Methodologies for Hydrokinetics | Department of Energy

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Siting Methodologies for Hydrokinetics Siting Methodologies for Hydrokinetics Report that provides an overview of the federal and state regulatory framework for hydrokinetic projects. siting_handbook_2009.pdf (2.43 MB) More Documents & Publications Siting Methodologies for Hydrokinetics EIS-0488: Final Environmental Impact Statement EIS-0493: Draft Environmental Impact Statement

  13. NREL: Measurements and Characterization - Analytical Microscopy

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Analytical Microscopy Analytical microscopy uses various high-resolution techniques to obtain information about materials on the atomic scale. It is one of the most powerful tools available for understanding a material's basic structure, chemistry, and morphology. We use two complementary types of analytical microscopy - electron microscopy and scanning probe microscopy - together with a variety of state-of-the-art imaging and analytical tools to capture data about photovoltaic (PV) materials

  14. Analytical Resources > Research > The Energy Materials Center...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Differential Electrochemical Mass Spectroscopy (DEMS) Electron Microscopy X-Ray Diffraction Analytical Resources Differential Electrochemical Mass Spectroscopy (DEMS) Electron...

  15. Web-based Visual Analytics for Extreme Scale Climate Science

    SciTech Connect (OSTI)

    Steed, Chad A; Evans, Katherine J; Harney, John F; Jewell, Brian C; Shipman, Galen M; Smith, Brian E; Thornton, Peter E; Williams, Dean N.

    2014-01-01

    In this paper, we introduce a Web-based visual analytics framework for democratizing advanced visualization and analysis capabilities pertinent to large-scale earth system simulations. We address significant limitations of present climate data analysis tools such as tightly coupled dependencies, ineffi- cient data movements, complex user interfaces, and static visualizations. Our Web-based visual analytics framework removes critical barriers to the widespread accessibility and adoption of advanced scientific techniques. Using distributed connections to back-end diagnostics, we minimize data movements and leverage HPC platforms. We also mitigate system dependency issues by employing a RESTful interface. Our framework embraces the visual analytics paradigm via new visual navigation techniques for hierarchical parameter spaces, multi-scale representations, and interactive spatio-temporal data mining methods that retain details. Although generalizable to other science domains, the current work focuses on improving exploratory analysis of large-scale Community Land Model (CLM) and Community Atmosphere Model (CAM) simulations.

  16. Methodology and a preliminary data base for examining the health risks of electricity generation from uranium and coal fuels

    SciTech Connect (OSTI)

    El-Bassioni, A.A.

    1980-08-01

    An analytical model was developed to assess and examine the health effects associated with the production of electricity from uranium and coal fuels. The model is based on a systematic methodology that is both simple and easy to check, and provides details about the various components of health risk. A preliminary set of data that is needed to calculate the health risks was gathered, normalized to the model facilities, and presented in a concise manner. Additional data will become available as a result of other evaluations of both fuel cycles, and they should be included in the data base. An iterative approach involving only a few steps is recommended for validating the model. After each validation step, the model is improved in the areas where new information or increased interest justifies such upgrading. Sensitivity analysis is proposed as the best method of using the model to its full potential. Detailed quantification of the risks associated with the two fuel cycles is not presented in this report. The evaluation of risks from producing electricity by these two methods can be completed only after several steps that address difficult social and technical questions. Preliminary quantitative assessment showed that several factors not considered in detail in previous studies are potentially important. 255 refs., 21 figs., 179 tabs.

  17. Methodology for flammable gas evaluations

    SciTech Connect (OSTI)

    Hopkins, J.D., Westinghouse Hanford

    1996-06-12

    There are 177 radioactive waste storage tanks at the Hanford Site. The waste generates flammable gases. The waste releases gas continuously, but in some tanks the waste has shown a tendency to trap these flammable gases. When enough gas is trapped in a tank`s waste matrix, it may be released in a way that renders part or all of the tank atmosphere flammable for a period of time. Tanks must be evaluated against previously defined criteria to determine whether they can present a flammable gas hazard. This document presents the methodology for evaluating tanks in two areas of concern in the tank headspace:steady-state flammable-gas concentration resulting from continuous release, and concentration resulting from an episodic gas release.

  18. Simulation Enabled Safeguards Assessment Methodology

    SciTech Connect (OSTI)

    Robert Bean; Trond Bjornard; Thomas Larson

    2007-09-01

    It is expected that nuclear energy will be a significant component of future supplies. New facilities, operating under a strengthened international nonproliferation regime will be needed. There is good reason to believe virtual engineering applied to the facility design, as well as to the safeguards system design will reduce total project cost and improve efficiency in the design cycle. Simulation Enabled Safeguards Assessment MEthodology (SESAME) has been developed as a software package to provide this capability for nuclear reprocessing facilities. The software architecture is specifically designed for distributed computing, collaborative design efforts, and modular construction to allow step improvements in functionality. Drag and drop wireframe construction allows the user to select the desired components from a component warehouse, render the system for 3D visualization, and, linked to a set of physics libraries and/or computational codes, conduct process evaluations of the system they have designed.

  19. Simulation enabled safeguards assessment methodology

    SciTech Connect (OSTI)

    Bean, Robert; Bjornard, Trond; Larson, Tom

    2007-07-01

    It is expected that nuclear energy will be a significant component of future supplies. New facilities, operating under a strengthened international nonproliferation regime will be needed. There is good reason to believe virtual engineering applied to the facility design, as well as to the safeguards system design will reduce total project cost and improve efficiency in the design cycle. Simulation Enabled Safeguards Assessment MEthodology has been developed as a software package to provide this capability for nuclear reprocessing facilities. The software architecture is specifically designed for distributed computing, collaborative design efforts, and modular construction to allow step improvements in functionality. Drag and drop wire-frame construction allows the user to select the desired components from a component warehouse, render the system for 3D visualization, and, linked to a set of physics libraries and/or computational codes, conduct process evaluations of the system they have designed. (authors)

  20. Development of an Improved Methodology to Assess Potential Unconventional Gas Resources

    SciTech Connect (OSTI)

    Salazar, Jesus; McVay, Duane A. Lee, W. John

    2010-12-15

    Considering the important role played today by unconventional gas resources in North America and their enormous potential for the future around the world, it is vital to both policy makers and industry that the volumes of these resources and the impact of technology on these resources be assessed. To provide for optimal decision making regarding energy policy, research funding, and resource development, it is necessary to reliably quantify the uncertainty in these resource assessments. Since the 1970s, studies to assess potential unconventional gas resources have been conducted by various private and governmental agencies, the most rigorous of which was by the United States Geological Survey (USGS). The USGS employed a cell-based, probabilistic methodology which used analytical equations to calculate distributions of the resources assessed. USGS assessments have generally produced distributions for potential unconventional gas resources that, in our judgment, are unrealistically narrow for what are essentially undiscovered, untested resources. In this article, we present an improved methodology to assess potential unconventional gas resources. Our methodology is a stochastic approach that includes Monte Carlo simulation and correlation between input variables. Application of the improved methodology to the Uinta-Piceance province of Utah and Colorado with USGS data validates the means and standard deviations of resource distributions produced by the USGS methodology, but reveals that these distributions are not right skewed, as expected for a natural resource. Our investigation indicates that the unrealistic shape and width of the gas resource distributions are caused by the use of narrow triangular input parameter distributions. The stochastic methodology proposed here is more versatile and robust than the USGS analytic methodology. Adoption of the methodology, along with a careful examination and revision of input distributions, should allow a more realistic

  1. Appendix D: Statistical Methodology of Estimating Petroleum Exports Using Data from U.S. Customs and Border Protection

    U.S. Energy Information Administration (EIA) Indexed Site

    Statistical Methodology of Estimating Petroleum Exports Using Data from U.S. Customs and Border Protection August 31, 2016 Independent Statistics & Analysis www.eia.gov U.S. Department of Energy Washington, DC 20585 U.S. Energy Information Administration | Statistical Methodology of Estimating Petroleum Exports Using Data from U.S. Customs and Border Protection This report was prepared by the U.S. Energy Information Administration (EIA), the statistical and analytical agency within the U.S.

  2. Calibration methodology for proportional counters applied to yield measurements of a neutron burst

    SciTech Connect (OSTI)

    Tarifeo-Saldivia, Ariel E-mail: atarisal@gmail.com; Pavez, Cristian; Soto, Leopoldo; Center for Research and Applications in Plasma Physics and Pulsed Power, P4, Santiago; Departamento de Ciencias Fisicas, Facultad de Ciencias Exactas, Universidad Andres Bello, Republica 220, Santiago ; Mayer, Roberto E.

    2014-01-15

    This paper introduces a methodology for the yield measurement of a neutron burst using neutron proportional counters. This methodology is to be applied when single neutron events cannot be resolved in time by nuclear standard electronics, or when a continuous current cannot be measured at the output of the counter. The methodology is based on the calibration of the counter in pulse mode, and the use of a statistical model to estimate the number of detected events from the accumulated charge resulting from the detection of the burst of neutrons. The model is developed and presented in full detail. For the measurement of fast neutron yields generated from plasma focus experiments using a moderated proportional counter, the implementation of the methodology is herein discussed. An experimental verification of the accuracy of the methodology is presented. An improvement of more than one order of magnitude in the accuracy of the detection system is obtained by using this methodology with respect to previous calibration methods.

  3. SASSI Analytical Methods Compared with SHAKE Results | Department...

    Office of Environmental Management (EM)

    Analytical Methods Compared with SHAKE Results SASSI Analytical Methods Compared with SHAKE Results SASSI Analytical Methods Compared with SHAKE Results Structural Mechanics - SRS...

  4. Hydrologic evaluation methodology for estimating water movement through the unsaturated zone at commercial low-level radioactive waste disposal sites

    SciTech Connect (OSTI)

    Meyer, P.D.; Rockhold, M.L.; Nichols, W.E.; Gee, G.W. [Pacific Northwest Lab., Richland, WA (United States)

    1996-01-01

    This report identifies key technical issues related to hydrologic assessment of water flow in the unsaturated zone at low-level radioactive waste (LLW) disposal facilities. In addition, a methodology for incorporating these issues in the performance assessment of proposed LLW disposal facilities is identified and evaluated. The issues discussed fall into four areas: estimating the water balance at a site (i.e., infiltration, runoff, water storage, evapotranspiration, and recharge); analyzing the hydrologic performance of engineered components of a facility; evaluating the application of models to the prediction of facility performance; and estimating the uncertainty in predicted facility performance. To illustrate the application of the methodology, two examples are presented. The first example is of a below ground vault located in a humid environment. The second example looks at a shallow land burial facility located in an arid environment. The examples utilize actual site-specific data and realistic facility designs. The two examples illustrate the issues unique to humid and arid sites as well as the issues common to all LLW sites. Strategies for addressing the analytical difficulties arising in any complex hydrologic evaluation of the unsaturated zone are demonstrated.

  5. Experimental and Analytical Research on Fracture Processes in ROck

    SciTech Connect (OSTI)

    Herbert H.. Einstein; Jay Miller; Bruno Silva

    2009-02-27

    Experimental studies on fracture propagation and coalescence were conducted which together with previous tests by this group on gypsum and marble, provide information on fracturing. Specifically, different fracture geometries wsere tested, which together with the different material properties will provide the basis for analytical/numerical modeling. INitial steps on the models were made as were initial investigations on the effect of pressurized water on fracture coalescence.

  6. New Methodology for Estimating Fuel Economy by Vehicle Class

    SciTech Connect (OSTI)

    Chin, Shih-Miao; Dabbs, Kathryn; Hwang, Ho-Ling

    2011-01-01

    Office of Highway Policy Information to develop a new methodology to generate annual estimates of average fuel efficiency and number of motor vehicles registered by vehicle class for Table VM-1 of the Highway Statistics annual publication. This paper describes the new methodology developed under this effort and compares the results of the existing manual method and the new systematic approach. The methodology developed under this study takes a two-step approach. First, the preliminary fuel efficiency rates are estimated based on vehicle stock models for different classes of vehicles. Then, a reconciliation model is used to adjust the initial fuel consumption rates from the vehicle stock models and match the VMT information for each vehicle class and the reported total fuel consumption. This reconciliation model utilizes a systematic approach that produces documentable and reproducible results. The basic framework utilizes a mathematical programming formulation to minimize the deviations between the fuel economy estimates published in the previous year s Highway Statistics and the results from the vehicle stock models, subject to the constraint that fuel consumptions for different vehicle classes must sum to the total fuel consumption estimate published in Table MF-21 of the current year Highway Statistics. The results generated from this new approach provide a smoother time series for the fuel economies by vehicle class. It also utilizes the most up-to-date and best available data with sound econometric models to generate MPG estimates by vehicle class.

  7. Seismic Fracture Characterization Methodologies for Enhanced Geothermal

    Office of Scientific and Technical Information (OSTI)

    Systems (Technical Report) | SciTech Connect Seismic Fracture Characterization Methodologies for Enhanced Geothermal Systems Citation Details In-Document Search Title: Seismic Fracture Characterization Methodologies for Enhanced Geothermal Systems Executive Summary The overall objective of this work was the development of surface and borehole seismic methodologies using both compressional and shear waves for characterizing faults and fractures in Enhanced Geothermal Systems. We used both

  8. ANALYTICAL SOLUTIONS OF SINGULAR ISOTHERMAL QUADRUPOLE LENS

    SciTech Connect (OSTI)

    Chu Zhe; Lin, W. P.; Yang Xiaofeng E-mail: linwp@shao.ac.cn

    2013-06-20

    Using an analytical method, we study the singular isothermal quadrupole (SIQ) lens system, which is the simplest lens model that can produce four images. In this case, the radial mass distribution is in accord with the profile of the singular isothermal sphere lens, and the tangential distribution is given by adding a quadrupole on the monopole component. The basic properties of the SIQ lens have been studied in this Letter, including the deflection potential, deflection angle, magnification, critical curve, caustic, pseudo-caustic, and transition locus. Analytical solutions of the image positions and magnifications for the source on axes are derived. We find that naked cusps will appear when the relative intensity k of quadrupole to monopole is larger than 0.6. According to the magnification invariant theory of the SIQ lens, the sum of the signed magnifications of the four images should be equal to unity, as found by Dalal. However, if a source lies in the naked cusp, the summed magnification of the left three images is smaller than the invariant 1. With this simple lens system, we study the situations where a point source infinitely approaches a cusp or a fold. The sum of the magnifications of the cusp image triplet is usually not equal to 0, and it is usually positive for major cusps while negative for minor cusps. Similarly, the sum of magnifications of the fold image pair is usually not equal to 0 either. Nevertheless, the cusp and fold relations are still equal to 0 in that the sum values are divided by infinite absolute magnifications by definition.

  9. Siting Methodologies for Hydrokinetics | Department of Energy

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Siting Methodologies for Hydrokinetics Report that provides an overview of the federal and state regulatory framework for hydrokinetic projects. PDF icon sitinghandbook2009.pdf ...

  10. Solutia: Massachusetts Chemical Manufacturer Uses SECURE Methodology...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Energy Consumption Solutia: Massachusetts Chemical Manufacturer Uses SECURE Methodology to Identify Potential Reductions in Utility and Process Energy Consumption This case ...