Nuclear Forensic Inferences Using Iterative Multidimensional Statistics
Robel, M; Kristo, M J; Heller, M A
2009-06-09
Nuclear forensics involves the analysis of interdicted nuclear material for specific material characteristics (referred to as 'signatures') that imply specific geographical locations, production processes, culprit intentions, etc. Predictive signatures rely on expert knowledge of physics, chemistry, and engineering to develop inferences from these material characteristics. Comparative signatures, on the other hand, rely on comparison of the material characteristics of the interdicted sample (the 'questioned sample' in FBI parlance) with those of a set of known samples. In the ideal case, the set of known samples would be a comprehensive nuclear forensics database, a database which does not currently exist. In fact, our ability to analyze interdicted samples and produce an extensive list of precise materials characteristics far exceeds our ability to interpret the results. Therefore, as we seek to develop the extensive databases necessary for nuclear forensics, we must also develop the methods necessary to produce the necessary inferences from comparison of our analytical results with these large, multidimensional sets of data. In the work reported here, we used a large, multidimensional dataset of results from quality control analyses of uranium ore concentrate (UOC, sometimes called 'yellowcake'). We have found that traditional multidimensional techniques, such as principal components analysis (PCA), are especially useful for understanding such datasets and drawing relevant conclusions. In particular, we have developed an iterative partial least squares-discriminant analysis (PLS-DA) procedure that has proven especially adept at identifying the production location of unknown UOC samples. By removing classes which fell far outside the initial decision boundary, and then rebuilding the PLS-DA model, we have consistently produced better and more definitive attributions than with a single pass classification approach. Performance of the iterative PLS-DA method compared favorably to that of classification and regression tree (CART) and k nearest neighbor (KNN) algorithms, with the best combination of accuracy and robustness, as tested by classifying samples measured independently in our laboratories against the vendor QC based reference set.
Statistical Inference for Big Data Problems in Molecular Biophysics
Ramanathan, Arvind; Savol, Andrej; Burger, Virginia; Quinn, Shannon; Agarwal, Pratul K; Chennubhotla, Chakra
2012-01-01
We highlight the role of statistical inference techniques in providing biological insights from analyzing long time-scale molecular simulation data. Technologi- cal and algorithmic improvements in computation have brought molecular simu- lations to the forefront of techniques applied to investigating the basis of living systems. While these longer simulations, increasingly complex reaching petabyte scales presently, promise a detailed view into microscopic behavior, teasing out the important information has now become a true challenge on its own. Mining this data for important patterns is critical to automating therapeutic intervention discovery, improving protein design, and fundamentally understanding the mech- anistic basis of cellular homeostasis.
A Statistical Perspective on Highly Accelerated Testing.
Thomas, Edward V.
2015-02-01
Highly accelerated life testing has been heavily promoted at Sandia (and elsewhere) as a means to rapidly identify product weaknesses caused by flaws in the product's design or manufacturing process. During product development, a small number of units are forced to fail at high stress. The failed units are then examined to determine the root causes of failure. The identification of the root causes of product failures exposed by highly accelerated life testing can instigate changes to the product's design and/or manufacturing process that result in a product with increased reliability. It is widely viewed that this qualitative use of highly accelerated life testing (often associated with the acronym HALT) can be useful. However, highly accelerated life testing has also been proposed as a quantitative means for "demonstrating" the reliability of a product where unreliability is associated with loss of margin via an identified and dominating failure mechanism. It is assumed that the dominant failure mechanism can be accelerated by changing the level of a stress factor that is assumed to be related to the dominant failure mode. In extreme cases, a minimal number of units (often from a pre-production lot) are subjected to a single highly accelerated stress relative to normal use. If no (or, sufficiently few) units fail at this high stress level, some might claim that a certain level of reliability has been demonstrated (relative to normal use conditions). Underlying this claim are assumptions regarding the level of knowledge associated with the relationship between the stress level and the probability of failure. The primary purpose of this document is to discuss (from a statistical perspective) the efficacy of using accelerated life testing protocols (and, in particular, "highly accelerated" protocols) to make quantitative inferences concerning the performance of a product (e.g., reliability) when in fact there is lack-of-knowledge and uncertainty concerning the assumed relationship between the stress level and performance. In addition, this document contains recommendations for conducting more informative accelerated tests.
n-dimensional Statistical Inverse Graphical Hydraulic Test Simulator
Energy Science and Technology Software Center (OSTI)
2012-09-12
nSIGHTS (n-dimensional Statistical Inverse Graphical Hydraulic Test Simulator) is a comprehensive well test analysis software package. It provides a user-interface, a well test analysis model and many tools to analyze both field and simulated data. The well test analysis model simulates a single-phase, one-dimensional, radial/non-radial flow regime, with a borehole at the center of the modeled flow system. nSIGHTS solves the radially symmetric n-dimensional forward flow problem using a solver based on a graph-theoretic approach.more » The results of the forward simulation are pressure, and flow rate, given all the input parameters. The parameter estimation portion of nSIGHTS uses a perturbation-based approach to interpret the best-fit well and reservoir parameters, given an observed dataset of pressure and flow rate.« less
Quantum Statistical Testing of a Quantum Random Number Generator
Humble, Travis S
2014-01-01
The unobservable elements in a quantum technology, e.g., the quantum state, complicate system verification against promised behavior. Using model-based system engineering, we present methods for verifying the opera- tion of a prototypical quantum random number generator. We begin with the algorithmic design of the QRNG followed by the synthesis of its physical design requirements. We next discuss how quantum statistical testing can be used to verify device behavior as well as detect device bias. We conclude by highlighting how system design and verification methods must influence effort to certify future quantum technologies.
Blanc, Guillermo A.; Kewley, Lisa; Vogt, Frédéric P. A.; Dopita, Michael A.
2015-01-10
We present a new method for inferring the metallicity (Z) and ionization parameter (q) of H II regions and star-forming galaxies using strong nebular emission lines (SELs). We use Bayesian inference to derive the joint and marginalized posterior probability density functions for Z and q given a set of observed line fluxes and an input photoionization model. Our approach allows the use of arbitrary sets of SELs and the inclusion of flux upper limits. The method provides a self-consistent way of determining the physical conditions of ionized nebulae that is not tied to the arbitrary choice of a particular SEL diagnostic and uses all the available information. Unlike theoretically calibrated SEL diagnostics, the method is flexible and not tied to a particular photoionization model. We describe our algorithm, validate it against other methods, and present a tool that implements it called IZI. Using a sample of nearby extragalactic H II regions, we assess the performance of commonly used SEL abundance diagnostics. We also use a sample of 22 local H II regions having both direct and recombination line (RL) oxygen abundance measurements in the literature to study discrepancies in the abundance scale between different methods. We find that oxygen abundances derived through Bayesian inference using currently available photoionization models in the literature can be in good (∼30%) agreement with RL abundances, although some models perform significantly better than others. We also confirm that abundances measured using the direct method are typically ∼0.2 dex lower than both RL and photoionization-model-based abundances.
Development and testing of improved statistical wind power forecasting methods.
Mendes, J.; Bessa, R.J.; Keko, H.; Sumaili, J.; Miranda, V.; Ferreira, C.; Gama, J.; Botterud, A.; Zhou, Z.; Wang, J.
2011-12-06
Wind power forecasting (WPF) provides important inputs to power system operators and electricity market participants. It is therefore not surprising that WPF has attracted increasing interest within the electric power industry. In this report, we document our research on improving statistical WPF algorithms for point, uncertainty, and ramp forecasting. Below, we provide a brief introduction to the research presented in the following chapters. For a detailed overview of the state-of-the-art in wind power forecasting, we refer to [1]. Our related work on the application of WPF in operational decisions is documented in [2]. Point forecasts of wind power are highly dependent on the training criteria used in the statistical algorithms that are used to convert weather forecasts and observational data to a power forecast. In Chapter 2, we explore the application of information theoretic learning (ITL) as opposed to the classical minimum square error (MSE) criterion for point forecasting. In contrast to the MSE criterion, ITL criteria do not assume a Gaussian distribution of the forecasting errors. We investigate to what extent ITL criteria yield better results. In addition, we analyze time-adaptive training algorithms and how they enable WPF algorithms to cope with non-stationary data and, thus, to adapt to new situations without requiring additional offline training of the model. We test the new point forecasting algorithms on two wind farms located in the U.S. Midwest. Although there have been advancements in deterministic WPF, a single-valued forecast cannot provide information on the dispersion of observations around the predicted value. We argue that it is essential to generate, together with (or as an alternative to) point forecasts, a representation of the wind power uncertainty. Wind power uncertainty representation can take the form of probabilistic forecasts (e.g., probability density function, quantiles), risk indices (e.g., prediction risk index) or scenarios (with spatial and/or temporal dependence). Statistical approaches to uncertainty forecasting basically consist of estimating the uncertainty based on observed forecasting errors. Quantile regression (QR) is currently a commonly used approach in uncertainty forecasting. In Chapter 3, we propose new statistical approaches to the uncertainty estimation problem by employing kernel density forecast (KDF) methods. We use two estimators in both offline and time-adaptive modes, namely, the Nadaraya-Watson (NW) and Quantilecopula (QC) estimators. We conduct detailed tests of the new approaches using QR as a benchmark. One of the major issues in wind power generation are sudden and large changes of wind power output over a short period of time, namely ramping events. In Chapter 4, we perform a comparative study of existing definitions and methodologies for ramp forecasting. We also introduce a new probabilistic method for ramp event detection. The method starts with a stochastic algorithm that generates wind power scenarios, which are passed through a high-pass filter for ramp detection and estimation of the likelihood of ramp events to happen. The report is organized as follows: Chapter 2 presents the results of the application of ITL training criteria to deterministic WPF; Chapter 3 reports the study on probabilistic WPF, including new contributions to wind power uncertainty forecasting; Chapter 4 presents a new method to predict and visualize ramp events, comparing it with state-of-the-art methodologies; Chapter 5 briefly summarizes the main findings and contributions of this report.
Statistical Analysis of Transient Cycle Test Results in a 40...
Broader source: Energy.gov (indexed) [DOE]
Effects of ""new"" engine testing procedures (40 CFR Part 1065) with respect to repeatability of transient engine dynamometer tests were examined as well as the effects of ...
Steffen, Jason H.; Ford, Eric B.; Rowe, Jason F.; Fabrycky, Daniel C.; Holman, Matthew J.; Welsh, William F.; Borucki, William J.; Batalha, Natalie M.; Bryson, Steve; Caldwell, Douglas A.; Ciardi, David R.; /Caltech /NASA, Ames /SETI Inst., Mtn. View
2012-01-01
We analyze the deviations of transit times from a linear ephemeris for the Kepler Objects of Interest (KOI) through Quarter six (Q6) of science data. We conduct two statistical tests for all KOIs and a related statistical test for all pairs of KOIs in multi-transiting systems. These tests identify several systems which show potentially interesting transit timing variations (TTVs). Strong TTV systems have been valuable for the confirmation of planets and their mass measurements. Many of the systems identified in this study should prove fruitful for detailed TTV studies.
Penarrubia, Jorge; Walker, Matthew G.
2012-11-20
We introduce the Minimum Entropy Method, a simple statistical technique for constraining the Milky Way gravitational potential and simultaneously testing different gravity theories directly from 6D phase-space surveys and without adopting dynamical models. We demonstrate that orbital energy distributions that are separable (i.e., independent of position) have an associated entropy that increases under wrong assumptions about the gravitational potential and/or gravity theory. Of known objects, 'cold' tidal streams from low-mass progenitors follow orbital distributions that most nearly satisfy the condition of separability. Although the orbits of tidally stripped stars are perturbed by the progenitor's self-gravity, systematic variations of the energy distribution can be quantified in terms of the cross-entropy of individual tails, giving further sensitivity to theoretical biases in the host potential. The feasibility of using the Minimum Entropy Method to test a wide range of gravity theories is illustrated by evolving restricted N-body models in a Newtonian potential and examining the changes in entropy introduced by Dirac, MONDian, and f(R) gravity modifications.
Ltourneau, Daniel McNiven, Andrea; Keller, Harald; Wang, An; Amin, Md Nurul; Pearce, Jim; Norrlinger, Bernhard; Jaffray, David A.
2014-12-15
Purpose: High-quality radiation therapy using highly conformal dose distributions and image-guided techniques requires optimum machine delivery performance. In this work, a monitoring system for multileaf collimator (MLC) performance, integrating semiautomated MLC quality control (QC) tests and statistical process control tools, was developed. The MLC performance monitoring system was used for almost a year on two commercially available MLC models. Control charts were used to establish MLC performance and assess test frequency required to achieve a given level of performance. MLC-related interlocks and servicing events were recorded during the monitoring period and were investigated as indicators of MLC performance variations. Methods: The QC test developed as part of the MLC performance monitoring system uses 2D megavoltage images (acquired using an electronic portal imaging device) of 23 fields to determine the location of the leaves with respect to the radiation isocenter. The precision of the MLC performance monitoring QC test and the MLC itself was assessed by detecting the MLC leaf positions on 127 megavoltage images of a static field. After initial calibration, the MLC performance monitoring QC test was performed 34 times/week over a period of 1011 months to monitor positional accuracy of individual leaves for two different MLC models. Analysis of test results was performed using individuals control charts per leaf with control limits computed based on the measurements as well as two sets of specifications of 0.5 and 1 mm. Out-of-specification and out-of-control leaves were automatically flagged by the monitoring system and reviewed monthly by physicists. MLC-related interlocks reported by the linear accelerator and servicing events were recorded to help identify potential causes of nonrandom MLC leaf positioning variations. Results: The precision of the MLC performance monitoring QC test and the MLC itself was within 0.22 mm for most MLC leaves and the majority of the apparent leaf motion was attributed to beam spot displacements between irradiations. The MLC QC test was performed 193 and 162 times over the monitoring period for the studied units and recalibration had to be repeated up to three times on one of these units. For both units, rate of MLC interlocks was moderately associated with MLC servicing events. The strongest association with the MLC performance was observed between the MLC servicing events and the total number of out-of-control leaves. The average elapsed time for which the number of out-of-specification or out-of-control leaves was within a given performance threshold was computed and used to assess adequacy of MLC test frequency. Conclusions: A MLC performance monitoring system has been developed and implemented to acquire high-quality QC data at high frequency. This is enabled by the relatively short acquisition time for the images and automatic image analysis. The monitoring system was also used to record and track the rate of MLC-related interlocks and servicing events. MLC performances for two commercially available MLC models have been assessed and the results support monthly test frequency for widely accepted 1 mm specifications. Higher QC test frequency is however required to maintain tighter specification and in-control behavior.
Broader source: Energy.gov [DOE]
Effects of "new" engine testing procedures (40 CFR Part 1065) with respect to repeatability of transient engine dynamometer tests were examined as well as the effects of calibration and measurement methods
Hybrid Statistical Testing for Nuclear Material Accounting Data and/or Process Monitoring Data
Ticknor, Lawrence O.; Hamada, Michael Scott; Sprinkle, James K.; Burr, Thomas Lee
2015-04-14
The two tests employed in the hybrid testing scheme are Page’s cumulative sums for all streams within a Balance Period (maximum of the maximums and average of the maximums) and Crosier’s multivariate cumulative sum applied to incremental cumulative sums across Balance Periods. The role of residuals for both kinds of data is discussed.
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Burr, Tom; Hamada, Michael S.; Ticknor, Larry; Sprinkle, James
2015-01-01
The aim of nuclear safeguards is to ensure that special nuclear material is used for peaceful purposes. Historically, nuclear material accounting (NMA) has provided the quantitative basis for monitoring for nuclear material loss or diversion, and process monitoring (PM) data is collected by the operator to monitor the process. PM data typically support NMA in various ways, often by providing a basis to estimate some of the in-process nuclear material inventory. We develop options for combining PM residuals and NMA residuals (residual = measurement - prediction), using a hybrid of period-driven and data-driven hypothesis testing. The modified statistical tests canmore » be used on time series of NMA residuals (the NMA residual is the familiar material balance), or on a combination of PM and NMA residuals. The PM residuals can be generated on a fixed time schedule or as events occur.« less
Burr, Tom; Hamada, Michael S.; Ticknor, Larry; Sprinkle, James
2015-01-01
The aim of nuclear safeguards is to ensure that special nuclear material is used for peaceful purposes. Historically, nuclear material accounting (NMA) has provided the quantitative basis for monitoring for nuclear material loss or diversion, and process monitoring (PM) data is collected by the operator to monitor the process. PM data typically support NMA in various ways, often by providing a basis to estimate some of the in-process nuclear material inventory. We develop options for combining PM residuals and NMA residuals (residual = measurement - prediction), using a hybrid of period-driven and data-driven hypothesis testing. The modified statistical tests can be used on time series of NMA residuals (the NMA residual is the familiar material balance), or on a combination of PM and NMA residuals. The PM residuals can be generated on a fixed time schedule or as events occur.
Methods for Bayesian power spectrum inference with galaxy surveys
Jasche, Jens; Wandelt, Benjamin D.
2013-12-10
We derive and implement a full Bayesian large scale structure inference method aiming at precision recovery of the cosmological power spectrum from galaxy redshift surveys. Our approach improves upon previous Bayesian methods by performing a joint inference of the three-dimensional density field, the cosmological power spectrum, luminosity dependent galaxy biases, and corresponding normalizations. We account for all joint and correlated uncertainties between all inferred quantities. Classes of galaxies with different biases are treated as separate subsamples. This method therefore also allows the combined analysis of more than one galaxy survey. In particular, it solves the problem of inferring the power spectrum from galaxy surveys with non-trivial survey geometries by exploring the joint posterior distribution with efficient implementations of multiple block Markov chain and Hybrid Monte Carlo methods. Our Markov sampler achieves high statistical efficiency in low signal-to-noise regimes by using a deterministic reversible jump algorithm. This approach reduces the correlation length of the sampler by several orders of magnitude, turning the otherwise numerically unfeasible problem of joint parameter exploration into a numerically manageable task. We test our method on an artificial mock galaxy survey, emulating characteristic features of the Sloan Digital Sky Survey data release 7, such as its survey geometry and luminosity-dependent biases. These tests demonstrate the numerical feasibility of our large scale Bayesian inference frame work when the parameter space has millions of dimensions. This method reveals and correctly treats the anti-correlation between bias amplitudes and power spectrum, which are not taken into account in current approaches to power spectrum estimation, a 20% effect across large ranges in k space. In addition, this method results in constrained realizations of density fields obtained without assuming the power spectrum or bias parameters in advance.
Inferences On The Hydrothermal System Beneath The Resurgent Dome...
Inferences On The Hydrothermal System Beneath The Resurgent Dome In Long Valley Caldera, East-Central California, USA, From Recent Pumping Tests And Geochemical Sampling Jump to:...
Model-Based Sampling and Inference
U.S. Energy Information Administration (EIA) Indexed Site
Model-Based Sampling, Inference and Imputation James R. Knaub, Jr., Energy Information Administration, EI-53.1 James.Knaub@eia.doe.gov Key Words: Survey statistics, Randomization, Conditionality, Random sampling, Cutoff sampling Abstract: Picking a sample through some randomization mechanism, such as random sampling within groups (stratified random sampling), or, say, sampling every fifth item (systematic random sampling), may be familiar to a lot of people. These are design-based samples.
Data free inference with processed data products
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Chowdhary, K.; Najm, H. N.
2014-07-12
Here, we consider the context of probabilistic inference of model parameters given error bars or confidence intervals on model output values, when the data is unavailable. We introduce a class of algorithms in a Bayesian framework, relying on maximum entropy arguments and approximate Bayesian computation methods, to generate consistent data with the given summary statistics. Once we obtain consistent data sets, we pool the respective posteriors, to arrive at a single, averaged density on the parameters. This approach allows us to perform accurate forward uncertainty propagation consistent with the reported statistics.
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Downtime Log Yearly Operation Statistics 2016 Statistics 2015 Statistics 2014 Statistics 2013 Statistics 2012 Statistics 2011 Statistics 2010 Statistics 2009 Statistics 2008...
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Usage Statistics Usage Statistics Genepool Cluster Statistics Period: daily weekly monthly quarter yearly 2year Utilization By Group Jobs Pending Last edited: 2013-09-26 18:21:13...
Sigma: Strain-level inference of genomes from metagenomic analysis for biosurveillance
Ahn, Tae-Hyuk; Crosskey, JJ; Pan, Chongle
2015-01-01
Motivation: Metagenomic sequencing of clinical samples provides a promising technique for direct pathogen detection and characterization in biosurveillance. Taxonomic analysis at the strain level can be used to resolve serotypes of a pathogen in biosurveillance. Sigma was developed for strain-level identification and quantification of pathogens using their reference genomes based on metagenomic analysis. Results: Sigma provides not only accurate strain-level inferences, but also three unique capabilities: (i) Sigma quantifies the statistical uncertainty of its inferences, which includes hypothesis testing of identified genomes and confidence interval estimation of their relative abundances; (ii) Sigma enables strain variant calling by assigning metagenomic reads to their most likely reference genomes; and (iii) Sigma supports parallel computing for fast analysis of large datasets. The algorithm performance was evaluated using simulated mock communities and fecal samples with spike-in pathogen strains. Availability and Implementation: Sigma was implemented in C++ with source codes and binaries freely available at http://sigma.omicsbio.org.
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Statistics Cluster Statistics Ganglia Ganglia can be used to monitor performance of PDSF nodes... Read More PDSF IO Monitoring This page shows the IO response of the elizas and...
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
f!\Jl~~ If & &0 :3 Statistical Methods for Environmental Pollution Monitoring 3 3679 00058 9400 Statistical Methods for Environmental Pollution Monitoring Richard O. Gilbert Pacific Northwest Laboratory Imi5l VAN NOSTRAND REINHOLD COMPANY ~ - - - - - - - New York Dedicated to my parents, Mary Margaret and Donald I. Gilbert Copyright © 1987 by Van Nostrand Reinhold Company Inc. Library of Congress Catalog Card Number: 86-26758 ISBN 0-442-23050-8 Work supported by the U.S. Department of
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Genepool Memory Heatmaps Usage Statistics UGE Scheduler Cycle Time File storage and I/O Data Management Supported Systems FAQ Performance and Optimization Genepool Completed Jobs Genepool Training and Tutorials Websites, databases and cluster services Testbeds Retired Systems Storage & File Systems Data & Analytics Connecting to NERSC Queues and Scheduling Job Logs & Statistics Application Performance Training & Tutorials Software Policies User Surveys NERSC Users Group User
Computing contingency statistics in parallel.
Bennett, Janine Camille; Thompson, David; Pebay, Philippe Pierre
2010-09-01
Statistical analysis is typically used to reduce the dimensionality of and infer meaning from data. A key challenge of any statistical analysis package aimed at large-scale, distributed data is to address the orthogonal issues of parallel scalability and numerical stability. Many statistical techniques, e.g., descriptive statistics or principal component analysis, are based on moments and co-moments and, using robust online update formulas, can be computed in an embarrassingly parallel manner, amenable to a map-reduce style implementation. In this paper we focus on contingency tables, through which numerous derived statistics such as joint and marginal probability, point-wise mutual information, information entropy, and {chi}{sup 2} independence statistics can be directly obtained. However, contingency tables can become large as data size increases, requiring a correspondingly large amount of communication between processors. This potential increase in communication prevents optimal parallel speedup and is the main difference with moment-based statistics where the amount of inter-processor communication is independent of data size. Here we present the design trade-offs which we made to implement the computation of contingency tables in parallel.We also study the parallel speedup and scalability properties of our open source implementation. In particular, we observe optimal speed-up and scalability when the contingency statistics are used in their appropriate context, namely, when the data input is not quasi-diffuse.
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Storage Trends and Summaries Storage by Scientific Discipline Troubleshooting I/O Resources for Scientific Applications at NERSC Optimizing I/O performance on the Lustre file system I/O Formats Science Databases Sharing Data Transferring Data Unix Groups at NERSC Unix File Permissions Data & Analytics Connecting to NERSC Queues and Scheduling Job Logs & Statistics Application Performance Training & Tutorials Software Policies User Surveys NERSC Users Group User Announcements Help
Pekney, Natalie J.; Cheng, Hanqi; Small, Mitchell J.
2015-11-05
Abstract: The objective of the current work was to develop a statistical method and associated tool to evaluate the impact of oil and natural gas exploration and production activities on local air quality.
DOE Science Showcase - Bayesian Inference | OSTI, US Dept of Energy, Office
Office of Scientific and Technical Information (OSTI)
of Scientific and Technical Information Bayesian Inference Credit: LANL For 250 years, the use of Bayesian inference methods has consistently been an important tool in estimating probabilities, given knowledge of certain related probabilities. These methods essentially provide a mathematical framework for rationally and coherently propagating uncertainty. The use of Bayesian statistical methods has increased in recent years due to the availability of simulation-based computational tools for
Candidate Assembly Statistical Evaluation
Energy Science and Technology Software Center (OSTI)
1998-07-15
The Savannah River Site (SRS) receives aluminum clad spent Material Test Reactor (MTR) fuel from all over the world for storage and eventual reprocessing. There are hundreds of different kinds of MTR fuels and these fuels will continue to be received at SRS for approximately ten more years. SRS''s current criticality evaluation methodology requires the modeling of all MTR fuels utilizing Monte Carlo codes, which is extremely time consuming and resource intensive. Now that amore » significant number of MTR calculations have been conducted it is feasible to consider building statistical models that will provide reasonable estimations of MTR behavior. These statistical models can be incorporated into a standardized model homogenization spreadsheet package to provide analysts with a means of performing routine MTR fuel analyses with a minimal commitment of time and resources. This became the purpose for development of the Candidate Assembly Statistical Evaluation (CASE) program at SRS.« less
Sigma: Strain-level inference of genomes from metagenomic analysis for biosurveillance
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Ahn, Tae-Hyuk; Crosskey, JJ; Pan, Chongle
2015-01-01
Motivation: Metagenomic sequencing of clinical samples provides a promising technique for direct pathogen detection and characterization in biosurveillance. Taxonomic analysis at the strain level can be used to resolve serotypes of a pathogen in biosurveillance. Sigma was developed for strain-level identification and quantification of pathogens using their reference genomes based on metagenomic analysis. Results: Sigma provides not only accurate strain-level inferences, but also three unique capabilities: (i) Sigma quantifies the statistical uncertainty of its inferences, which includes hypothesis testing of identified genomes and confidence interval estimation of their relative abundances; (ii) Sigma enables strain variant calling by assigning metagenomic readsmore » to their most likely reference genomes; and (iii) Sigma supports parallel computing for fast analysis of large datasets. The algorithm performance was evaluated using simulated mock communities and fecal samples with spike-in pathogen strains. Availability and Implementation: Sigma was implemented in C++ with source codes and binaries freely available at http://sigma.omicsbio.org.« less
UNSUPERVISED TRANSIENT LIGHT CURVE ANALYSIS VIA HIERARCHICAL BAYESIAN INFERENCE
Sanders, N. E.; Soderberg, A. M.; Betancourt, M.
2015-02-10
Historically, light curve studies of supernovae (SNe) and other transient classes have focused on individual objects with copious and high signal-to-noise observations. In the nascent era of wide field transient searches, objects with detailed observations are decreasing as a fraction of the overall known SN population, and this strategy sacrifices the majority of the information contained in the data about the underlying population of transients. A population level modeling approach, simultaneously fitting all available observations of objects in a transient sub-class of interest, fully mines the data to infer the properties of the population and avoids certain systematic biases. We present a novel hierarchical Bayesian statistical model for population level modeling of transient light curves, and discuss its implementation using an efficient Hamiltonian Monte Carlo technique. As a test case, we apply this model to the Type IIP SN sample from the Pan-STARRS1 Medium Deep Survey, consisting of 18,837 photometric observations of 76 SNe, corresponding to a joint posterior distribution with 9176 parameters under our model. Our hierarchical model fits provide improved constraints on light curve parameters relevant to the physical properties of their progenitor stars relative to modeling individual light curves alone. Moreover, we directly evaluate the probability for occurrence rates of unseen light curve characteristics from the model hyperparameters, addressing observational biases in survey methodology. We view this modeling framework as an unsupervised machine learning technique with the ability to maximize scientific returns from data to be collected by future wide field transient searches like LSST.
Assessment of Inferred Geothermal Resource: Longavi Project,...
Chile Jump to: navigation, search OpenEI Reference LibraryAdd to library Report: Assessment of Inferred Geothermal Resource: Longavi Project, Chile Organization Hot Rock...
Structure Learning and Statistical Estimation in Distribution Networks - Part II
Deka, Deepjyoti; Backhaus, Scott N.; Chertkov, Michael
2015-02-13
Limited placement of real-time monitoring devices in the distribution grid, recent trends notwithstanding, has prevented the easy implementation of demand-response and other smart grid applications. Part I of this paper discusses the problem of learning the operational structure of the grid from nodal voltage measurements. In this work (Part II), the learning of the operational radial structure is coupled with the problem of estimating nodal consumption statistics and inferring the line parameters in the grid. Based on a Linear-Coupled(LC) approximation of AC power flows equations, polynomial time algorithms are designed to identify the structure and estimate nodal load characteristics and/or line parameters in the grid using the available nodal voltage measurements. Then the structure learning algorithm is extended to cases with missing data, where available observations are limited to a fraction of the grid nodes. The efficacy of the presented algorithms are demonstrated through simulations on several distribution test cases.
Statistical assessment of Monte Carlo distributional tallies
Kiedrowski, Brian C; Solomon, Clell J
2010-12-09
Four tests are developed to assess the statistical reliability of distributional or mesh tallies. To this end, the relative variance density function is developed and its moments are studied using simplified, non-transport models. The statistical tests are performed upon the results of MCNP calculations of three different transport test problems and appear to show that the tests are appropriate indicators of global statistical quality.
Quantrum chaos and statistical nuclear physics
Not Available
1986-01-01
This book contains 33 selections. Some of the titles are: Chaotic motion and statistical nuclear theory; Test of spectrum and strength fluctuations with proton resonances; Nuclear level densities and level spacing distributions; Spectral statistics of scale invariant systems; and Antiunitary symmetries and energy level statistics.
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
ARMFacility Statistics 2015 Quarterly Reports First Quarter (PDF) Second Quarter (PDF) Third Quarter (PDF) Fourth Quarter (PDF) Historical Statistics Field Campaigns Operational...
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
The tools used for this evaluation included statistical software to track Internet usage, ... * 123LogAnalyzer and Google Analytics statistical software packages * The LM National ...
AMERICAN STATISTICAL ASSOCIATION
U.S. Energy Information Administration (EIA) Indexed Site
AMERICAN STATISTICAL ASSOCIATION SPRING 2008 MEETING OF THE AMERICAN STATISTICAL ASSOCIATION COMMITTEE ON ENERGY STATISTICS WITH THE ENERGY INFORMATION ADMINISTRATION Washington, D.C. Wednesday, April 9, 2008 2 1 PARTICIPANTS: 2 COMMITTEE ON ENERGY STATISTICS: 3 NAGARAJ K. NEERCHAL Department of Mathematics and Statistics 4 University of Maryland 5 EDWARD A. BLAIR University of Houston 6 BARBARA FORSYTH 7 University of Maryland 8 DEREK R. BINGHAM University of Michigan 9 CALVIN A. KENT 10
Lectures on probability and statistics
Yost, G.P.
1984-09-01
These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another.
Step-Stress Accelerated Degradation Testing for Solar Reflectors: Preprint
Jones, W.; Elmore, R.; Lee, J.; Kennedy, C.
2011-09-01
To meet the challenge to reduce the cost of electricity generated with concentrating solar power (CSP) new low-cost reflector materials are being developed including metalized polymer reflectors and must be tested and validated against appropriate failure mechanisms. We explore the application of testing methods and statistical inference techniques for quantifying estimates and improving lifetimes of concentrating solar power (CSP) reflectors associated with failure mechanisms initiated by exposure to the ultraviolet (UV) part of the solar spectrum. In general, a suite of durability and reliability tests are available for testing a variety of failure mechanisms where the results of a set are required to understand overall lifetime of a CSP reflector. We will focus on the use of the Ultra-Accelerated Weathering System (UAWS) as a testing device for assessing various degradation patterns attributable to accelerated UV exposure. Depending on number of samples, test conditions, degradation and failure patterns, test results may be used to derive insight into failure mechanisms, associated physical parameters, lifetimes and uncertainties. In the most complicated case warranting advanced planning and statistical inference, step-stress accelerated degradation (SSADT) methods may be applied.
Notes on power of normality tests of error terms in regression models
Střelec, Luboš
2015-03-10
Normality is one of the basic assumptions in applying statistical procedures. For example in linear regression most of the inferential procedures are based on the assumption of normality, i.e. the disturbance vector is assumed to be normally distributed. Failure to assess non-normality of the error terms may lead to incorrect results of usual statistical inference techniques such as t-test or F-test. Thus, error terms should be normally distributed in order to allow us to make exact inferences. As a consequence, normally distributed stochastic errors are necessary in order to make a not misleading inferences which explains a necessity and importance of robust tests of normality. Therefore, the aim of this contribution is to discuss normality testing of error terms in regression models. In this contribution, we introduce the general RT class of robust tests for normality, and present and discuss the trade-off between power and robustness of selected classical and robust normality tests of error terms in regression models.
Qai, Qiang; Rushton, Gerald; Bhaduri, Budhendra L; Bright, Eddie A; Coleman, Phil R
2006-01-01
The objective of this research is to compute population estimates by age and sex for small areas whose boundaries are different from those for which the population counts were made. In our approach, population surfaces and age-sex proportion surfaces are separately estimated. Age-sex population estimates for small areas and their confidence intervals are then computed using a binomial model with the two surfaces as inputs. The approach was implemented for Iowa using a 90 m resolution population grid (LandScan USA) and U.S. Census 2000 population. Three spatial interpolation methods, the areal weighting (AW) method, the ordinary kriging (OK) method, and a modification of the pycnophylactic method, were used on Census Tract populations to estimate the age-sex proportion surfaces. To verify the model, age-sex population estimates were computed for paired Block Groups that straddled Census Tracts and therefore were spatially misaligned with them. The pycnophylactic method and the OK method were more accurate than the AW method. The approach is general and can be used to estimate subgroup-count types of variables from information in existing administrative areas for custom-defined areas used as the spatial basis of support in other applications.
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Statistical Table by Appropriation (dollars in thousands - OMB Scoring) Table of Contents Summary...................................................................................................... 1 Mandatory Funding....................................................................................... 3 Energy Supply.............................................................................................. 4 Non-Defense site acceleration
Independent Statistics & Analysis
U.S. Energy Information Administration (EIA) Indexed Site
October 2014 Independent Statistics & Analysis www.eia.gov U.S. Department of Energy Washington, DC 20585 Quarterly Coal Distribution Report April - June 2014 This report was...
International Energy Statistics
U.S. Energy Information Administration (EIA) Indexed Site
Eia.gov BETA - Data - U.S. Energy Information Administration (EIA) U.S. Energy Information Administration - EIA - Independent Statistics and Analysis Sources & Uses Petroleum &...
ARM - Historical Visitor Statistics
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Operational Visitors and Accounts Data Archive and Usage (October 1995 - Present) Historical Visitor Statistics As a national user facility, ARM is required to report...
Statistical Fault Detection & Diagnosis Expert System
Energy Science and Technology Software Center (OSTI)
1996-12-18
STATMON is an expert system that performs real-time fault detection and diagnosis of redundant sensors in any industrial process requiring high reliability. After a training period performed during normal operation, the expert system monitors the statistical properties of the incoming signals using a pattern recognition test. If the test determines that statistical properties of the signals have changed, the expert system performs a sequence of logical steps to determine which sensor or machine component hasmoredegraded.less
International Energy Statistics - EIA
Gasoline and Diesel Fuel Update (EIA)
International > International Energy Statistics International Energy Statistics Petroleum Production | Annual Monthly/Quarterly Consumption | Annual Monthly/Quarterly Capacity | Bunker Fuels | Stocks | Annual Monthly/Quarterly Reserves | Imports | Annual Monthly/Quarterly Exports | CO2 Emissions | Heat Content Natural Gas All Flows | Production | Consumption | Reserves | Imports | Exports | Carbon Dioxide Emissions | Heat Content Coal All Flows | Production | Consumption | Reserves | Imports
Independent Statistics & Analysis
U.S. Energy Information Administration (EIA) Indexed Site
March 2016 Independent Statistics & Analysis www.eia.gov U.S. Department of Energy Washington, DC 20585 Quarterly Coal Distribution Report October - December 2014 This report was prepared by the U.S. Energy Information Administration (EIA), the statistical and analytical agency within the U.S. Department of Energy. By law, EIA's data, analyses, and forecasts are independent of approval by any other officer or employee of the United States Government. The views in this report therefore should
AMERICAN STATISTICAL ASSOCIATION
U.S. Energy Information Administration (EIA) Indexed Site
AMERICAN STATISTICAL ASSOCIATION + + + + + COMMITTEE ON ENERGY STATISTICS + + + + + FALL MEETING + + + + + FRIDAY OCTOBER 17, 2003 + + + + + The Committee met in Room 8E089 in the Forrestal Building, 1000 Independence Avenue, S.W., Washington, D.C., at 8:30 a.m., Jay Breidt, Chair, presiding. PRESENT F. JAY BREIDT Chair NICOLAS HENGARTNER Vice Chair JOHNNY BLAIR Committee Member MARK BURTON Committee Member JAE EDMONDS Committee Member MOSHE FEDER Committee Member JAMES K. HAMMITT Committee
AMERICAN STATISTICAL ASSOCIATION (ASA)
U.S. Energy Information Administration (EIA) Indexed Site
AMERICAN STATISTICAL ASSOCIATION (ASA) MEETING OF THE COMMITTEE ON ENERGY STATISTICS WITH THE ENERGY INFORMATION ADMINISTRATION (EIA) Washington, D.C. Friday, April 29, 2005 COMMITTEE MEMBERS: NICOLAS HENGARTNER, Chair Los Alamos National Laboratory MARK BERNSTEIN RAND Corporation CUTLER CLEVELAND Center for Energy and Environmental Studies JAE EDMONDS Pacific Northwest National Laboratory MOSHE FEDER Research Triangle Institute BARBARA FORSYTH Westat WALTER HILL St. Mary's College of Maryland
System and method for statistically monitoring and analyzing sensed conditions
Pebay, Philippe P.; Brandt, James M.; Gentile, Ann C.; Marzouk, Youssef M.; Hale, Darrian J.; Thompson, David C.
2011-01-04
A system and method of monitoring and analyzing a plurality of attributes for an alarm condition is disclosed. The attributes are processed and/or unprocessed values of sensed conditions of a collection of a statistically significant number of statistically similar components subjected to varying environmental conditions. The attribute values are used to compute the normal behaviors of some of the attributes and also used to infer parameters of a set of models. Relative probabilities of some attribute values are then computed and used along with the set of models to determine whether an alarm condition is met. The alarm conditions are used to prevent or reduce the impact of impending failure.
System and method for statistically monitoring and analyzing sensed conditions
Pebay, Philippe P.; Brandt, James M.; Gentile, Ann C.; Marzouk, Youssef M.; Hale, Darrian J.; Thompson, David C.
2011-01-25
A system and method of monitoring and analyzing a plurality of attributes for an alarm condition is disclosed. The attributes are processed and/or unprocessed values of sensed conditions of a collection of a statistically significant number of statistically similar components subjected to varying environmental conditions. The attribute values are used to compute the normal behaviors of some of the attributes and also used to infer parameters of a set of models. Relative probabilities of some attribute values are then computed and used along with the set of models to determine whether an alarm condition is met. The alarm conditions are used to prevent or reduce the impact of impending failure.
System and method for statistically monitoring and analyzing sensed conditions
Pebay, Philippe P.; Brandt, James M. , Gentile; Ann C. , Marzouk; Youssef M. , Hale; Darrian J. , Thompson; David C.
2010-07-13
A system and method of monitoring and analyzing a plurality of attributes for an alarm condition is disclosed. The attributes are processed and/or unprocessed values of sensed conditions of a collection of a statistically significant number of statistically similar components subjected to varying environmental conditions. The attribute values are used to compute the normal behaviors of some of the attributes and also used to infer parameters of a set of models. Relative probabilities of some attribute values are then computed and used along with the set of models to determine whether an alarm condition is met. The alarm conditions are used to prevent or reduce the impact of impending failure.
MISR-Derived Statistics of Cumulus Geometry at TWP Site
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
derived from satellite (Figure 4a) and surface (Figure 5a) observations. Summary To test the potential for deriving the basic statistics (mean, standard deviation, and...
Statistical methods for environmental pollution monitoring
Gilbert, R.O.
1987-01-01
The application of statistics to environmental pollution monitoring studies requires a knowledge of statistical analysis methods particularly well suited to pollution data. This book fills that need by providing sampling plans, statistical tests, parameter estimation procedure techniques, and references to pertinent publications. Most of the statistical techniques are relatively simple, and examples, exercises, and case studies are provided to illustrate procedures. The book is logically divided into three parts. Chapters 1, 2, and 3 are introductory chapters. Chapters 4 through 10 discuss field sampling designs and Chapters 11 through 18 deal with a broad range of statistical analysis procedures. Some statistical techniques given here are not commonly seen in statistics book. For example, see methods for handling correlated data (Sections 4.5 and 11.12), for detecting hot spots (Chapter 10), and for estimating a confidence interval for the mean of a lognormal distribution (Section 13.2). Also, Appendix B lists a computer code that estimates and tests for trends over time at one or more monitoring stations using nonparametric methods (Chapters 16 and 17). Unfortunately, some important topics could not be included because of their complexity and the need to limit the length of the book. For example, only brief mention could be made of time series analysis using Box-Jenkins methods and of kriging techniques for estimating spatial and spatial-time patterns of pollution, although multiple references on these topics are provided. Also, no discussion of methods for assessing risks from environmental pollution could be included.
Utilizing the sequential probability ratio test for building joint monitoring
Allen, D. W.; Sohn, H.; Worden, K.; Farrar, C. R.
2002-01-01
In this application of the statistical pattern recognition paradigm, a prediction model of a chosen feature is developed from the time domain response of a baseline structure. After the model is developed, subsequent feature sets are tested against the model to determine if a change in the feature has occurred. In the proposed statistical inference for damage identification there are two basic hypotheses; (1) the model can predict the feature, in which case the structure is undamaged or (2) the model can not accurately predict the feature, suggesting that the structure is damaged. The Sequential Probability Ratio Test (SPRT) develops a statistical method that quickly arrives at a decision between these two hypotheses and is applicable to continuous monitoring. In the original formulation of the SPRT algorithm, the feature is assumed to be Gaussian and thresholds are set accordingly. It is likely, however, that the feature used for damage identification is sensitive to the tails of the distribution and that the tails may not necessarily be governed by Gaussian characteristics. By modeling the tails using the technique of Extreme Value Statistics, the hypothesis decision thresholds for the SPRT algorithm may be set avoiding the normality assumption. The SPRT algorithm is utilized to decide if the test structure is undamaged or damaged and which joint is exhibiting the change.
Deformation of the Long Valley Caldera, California: Inferences...
of the Long Valley Caldera, California: Inferences from Measurements from 1988 to 2001 Jump to: navigation, search OpenEI Reference LibraryAdd to library Journal Article:...
Active Fault Segments As Potential Earthquake Sources- Inferences...
Segments As Potential Earthquake Sources- Inferences From Integrated Geophysical Mapping Of The Magadi Fault System, Southern Kenya Rift Jump to: navigation, search OpenEI...
A new method for multinomial inference using Dempster-Shafer...
Office of Scientific and Technical Information (OSTI)
A new method for multinomial inference is proposed by representing the cell probabilities as unordered segments on the unit interval and following Dempster-Shafer (DS) theory. The ...
Comparison of line-imaging VISAR inferences of spalled sample...
Office of Scientific and Technical Information (OSTI)
distension with metallographic analysis of spalled samples. Citation Details In-Document Search Title: Comparison of line-imaging VISAR inferences of spalled sample distension ...
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Statistical Table by Appropriation (dollars in thousands - OMB Scoring) FY 2004 FY 2005 FY 2006 Comparable Comparable Request to FY 2006 vs. FY 2005 Approp Approp Congress Discretionary Summary By Appropriation Energy And Water Development Appropriation Summary: Energy Programs Energy supply Operation and maintenance................................................. 787,941 909,903 862,499 -47,404 -5.2% Construction......................................................................... 6,956
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Statistical Table by Appropriation (dollars in thousands - OMB Scoring) FY 2005 FY 2006 FY 2007 Current Current Congressional Approp. Approp. Request $ % Discretionary Summary By Appropriation Energy And Water Development, And Related Agencies Appropriation Summary: Energy Programs Energy supply and conservation Operation and maintenance............................................ 1,779,399 1,791,372 1,917,331 +125,959 +7.0%
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Statistical Table by Appropriation (dollars in thousands - OMB Scoring) FY 2006 FY 2007 FY 2008 Current Congressional Congressional Approp. Request Request $ % Discretionary Summary By Appropriation Energy And Water Development, And Related Agencies Appropriation Summary: Energy Programs Energy supply and conservation Operation and maintenance........................................... 1,781,242 1,917,331 2,187,943 +270,612 +14.1%
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Statistical Table by Appropriation (dollars in thousands - OMB Scoring) FY 2011 FY 2012 FY 2013 Current Enacted Congressional Approp. Approp. * Request $ % Discretionary Summary By Appropriation Energy And Water Development, And Related Agencies Appropriation Summary: Energy Programs Energy efficiency and renewable energy........................................ 1,771,721 1,809,638 2,337,000 +527,362 +29.1% Electricity delivery and energy reliability.........................................
Statistical Fault Detection & Diagnosis Expert System
Energy Science and Technology Software Center (OSTI)
1996-12-18
STATMON is an expert system that performs real-time fault detection and diagnosis of redundant sensors in any industrial process requiring high reliability. After a training period performed during normal operation, the expert system monitors the statistical properties of the incoming signals using a pattern recognition test. If the test determines that statistical properties of the signals have changed, the expert system performs a sequence of logical steps to determine which sensor or machine component hasmore » degraded.« less
Implementing Bayesian Statistics
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Implementing Bayesian Statistics and a Full Systematic Uncertainty Propagation with the Soft X-Ray Tomography Diagnostic on the Madison Symmetric Torus by Jay Johnson A thesis submitted in partial fulfillment of the requirements for the degree of Bachelors of Science (Physics) at the University of Wisconsin - Madison 2013 i Abstract The Madison Symmetric Torus uses multiple diagnostics to measure electron temper- ature (T e ). The soft x-ray (SXR) diagnostic measures T e from x-ray emission in
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Statistical Table by Appropriation (dollars in thousands - OMB Scoring) FY 2007 FY 2008 FY 2009 Current Current Congressional Op. Plan Approp. Request $ % Discretionary Summary By Appropriation Energy And Water Development, And Related Agencies Appropriation Summary: Energy Programs Energy efficiency and renewable energy.......................... -- 1,722,407 1,255,393 -467,014 -27.1% Electricity delivery and energy reliability........................... -- 138,556 134,000 -4,556 -3.3% Nuclear
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Statistical Table by Appropriation (dollars in thousands - OMB Scoring) FY 2008 FY 2009 FY 2009 FY 2010 Current Current Current Congressional Approp. Approp. Recovery Request $ % Discretionary Summary By Appropriation Energy And Water Development, And Related Agencies Appropriation Summary: Energy Programs Energy efficiency and renewable energy....................................... 1,704,112 2,178,540 16,800,000 2,318,602 +140,062 +6.4% Electricity delivery and energy
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
2Statistical Table by Appropriation (dollars in thousands - OMB Scoring) FY 2010 FY 2011 FY 2011 FY 2012 Current Congressional Annualized Congressional Approp. Request CR Request $ % Discretionary Summary By Appropriation Energy And Water Development, And Related Agencies Appropriation Summary: Energy Programs Energy efficiency and renewable energy....................................... 2,216,392 2,355,473 2,242,500 3,200,053 +983,661 +44.4% Electricity delivery and energy
Experimental Mathematics and Computational Statistics
Bailey, David H.; Borwein, Jonathan M.
2009-04-30
The field of statistics has long been noted for techniques to detect patterns and regularities in numerical data. In this article we explore connections between statistics and the emerging field of 'experimental mathematics'. These includes both applications of experimental mathematics in statistics, as well as statistical methods applied to computational mathematics.
Lectures on probability and statistics. Revision
Yost, G.P.
1985-06-01
These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. They begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probabilty of any specified outcome. They finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another. Hopefully, the reader will come away from these notes with a feel for some of the problems and uncertainties involved. Although there are standard approaches, most of the time there is no cut and dried ''best'' solution - ''best'' according to every criterion.
International petroleum statistics report
1995-10-01
The International Petroleum Statistics Report is a monthly publication that provides current international oil data. This report presents data on international oil production, demand, imports, exports and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). Section 2 presents an oil supply/demand balance for the world, in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries.
Dorn, Sebastian; Enlin, Torsten A.; Ramirez, Erandy; Kunze, Kerstin E.
2014-06-01
We present a generic inference method for inflation models from observational data by the usage of higher-order statistics of the curvature perturbation on uniform density hypersurfaces. This method is based on the calculation of the posterior for the primordial non-Gaussianity parameters f{sub NL} and g{sub NL}, which in general depend on specific parameters of inflation and reheating models, and enables to discriminate among the still viable inflation models. To keep analyticity as far as possible to dispense with numerically expensive sampling techniques a saddle-point approximation is introduced, whose precision is validated for a numerical toy example. The mathematical formulation is done in a generic way so that the approach remains applicable to cosmic microwave background data as well as to large scale structure data. Additionally, we review a few currently interesting inflation models and present numerical toy examples thereof in two and three dimensions to demonstrate the efficiency of the higher-order statistics method. A second quantity of interest is the primordial power spectrum. Here, we present two Bayesian methods to infer it from observational data, the so called critical filter and an extension thereof with smoothness prior, both allowing for a non-parametric spectrum reconstruction. These methods are able to reconstruct the spectra of the observed perturbations and the primordial ones of curvature perturbation even in case of non-Gaussianity and partial sky coverage. We argue that observables like T- and B-modes permit to measure both spectra. This also allows to infer the level of non-Gaussianity generated since inflation.
International petroleum statistics report
1997-05-01
The International Petroleum Statistics Report is a monthly publication that provides current international oil data. This report is published for the use of Members of Congress, Federal agencies, State agencies, industry, and the general public. Publication of this report is in keeping with responsibilities given the Energy Information Administration in Public Law 95-91. The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1995; OECD stocks from 1973 through 1995; and OECD trade from 1985 through 1995.
A Statistical Framework for Microbial Source Attribution
Velsko, S P; Allen, J E; Cunningham, C T
2009-04-28
This report presents a general approach to inferring transmission and source relationships among microbial isolates from their genetic sequences. The outbreak transmission graph (also called the transmission tree or transmission network) is the fundamental structure which determines the statistical distributions relevant to source attribution. The nodes of this graph are infected individuals or aggregated sub-populations of individuals in which transmitted bacteria or viruses undergo clonal expansion, leading to a genetically heterogeneous population. Each edge of the graph represents a transmission event in which one or a small number of bacteria or virions infects another node thus increasing the size of the transmission network. Recombination and re-assortment events originate in nodes which are common to two distinct networks. In order to calculate the probability that one node was infected by another, given the observed genetic sequences of microbial isolates sampled from them, we require two fundamental probability distributions. The first is the probability of obtaining the observed mutational differences between two isolates given that they are separated by M steps in a transmission network. The second is the probability that two nodes sampled randomly from an outbreak transmission network are separated by M transmission events. We show how these distributions can be obtained from the genetic sequences of isolates obtained by sampling from past outbreaks combined with data from contact tracing studies. Realistic examples are drawn from the SARS outbreak of 2003, the FMDV outbreak in Great Britain in 2001, and HIV transmission cases. The likelihood estimators derived in this report, and the underlying probability distribution functions required to calculate them possess certain compelling general properties in the context of microbial forensics. These include the ability to quantify the significance of a sequence 'match' or 'mismatch' between two isolates; the ability to capture non-intuitive effects of network structure on inferential power, including the 'small world' effect; the insensitivity of inferences to uncertainties in the underlying distributions; and the concept of rescaling, i.e. ability to collapse sub-networks into single nodes and examine transmission inferences on the rescaled network.
Statistical physics ""Beyond equilibrium
Ecke, Robert E
2009-01-01
The scientific challenges of the 21st century will increasingly involve competing interactions, geometric frustration, spatial and temporal intrinsic inhomogeneity, nanoscale structures, and interactions spanning many scales. We will focus on a broad class of emerging problems that will require new tools in non-equilibrium statistical physics and that will find application in new material functionality, in predicting complex spatial dynamics, and in understanding novel states of matter. Our work will encompass materials under extreme conditions involving elastic/plastic deformation, competing interactions, intrinsic inhomogeneity, frustration in condensed matter systems, scaling phenomena in disordered materials from glasses to granular matter, quantum chemistry applied to nano-scale materials, soft-matter materials, and spatio-temporal properties of both ordinary and complex fluids.
statistics Home Rmckeel's picture Submitted by Rmckeel(297) Contributor 8 November, 2012 - 12:58 OpenEI dashboard Google Analytics mediawiki OpenEI statistics wiki OpenEI web...
Brittleness and Bayesian Inference (Technical Report) | SciTech Connect
Office of Scientific and Technical Information (OSTI)
Brittleness and Bayesian Inference Citation Details In-Document Search Title: Brittleness and Bayesian Inference Authors: Wallstrom, Timothy C. [1] + Show Author Affiliations Los Alamos National Laboratory Publication Date: 2013-08-15 OSTI Identifier: 1090691 Report Number(s): LA-UR-13-25883 DOE Contract Number: AC52-06NA25396 Resource Type: Technical Report Research Org: Los Alamos National Laboratory (LANL) Sponsoring Org: DOE/LANL Country of Publication: United States Language: English
The Bayesian inference engine, an outsider, computer scientist's
Office of Scientific and Technical Information (OSTI)
perspective (Conference) | SciTech Connect The Bayesian inference engine, an outsider, computer scientist's perspective Citation Details In-Document Search Title: The Bayesian inference engine, an outsider, computer scientist's perspective Authors: Carroll, James Lamond [1] + Show Author Affiliations Los Alamos National Laboratory [Los Alamos National Laboratory Publication Date: 2011-10-07 OSTI Identifier: 1107106 Report Number(s): LA-UR-11-05772; LA-UR-11-5772 DOE Contract Number:
Metallic conductance below T{sub c} inferred by quantum interference...
Office of Scientific and Technical Information (OSTI)
Metallic conductance below Tsub c inferred by quantum interference effects in layered ... Title: Metallic conductance below Tsub c inferred by quantum interference effects in ...
International petroleum statistics report
1996-05-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1995; OECD stocks from 1973 through 1995; and OECD trade from 1084 through 1994.
International petroleum statistics report
1996-10-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. Word oil production and OECD demand data are for the years 1970 through 1995; OECD stocks from 1973 through 1995; and OECD trade from 1985 through 1995.
International petroleum statistics report
1995-11-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1994; OECD stocks from 1973 through 1994; and OECD trade from 1984 through 1994.
International petroleum statistics report
1997-07-01
The International Petroleum Statistics Report is a monthly publication that provides current international data. The report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent 12 months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1996; OECD stocks from 1973 through 1996; and OECD trade from 1986 through 1996.
International petroleum statistics report
1995-07-27
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1994; OECD stocks from 1973 through 1994; and OECD trade from 1984 through 1994.
Angular-momentum nonclassicality by breaking classical bounds on statistics
Luis, Alfredo; Rivas, Angel
2011-10-15
We derive simple practical procedures revealing the quantum behavior of angular momentum variables by the violation of classical upper bounds on the statistics. Data analysis is minimum and definite conclusions are obtained without evaluation of moments, or any other more sophisticated procedures. These nonclassical tests are very general and independent of other typical quantum signatures of nonclassical behavior such as sub-Poissonian statistics, squeezing, or oscillatory statistics, being insensitive to the nonclassical behavior displayed by other variables.
ARM - Historical Field Campaign Statistics
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Operational Visitors and Accounts Data Archive and Usage (October 1995 - Present) Historical Field Campaign Statistics ARM Climate Research Facility users regularly conduct...
Inductive inference model of anomaly and misuse detection
Helman, P.
1997-01-01
Further consequences of the inductive inference model of anomaly and misuse detection are presented. The results apply to the design of both probability models for the inductive inference framework and to the design of W&S rule bases. The issues considered include: the role of misuse models M{sub A}, the selection of relevant sets of attributes and the aggregation of their values, the effect on a rule base of nonmaximal rules, and the partitioning of a set of attributes into a left hand and right hand side.
Design and performance of a scalable, parallel statistics toolkit.
Thompson, David C.; Bennett, Janine Camille; Pebay, Philippe Pierre
2010-11-01
Most statistical software packages implement a broad range of techniques but do so in an ad hoc fashion, leaving users who do not have a broad knowledge of statistics at a disadvantage since they may not understand all the implications of a given analysis or how to test the validity of results. These packages are also largely serial in nature, or target multicore architectures instead of distributed-memory systems, or provide only a small number of statistics in parallel. This paper surveys a collection of parallel implementations of statistics algorithm developed as part of a common framework over the last 3 years. The framework strategically groups modeling techniques with associated verification and validation techniques to make the underlying assumptions of the statistics more clear. Furthermore it employs a design pattern specifically targeted for distributed-memory parallelism, where architectural advances in large-scale high-performance computing have been focused. Moment-based statistics (which include descriptive, correlative, and multicorrelative statistics, principal component analysis (PCA), and k-means statistics) scale nearly linearly with the data set size and number of processes. Entropy-based statistics (which include order and contingency statistics) do not scale well when the data in question is continuous or quasi-diffuse but do scale well when the data is discrete and compact. We confirm and extend our earlier results by now establishing near-optimal scalability with up to 10,000 processes.
Gregory, D.L.; Hansche, B.D.
1996-06-01
In order to support advanced manufacturing, Sandia has acquired the capability to produce plastic prototypes using stereolithography. Currently, these prototypes are used mainly to verify part geometry and ``fit and form`` checks. This project investigates methods for rapidly testing these plastic prototypes, and inferring from prototype test data actual metal part performance and behavior. Performances examined include static load/stress response, and structural dynamic (modal) and vibration behavior. The integration of advanced non-contacting measurement techniques including scanning laser velocimetry, laser holography, and thermoelasticity into testing of these prototypes is described. Photoelastic properties of the epoxy prototypes to reveal full field stress/strain fields are also explored.
Web Analytics and Statistics | Department of Energy
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
User Experience Research & Statistics » Web Analytics and Statistics Web Analytics and Statistics EERE uses Google Analytics to capture statistics on its websites. These statistics help website managers measure and report on users, sessions, most visited pages, and more. The Web Template Coordinator can provide you with EERE's username and password and answer questions about your site statistics. Adding Google Analytics to EERE Websites In order for Google Analytics to capture statistics on
Inference and learning in sparse systems with multiple states
Braunstein, A.; Ramezanpour, A.; Zhang, P.; Zecchina, R.
2011-05-15
We discuss how inference can be performed when data are sampled from the nonergodic phase of systems with multiple attractors. We take as a model system the finite connectivity Hopfield model in the memory phase and suggest a cavity method approach to reconstruct the couplings when the data are separately sampled from few attractor states. We also show how the inference results can be converted into a learning protocol for neural networks in which patterns are presented through weak external fields. The protocol is simple and fully local, and is able to store patterns with a finite overlap with the input patterns without ever reaching a spin-glass phase where all memories are lost.
Statistical analysis of random duration times
Engelhardt, M.E.
1996-04-01
This report presents basic statistical methods for analyzing data obtained by observing random time durations. It gives nonparametric estimates of the cumulative distribution function, reliability function and cumulative hazard function. These results can be applied with either complete or censored data. Several models which are commonly used with time data are discussed, and methods for model checking and goodness-of-fit tests are discussed. Maximum likelihood estimates and confidence limits are given for the various models considered. Some results for situations where repeated durations such as repairable systems are also discussed.
Testing Statistical Cloud Scheme Ideas in the GFDL Climate Model
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Testimony by the Inspector General Testimony by the Inspector General Testimony by the Inspector General August 1, 2013 Gregory H. Friedman: Provided for The Committee on Oversight and Government Reform U.S. House of Representatives Gregory H. Friedman: Provided for The Committee on Oversight and Government Reform U.S. House of Representatives July 24, 2013 Gregory H. Friedman: Provided for the Subcommittee on Oversight and Investigations Committee on Energy and Commerce U.S. House of
Key China Energy Statistics 2012
Levine, Mark; Fridley, David; Lu, Hongyou; Fino-Chen, Cecilia
2012-05-01
The China Energy Group at Lawrence Berkeley National Laboratory (LBNL) was established in 1988. Over the years the Group has gained recognition as an authoritative source of China energy statistics through the publication of its China Energy Databook (CED). The Group has published seven editions to date of the CED (http://china.lbl.gov/research/chinaenergy-databook). This handbook summarizes key statistics from the CED and is expressly modeled on the International Energy Agency’s “Key World Energy Statistics” series of publications. The handbook contains timely, clearly-presented data on the supply, transformation, and consumption of all major energy sources.
Key China Energy Statistics 2011
Levine, Mark; Fridley, David; Lu, Hongyou; Fino-Chen, Cecilia
2012-01-15
The China Energy Group at Lawrence Berkeley National Laboratory (LBNL) was established in 1988. Over the years the Group has gained recognition as an authoritative source of China energy statistics through the publication of its China Energy Databook (CED). In 2008 the Group published the Seventh Edition of the CED (http://china.lbl.gov/research/chinaenergy-databook). This handbook summarizes key statistics from the CED and is expressly modeled on the International Energy Agency’s “Key World Energy Statistics” series of publications. The handbook contains timely, clearly-presented data on the supply, transformation, and consumption of all major energy sources.
Computing contingency statistics in parallel : design trade-offs and limiting cases.
Bennett, Janine Camille; Thompson, David; Pebay, Philippe Pierre
2010-06-01
Statistical analysis is typically used to reduce the dimensionality of and infer meaning from data. A key challenge of any statistical analysis package aimed at large-scale, distributed data is to address the orthogonal issues of parallel scalability and numerical stability. Many statistical techniques, e.g., descriptive statistics or principal component analysis, are based on moments and co-moments and, using robust online update formulas, can be computed in an embarrassingly parallel manner, amenable to a map-reduce style implementation. In this paper we focus on contingency tables, through which numerous derived statistics such as joint and marginal probability, point-wise mutual information, information entropy, and {chi}{sup 2} independence statistics can be directly obtained. However, contingency tables can become large as data size increases, requiring a correspondingly large amount of communication between processors. This potential increase in communication prevents optimal parallel speedup and is the main difference with moment-based statistics (which we discussed in [1]) where the amount of inter-processor communication is independent of data size. Here we present the design trade-offs which we made to implement the computation of contingency tables in parallel. We also study the parallel speedup and scalability properties of our open source implementation. In particular, we observe optimal speed-up and scalability when the contingency statistics are used in their appropriate context, namely, when the data input is not quasi-diffuse.
Computing contingency statistics in parallel : design trade-offs and limiting cases.
Thompson, David C.; Bennett, Janine C.; Pebay, Philippe Pierre
2010-03-01
Statistical analysis is typically used to reduce the dimensionality of and infer meaning from data. A key challenge of any statistical analysis package aimed at large-scale, distributed data is to address the orthogonal issues of parallel scalability and numerical stability. Many statistical techniques, e.g., descriptive statistics or principal component analysis, are based on moments and co-moments and, using robust online update formulas, can be computed in an embarrassingly parallel manner, amenable to a map-reduce style implementation. In this paper we focus on contingency tables, through which numerous derived statistics such as joint and marginal probability, point-wise mutual information, information entropy, and X{sup 2} independence statistics can be directly obtained. However, contingency tables can become large as data size increases, requiring a correspondingly large amount of communication between processors. This potential increase in communication prevents optimal parallel speedup and is the main difference with moment-based statistics (which we discussed in [1]) where the amount of inter-processor communication is independent of data size. Here we present the design trade-offs which we made to implement the computation of contingency tables in parallel.We also study the parallel speedup and scalability properties of our open source implementation. In particular, we observe optimal speed-up and scalability when the contingency statistics are used in their appropriate context, namely, when the data input is not quasi-diffuse.
QUANTUM MECHANICS WITHOUT STATISTICAL POSTULATES
G. GEIGER; ET AL
2000-11-01
The Bohmian formulation of quantum mechanics describes the measurement process in an intuitive way without a reduction postulate. Due to the chaotic motion of the hidden classical particle all statistical features of quantum mechanics during a sequence of repeated measurements can be derived in the framework of a deterministic single system theory.
Ideas for Effective Communication of Statistical Results
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Anderson-Cook, Christine M.
2015-03-01
Effective presentation of statistical results to those with less statistical training, including managers and decision-makers requires planning, anticipation and thoughtful delivery. Here are several recommendations for effectively presenting statistical results.
VTPI-Transportation Statistics | Open Energy Information
Area: Transportation Resource Type: Dataset Website: www.vtpi.orgtdmtdm80.htm Cost: Free VTPI-Transportation Statistics Screenshot References: VTPI-Transportation Statistics1...
IEA Energy Statistics | Open Energy Information
Statistics Jump to: navigation, search Tool Summary LAUNCH TOOL Name: IEA Energy Statistics AgencyCompany Organization: International Energy Agency Sector: Energy Topics: GHG...
ORISE: Statistical Analyses of Worker Health
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
appropriate methods of statistical analysis to a variety of problems in occupational health and other areas. Our expertise spans a range of capabilities essential for statistical...
STATISTICAL PERFORMANCE EVALUATION OF SPRING OPERATED PRESSURE...
Office of Scientific and Technical Information (OSTI)
STATISTICAL PERFORMANCE EVALUATION OF SPRING OPERATED PRESSURE RELIEF VALVE RELIABILITY IMPROVEMENTS 2004 TO 2014 Citation Details In-Document Search Title: STATISTICAL PERFORMANCE ...
Statistically significant relational data mining :
Berry, Jonathan W.; Leung, Vitus Joseph; Phillips, Cynthia Ann; Pinar, Ali; Robinson, David Gerald; Berger-Wolf, Tanya; Bhowmick, Sanjukta; Casleton, Emily; Kaiser, Mark; Nordman, Daniel J.; Wilson, Alyson G.
2014-02-01
This report summarizes the work performed under the project (3z(BStatitically significant relational data mining.(3y (BThe goal of the project was to add more statistical rigor to the fairly ad hoc area of data mining on graphs. Our goal was to develop better algorithms and better ways to evaluate algorithm quality. We concetrated on algorithms for community detection, approximate pattern matching, and graph similarity measures. Approximate pattern matching involves finding an instance of a relatively small pattern, expressed with tolerance, in a large graph of data observed with uncertainty. This report gathers the abstracts and references for the eight refereed publications that have appeared as part of this work. We then archive three pieces of research that have not yet been published. The first is theoretical and experimental evidence that a popular statistical measure for comparison of community assignments favors over-resolved communities over approximations to a ground truth. The second are statistically motivated methods for measuring the quality of an approximate match of a small pattern in a large graph. The third is a new probabilistic random graph model. Statisticians favor these models for graph analysis. The new local structure graph model overcomes some of the issues with popular models such as exponential random graph models and latent variable models.
Inferring the gravitational potential of the Milky Way with a few precisely measured stars
Price-Whelan, Adrian M.; Johnston, Kathryn V.; Hendel, David; Hogg, David W.
2014-10-10
The dark matter halo of the Milky Way is expected to be triaxial and filled with substructure. It is hoped that streams or shells of stars produced by tidal disruption of stellar systems will provide precise measures of the gravitational potential to test these predictions. We develop a method for inferring the Galactic potential with tidal streams based on the idea that the stream stars were once close in phase space. Our method can flexibly adapt to any form for the Galactic potential: it works in phase-space rather than action-space and hence relies neither on our ability to derive actions nor on the integrability of the potential. Our model is probabilistic, with a likelihood function and priors on the parameters. The method can properly account for finite observational uncertainties and missing data dimensions. We test our method on synthetic data sets generated from N-body simulations of satellite disruption in a static, multi-component Milky Way, including a triaxial dark matter halo with observational uncertainties chosen to mimic current and near-future surveys of various stars. We find that with just eight well-measured stream stars, we can infer properties of a triaxial potential with precisions of the order of 5%-7%. Without proper motions, we obtain 10% constraints on most potential parameters and precisions around 5%-10% for recovering missing phase-space coordinates. These results are encouraging for the goal of using flexible, time-dependent potential models combined with larger data sets to unravel the detailed shape of the dark matter distribution around the Milky Way.
Moore honored with American Statistical Association award
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
American Statistical Association Award Moore honored with American Statistical Association award Lisa Moore is the recipient of the 2013 Don Owen Award presented by the American Statistical Association, San Antonio Chapter. May 24, 2013 Leslie "Lisa" Moore Leslie "Lisa" Moore The American Statistical Association (ASA) is the world's largest community of statisticians. It was founded in Massachusetts in 1839. Leslie "Lisa" Moore of the Laboratory's Statistical
Moore named an American Statistical Society Fellow
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Moore named an American Statistical Society Fellow Moore named an American Statistical Society Fellow The ASA inducted Leslie (Lisa) Moore as a Fellow at the 2014 Joint Statistical Meetings. October 8, 2014 Leslie (Lisa) Moore Leslie (Lisa) Moore ASA cited Moore for "seminal and creative research on the design of computer experiments; for statistical collaboration on a wide range of problems of scientific and national importance; and for mentoring statisticians and statistical
Office of Survey Development and Statistical Integration
U.S. Energy Information Administration (EIA) Indexed Site
Steve Harvey April 27, 2011 | Washington, D.C. Tough Choices in U.S. EIA's Data Programs Agenda * Office of Oil, Gas, and Coal Supply Statistics * Office of Petroleum and Biofuels Statistics * Office of Electricity, Renewables, and Uranium Statistics * Office of Energy Consumption and Efficiency Statistics * Office of Survey Development and Statistical Integration 2 Presenter name, Presentation location, Presentation date Coal Data Collection Program 3 James Kendell Washington, DC, April 27,
Transportation Statistics Annual Report 1997
Fenn, M.
1997-01-01
This document is the fourth Transportation Statistics Annual Report (TSAR) prepared by the Bureau of Transportation Statistics (BTS) for the President and Congress. As in previous years, it reports on the state of U.S. transportation system at two levels. First, in Part I, it provides a statistical and interpretive survey of the system—its physical characteristics, its economic attributes, aspects of its use and performance, and the scale and severity of unintended consequences of transportation, such as fatalities and injuries, oil import dependency, and environment impacts. Part I also explores the state of transportation statistics, and new needs of the rapidly changing world of transportation. Second, Part II of the report, as in prior years, explores in detail the performance of the U.S. transportation system from the perspective of desired social outcomes or strategic goals. This year, the performance aspect of transportation chosen for thematic treatment is “Mobility and Access,” which complements past TSAR theme sections on “The Economic Performance of Transportation” (1995) and “Transportation and the Environment” (1996). Mobility and access are at the heart of the transportation system’s performance from the user’s perspective. In what ways and to what extent does the geographic freedom provided by transportation enhance personal fulfillment of the nation’s residents and contribute to economic advancement of people and businesses? This broad question underlies many of the topics examined in Part II: What is the current level of personal mobility in the United States, and how does it vary by sex, age, income level, urban or rural location, and over time? What factors explain variations? Has transportation helped improve people’s access to work, shopping, recreational facilities, and medical services, and in what ways and in what locations? How have barriers, such as age, disabilities, or lack of an automobile, affected these accessibility patterns? How are commodity flows and transportation services responding to global competition, deregulation, economic restructuring, and new information technologies? How do U.S. patterns of personal mobility and freight movement compare with other advanced industrialized countries, formerly centrally planned economies, and major newly industrializing countries? Finally, how is the rapid adoption of new information technologies influencing the patterns of transportation demand and the supply of new transportation services? Indeed, how are information technologies affecting the nature and organization of transportation services used by individuals and firms?
Statistics, Uncertainty, and Transmitted Variation
Wendelberger, Joanne Roth
2014-11-05
The field of Statistics provides methods for modeling and understanding data and making decisions in the presence of uncertainty. When examining response functions, variation present in the input variables will be transmitted via the response function to the output variables. This phenomenon can potentially have significant impacts on the uncertainty associated with results from subsequent analysis. This presentation will examine the concept of transmitted variation, its impact on designed experiments, and a method for identifying and estimating sources of transmitted variation in certain settings.
Parallel auto-correlative statistics with VTK.
Pebay, Philippe Pierre; Bennett, Janine Camille
2013-08-01
This report summarizes existing statistical engines in VTK and presents both the serial and parallel auto-correlative statistics engines. It is a sequel to [PT08, BPRT09b, PT09, BPT09, PT10] which studied the parallel descriptive, correlative, multi-correlative, principal component analysis, contingency, k-means, and order statistics engines. The ease of use of the new parallel auto-correlative statistics engine is illustrated by the means of C++ code snippets and algorithm verification is provided. This report justifies the design of the statistics engines with parallel scalability in mind, and provides scalability and speed-up analysis results for the autocorrelative statistics engine.
A new method for multinomial inference using Dempster-Shafer theory
Office of Scientific and Technical Information (OSTI)
(Journal Article) | SciTech Connect A new method for multinomial inference using Dempster-Shafer theory Citation Details In-Document Search Title: A new method for multinomial inference using Dempster-Shafer theory A new method for multinomial inference is proposed by representing the cell probabilities as unordered segments on the unit interval and following Dempster-Shafer (DS) theory. The resulting DS posterior is then strengthened to improve symmetry and learning properties with the
Tomography and weak lensing statistics
Munshi, Dipak; Coles, Peter; Kilbinger, Martin E-mail: peter.coles@astro.cf.ac.uk
2014-04-01
We provide generic predictions for the lower order cumulants of weak lensing maps, and their correlators for tomographic bins as well as in three dimensions (3D). Using small-angle approximation, we derive the corresponding one- and two-point probability distribution function for the tomographic maps from different bins and for 3D convergence maps. The modelling of weak lensing statistics is obtained by adopting a detailed prescription for the underlying density contrast that involves hierarchal ansatz and lognormal distribution. We study the dependence of our results on cosmological parameters and source distributions corresponding to the realistic surveys such as LSST and DES. We briefly outline how photometric redshift information can be incorporated in our results. We also show how topological properties of convergence maps can be quantified using our results.
Statistics and Discoveries at the LHC (3/4)
None
2011-10-06
The lectures will give an introduction to statistics as applied in particle physics and will provide all the necessary basics for data analysis at the LHC. Special emphasis will be placed on the the problems and questions that arise when searching for new phenomena, including p-values, discovery significance, limit setting procedures, treatment of small signals in the presence of large backgrounds. Specific issues that will be addressed include the advantages and drawbacks of different statistical test procedures (cut-based, likelihood-ratio, etc.), the look-elsewhere effect and treatment of systematic uncertainties.
Statistics and Discoveries at the LHC (4/4)
None
2011-10-06
The lectures will give an introduction to statistics as applied in particle physics and will provide all the necessary basics for data analysis at the LHC. Special emphasis will be placed on the the problems and questions that arise when searching for new phenomena, including p-values, discovery significance, limit setting procedures, treatment of small signals in the presence of large backgrounds. Specific issues that will be addressed include the advantages and drawbacks of different statistical test procedures (cut-based, likelihood-ratio, etc.), the look-elsewhere effect and treatment of systematic uncertainties.
Statistics and Discoveries at the LHC (2/4)
None
2011-10-06
The lectures will give an introduction to statistics as applied in particle physics and will provide all the necessary basics for data analysis at the LHC. Special emphasis will be placed on the the problems and questions that arise when searching for new phenomena, including p-values, discovery significance, limit setting procedures, treatment of small signals in the presence of large backgrounds. Specific issues that will be addressed include the advantages and drawbacks of different statistical test procedures (cut-based, likelihood-ratio, etc.), the look-elsewhere effect and treatment of systematic uncertainties.
Statistics and Discoveries at the LHC (1/4)
None
2011-10-06
The lectures will give an introduction to statistics as applied in particle physics and will provide all the necessary basics for data analysis at the LHC. Special emphasis will be placed on the the problems and questions that arise when searching for new phenomena, including p-values, discovery significance, limit setting procedures, treatment of small signals in the presence of large backgrounds. Specific issues that will be addressed include the advantages and drawbacks of different statistical test procedures (cut-based, likelihood-ratio, etc.), the look-elsewhere effect and treatment of systematic uncertainties.
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
I/O Statistics Last 30 Days I/O Statistics Last 30 Days These plots show the daily statistics for the last 30 days for the storage systems at NERSC in terms of the amount of data transferred and the number of files transferred. Daily I/O Volume Daily I/O Count
STORM: A STatistical Object Representation Model
Rafanelli, M. ); Shoshani, A. )
1989-11-01
In this paper we explore the structure and semantic properties of the entities stored in statistical databases. We call such entities statistical objects'' (SOs) and propose a new statistical object representation model,'' based on a graph representation. We identify a number of SO representational problems in current models and propose a methodology for their solution. 11 refs.
Energy Science and Technology Software Center (OSTI)
1992-02-20
SENSIT,MUSIG,COMSEN is a set of three related programs for sensitivity test analysis. SENSIT conducts sensitivity tests. These tests are also known as threshold tests, LD50 tests, gap tests, drop weight tests, etc. SENSIT interactively instructs the experimenter on the proper level at which to stress the next specimen, based on the results of previous responses. MUSIG analyzes the results of a sensitivity test to determine the mean and standard deviation of the underlying population bymore » computing maximum likelihood estimates of these parameters. MUSIG also computes likelihood ratio joint confidence regions and individual confidence intervals. COMSEN compares the results of two sensitivity tests to see if the underlying populations are significantly different. COMSEN provides an unbiased method of distinguishing between statistical variation of the estimates of the parameters of the population and true population difference.« less
FRAMES Software System: Linking to the Statistical Package R
Castleton, Karl J.; Whelan, Gene; Hoopes, Bonnie L.
2006-12-11
This document provides requirements, design, data-file specifications, test plan, and Quality Assurance/Quality Control protocol for the linkage between the statistical package R and the Framework for Risk Analysis in Multimedia Environmental Systems (FRAMES) Versions 1.x and 2.0. The requirements identify the attributes of the system. The design describes how the system will be structured to meet those requirements. The specification presents the specific modifications to FRAMES to meet the requirements and design. The test plan confirms that the basic functionality listed in the requirements (black box testing) actually functions as designed, and QA/QC confirms that the software meets the client’s needs.
Inferring Magnetospheric Heavy Ion Density using EMIC Waves
Kim, Eun-Hwa; Johnson, Jay R.; Kim, Hyomin; Lee, Dong-Hun
2014-05-01
We present a method to infer heavy ion concentration ratios from EMIC wave observations that result from ionion hybrid (IIH) resonance. A key feature of the ion-ion hybrid resonance is the concentration of wave energy in a field-aligned resonant mode that exhibits linear polarization. This mode converted wave is localized at the location where the frequency of a compressional wave driver matches the IIH resonance condition, which depends sensitively on the heavy ion concentration. This dependence makes it possible to estimate the heavy ion concentration ratio. In this letter, we evaluate the absorption coefficients at the IIH resonance at Earth's geosynchronous orbit for variable concentrations of He+ and field-aligned wave numbers using a dipole magnetic field. Although wave absorption occurs for a wide range of heavy ion concentrations, it only occurs for a limited range of field-aligned wave numbers such that the IIH resonance frequency is close to, but not exactly the same as the crossover frequency. Using the wave absorption and observed EMIC waves from GOES-12 satellite, we demonstrate how this technique can be used to estimate that the He+ concentration is around 4% near L = 6.6.
Predicting weak lensing statistics from halo mass reconstructions - Final Paper
Everett, Spencer
2015-08-20
As dark matter does not absorb or emit light, its distribution in the universe must be inferred through indirect effects such as the gravitational lensing of distant galaxies. While most sources are only weakly lensed, the systematic alignment of background galaxies around a foreground lens can constrain the mass of the lens which is largely in the form of dark matter. In this paper, I have implemented a framework to reconstruct all of the mass along lines of sight using a best-case dark matter halo model in which the halo mass is known. This framework is then used to make predictions of the weak lensing of 3,240 generated source galaxies through a 324 arcmin field of the Millennium Simulation. The lensed source ellipticities are characterized by the ellipticity-ellipticity and galaxy-mass correlation functions and compared to the same statistic for the intrinsic and ray-traced ellipticities. In the ellipticity-ellipticity correlation function, I and that the framework systematically under predicts the shear power by an average factor of 2.2 and fails to capture correlation from dark matter structure at scales larger than 1 arcminute. The model predicted galaxy-mass correlation function is in agreement with the ray-traced statistic from scales 0.2 to 0.7 arcminutes, but systematically underpredicts shear power at scales larger than 0.7 arcminutes by an average factor of 1.2. Optimization of the framework code has reduced the mean CPU time per lensing prediction by 70% to 24 5 ms. Physical and computational shortcomings of the framework are discussed, as well as potential improvements for upcoming work.
Statistics for characterizing data on the periphery
Theiler, James P; Hush, Donald R
2010-01-01
We introduce a class of statistics for characterizing the periphery of a distribution, and show that these statistics are particularly valuable for problems in target detection. Because so many detection algorithms are rooted in Gaussian statistics, we concentrate on ellipsoidal models of high-dimensional data distributions (that is to say: covariance matrices), but we recommend several alternatives to the sample covariance matrix that more efficiently model the periphery of a distribution, and can more effectively detect anomalous data samples.
Structure Learning and Statistical Estimation in Distribution...
Office of Scientific and Technical Information (OSTI)
Citation Details In-Document Search Title: Structure Learning and Statistical Estimation ... Part I of this paper discusses the problem of learning the operational structure of the ...
Statistical methods for nuclear material management
Bowen W.M.; Bennett, C.A.
1988-12-01
This book is intended as a reference manual of statistical methodology for nuclear material management practitioners. It describes statistical methods currently or potentially important in nuclear material management, explains the choice of methods for specific applications, and provides examples of practical applications to nuclear material management problems. Together with the accompanying training manual, which contains fully worked out problems keyed to each chapter, this book can also be used as a textbook for courses in statistical methods for nuclear material management. It should provide increased understanding and guidance to help improve the application of statistical methods to nuclear material management problems.
Moore honored with American Statistical Association award
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Before his death in 1991, Professor Owen was the Distinguished Professor of Statistics at Southern Methodist University in Dallas, Texas. His illustrious career serves as the ...
[pic] EERE Web Site Statistics - Social Media
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
EERE Web Site Statistics - Social Media Custom View: 10110 - 93011 October 1, 2010 ... site compels visitors to return. Updating web site content is one way to draw return ...
ANNUAL FEDERAL EQUAL EMPLOYMENT OPPORTUNITY STATISTICAL REPORT...
National Nuclear Security Administration (NNSA)
STATISTICAL REPORT OF DISCRIMINATION COMPLAINTS (REPORTING PERIOD BEGINS OCTOBER 1ST AND ... COUNSELORINVESTIGATOR F. COMPLAINTS IN LINE E CLOSED DURING REPORT PERIOD a. FULL-TIME b. ...
ANNUAL FEDERAL EQUAL EMPLOYMENT OPPORTUNITY STATISTICAL REPORT...
National Nuclear Security Administration (NNSA)
4 EEOC FORM 462 (REVISED APR 2011) Report Status: Finalized, 11062014 6:14 PM 1 PART I - ... EQUAL EMPLOYMENT OPPORTUNITY STATISTICAL REPORT OF DISCRIMINATION COMPLAINTS (REPORTING ...
STATISTICAL PERFORMANCE EVALUATION OF SPRING OPERATED PRESSURE...
Office of Scientific and Technical Information (OSTI)
VALVE RELIABILITY IMPROVEMENTS 2004 TO 2014 Citation Details In-Document Search Title: STATISTICAL PERFORMANCE EVALUATION OF SPRING OPERATED PRESSURE RELIEF VALVE RELIABILITY ...
Statistics for Industry Groups and Industries, 2003
2009-01-18
Statistics for the U.S. Department of Commerce including types of manufacturing, employees, and products as outlined in the Annual Survey of Manufacturers (ASM).
Bayesian Inference in Probabilistic Risk Assessment -- The Current State of the Art
Dana L. Kelly; Curtis L. Smith
2009-02-01
Markov chain Monte Carlo approaches to sampling directly from the joint posterior distribution of aleatory model parameters have led to tremendous advances in Bayesian inference capability in a wide variety of fields, including probabilistic risk analysis. The advent of freely available software coupled with inexpensive computing power has catalyzed this advance. This paper examines where the risk assessment community is with respect to implementing modern computational-based Bayesian approaches to inference. Through a series of examples in different topical areas, it introduces salient concepts and illustrates the practical application of Bayesian inference via Markov chain Monte Carlo sampling to a variety of important problems.
Statistical Methods Handbook for Advanced Gas Reactor Fuel Materials
J. J. Einerson
2005-05-01
Fuel materials such as kernels, coated particles, and compacts are being manufactured for experiments simulating service in the next generation of high temperature gas reactors. These must meet predefined acceptance specifications. Many tests are performed for quality assurance, and many of these correspond to criteria that must be met with specified confidence, based on random samples. This report describes the statistical methods to be used. The properties of the tests are discussed, including the risk of false acceptance, the risk of false rejection, and the assumption of normality. Methods for calculating sample sizes are also described.
Detailed Monthly and Annual LNG Import Statistics (2004-2012...
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Detailed Monthly and Annual LNG Import Statistics (2004-2012) Detailed Monthly and Annual LNG Import Statistics (2004-2012) Detailed Monthly and Annual LNG Import Statistics ...
Federal offshore statistics: leasing - exploration - production - revenue
Essertier, E.P.
1984-01-01
Federal Offshore Statistics is a numerical record of what has happened since Congress gave authority to the Secretary of the Interior in 1953 to lease the Federal portion of the Continental Shelf for oil and gas. The publication updates and augments the first Federal Offshore Statistics, published in December 1983. It also extends a statistical series published annually from 1969 until 1981 by the US Geological Survey (USGS) under the title Outer Continental Shelf Statistics. The USGS collected royalties and supervised operation and production of minerals on the Outer Continental Shelf (OCS) until the Minerals Management Service (MMS) took over these functions in 1982. Statistics are presented under the following topics: (1) highlights, (2) leasing, (3) exploration and development, (4) production and revenue, (5) federal offshore production by ranking operator, 1983, (6) reserves and undiscovered recoverable resources, and (7) oil pollution in the world's oceans.
Marzouk, Youssef; Fast P. (Lawrence Livermore National Laboratory, Livermore, CA); Kraus, M.; Ray, J. P.
2006-01-01
Terrorist attacks using an aerosolized pathogen preparation have gained credibility as a national security concern after the anthrax attacks of 2001. The ability to characterize such attacks, i.e., to estimate the number of people infected, the time of infection, and the average dose received, is important when planning a medical response. We address this question of characterization by formulating a Bayesian inverse problem predicated on a short time-series of diagnosed patients exhibiting symptoms. To be of relevance to response planning, we limit ourselves to 3-5 days of data. In tests performed with anthrax as the pathogen, we find that these data are usually sufficient, especially if the model of the outbreak used in the inverse problem is an accurate one. In some cases the scarcity of data may initially support outbreak characterizations at odds with the true one, but with sufficient data the correct inferences are recovered; in other words, the inverse problem posed and its solution methodology are consistent. We also explore the effect of model error-situations for which the model used in the inverse problem is only a partially accurate representation of the outbreak; here, the model predictions and the observations differ by more than a random noise. We find that while there is a consistent discrepancy between the inferred and the true characterizations, they are also close enough to be of relevance when planning a response.
In the OSTI Collections: Bayesian Inference | OSTI, US Dept of Energy,
Office of Scientific and Technical Information (OSTI)
Office of Scientific and Technical Information Bayesian Inference Specific uses of Bayes' theorem Conceptual ramifications of Bayes' theorem The meaning of Bayesian inference References Research Organizations Reports Available through OSTI's SciTech Connect and E-print Network Lectures Available through OSTI's ScienceCinema Additional References Mr. Smith walks with his dog every morning; those neighbors who can see the street where they walk are used to seeing them around 7:30 am. Ask any
Is Bayesian inference "brittle"? (Technical Report) | SciTech Connect
Office of Scientific and Technical Information (OSTI)
Technical Report: Is Bayesian inference "brittle"? Citation Details In-Document Search Title: Is Bayesian inference "brittle"? Authors: Wallstrom, Timothy C. [1] ; Higdon, David M. [1] + Show Author Affiliations Los Alamos National Laboratory Publication Date: 2013-08-15 OSTI Identifier: 1090693 Report Number(s): LA-UR-13-26482 DOE Contract Number: AC52-06NA25396 Resource Type: Technical Report Research Org: Los Alamos National Laboratory (LANL) Sponsoring Org: DOE/LANL
Comparison of line-imaging VISAR inferences of spalled sample distension
Office of Scientific and Technical Information (OSTI)
with metallographic analysis of spalled samples. (Conference) | SciTech Connect Comparison of line-imaging VISAR inferences of spalled sample distension with metallographic analysis of spalled samples. Citation Details In-Document Search Title: Comparison of line-imaging VISAR inferences of spalled sample distension with metallographic analysis of spalled samples. No abstract prepared. Authors: Furnish, Michael David ; Bingert, John F. ; Gray, George T., III Publication Date: 2010-06-01 OSTI
Topology for statistical modeling of petascale data.
Pascucci, Valerio; Mascarenhas, Ajith Arthur; Rusek, Korben; Bennett, Janine Camille; Levine, Joshua; Pebay, Philippe Pierre; Gyulassy, Attila; Thompson, David C.; Rojas, Joseph Maurice
2011-07-01
This document presents current technical progress and dissemination of results for the Mathematics for Analysis of Petascale Data (MAPD) project titled 'Topology for Statistical Modeling of Petascale Data', funded by the Office of Science Advanced Scientific Computing Research (ASCR) Applied Math program. Many commonly used algorithms for mathematical analysis do not scale well enough to accommodate the size or complexity of petascale data produced by computational simulations. The primary goal of this project is thus to develop new mathematical tools that address both the petascale size and uncertain nature of current data. At a high level, our approach is based on the complementary techniques of combinatorial topology and statistical modeling. In particular, we use combinatorial topology to filter out spurious data that would otherwise skew statistical modeling techniques, and we employ advanced algorithms from algebraic statistics to efficiently find globally optimal fits to statistical models. This document summarizes the technical advances we have made to date that were made possible in whole or in part by MAPD funding. These technical contributions can be divided loosely into three categories: (1) advances in the field of combinatorial topology, (2) advances in statistical modeling, and (3) new integrated topological and statistical methods.
Statistical criteria for characterizing irradiance time series.
Stein, Joshua S.; Ellis, Abraham; Hansen, Clifford W.
2010-10-01
We propose and examine several statistical criteria for characterizing time series of solar irradiance. Time series of irradiance are used in analyses that seek to quantify the performance of photovoltaic (PV) power systems over time. Time series of irradiance are either measured or are simulated using models. Simulations of irradiance are often calibrated to or generated from statistics for observed irradiance and simulations are validated by comparing the simulation output to the observed irradiance. Criteria used in this comparison should derive from the context of the analyses in which the simulated irradiance is to be used. We examine three statistics that characterize time series and their use as criteria for comparing time series. We demonstrate these statistics using observed irradiance data recorded in August 2007 in Las Vegas, Nevada, and in June 2009 in Albuquerque, New Mexico.
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Last 30 Days These plots show the daily statistics for the last 30 days for the storage systems at NERSC in terms of the amount of data transferred and the number of files...
EERE Web Site Engagement Statistics: FY09
Broader source: Energy.gov (indexed) [DOE]
WEB SITE ENGAGEMENT STATISTICS TECHNOLOGY ADVANCEMENT AND OUTREACH | 01 TABLE OF CONTENTS ... Views 02 Average Visit Duration 03 Top 20 Web Sites by Visits 03 Top 20 Visited Pages 04 ...
FY 2015 Statistical Table by Appropriation
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Statistical Table by Appropriation (dollars in thousands - OMB Scoring) Statistical Table by Appropriation Page 1 FY 2015 Congressional Request FY 2013 FY 2014 FY 2014 FY 2014 FY 2015 Current Enacted Adjustment Current Congressional Approp. Approp. Approp. Request Discretionary Summary By Appropriation Energy And Water Development And Related Agencies Appropriation Summary: Energy Programs Energy efficiency and renewable energy............................... 1,691,757 1,900,641 ---- 1,900,641
FY 2015 Statistical Table by Organization
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
of Energy FY 2015 Statistical Table by Organization (dollars in thousands - OMB Scoring) Statistical Table by Organization Page 1 FY 2015 Congressional Request FY 2013 FY 2014 FY 2014 FY 2014 FY 2015 Current Enacted Adjustments Current Congressional Approp. Approp. Approp. Request Discretionary Summary By Organization National Nuclear Security Administration Weapons Activities........................................................................... 6,966,855 7,781,000 ---- 7,781,000 8,314,902
ORISE: Statistical Analyses of Worker Health
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Statistical Analyses Statistical analyses at the Oak Ridge Institute for Science and Education (ORISE) support ongoing programs involving medical surveillance of workers and other populations, as well as occupational epidemiology and research. ORISE emphasizes insightful and accurate analysis, practical interpretation of results and clear, easily read reports. All analyses are preceded by extensive data scrubbing and verification. ORISE's approach relies on applying appropriate methods of
Combined statistical and dynamical assessment of simulated
Office of Scientific and Technical Information (OSTI)
vegetation-rainfall in North Africa during the mid-Holocene* (Journal Article) | SciTech Connect Combined statistical and dynamical assessment of simulated vegetation-rainfall in North Africa during the mid-Holocene* Citation Details In-Document Search Title: Combined statistical and dynamical assessment of simulated vegetation-rainfall in North Africa during the mid-Holocene* A negative feedback of vegetation cover on subsequent annual precipitation is simulated for the mid-Holocene over
USING CORONAL CELLS TO INFER THE MAGNETIC FIELD STRUCTURE AND CHIRALITY OF FILAMENT CHANNELS
Sheeley, N. R. Jr.; Warren, H. P.; Martin, S. F.; Panasenco, O.
2013-08-01
Coronal cells are visible at temperatures of {approx}1.2 MK in Fe XII coronal images obtained from the Solar Dynamics Observatory and Solar Terrestrial Relations Observatory spacecraft. We show that near a filament channel, the plumelike tails of these cells bend horizontally in opposite directions on the two sides of the channel like fibrils in the chromosphere. Because the cells are rooted in magnetic flux concentrations of majority polarity, these observations can be used with photospheric magnetograms to infer the direction of the horizontal field in filament channels and the chirality of the associated magnetic field. This method is similar to the procedure for inferring the direction of the magnetic field and the chirality of the fibril pattern in filament channels from H{alpha} observations. However, the coronal cell observations are easier to use and provide clear inferences of the horizontal field direction for heights up to {approx}50 Mm into the corona.
DOE - NNSA/NFO -- FOIA Statistics
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Statistics NNSA/NFO Language Options U.S. DOE/NNSA - Nevada Field Office FOIA Statistics The FOIA has become a useful tool for researchers, news media, and the general public. In October 1996, Congress enacted the Electronic Freedom of Information Act Amendments of 1996, one of which (5 U.S.C. 552(a)(6)(A)(i)) extended the agency response period from 10 days to 20 days, and another (5 U.S.C. 552(e)) which requires agencies to make available to the public fiscal year FOIA Annual Reports. This
Federal offshore statistics: leasing, exploration, production, revenue
Essertier, E.P.
1983-01-01
The statistics in this update of the Outer Continental Shelf Statistics publication document what has happened since federal leasing began on the Outer Continental Shelf (OCS) in 1954. Highlights note that of the 29.8 million acres actually leased from 175.6 million acres offered for leasing, 20.1% were in frontier areas. Total revenues for the 1954-1982 period were $58.9 billion with about 13% received in 1982. The book is divided into six parts covering highlights, leasing, exploration and development, production and revenue, reserves and undiscovered recoverable resources, and pollution problems from well and tanker accidents. 5 figures, 59 tables.
al-Saffar, Sinan; Joslyn, Cliff A.; Chappell, Alan R.
2011-07-18
As semantic datasets grow to be very large and divergent, there is a need to identify and exploit their inherent semantic structure for discovery and optimization. Towards that end, we present here a novel methodology to identify the semantic structures inherent in an arbitrary semantic graph dataset. We first present the concept of an extant ontology as a statistical description of the semantic relations present amongst the typed entities modeled in the graph. This serves as a model of the underlying semantic structure to aid in discovery and visualization. We then describe a method of ontological scaling in which the ontology is employed as a hierarchical scaling filter to infer different resolution levels at which the graph structures are to be viewed or analyzed. We illustrate these methods on three large and publicly available semantic datasets containing more than one billion edges each. Keywords-Semantic Web; Visualization; Ontology; Multi-resolution Data Mining;
Statistical modeling support for calibration of a multiphysics model of subcooled boiling flows
Bui, A. V.; Dinh, N. T.; Nourgaliev, R. R.; Williams, B. J.
2013-07-01
Nuclear reactor system analyses rely on multiple complex models which describe the physics of reactor neutronics, thermal hydraulics, structural mechanics, coolant physico-chemistry, etc. Such coupled multiphysics models require extensive calibration and validation before they can be used in practical system safety study and/or design/technology optimization. This paper presents an application of statistical modeling and Bayesian inference in calibrating an example multiphysics model of subcooled boiling flows which is widely used in reactor thermal hydraulic analysis. The presence of complex coupling of physics in such a model together with the large number of model inputs, parameters and multidimensional outputs poses significant challenge to the model calibration method. However, the method proposed in this work is shown to be able to overcome these difficulties while allowing data (observation) uncertainty and model inadequacy to be taken into consideration. (authors)
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Crane Safety Test Instructions: All Training and Testing Material is for LSU CAMD Users ... A minimum passing score is 80% (8 out of 10) This test can only be taken once in a thirty ...
Picard, R.R.
1987-01-01
Many aspects of the MUF-D statistic, used for verification of accountability data, have been examined in the safeguards literature. In this paper, basic MUF-D results are extended to more general environments than are usually considered. These environments include arbitrary measurement error structures, various sampling regimes that could be imposed by the inspectorate, and the attributes/variables framework.
Statistics of dislocation pinning at localized obstacles
Dutta, A.; Bhattacharya, M. Barat, P.
2014-10-14
Pinning of dislocations at nanosized obstacles like precipitates, voids, and bubbles is a crucial mechanism in the context of phenomena like hardening and creep. The interaction between such an obstacle and a dislocation is often studied at fundamental level by means of analytical tools, atomistic simulations, and finite element methods. Nevertheless, the information extracted from such studies cannot be utilized to its maximum extent on account of insufficient information about the underlying statistics of this process comprising a large number of dislocations and obstacles in a system. Here, we propose a new statistical approach, where the statistics of pinning of dislocations by idealized spherical obstacles is explored by taking into account the generalized size-distribution of the obstacles along with the dislocation density within a three-dimensional framework. Starting with a minimal set of material parameters, the framework employs the method of geometrical statistics with a few simple assumptions compatible with the real physical scenario. The application of this approach, in combination with the knowledge of fundamental dislocation-obstacle interactions, has successfully been demonstrated for dislocation pinning at nanovoids in neutron irradiated type 316-stainless steel in regard to the non-conservative motion of dislocations. An interesting phenomenon of transition from rare pinning to multiple pinning regimes with increasing irradiation temperature is revealed.
Baseballs and Barrels: World Statistics Day
Broader source: Energy.gov [DOE]
Statistics don’t just help us answer trivia questions – they also help us make intelligent decisions. For example, if I heat my home with natural gas, I’m probably interested in what natural gas prices are likely to be this winter.
Multifragmentation: New dynamics or old statistics?
Moretto, L.G.; Delis, D.N.; Wozniak, G.J.
1993-10-01
The understanding of the fission process as it has developed over the last fifty years has been applied to multifragmentation. Two salient aspects have been discovered: 1) a strong decoupling of the entrance and exit channels with the formation of well-characterized sources: 2) a statistical competition between two-, three-, four-, five-, ... n-body decays.
Statistical Characterization of Medium-Duty Electric Vehicle Drive Cycles
Prohaska, Robert; Duran, Adam; Ragatz, Adam; Kelly, Kenneth
2015-05-03
In an effort to help commercialize technologies for electric vehicles (EVs) through deployment and demonstration projects, the U.S. Department of Energy's (DOE's) American Recovery and Reinvestment Act (ARRA) provided funding to participating U.S. companies to cover part of the cost of purchasing new EVs. Within the medium- and heavy-duty commercial vehicle segment, both Smith Electric Newton and and Navistar eStar vehicles qualified for such funding opportunities. In an effort to evaluate the performance characteristics of the new technologies deployed in these vehicles operating under real world conditions, data from Smith Electric and Navistar medium-duty EVs were collected, compiled, and analyzed by the National Renewable Energy Laboratory's (NREL) Fleet Test and Evaluation team over a period of 3 years. More than 430 Smith Newton EVs have provided data representing more than 150,000 days of operation. Similarly, data have been collected from more than 100 Navistar eStar EVs, resulting in a comparative total of more than 16,000 operating days. Combined, NREL has analyzed more than 6 million kilometers of driving and 4 million hours of charging data collected from commercially operating medium-duty electric vehicles in various configurations. In this paper, extensive duty-cycle statistical analyses are performed to examine and characterize common vehicle dynamics trends and relationships based on in-use field data. The results of these analyses statistically define the vehicle dynamic and kinematic requirements for each vehicle, aiding in the selection of representative chassis dynamometer test cycles and the development of custom drive cycles that emulate daily operation. In this paper, the methodology and accompanying results of the duty-cycle statistical analysis are presented and discussed. Results are presented in both graphical and tabular formats illustrating a number of key relationships between parameters observed within the data set that relate to medium duty EVs.
Energy Science and Technology Software Center (OSTI)
2007-08-31
Lustre-tests is a package of regression tests for the Lustre file system containing I/O workloads representative of problems discovered on production systems.
Statistical Behavior of Formation Process of Magnetic Vortex...
Office of Scientific and Technical Information (OSTI)
Statistical Behavior of Formation Process of Magnetic Vortex State in Ni80Fe20 Nanodisks Citation Details In-Document Search Title: Statistical Behavior of Formation Process of ...
Masked Areas in Shear Peak Statistics: A Forward Modeling Approach...
Office of Scientific and Technical Information (OSTI)
Journal Article: Masked Areas in Shear Peak Statistics: A Forward Modeling Approach Citation Details In-Document Search Title: Masked Areas in Shear Peak Statistics: A Forward ...
Fact #602: December 21, 2009 Freight Statistics by Mode, 2007...
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
2: December 21, 2009 Freight Statistics by Mode, 2007 Commodity Flow Survey Fact 602: December 21, 2009 Freight Statistics by Mode, 2007 Commodity Flow Survey Results from the ...
Doppler Lidar Vertical Velocity Statistics Value-Added Product...
Office of Scientific and Technical Information (OSTI)
Vertical Velocity Statistics Value-Added Product Citation Details In-Document Search Title: Doppler Lidar Vertical Velocity Statistics Value-Added Product You are accessing a ...
User Statistics Collection Practices Archives | U.S. DOE Office...
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Policies and Processes User Statistics Collection Practices User Statistics Collection Practices Archives User Facilities User Facilities Home User Facilities at a Glance...
RITA-Bureau of Transportation Statistics | Open Energy Information
RITA-Bureau of Transportation Statistics Jump to: navigation, search Tool Summary LAUNCH TOOL Name: RITA-Bureau of Transportation Statistics AgencyCompany Organization: United...
Statistical surrogate models for prediction of high-consequence...
Office of Scientific and Technical Information (OSTI)
Statistical surrogate models for prediction of high-consequence climate change. Citation Details In-Document Search Title: Statistical surrogate models for prediction of ...
Environment/Health/Safety (EHS): Monthly Accident Statistics
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Personal Protective Equipment (PPE) Injury Review & Analysis Worker Safety and Health Program: PUB-3851 Monthly Accident Statistics Latest Accident Statistics Accident...
An overview of component qualification using Bayesian statistics...
Office of Scientific and Technical Information (OSTI)
using Bayesian statistics and energy methods. Citation Details In-Document Search Title: An overview of component qualification using Bayesian statistics and energy methods. ...
Cosmology constraints from shear peak statistics in Dark Energy...
Office of Scientific and Technical Information (OSTI)
shear peak statistics in Dark Energy Survey Science Verification data Citation Details In-Document Search Title: Cosmology constraints from shear peak statistics in Dark Energy ...
Masked Areas in Shear Peak Statistics: A Forward Modeling Approach...
Office of Scientific and Technical Information (OSTI)
Masked Areas in Shear Peak Statistics: A Forward Modeling Approach Citation Details In-Document Search Title: Masked Areas in Shear Peak Statistics: A Forward Modeling Approach ...
Random-matrix approach to the statistical compound nuclear reaction...
Office of Scientific and Technical Information (OSTI)
statistical compound nuclear reaction at low energies using the Monte-Carlo technique Citation Details In-Document Search Title: Random-matrix approach to the statistical compound ...
FY 2014 Budget Request Statistical Table | Department of Energy
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Statistical Table FY 2014 Budget Request Statistical Table PDF icon Stats Table FY2014.pdf More Documents & Publications FY 2009 Environmental Management Budget Request to Congress ...
Statistical and Domain Analytics Applied to PV Module Lifetime...
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Statistical and Domain Analytics Applied to PV Module Lifetime and Degradation Science Statistical and Domain Analytics Applied to PV Module Lifetime and Degradation Science ...
ARM Climate Modeling Best Estimate Lamont, OK Statistical Summary...
Office of Scientific and Technical Information (OSTI)
Climate Modeling Best Estimate Lamont, OK Statistical Summary (ARMBE-CLDRAD SGPC1) Title: ARM Climate Modeling Best Estimate Lamont, OK Statistical Summary (ARMBE-CLDRAD SGPC1) ...
A statistical perspective of validation and UQ (Conference) ...
Office of Scientific and Technical Information (OSTI)
A statistical perspective of validation and UQ Citation Details In-Document Search Title: A statistical perspective of validation and UQ You are accessing a document from the ...
Evolution in Cloud Population Statistics of the MJO. From AMIE...
Office of Scientific and Technical Information (OSTI)
Technical Report: Evolution in Cloud Population Statistics of the MJO. From AMIE Field ... Citation Details In-Document Search Title: Evolution in Cloud Population Statistics of the ...
Evolution in Cloud Population Statistics of the MJO. From AMIE...
Office of Scientific and Technical Information (OSTI)
Evolution in Cloud Population Statistics of the MJO. From AMIE Field Observations to ... Citation Details In-Document Search Title: Evolution in Cloud Population Statistics of the ...
Bayesian Inference for Time Trends in Parameter Values using Weighted Evidence Sets
D. L. Kelly; A. Malkhasyan
2010-09-01
There is a nearly ubiquitous assumption in PSA that parameter values are at least piecewise-constant in time. As a result, Bayesian inference tends to incorporate many years of plant operation, over which there have been significant changes in plant operational and maintenance practices, plant management, etc. These changes can cause significant changes in parameter values over time; however, failure to perform Bayesian inference in the proper time-dependent framework can mask these changes. Failure to question the assumption of constant parameter values, and failure to perform Bayesian inference in the proper time-dependent framework were noted as important issues in NUREG/CR-6813, performed for the U. S. Nuclear Regulatory Commissions Advisory Committee on Reactor Safeguards in 2003. That report noted that in-dustry lacks tools to perform time-trend analysis with Bayesian updating. This paper describes an applica-tion of time-dependent Bayesian inference methods developed for the European Commission Ageing PSA Network. These methods utilize open-source software, implementing Markov chain Monte Carlo sampling. The paper also illustrates an approach to incorporating multiple sources of data via applicability weighting factors that address differences in key influences, such as vendor, component boundaries, conditions of the operating environment, etc.
Topological Cacti: Visualizing Contour-based Statistics
Weber, Gunther H.; Bremer, Peer-Timo; Pascucci, Valerio
2011-05-26
Contours, the connected components of level sets, play an important role in understanding the global structure of a scalar field. In particular their nestingbehavior and topology-often represented in form of a contour tree-have been used extensively for visualization and analysis. However, traditional contour trees onlyencode structural properties like number of contours or the nesting of contours, but little quantitative information such as volume or other statistics. Here we use thesegmentation implied by a contour tree to compute a large number of per-contour (interval) based statistics of both the function defining the contour tree as well asother co-located functions. We introduce a new visual metaphor for contour trees, called topological cacti, that extends the traditional toporrery display of acontour tree to display additional quantitative information as width of the cactus trunk and length of its spikes. We apply the new technique to scalar fields ofvarying dimension and different measures to demonstrate the effectiveness of the approach.
Statistical approach to nuclear level density
Sen'kov, R. A.; Horoi, M.; Zelevinsky, V. G.
2014-10-15
We discuss the level density in a finite many-body system with strong interaction between the constituents. Our primary object of applications is the atomic nucleus but the same techniques can be applied to other mesoscopic systems. We calculate and compare nuclear level densities for given quantum numbers obtained by different methods, such as nuclear shell model (the most successful microscopic approach), our main instrument - moments method (statistical approach), and Fermi-gas model; the calculation with the moments method can use any shell-model Hamiltonian excluding the spurious states of the center-of-mass motion. Our goal is to investigate statistical properties of nuclear level density, define its phenomenological parameters, and offer an affordable and reliable way of calculation.
Robust statistical reconstruction for charged particle tomography
2013-10-08
Systems and methods for charged particle detection including statistical reconstruction of object volume scattering density profiles from charged particle tomographic data to determine the probability distribution of charged particle scattering using a statistical multiple scattering model and determine a substantially maximum likelihood estimate of object volume scattering density using expectation maximization (ML/EM) algorithm to reconstruct the object volume scattering density. The presence of and/or type of object occupying the volume of interest can be identified from the reconstructed volume scattering density profile. The charged particle tomographic data can be cosmic ray muon tomographic data from a muon tracker for scanning packages, containers, vehicles or cargo. The method can be implemented using a computer program which is executable on a computer.
Mathematical and Statistical Opportunities in Cyber Security
Office of Scientific and Technical Information (OSTI)
Mathematical and Statistical Opportunities in Cyber Security ∗ Juan Meza † Scott Campbell ‡ David Bailey § Abstract The role of mathematics in a complex system such as the Internet has yet to be deeply explored. In this paper, we summarize some of the important and pressing problems in cyber security from the viewpoint of open science environments. We start by posing the question "What fundamental problems exist within cyber security research that can be helped by advanced
FY 2011 Statistical Table by Appropriation
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Statistical Table by Appropriation (dollars in thousands - OMB Scoring) FY 2009 FY 2009 FY 2010 FY 2011 Current Current Current Congressional Approp. Recovery Approp. Request $ % Discretionary Summary By Appropriation Energy And Water Development, And Related Agencies Appropriation Summary: Energy Programs Energy efficiency and renewable energy........................................... 2,156,865 16,771,907 2,242,500 2,355,473 +112,973 +5.0% Electricity delivery and energy
FY 2017 Statistical Table by Organization
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Organization (dollars in thousands - OMB Scoring) Statistical Table by Organization Page 1 FY 2017 Congressional Budget Justification FY 2015 FY 2015 FY 2016 FY 2017 Enacted Current Enacted Congressional Approp. Approp. Approp. Request $ % Discretionary Summary By Organization National Nuclear Security Administration Weapons Activities................................................................................. 8,180,359 8,180,609 8,846,948 9,243,147 +396,199 +4.5% Defense Nuclear
Statistical design of a uranium corrosion experiment
Wendelberger, Joanne R; Moore, Leslie M
2009-01-01
This work supports an experiment being conducted by Roland Schulze and Mary Ann Hill to study hydride formation, one of the most important forms of corrosion observed in uranium and uranium alloys. The study goals and objectives are described in Schulze and Hill (2008), and the work described here focuses on development of a statistical experiment plan being used for the study. The results of this study will contribute to the development of a uranium hydriding model for use in lifetime prediction models. A parametric study of the effect of hydrogen pressure, gap size and abrasion on hydride initiation and growth is being planned where results can be analyzed statistically to determine individual effects as well as multi-variable interactions. Input to ESC from this experiment will include expected hydride nucleation, size, distribution, and volume on various uranium surface situations (geometry) as a function of age. This study will also address the effect of hydrogen threshold pressure on corrosion nucleation and the effect of oxide abrasion/breach on hydriding processes. Statistical experiment plans provide for efficient collection of data that aids in understanding the impact of specific experiment factors on initiation and growth of corrosion. The experiment planning methods used here also allow for robust data collection accommodating other sources of variation such as the density of inclusions, assumed to vary linearly along the cast rods from which samples are obtained.
Federal offshore statistics: leasing, exploration, production, revenue
Essertier, E.P.
1984-09-01
This publication is a numerical record of what has happened since Congress gave authority to the Secretary of the Interior in 1953 to lease the federal portion of the Continental Shelf for oil and gas. The publication updates and augments the first Federal Offshore Statistics, published in December 1983. It also extends a statistical series published annually from 1969 until 1981 by the US Geological Survey (USGS) under the title Outer Continental Shelf Statistics. The USGS collected royalties and supervised operation and production of minerals on the Outer Continental Shelf (OCS) until the Minerals Management Service (MMS) took over these functions in 1982. Some of the highlights are: of the 329.5 million acres offered for leasing, 37.1 million acres were actually leased; total revenues for the 1954 to 1983 period were $68,173,112,563 and for 1983 $9,161,435,540; a total of 22,095 wells were drilled in federal waters and 10,145 wells were drilled in state waters; from 1954 through 1983, federal offshore areas produced 6.4 billion barrels of oil and condensate, and 62.1 trillion cubic feet of natural gas; in 1983 alone production was 340.7 million barrels of oil and condensate, and 3.9 trillion cubic feet of gas; and for the second straight year, no oil was lost in 1983 as a result of blowouts in federal waters. 8 figures, 66 tables.
Weatherization Assistance Program - Background Data and Statistics
Eisenberg, Joel Fred
2010-03-01
This technical memorandum is intended to provide readers with information that may be useful in understanding the purposes, performance, and outcomes of the Department of Energy's (DOE's) Weatherization Assistance Program (Weatherization). Weatherization has been in operation for over thirty years and is the nation's largest single residential energy efficiency program. Its primary purpose, established by law, is 'to increase the energy efficiency of dwellings owned or occupied by low-income persons, reduce their total residential energy expenditures, and improve their health and safety, especially low-income persons who are particularly vulnerable such as the elderly, the handicapped, and children.' The American Reinvestment and Recovery Act PL111-5 (ARRA), passed and signed into law in February 2009, committed $5 Billion over two years to an expanded Weatherization Assistance Program. This has created substantial interest in the program, the population it serves, the energy and cost savings it produces, and its cost-effectiveness. This memorandum is intended to address the need for this kind of information. Statistically valid answers to many of the questions surrounding Weatherization and its performance require comprehensive evaluation of the program. DOE is undertaking precisely this kind of independent evaluation in order to ascertain program effectiveness and to improve its performance. Results of this evaluation effort will begin to emerge in late 2010 and 2011, but they require substantial time and effort. In the meantime, the data and statistics in this memorandum can provide reasonable and transparent estimates of key program characteristics. The memorandum is laid out in three sections. The first deals with some key characteristics describing low-income energy consumption and expenditures. The second section provides estimates of energy savings and energy bill reductions that the program can reasonably be presumed to be producing. The third section deals with estimates of program cost-effectiveness and societal impacts such as carbon reduction and reduced national energy consumption. Each of the sections is brief, containing statistics, explanatory graphics and tables as appropriate, and short explanations of the statistics in order to place them in context for the reader. The companion appendices at the back of the memorandum explain the methods and sources used in developing the statistics.
Cosmology constraints from shear peak statistics in Dark Energy Survey
Office of Scientific and Technical Information (OSTI)
Science Verification data (Journal Article) | SciTech Connect Cosmology constraints from shear peak statistics in Dark Energy Survey Science Verification data Citation Details In-Document Search Title: Cosmology constraints from shear peak statistics in Dark Energy Survey Science Verification data Shear peak statistics has gained a lot of attention recently as a practical alternative to the two point statistics for constraining cosmological parameters. We perform a shear peak statistics
Gasoline and Diesel Fuel Update (EIA)
6 Tables May 1996 Energy Information Administration Office of Coal, Nuclear, Electric and Alternate Fuels U.S. Department of Energy Washington DC 20585 This report was prepared by the Energy Information Administration, the independent statistical and analytical agency within the Department of Energy. The information contained herein should not be construed as advocating or reflecting any policy position of the Department of Energy or any other organization. Contacts The annual publication Cost
Statistical simulation ?of the magnetorotational dynamo
Squire, J.; Bhattacharjee, A.
2014-08-01
We analyze turbulence and dynamo induced by the magnetorotational instability (MRI) using quasi-linear statistical simulation methods. We find that homogenous turbulence is unstable to a large scale dynamo instability, which saturates to an inhomogenous equilibrium with a very strong dependence on the magnetic Prandtl number (Pm). Despite its enormously reduced nonlinearity, the quasi-linear model exhibits the same qualitative scaling of angular momentum transport with Pm as fully nonlinear turbulence. This demonstrates the relationship of recent convergence problems to the large scale dynamo and suggests possible methods for studying astrophysically relevant regimes at very low or high Pm.
[pic] EERE Web Site Statistics - Information Center
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
EERE Web Site Statistics - Information Center Custom View: 10/1/10 - 9/30/11 October 1, 2010 12:00:00 AM - September 30, 2011 11:59:59 PM Table of Contents Overview Dashboard 3 By Number of Visits 4 Domain Names 28 Top-Level Domain Types 31 Countries 34 Visits Trend 37 Visits by Number of Pages Viewed 39 Visit Duration by Visits 41 Visit Duration by Page Views 44 Pages 47 Page Views Trend 52 File Downloads 54 Entry Pages 56 Exit Pages 60 Single-Page Visits 66 Paths, Forward 71 Referring Site 105
Statistical analysis of cascading failures in power grids
Chertkov, Michael; Pfitzner, Rene; Turitsyn, Konstantin
2010-12-01
We introduce a new microscopic model of cascading failures in transmission power grids. This model accounts for automatic response of the grid to load fluctuations that take place on the scale of minutes, when optimum power flow adjustments and load shedding controls are unavailable. We describe extreme events, caused by load fluctuations, which cause cascading failures of loads, generators and lines. Our model is quasi-static in the causal, discrete time and sequential resolution of individual failures. The model, in its simplest realization based on the Directed Current description of the power flow problem, is tested on three standard IEEE systems consisting of 30, 39 and 118 buses. Our statistical analysis suggests a straightforward classification of cascading and islanding phases in terms of the ratios between average number of removed loads, generators and links. The analysis also demonstrates sensitivity to variations in line capacities. Future research challenges in modeling and control of cascading outages over real-world power networks are discussed.
Inferring Cloud Feedbacks from ARM Continuous Forcing, ISCCP, and ARSCL Data
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Inferring Cloud Feedbacks from ARM Continuous Forcing, ISCCP, and ARSCL Data A. D. Del Genio National Aeronautics and Space Administration Goddard Institute for Space Studies New York, New York A. B. Wolf and M.-S. Yao SGT Inc., Institute for Space Studies New York, New York Introduction Single Column Model (SCM) versions of parent general circulation models (GCMs), accompanied by cloud-resolving models (CRMs) that crudely resolve cloud-scale dynamics, have increasingly been used to simulate
Statistical theory of Coulomb blockade oscillations: Quantum chaos in quantum dots
Jalabert, R.A.; Stone, A.D.; Alhassid, Y. (Center for Theoretical Physics, Sloane Physics Laboratory, Yale University, New Haven, Connecticut 06511 (United States))
1992-06-08
We develop a statistical theory of the amplitude of Coulomb blockade oscillations in semiconductor quantum dots based on the hypothesis that chaotic dynamics in the dot potential leads to behavior described by random-matrix theory. Breaking time-reversal symmetry is predicted to cause an experimentally observable change in the distribution of amplitudes. The theory is tested numerically and good agreement is found.
Statistics and geometry of cosmic voids
Gaite, Jos
2009-11-01
We introduce new statistical methods for the study of cosmic voids, focusing on the statistics of largest size voids. We distinguish three different types of distributions of voids, namely, Poisson-like, lognormal-like and Pareto-like distributions. The last two distributions are connected with two types of fractal geometry of the matter distribution. Scaling voids with Pareto distribution appear in fractal distributions with box-counting dimension smaller than three (its maximum value), whereas the lognormal void distribution corresponds to multifractals with box-counting dimension equal to three. Moreover, voids of the former type persist in the continuum limit, namely, as the number density of observable objects grows, giving rise to lacunar fractals, whereas voids of the latter type disappear in the continuum limit, giving rise to non-lacunar (multi)fractals. We propose both lacunar and non-lacunar multifractal models of the cosmic web structure of the Universe. A non-lacunar multifractal model is supported by current galaxy surveys as well as cosmological N-body simulations. This model suggests, in particular, that small dark matter halos and, arguably, faint galaxies are present in cosmic voids.
International energy indicators. [Statistical tables and graphs
Bauer, E.K.
1980-05-01
International statistical tables and graphs are given for the following: (1) Iran - Crude Oil Capacity, Production and Shut-in, June 1974-April 1980; (2) Saudi Arabia - Crude Oil Capacity, Production, and Shut-in, March 1974-Apr 1980; (3) OPEC (Ex-Iran and Saudi Arabia) - Capacity, Production and Shut-in, June 1974-March 1980; (4) Non-OPEC Free World and US Production of Crude Oil, January 1973-February 1980; (5) Oil Stocks - Free World, US, Japan, and Europe (Landed, 1973-1st Quarter, 1980); (6) Petroleum Consumption by Industrial Countries, January 1973-December 1979; (7) USSR Crude Oil Production and Exports, January 1974-April 1980; and (8) Free World and US Nuclear Generation Capacity, January 1973-March 1980. Similar statistical tables and graphs included for the United States include: (1) Imports of Crude Oil and Products, January 1973-April 1980; (2) Landed Cost of Saudi Oil in Current and 1974 Dollars, April 1974-January 1980; (3) US Trade in Coal, January 1973-March 1980; (4) Summary of US Merchandise Trade, 1976-March 1980; and (5) US Energy/GNP Ratio, 1947 to 1979.
International petroleum statistics report, July 1999
1999-07-01
The International Petroleum Statistics Report is a monthly publication that provides current international oil data. This report is published for the use of Members of Congress, Federal agencies, State agencies, industry, and the general public. The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1990, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years and annually for the three years prior to that. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1998; OECD stocks from 1973 through 1998; and OECD trade from 1988 through 1998. 4 figs., 44 tabs.
International petroleum statistics report, March 1998
1998-03-01
The International Petroleum Statistics Report is a monthly publication that provides current international oil data. This report is published for the use of Members of Congress, Federal agencies, State agencies, industry, and the general public. The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1996; OECD stocks from 1973 through 1996; and OECD trade from 1986 through 1996.
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Safety, Security & Resilience of the Energy Infrastructure Energy Storage Nuclear Power & Engineering Grid Modernization Battery Testing Nuclear Fuel Cycle Defense Waste Management ...
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Safety, Security & Resilience of the Energy Infrastructure Energy Storage Nuclear Power & Engineering Grid Modernization Battery Testing Nuclear Fuel Cycle Defense Waste Management ...
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Safety, Security & Resilience of the Energy Infrastructure Energy Storage Nuclear Power & Engineering Grid Modernization Battery Testing Nuclear Fuel Cycle Defense Waste Management ...
Revisiting Statistical Aspects of Nuclear Material Accounting
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Burr, T.; Hamada, M. S.
2013-01-01
Nuclear material accounting (NMA) is the only safeguards system whose benefits are routinely quantified. Process monitoring (PM) is another safeguards system that is increasingly used, and one challenge is how to quantify its benefit. This paper considers PM in the role of enabling frequent NMA, which is referred to as near-real-time accounting (NRTA). We quantify NRTA benefits using period-driven and data-driven testing. Period-driven testing makes a decision to alarm or not at fixed periods. Data-driven testing decides as the data arrives whether to alarm or continue testing. The difference between period-driven and datad-riven viewpoints is illustrated by using one-year andmore » two-year periods. For both one-year and two-year periods, period-driven NMA using once-per-year cumulative material unaccounted for (CUMUF) testing is compared to more frequent Shewhart and joint sequential cusum testing using either MUF or standardized, independently transformed MUF (SITMUF) data. We show that the data-driven viewpoint is appropriate for NRTA and that it can be used to compare safeguards effectiveness. In addition to providing period-driven and data-driven viewpoints, new features include assessing the impact of uncertainty in the estimated covariance matrix of the MUF sequence and the impact of both random and systematic measurement errors.« less
International petroleum statistics report, October 1997
1997-10-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1996; OECD stocks from 1973 through 1996; and OECD trade from 1986 through 1996. 4 figs., 48 tabs.
International Petroleum Statistics Report, January 1994
Not Available
1994-01-31
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade for the years 1970 through 1992; OECD stocks from 1973 through 1992; and OECD trade from 1982 through 1992.
International petroleum statistics report, November 1993
Not Available
1993-11-26
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries World oil production and OECD demand data are for the years 1970 through 1992; OECD stocks from 1973 through 1992: and OECD trade from 1982 through 1992.
Statistical correlations in the Moshinsky atom
Laguna, H. G.; Sagar, R. P.
2011-07-15
We study the influence of the interparticle and confining potentials on statistical correlation via the correlation coefficient and mutual information in ground and some excited states of the Moshinsky atom in position and momentum space. The magnitude of the correlation between positions and between momenta is equal in the ground state. In excited states, the correlation between the momenta of the particles is greater than between their positions when they interact through an attractive potential whereas for repulsive interparticle potentials the opposite is true. Shannon entropies, and their sums (entropic formulations of the uncertainty principle), are also analyzed, showing that the one-particle entropy sum is dependent on the interparticle potential and thus able to detect the correlation between particles.
Statistical fingerprinting for malware detection and classification
Prowell, Stacy J.; Rathgeb, Christopher T.
2015-09-15
A system detects malware in a computing architecture with an unknown pedigree. The system includes a first computing device having a known pedigree and operating free of malware. The first computing device executes a series of instrumented functions that, when executed, provide a statistical baseline that is representative of the time it takes the software application to run on a computing device having a known pedigree. A second computing device executes a second series of instrumented functions that, when executed, provides an actual time that is representative of the time the known software application runs on the second computing device. The system detects malware when there is a difference in execution times between the first and the second computing devices.
International Petroleum Statistics Report, July 1994
Not Available
1994-07-26
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1993; OECD stocks from 1973 through 1993; and OECD trade from 1983 through 1993. Data for the United States are developed by the Energy Information Administration`s (EIA) Office of Oil and Gas. Data for other countries are derived largely from published sources, including International Energy Agency publications, the EIA International Energy Annual, and the trade press. (See sources after each section.) All data are reviewed by the International Statistics Branch of EIA. All data have been converted to units of measurement familiar to the American public. Definitions of oil production and consumption are consistent with other EIA publications.
Helioseismic Tests of Radiative Opacities.
Guzik, J. A.; Neuforge, C. M.; Keady, J. J.; Magee, N. H.; Bradley, P. A.
2002-01-01
During the past fifteen years, thousands of solar acoustic oscillation modes have been measured to remarkable precision, in many cases to within 0.01%. These frequencies have been used to infer the interior structure of the sun and test the physical input to solar models. Here we summarize the procedures, input physics and assumptions for calculating a standard solar evolution model. We compare the observed and calculated sound speed profile and oscillation frequencies of solar models calibrated using the new Los Alamos LEDCOP and Livermore OPAL Rosseland mean opacities for the same element mixture. We show that solar oscillations are extremely sensitive to opacities, with opacity differences of only a few percent producing an easily detectable effect on the sound speed and predicted frequencies. The oscillation data indicate that agreement would be improved by an opacity increase of several percent below the convection zone for both the LEDCOP and OPAL opacities.
EU Pocketbook - European Vehicle Market Statistics | Open Energy...
- European Vehicle Market Statistics AgencyCompany Organization: International Council on Clean Transportation Website: eupocketbook.theicct.org Transport Toolkit...
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Testing - Sandia Energy Energy Search Icon Sandia Home Locations Contact Us Employee Locator Energy & Climate Secure & Sustainable Energy Future Stationary Power Energy Conversion Efficiency Solar Energy Wind Energy Water Power Supercritical CO2 Geothermal Natural Gas Safety, Security & Resilience of the Energy Infrastructure Energy Storage Nuclear Power & Engineering Grid Modernization Battery Testing Nuclear Fuel Cycle Defense Waste Management Programs Advanced Nuclear Energy
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Testing - Sandia Energy Energy Search Icon Sandia Home Locations Contact Us Employee Locator Energy & Climate Secure & Sustainable Energy Future Stationary Power Energy Conversion Efficiency Solar Energy Wind Energy Water Power Supercritical CO2 Geothermal Natural Gas Safety, Security & Resilience of the Energy Infrastructure Energy Storage Nuclear Power & Engineering Grid Modernization Battery Testing Nuclear Fuel Cycle Defense Waste Management Programs Advanced Nuclear Energy
Random paths and current fluctuations in nonequilibrium statistical mechanics
Gaspard, Pierre
2014-07-15
An overview is given of recent advances in nonequilibrium statistical mechanics about the statistics of random paths and current fluctuations. Although statistics is carried out in space for equilibrium statistical mechanics, statistics is considered in time or spacetime for nonequilibrium systems. In this approach, relationships have been established between nonequilibrium properties such as the transport coefficients, the thermodynamic entropy production, or the affinities, and quantities characterizing the microscopic Hamiltonian dynamics and the chaos or fluctuations it may generate. This overview presents results for classical systems in the escape-rate formalism, stochastic processes, and open quantum systems.
International petroleum statistics report, April 1997
1997-04-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1995; OECD stocks from 1973 through 1995; and OECD trade from 1985 through 1995. 4 figs., 47 tabs.
International petroleum statistics report, May 1999
1999-05-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1990, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1998; OECD stocks from 1973 through 1998; and OECD trade from 1988 through 1998. 4 figs., 48 tabs.
International petroleum statistics report, February 1999
1999-02-01
The International petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970--1997; OECD stocks from 1973--1997; and OECD trade from 1987--1997.
International petroleum statistics report, June 1999
1999-06-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1990, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years and annually for the three years prior to that. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1998; OECD stocks from 1973 through 1998; and OECD trade from 1988 through 1998. 4 figs., 46 tabs.
International petroleum statistics report, February 1996
1996-02-28
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1994; OECD stocks from 1973 through 1994; and OECD trade from 1984 through 1994.
International petroleum statistics report, July 1998
1998-07-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1997; OECD stocks from 1973 through 1997; and OECD trade from 1987 through 1997. 4 figs., 46 tabs.
International petroleum statistics report, December 1998
1998-12-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1997; OECD stocks from 1973 through 1997; and OECD trade from 1987 through 1997. 4 figs., 46 tabs.
International petroleum statistics report, March 1999
1999-03-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1990, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years and annually for the three years prior to that. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarter data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1997; OECD stocks from 1973 through 1997; and OECD trade from 1987 through 1997. 4 figs., 48 tabs.
International petroleum statistics report, March 1994
1994-03-28
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1992; OECD stocks from 1973 through 1992; and OECD trade from 1982 through 1992.
International petroleum statistics report, February 1998
1998-02-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1996; OECD stocks from 1973 through 1996; and OECD trade from 1986 through 1996. 4 figs., 48 tabs.
International petroleum statistics report, April 1999
1999-05-04
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance fore the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1997; OECD stocks from 1973 through 1997; and OECD trade from 1987 through 1997. 4 figs., 48 tabs.
International petroleum statistics report, February 1997
1997-02-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1995; OECD stocks from 1973 through 1995; and OECD trade from 1985 through 1995. 4 figs., 47 tabs.
International petroleum statistics report, January 1999
1999-01-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1997; OECD stocks from 1973 through 1997; and OECD trade from 1987 through 1997. 4 figs., 46 tabs.
International petroleum statistics report, August 1995
1995-08-25
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1994; OECD stocks from 1973 through 1994; and OECD trade from 1984 through 1994.
International petroleum statistics report, August 1998
1998-08-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1997; OECD stocks from 1973 through 1997; and OECD trade from 1987 through 1997. 4 figs., 48 tabs.
International petroleum statistics report, June 1998
1998-06-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1997; OECD stocks from 1973 through 1997; and OECD trade from 1987 through 1997. 4 figs., 48 tabs.
International petroleum statistics report, August 1994
Not Available
1994-08-26
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1993; OECD stocks from 1973 through 1993; and OECD trade from 1983 through 1993.
International petroleum statistics report, September 1998
1998-09-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1997; OECD stocks from 1973 through 1997; and OECD trade from 1987 through 1997. 4 figs., 46 tabs.
International petroleum statistics report, March 1995
1995-03-30
The International Petroleum Statistics Report presents data for March 1995 on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1993; OECD stocks from 1973 through 1993; and OECD trade from 1983 through 1993.
International petroleum statistics report, November 1994
Not Available
1994-11-25
Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The International production, and on oil and stocks. The report has four sections. Section 1 contains time series data on world oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1993; OECD stocks from 1973 through 1993; and OECD trade from 1983 through 1993.
International petroleum statistics report, April 1998
1998-04-01
The International Petroleum Statistics Report presents data on International oil production, demand, imports and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1997; OECD stocks from 1973 through 1997; and OECD trade from 1986 through 1996. 4 figs., 46 tabs.
International petroleum statistics report, December 1997
1997-12-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. The balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1996; OECD stocks from 1973 through 1996; and OECD trade from 1986 through 1996. 4 figs., 46 tabs.
International petroleum statistics report, November 1998
1998-11-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1997; OECD stocks from 1973 through 1997; and OECD trade from 1987 through 1997. 4 figs., 46 tabs.
International petroleum statistics report, October 1998
1998-10-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1997; OECD stocks from 1973 through 1997; and OECD trade from 1987 through 1997. 4 figs., 46 tabs.
International petroleum statistics report, June 1997
1997-06-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1996; OECD stocks from 1973 through 1996; and OECD trade from 1986 through 1996. 46 tabs.
International petroleum statistics report, May 1998
1998-05-01
The International Petroleum Statistics report is a monthly publication that provides current international oil data. It presents data on international production, demand, imports and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two year. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1997; OECD stocks from 1973 through 1997, and OECD trade from 1987 through 1997. 4 fig., 48 tabs.
International petroleum statistics report, September 1995
1995-09-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1994; OECD stocks from 1973 through 1994; and OECD trade from 1984 through 1994. 4 figs., 45 tabs.
International petroleum statistics report, September 1994
Not Available
1994-09-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (ECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1993; OECD stocks from 1973 through 1993; and OECD trade from 1983 through 1993.
International petroleum statistics report, May 1995
1995-05-30
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1994; OECD stocks from 1973 through 1994; and OECD trade from 1983 through 1993.
International petroleum statistics report, October 1993
Not Available
1993-10-27
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1980, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1992; OECD stocks from 1982 through 1992.
International petroleum statistics report, December 1993
Not Available
1993-12-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1992; OECD stocks from 1973 through 1992; and OECD trade from 1982 through 1992. 41 tabs.
International petroleum statistics report, April 1994
Not Available
1994-04-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1993; OECD stocks from 1973 through 1993; and OECD trade from 1982 through 1992. 41 tables.
International petroleum statistics report, February 1994
Not Available
1994-02-28
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1992; OECD stocks from 1973 through 1992; and OECD trade from 1982 through 1992.
Statistical Hot Channel Analysis for the NBSR
Cuadra A.; Baek J.
2014-05-27
A statistical analysis of thermal limits has been carried out for the research reactor (NBSR) at the National Institute of Standards and Technology (NIST). The objective of this analysis was to update the uncertainties of the hot channel factors with respect to previous analysis for both high-enriched uranium (HEU) and low-enriched uranium (LEU) fuels. Although uncertainties in key parameters which enter into the analysis are not yet known for the LEU core, the current analysis uses reasonable approximations instead of conservative estimates based on HEU values. Cumulative distribution functions (CDFs) were obtained for critical heat flux ratio (CHFR), and onset of flow instability ratio (OFIR). As was done previously, the Sudo-Kaminaga correlation was used for CHF and the Saha-Zuber correlation was used for OFI. Results were obtained for probability levels of 90%, 95%, and 99.9%. As an example of the analysis, the results for both the existing reactor with HEU fuel and the LEU core show that CHFR would have to be above 1.39 to assure with 95% probability that there is no CHF. For the OFIR, the results show that the ratio should be above 1.40 to assure with a 95% probability that OFI is not reached.
International petroleum statistics report, September 1996
1996-09-27
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1995; OECD stocks from 1973 through 1995; and OECD trade from 1985 through 1995.
Klenzing, J. H.; Earle, G. D.; Heelis, R. A.; Coley, W. R. [William B. Hanson Center for Space Sciences, University of Texas at Dallas, 800 W. Campbell Rd. WT15, Richardson, Texas 75080 (United States)
2009-05-15
The use of biased grids as energy filters for charged particles is common in satellite-borne instruments such as a planar retarding potential analyzer (RPA). Planar RPAs are currently flown on missions such as the Communications/Navigation Outage Forecast System and the Defense Meteorological Satellites Program to obtain estimates of geophysical parameters including ion velocity and temperature. It has been shown previously that the use of biased grids in such instruments creates a nonuniform potential in the grid plane, which leads to inherent errors in the inferred parameters. A simulation of ion interactions with various configurations of biased grids has been developed using a commercial finite-element analysis software package. Using a statistical approach, the simulation calculates collected flux from Maxwellian ion distributions with three-dimensional drift relative to the instrument. Perturbations in the performance of flight instrumentation relative to expectations from the idealized RPA flux equation are discussed. Both single grid and dual-grid systems are modeled to investigate design considerations. Relative errors in the inferred parameters for each geometry are characterized as functions of ion temperature and drift velocity.
Full counting statistics of energy fluctuations in a driven quantum resonator
Clerk, A. A.
2011-10-15
We consider the statistics of time-integrated energy fluctuations of a driven bosonic single-mode resonator, as measured by a quantum nondemolition (QND) detector, using the standard Keldysh prescription to define higher moments. We find that, due to an effective cascading of fluctuations, these statistics are surprisingly nonclassical: the low-temperature, quantum probability distribution is not equivalent to the high-temperature classical distribution evaluated at some effective temperature. Moreover, for a sufficiently large drive detuning and low temperatures, the Keldysh-ordered quasiprobability distribution characterizing these fluctuations fails to be positive-definite; this is similar to the full counting statistics of charge in superconducting systems. We argue that this indicates a kind of nonclassical behavior akin to that tested by Leggett-Garg inequalities.
STATISTICAL ANALYSIS OF CURRENT SHEETS IN THREE-DIMENSIONAL MAGNETOHYDRODYNAMIC TURBULENCE
Zhdankin, Vladimir; Boldyrev, Stanislav; Uzdensky, Dmitri A.; Perez, Jean C. E-mail: boldyrev@wisc.edu E-mail: jcperez@wisc.edu
2013-07-10
We develop a framework for studying the statistical properties of current sheets in numerical simulations of magnetohydrodynamic (MHD) turbulence with a strong guide field, as modeled by reduced MHD. We describe an algorithm that identifies current sheets in a simulation snapshot and then determines their geometrical properties (including length, width, and thickness) and intensities (peak current density and total energy dissipation rate). We then apply this procedure to simulations of reduced MHD and perform a statistical analysis on the obtained population of current sheets. We evaluate the role of reconnection by separately studying the populations of current sheets which contain magnetic X-points and those which do not. We find that the statistical properties of the two populations are different in general. We compare the scaling of these properties to phenomenological predictions obtained for the inertial range of MHD turbulence. Finally, we test whether the reconnecting current sheets are consistent with the Sweet-Parker model.
The effect of turbulent kinetic energy on inferred ion temperature from neutron spectra
Murphy, T. J.
2014-07-15
Measuring the width of the energy spectrum of fusion-produced neutrons from deuterium (DD) or deuterium-tritium (DT) plasmas is a commonly used method for determining the ion temperature in inertial confinement fusion (ICF) implosions. In a plasma with a Maxwellian distribution of ion energies, the spread in neutron energy arises from the thermal spread in the center-of-mass velocities of reacting pairs of ions. Fluid velocities in ICF are of a similar magnitude as the center-of-mass velocities and can lead to further broadening of the neutron spectrum, leading to erroneous inference of ion temperature. Motion of the reacting plasma will affect DD and DT neutrons differently, leading to disagreement between ion temperatures inferred from the two reactions. This effect may be a contributor to observations over the past decades of ion temperatures higher than expected from simulations, ion temperatures in disagreement with observed yields, and different temperatures measured in the same implosion from DD and DT neutrons. This difference in broadening of DD and DT neutrons also provides a measure of turbulent motion in a fusion plasma.
Statistics of particle time-temperature histories.
Hewson, John C.; Lignell, David O.; Sun, Guangyuan
2014-10-01
Particles in non - isothermal turbulent flow are subject to a stochastic environment tha t produces a distribution of particle time - temperature histories. This distribution is a function of the dispersion of the non - isothermal (continuous) gas phase and the distribution of particles relative to that gas phase. In this work we extend the one - dimensional turbulence (ODT) model to predict the joint dispersion of a dispersed particle phase and a continuous phase. The ODT model predicts the turbulent evolution of continuous scalar fields with a model for the cascade of fluctuations to smaller sc ales (the 'triplet map') at a rate that is a function of the fully resolved one - dimens ional velocity field . Stochastic triplet maps also drive Lagrangian particle dispersion with finite Stokes number s including inertial and eddy trajectory - crossing effect s included. Two distinct approaches to this coupling between triplet maps and particle dispersion are developed and implemented along with a hybrid approach. An 'instantaneous' particle displacement model matches the tracer particle limit and provide s an accurate description of particle dispersion. A 'continuous' particle displacement m odel translates triplet maps into a continuous velocity field to which particles respond. Particles can alter the turbulence, and modifications to the stochastic rate expr ession are developed for two - way coupling between particles and the continuous phase. Each aspect of model development is evaluated in canonical flows (homogeneous turbulence, free - shear flows and wall - bounded flows) for which quality measurements are ava ilable. ODT simulations of non - isothermal flows provide statistics for particle heating. These simulations show the significance of accurately predicting the joint statistics of particle and fluid dispersion . Inhomogeneous turbulence coupled with the in fluence of the mean flow fields on particles of varying properties alter s particle dispersion. The joint particle - temperature dispersion leads to a distribution of temperature histories predicted by the ODT . Predictions are shown for the lower moments an d the full distributions of the particle positions, particle - observed gas temperatures and particle temperatures. An analysis of the time scales affecting particle - temperature interactions covers Lagrangian integral time scales based on temperature autoco rrelations, rates of temperature change associated with particle motion relative to the temperature field and rates of diffusional change of temperatures. These latter two time scales have not been investigated previously; they are shown to be strongly in termittent having peaked distributions with long tails. The logarithm of the absolute value of these time scales exhibits a distribution closer to normal. A cknowledgements This work is supported by the Defense Threat Reduction Agency (DTRA) under their Counter - Weapons of Mass Destruction Basic Research Program in the area of Chemical and Biological Agent Defeat under award number HDTRA1 - 11 - 4503I to Sandia National Laboratories. The authors would like to express their appreciation for the guidance provi ded by Dr. Suhithi Peiris to this project and to the Science to Defeat Weapons of Mass Destruction program.
Statistical theory of turbulent incompressible multimaterial flow
Kashiwa, B.
1987-10-01
Interpenetrating motion of incompressible materials is considered. ''Turbulence'' is defined as any deviation from the mean motion. Accordingly a nominally stationary fluid will exhibit turbulent fluctuations due to a single, slowly moving sphere. Mean conservation equations for interpenetrating materials in arbitrary proportions are derived using an ensemble averaging procedure, beginning with the exact equations of motion. The result is a set of conservation equations for the mean mass, momentum and fluctuational kinetic energy of each material. The equation system is at first unclosed due to integral terms involving unknown one-point and two-point probability distribution functions. In the mean momentum equation, the unclosed terms are clearly identified as representing two physical processes. One is transport of momentum by multimaterial Reynolds stresses, and the other is momentum exchange due to pressure fluctuations and viscous stress at material interfaces. Closure is approached by combining careful examination of multipoint statistical correlations with the traditional physical technique of kappa-epsilon modeling for single-material turbulence. This involves representing the multimaterial Reynolds stress for each material as a turbulent viscosity times the rate of strain based on the mean velocity of that material. The multimaterial turbulent viscosity is related to the fluctuational kinetic energy kappa, and the rate of fluctuational energy dissipation epsilon, for each material. Hence a set of kappa and epsilon equations must be solved, together with mean mass and momentum conservation equations, for each material. Both kappa and the turbulent viscosities enter into the momentum exchange force. The theory is applied to (a) calculation of the drag force on a sphere fixed in a uniform flow, (b) calculation of the settling rate in a suspension and (c) calculation of velocity profiles in the pneumatic transport of solid particles in a pipe.
Accelerated Stress Testing, Qualification Testing, HAST, Field...
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Accelerated Stress Testing, Qualification Testing, HAST, Field Experience Accelerated Stress Testing, Qualification Testing, HAST, Field Experience This presentation, which was the ...
Experimental uncertainty estimation and statistics for data having interval uncertainty.
Kreinovich, Vladik; Oberkampf, William Louis; Ginzburg, Lev; Ferson, Scott; Hajagos, Janos
2007-05-01
This report addresses the characterization of measurements that include epistemic uncertainties in the form of intervals. It reviews the application of basic descriptive statistics to data sets which contain intervals rather than exclusively point estimates. It describes algorithms to compute various means, the median and other percentiles, variance, interquartile range, moments, confidence limits, and other important statistics and summarizes the computability of these statistics as a function of sample size and characteristics of the intervals in the data (degree of overlap, size and regularity of widths, etc.). It also reviews the prospects for analyzing such data sets with the methods of inferential statistics such as outlier detection and regressions. The report explores the tradeoff between measurement precision and sample size in statistical results that are sensitive to both. It also argues that an approach based on interval statistics could be a reasonable alternative to current standard methods for evaluating, expressing and propagating measurement uncertainties.
Statistical Performance Evaluation Of Soft Seat Pressure Relief Valves
Harris, Stephen P.; Gross, Robert E.
2013-03-26
Risk-based inspection methods enable estimation of the probability of failure on demand for spring-operated pressure relief valves at the United States Department of Energy's Savannah River Site in Aiken, South Carolina. This paper presents a statistical performance evaluation of soft seat spring operated pressure relief valves. These pressure relief valves are typically smaller and of lower cost than hard seat (metal to metal) pressure relief valves and can provide substantial cost savings in fluid service applications (air, gas, liquid, and steam) providing that probability of failure on demand (the probability that the pressure relief valve fails to perform its intended safety function during a potentially dangerous over pressurization) is at least as good as that for hard seat valves. The research in this paper shows that the proportion of soft seat spring operated pressure relief valves failing is the same or less than that of hard seat valves, and that for failed valves, soft seat valves typically have failure ratios of proof test pressure to set pressure less than that of hard seat valves.
A statistical approach to designing mitigation for induced ac voltages
Dabkowski, J. [Electro Sciences, Inc., Crystal Lake, IL (United States)
1996-08-01
Induced voltage levels on buried pipelines collocated with overhead electric power transmission lines are usually mitigated by means of grounding the pipeline. Maximum effectiveness is obtained when grounds are placed at discrete locations along the pipeline where the peak induced voltages occur. The degree of mitigation achieved is dependent upon the local soil resistivity at these locations. On occasion it may be necessary to employ an extensive distributed grounding system, for example, a parallel buried wire connected to the pipe at periodic intervals. In this situation the a priori calculation of mitigated voltage levels is sometimes made assuming an average value for the soil resistivity. Over long distances, however, the soil resistivity generally varies as a log-normally distributed random variable. The effect of this variability upon the predicted mitigated voltage levels is examined. It is found that the predicted levels exhibit a statistical variability which precludes a precise determination of the mitigated voltage levels. Thus, post commissioning testing of the emplaced mitigation system is advisable.
Structure Learning and Statistical Estimation in Distribution Networks - Part I
Deka, Deepjyoti; Backhaus, Scott N.; Chertkov, Michael
2015-02-13
Traditionally power distribution networks are either not observable or only partially observable. This complicates development and implementation of new smart grid technologies, such as those related to demand response, outage detection and management, and improved load-monitoring. In this two part paper, inspired by proliferation of the metering technology, we discuss estimation problems in structurally loopy but operationally radial distribution grids from measurements, e.g. voltage data, which are either already available or can be made available with a relatively minor investment. In Part I, the objective is to learn the operational layout of the grid. Part II of this paper presents algorithms that estimate load statistics or line parameters in addition to learning the grid structure. Further, Part II discusses the problem of structure estimation for systems with incomplete measurement sets. Our newly suggested algorithms apply to a wide range of realistic scenarios. The algorithms are also computationally efficient – polynomial in time– which is proven theoretically and illustrated computationally on a number of test cases. The technique developed can be applied to detect line failures in real time as well as to understand the scope of possible adversarial attacks on the grid.
Statistically qualified neuro-analytic failure detection method and system
Vilim, Richard B.; Garcia, Humberto E.; Chen, Frederick W.
2002-03-02
An apparatus and method for monitoring a process involve development and application of a statistically qualified neuro-analytic (SQNA) model to accurately and reliably identify process change. The development of the SQNA model is accomplished in two stages: deterministic model adaption and stochastic model modification of the deterministic model adaptation. Deterministic model adaption involves formulating an analytic model of the process representing known process characteristics, augmenting the analytic model with a neural network that captures unknown process characteristics, and training the resulting neuro-analytic model by adjusting the neural network weights according to a unique scaled equation error minimization technique. Stochastic model modification involves qualifying any remaining uncertainty in the trained neuro-analytic model by formulating a likelihood function, given an error propagation equation, for computing the probability that the neuro-analytic model generates measured process output. Preferably, the developed SQNA model is validated using known sequential probability ratio tests and applied to the process as an on-line monitoring system. Illustrative of the method and apparatus, the method is applied to a peristaltic pump system.
Doppler Lidar Vertical Velocity Statistics Value-Added Product (Technical
Office of Scientific and Technical Information (OSTI)
Report) | SciTech Connect Vertical Velocity Statistics Value-Added Product Citation Details In-Document Search Title: Doppler Lidar Vertical Velocity Statistics Value-Added Product Accurate height-resolved measurements of higher-order statistical moments of vertical velocity fluctuations are crucial for improved understanding of turbulent mixing and diffusion, convective initiation, and cloud life cycles. The Atmospheric Radiation Measurement (ARM) Climate Research Facility operates coherent
Nonlinearity sensing via photon-statistics excitation spectroscopy
Assmann, Marc; Bayer, Manfred
2011-11-15
We propose photon-statistics excitation spectroscopy as an adequate tool to describe the optical response of a nonlinear system. To this end we suggest to use optical excitation with varying photon statistics as another spectroscopic degree of freedom to gather information about the system in question. The responses of several simple model systems to excitation beams with different photon statistics are discussed. Possible spectroscopic applications in terms of identifying lasing operation are pointed out.
User Statistics | U.S. DOE Office of Science (SC)
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Statistics User Facilities User Facilities Home User Facilities at a Glance User Resources User Statistics Policies and Processes Frequently Asked Questions User Facility Science Highlights User Facility News Contact Information Office of Science U.S. Department of Energy 1000 Independence Ave., SW Washington, DC 20585 P: (202) 586-5430 User Statistics Print Text Size: A A A FeedbackShare Page The map and source data below contain information regarding FY 2014 user projects at the Office of
STATISTICAL MECHANICS MODELING OF MESOSCALE DEFORMATION IN METALS
Office of Scientific and Technical Information (OSTI)
(Technical Report) | SciTech Connect Technical Report: STATISTICAL MECHANICS MODELING OF MESOSCALE DEFORMATION IN METALS Citation Details In-Document Search Title: STATISTICAL MECHANICS MODELING OF MESOSCALE DEFORMATION IN METALS The research under this project focused on a theoretical and computational modeling of dislocation dynamics of mesoscale deformation of metal single crystals. Specifically, the work aimed to implement a continuum statistical theory of dislocations to understand
Shirazi, M.; Kroposki, B.
2012-01-01
With the publication of IEEE 1574.4 Guide for Design, Operation, and Integration of Distributed Resource Island Systems with Electric Power Systems, there is an increasing amount of attention on not only the design and operations of microgrids, but also on the proper operation and testing of these systems. This standard provides alternative approaches and good practices for the design, operation, and integration of microgrids. This includes the ability to separate from and reconnect to part of the utility grid while providing power to the islanded power system. This presentation addresses the industry need to develop standardized testing and evaluation procedures for microgrids in order to assure quality operation in the grid connected and islanded modes of operation.
ARM - Publications: Science Team Meeting Documents: Testing and Deployment
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
of an Infrared Thermometer Network at the ARM Southern Great Plains Climate Research Facility Testing and Deployment of an Infrared Thermometer Network at the ARM Southern Great Plains Climate Research Facility Morris, Victor Pacific Northwest National Laboratory Long, Chuck Pacific Northwest National Laboratory To increase our ability to calculate heating rate profiles to study the variability across the Global Climate Model scale area and for inferring information about distribution and
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Forest Products (2010 MECS) Forest Products (2010 MECS) Manufacturing Energy and Carbon Footprint for Forest Products Sector (NAICS 321, 322) Energy use data source: 2010 EIA MECS (with adjustments) Footprint Last Revised: February 2014 View footprints for other sectors here. Manufacturing Energy and Carbon Footprint PDF icon Forest Products More Documents & Publications MECS 2006 - Forest Products Cement (2010 MECS) Transportation
Forklift Safety Test Instructions: All Training and
UNECE-Annual Bulletin of Transport Statistics for Europe and...
Data covers Europe, Canada and the United States. This is a trilingual publication in English, French and Russian." "This annual publication presents statistics and brief studies...
WHO Statistical Information System (WHOSIS) | Open Energy Information
Classification of Diseases (ICD-10), International Classification of Impairments, Disabilities and Handicaps (ICIDH) Links to other sources of health-related statistical...
Experimental and Statistical Comparison of Engine Response as...
Office of Scientific and Technical Information (OSTI)
Experimental and Statistical Comparison of Engine Response as a Function of Fuel Chemistry ... Engine Response as a Function of Fuel Chemistry and Properties in CI and HCCI Engines ...
Doppler Lidar Vertical Velocity Statistics Value-Added Product...
Office of Scientific and Technical Information (OSTI)
Citation Details In-Document Search Title: Doppler Lidar Vertical Velocity Statistics ... Facility operates coherent Doppler lidar systems at several sites around the globe. ...
Random-matrix approach to the statistical compound nuclear reaction...
Office of Scientific and Technical Information (OSTI)
nuclear reaction at low energies using the Monte-Carlo technique Citation Details In-Document Search Title: Random-matrix approach to the statistical compound nuclear ...
UN-Glossary for Transportation Statistics | Open Energy Information
Publications Website: www.internationaltransportforum.orgPubpdfGloStat3e.pdf Cost: Free UN-Glossary for Transportation Statistics Screenshot References: UN-Glossary for...
Statistical surrogate models for prediction of high-consequence...
Office of Scientific and Technical Information (OSTI)
surrogate models for prediction of high-consequence climate change. Citation Details In-Document Search Title: Statistical surrogate models for prediction of high-consequence ...
Key World Energy Statistics-2010 | Open Energy Information
World Energy Statistics-2010 AgencyCompany Organization: International Energy Agency Sector: Energy Topics: Market analysis Resource Type: Dataset, Maps Website: www.iea.org...
Physics-based statistical learning approach to mesoscopic model...
Office of Scientific and Technical Information (OSTI)
Title: Physics-based statistical learning approach to ... Type: Publisher's Accepted Manuscript Journal Name: Physical ... Country of Publication: United States Language: English Word ...
BP Statistical Review of World Energy | Open Energy Information
OpenEI The BP Statistical Review of World Energy is an Excel spreadsheet which contains consumption and production data for Coal, Natural Gas, Nuclear, Oil, and Hydroelectric...
International Monetary Fund-Data and Statistics | Open Energy...
"The IMF publishes a range of time series data on IMF lending, exchange rates and other economic and financial indicators. Manuals, guides, and other material on statistical...
IRF-World Road Statistics | Open Energy Information
AgencyCompany Organization: International Road Statistics Focus Area: Transportation, Economic Development Resource Type: Dataset Website: www.irfnet.orgstatistics.php Cost:...
An overview of component qualification using Bayesian statistics...
Office of Scientific and Technical Information (OSTI)
Example problems with solutions have been supplied as a learning aid. Bold letters are ... COMPUTING, AND INFORMATION SCIENCE; LEARNING; STATISTICS; MATHEMATICS Word Cloud More ...
STATISTICAL MECHANICS MODELING OF MESOSCALE DEFORMATION IN METALS...
Office of Scientific and Technical Information (OSTI)
STATISTICAL MECHANICS MODELING OF MESOSCALE DEFORMATION IN METALS Anter El-Azab 36 MATERIALS SCIENCE dislocation dynamics; mesoscale deformation of metals; crystal mechanics...
TITLE V-CONFIDENTIAL INFORMATION PROTECTION AND STATISTICAL EFFI...
Gasoline and Diesel Fuel Update (EIA)
or maintain the systems for handling or storage of data received under this title; and ... data anomalies, produce statistical samples that are consistently adjusted for the ...
Autocorrelation Function Statistics and Implication to Decay Ratio Estimation
March-Leuba, Jose A.
2016-01-01
This document summarizes the results of a series of computer simulations to attempt to identify the statistics of the autocorrelation function, and implications for decay ratio estimation.
Characterization of U.S. Wave Energy Converter Test Sites: A...
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
planning WEC tests, including the planning of deployment and operations and maintenance. ... of weather windows and extreme sea states, and statistics on wind and ocean currents. ...
The Network Completion Problem: Inferring Missing Nodes and Edges in Networks
Kim, M; Leskovec, J
2011-11-14
Network structures, such as social networks, web graphs and networks from systems biology, play important roles in many areas of science and our everyday lives. In order to study the networks one needs to first collect reliable large scale network data. While the social and information networks have become ubiquitous, the challenge of collecting complete network data still persists. Many times the collected network data is incomplete with nodes and edges missing. Commonly, only a part of the network can be observed and we would like to infer the unobserved part of the network. We address this issue by studying the Network Completion Problem: Given a network with missing nodes and edges, can we complete the missing part? We cast the problem in the Expectation Maximization (EM) framework where we use the observed part of the network to fit a model of network structure, and then we estimate the missing part of the network using the model, re-estimate the parameters and so on. We combine the EM with the Kronecker graphs model and design a scalable Metropolized Gibbs sampling approach that allows for the estimation of the model parameters as well as the inference about missing nodes and edges of the network. Experiments on synthetic and several real-world networks show that our approach can effectively recover the network even when about half of the nodes in the network are missing. Our algorithm outperforms not only classical link-prediction approaches but also the state of the art Stochastic block modeling approach. Furthermore, our algorithm easily scales to networks with tens of thousands of nodes.
Fundamental Statistical Descriptions of Plasma Turbulence in Magnetic Fields
John A. Krommes
2001-02-16
A pedagogical review of the historical development and current status (as of early 2000) of systematic statistical theories of plasma turbulence is undertaken. Emphasis is on conceptual foundations and methodology, not practical applications. Particular attention is paid to equations and formalism appropriate to strongly magnetized, fully ionized plasmas. Extensive reference to the literature on neutral-fluid turbulence is made, but the unique properties and problems of plasmas are emphasized throughout. Discussions are given of quasilinear theory, weak-turbulence theory, resonance-broadening theory, and the clump algorithm. Those are developed independently, then shown to be special cases of the direct-interaction approximation (DIA), which provides a central focus for the article. Various methods of renormalized perturbation theory are described, then unified with the aid of the generating-functional formalism of Martin, Siggia, and Rose. A general expression for the renormalized dielectric function is deduced and discussed in detail. Modern approaches such as decimation and PDF methods are described. Derivations of DIA-based Markovian closures are discussed. The eddy-damped quasinormal Markovian closure is shown to be nonrealizable in the presence of waves, and a new realizable Markovian closure is presented. The test-field model and a realizable modification thereof are also summarized. Numerical solutions of various closures for some plasma-physics paradigms are reviewed. The variational approach to bounds on transport is developed. Miscellaneous topics include Onsager symmetries for turbulence, the interpretation of entropy balances for both kinetic and fluid descriptions, self-organized criticality, statistical interactions between disparate scales, and the roles of both mean and random shear. Appendices are provided on Fourier transform conventions, dimensional and scaling analysis, the derivations of nonlinear gyrokinetic and gyrofluid equations, stochasticity criteria for quasilinear theory, formal aspects of resonance-broadening theory, Novikov's theorem, the treatment of weak inhomogeneity, the derivation of the Vlasov weak-turbulence wave kinetic equation from a fully renormalized description, some features of a code for solving the direct-interaction approximation and related Markovian closures, the details of the solution of the EDQNM closure for a solvable three-wave model, and the notation used in the article.
A NEW METHOD TO CORRECT FOR FIBER COLLISIONS IN GALAXY TWO-POINT STATISTICS
Guo Hong; Zehavi, Idit; Zheng Zheng
2012-09-10
In fiber-fed galaxy redshift surveys, the finite size of the fiber plugs prevents two fibers from being placed too close to one another, limiting the ability to study galaxy clustering on all scales. We present a new method for correcting such fiber collision effects in galaxy clustering statistics based on spectroscopic observations. The target galaxy sample is divided into two distinct populations according to the targeting algorithm of fiber placement, one free of fiber collisions and the other consisting of collided galaxies. The clustering statistics are a combination of the contributions from these two populations. Our method makes use of observations in tile overlap regions to measure the contributions from the collided population, and to therefore recover the full clustering statistics. The method is rooted in solid theoretical ground and is tested extensively on mock galaxy catalogs. We demonstrate that our method can well recover the projected and the full three-dimensional (3D) redshift-space two-point correlation functions (2PCFs) on scales both below and above the fiber collision scale, superior to the commonly used nearest neighbor and angular correction methods. We discuss potential systematic effects in our method. The statistical correction accuracy of our method is only limited by sample variance, which scales down with (the square root of) the volume probed. For a sample similar to the final SDSS-III BOSS galaxy sample, the statistical correction error is expected to be at the level of 1% on scales {approx}0.1-30 h {sup -1} Mpc for the 2PCFs. The systematic error only occurs on small scales, caused by imperfect correction of collision multiplets, and its magnitude is expected to be smaller than 5%. Our correction method, which can be generalized to other clustering statistics as well, enables more accurate measurements of full 3D galaxy clustering on all scales with galaxy redshift surveys.
Infinite statistics condensate as a model of dark matter
Ebadi, Zahra; Mirza, Behrouz; Mohammadzadeh, Hosein E-mail: b.mirza@cc.iut.ac.ir
2013-11-01
In some models, dark matter is considered as a condensate bosonic system. In this paper, we prove that condensation is also possible for particles that obey infinite statistics and derive the critical condensation temperature. We argue that a condensed state of a gas of very weakly interacting particles obeying infinite statistics could be considered as a consistent model of dark matter.
Sub-Poissonian statistics in order-to-chaos transition
Kryuchkyan, Gagik Yu. [Yerevan State University, Manookyan 1, Yerevan 375049, (Armenia); Institute for Physical Research, National Academy of Sciences, Ashtarak-2 378410, (Armenia); Manvelyan, Suren B. [Institute for Physical Research, National Academy of Sciences, Ashtarak-2 378410, (Armenia)
2003-07-01
We study the phenomena at the overlap of quantum chaos and nonclassical statistics for the time-dependent model of nonlinear oscillator. It is shown in the framework of Mandel Q parameter and Wigner function that the statistics of oscillatory excitation numbers is drastically changed in the order-to-chaos transition. The essential improvement of sub-Poissonian statistics in comparison with an analogous one for the standard model of driven anharmonic oscillator is observed for the regular operational regime. It is shown that in the chaotic regime, the system exhibits the range of sub-Poissonian and super-Poissonian statistics which alternate one to other depending on time intervals. Unusual dependence of the variance of oscillatory number on the external noise level for the chaotic dynamics is observed. The scaling invariance of the quantum statistics is demonstrated and its relation to dissipation and decoherence is studied.
Dana L. Kelly; Albert Malkhasyan
2010-06-01
There is a nearly ubiquitous assumption in PSA that parameter values are at least piecewise-constant in time. As a result, Bayesian inference tends to incorporate many years of plant operation, over which there have been significant changes in plant operational and maintenance practices, plant management, etc. These changes can cause significant changes in parameter values over time; however, failure to perform Bayesian inference in the proper time-dependent framework can mask these changes. Failure to question the assumption of constant parameter values, and failure to perform Bayesian inference in the proper time-dependent framework were noted as important issues in NUREG/CR-6813, performed for the U. S. Nuclear Regulatory Commissions Advisory Committee on Reactor Safeguards in 2003. That report noted that industry lacks tools to perform time-trend analysis with Bayesian updating. This paper describes an application of time-dependent Bayesian inference methods developed for the European Commission Ageing PSA Network. These methods utilize open-source software, implementing Markov chain Monte Carlo sampling. The paper also illustrates the development of a generic prior distribution, which incorporates multiple sources of generic data via weighting factors that address differences in key influences, such as vendor, component boundaries, conditions of the operating environment, etc.
Statistical anisotropies in gravitational waves in solid inflation
Akhshik, Mohammad; Emami, Razieh; Firouzjahi, Hassan; Wang, Yi E-mail: emami@ipm.ir E-mail: yw366@cam.ac.uk
2014-09-01
Solid inflation can support a long period of anisotropic inflation. We calculate the statistical anisotropies in the scalar and tensor power spectra and their cross-correlation in anisotropic solid inflation. The tensor-scalar cross-correlation can either be positive or negative, which impacts the statistical anisotropies of the TT and TB spectra in CMB map more significantly compared with the tensor self-correlation. The tensor power spectrum contains potentially comparable contributions from quadrupole and octopole angular patterns, which is different from the power spectra of scalar, the cross-correlation or the scalar bispectrum, where the quadrupole type statistical anisotropy dominates over octopole.
Statistical mechanics based on fractional classical and quantum mechanics
Korichi, Z.; Meftah, M. T.
2014-03-15
The purpose of this work is to study some problems in statistical mechanics based on the fractional classical and quantum mechanics. At first stage we have presented the thermodynamical properties of the classical ideal gas and the system of N classical oscillators. In both cases, the Hamiltonian contains fractional exponents of the phase space (position and momentum). At the second stage, in the context of the fractional quantum mechanics, we have calculated the thermodynamical properties for the black body radiation, studied the Bose-Einstein statistics with the related problem of the condensation and the Fermi-Dirac statistics.
Techniques in teaching statistics : linking research production and research use.
Martinez-Moyano, I .; Smith, A.
2012-01-01
In the spirit of closing the 'research-practice gap,' the authors extend evidence-based principles to statistics instruction in social science graduate education. The authors employ a Delphi method to survey experienced statistics instructors to identify teaching techniques to overcome the challenges inherent in teaching statistics to students enrolled in practitioner-oriented master's degree programs. Among the teaching techniques identi?ed as essential are using real-life examples, requiring data collection exercises, and emphasizing interpretation rather than results. Building on existing research, preliminary interviews, and the ?ndings from the study, the authors develop a model describing antecedents to the strength of the link between research and practice.
Novichkov, Pavel S.; Rodionov, Dmitry A.; Stavrovskaya, Elena D.; Novichkova, Elena S.; Kazakov, Alexey E.; Gelfand, Mikhail S.; Arkin, Adam P.; Mironov, Andrey A.; Dubchak, Inna
2010-05-26
RegPredict web server is designed to provide comparative genomics tools for reconstruction and analysis of microbial regulons using comparative genomics approach. The server allows the user to rapidly generate reference sets of regulons and regulatory motif profiles in a group of prokaryotic genomes. The new concept of a cluster of co-regulated orthologous operons allows the user to distribute the analysis of large regulons and to perform the comparative analysis of multiple clusters independently. Two major workflows currently implemented in RegPredict are: (i) regulon reconstruction for a known regulatory motif and (ii) ab initio inference of a novel regulon using several scenarios for the generation of starting gene sets. RegPredict provides a comprehensive collection of manually curated positional weight matrices of regulatory motifs. It is based on genomic sequences, ortholog and operon predictions from the MicrobesOnline. An interactive web interface of RegPredict integrates and presents diverse genomic and functional information about the candidate regulon members from several web resources. RegPredict is freely accessible at http://regpredict.lbl.gov.
Characteristics of surface current flow inferred from a global ocean current data set
Meehl, G.A.
1982-06-01
A seasonal global ocean-current data set (OCDS) digitized on a 5/sup 0/ grid from long-term mean shipdrift-derived currents from pilot charts is presented and described. Annual zonal means of v-component currents show subtropical convergence zones which moved closest to the equator during the respective winters in each hemisphere. Net annual v-component surface flow at the equator is northward. Zonally average u-component currents have greatest seasonal variance in the tropics with strongest westward currents in the winter hemisphere. An ensemble of ocean currents measured by buoys and current meters compares favorably with OCDS data in spite of widely varying time and space scales. The OCDS currents and directly measured currents are about twice as large as computed geostrophic currents. An analysis of equatorial Pacific currents suggests that dynamic topography and sea-level change indicative of the geostrophic flow component cannot be relied on solely to infer absolute strength of surface currents which include a strong Ekman component. Comparison of OCDS v-component currents and meridional transports predicted by Ekman theory shows agreement in the sign of transports in the midlatitudes and tropics in both hemispheres. Ekman depths required to scale OCDS v-component currents to computed Ekman transports are reasonable at most latitudes with layer depths deepening closer to the equator.
Roberto, Baccoli; Ubaldo, Carlini; Stefano, Mariotti; Roberto, Innamorati; Elisa, Solinas; Paolo, Mura
2010-06-15
This paper deals with the development of methods for non steady state test of solar thermal collectors. Our goal is to infer performances in steady-state conditions in terms of the efficiency curve when measures in transient conditions are the only ones available. We take into consideration the method of identification of a system in dynamic conditions by applying a Graybox Identification Model and a Dynamic Adaptative Linear Neural Network (ALNN) model. The study targets the solar collector with evacuated pipes, such as Dewar pipes. The mathematical description that supervises the functioning of the solar collector in transient conditions is developed using the equation of the energy balance, with the aim of determining the order and architecture of the two models. The input and output vectors of the two models are constructed, considering the measures of 4 days of solar radiation, flow mass, environment and heat-transfer fluid temperature in the inlet and outlet from the thermal solar collector. The efficiency curves derived from the two models are detected in correspondence to the test and validation points. The two synthetic simulated efficiency curves are compared with the actual efficiency curve certified by the Swiss Institute Solartechnik Puffung Forschung which tested the solar collector performance in steady-state conditions according to the UNI-EN 12975 standard. An acquisition set of measurements of only 4 days in the transient condition was enough to trace through a Graybox State Space Model the efficiency curve of the tested solar thermal collector, with a relative error of synthetic values with respect to efficiency certified by SPF, lower than 0.5%, while with the ALNN model the error is lower than 2.2% with respect to certified one. (author)
Doppler Lidar Vertical Velocity Statistics Value-Added Product
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
49 Doppler Lidar Vertical Velocity Statistics Value-Added Product RK Newsom C Sivaraman TR Shippert LD Riihimaki July 2015 DISCLAIMER This report was prepared as an account of work...
Data analysis using the Gnu R system for statistical computation
Simone, James; /Fermilab
2011-07-01
R is a language system for statistical computation. It is widely used in statistics, bioinformatics, machine learning, data mining, quantitative finance, and the analysis of clinical drug trials. Among the advantages of R are: it has become the standard language for developing statistical techniques, it is being actively developed by a large and growing global user community, it is open source software, it is highly portable (Linux, OS-X and Windows), it has a built-in documentation system, it produces high quality graphics and it is easily extensible with over four thousand extension library packages available covering statistics and applications. This report gives a very brief introduction to R with some examples using lattice QCD simulation results. It then discusses the development of R packages designed for chi-square minimization fits for lattice n-pt correlation functions.
The Sloan Digital Sky Survey Quasar Lens Search. IV. Statistical...
Office of Scientific and Technical Information (OSTI)
The Sloan Digital Sky Survey Quasar Lens Search. IV. Statistical Lens Sample from the Fifth Data Release Citation Details In-Document Search Title: The Sloan Digital Sky Survey...
UN-Energy Statistics Database | Open Energy Information
PV, Wind Resource Type: Dataset Website: data.un.orgExplorer.aspx?dEDATA Cost: Free Language: English UN-Energy Statistics Database Screenshot References: UN Data1 "The United...
A Divergence Statistics Extension to VTK for Performance Analysis.
Pebay, Philippe Pierre; Bennett, Janine Camille
2015-02-01
This report follows the series of previous documents ([PT08, BPRT09b, PT09, BPT09, PT10, PB13], where we presented the parallel descriptive, correlative, multi-correlative, principal component analysis, contingency, k -means, order and auto-correlative statistics engines which we developed within the Visualization Tool Kit ( VTK ) as a scalable, parallel and versatile statistics package. We now report on a new engine which we developed for the calculation of divergence statistics, a concept which we hereafter explain and whose main goal is to quantify the discrepancy, in a stasticial manner akin to measuring a distance, between an observed empirical distribution and a theoretical, "ideal" one. The ease of use of the new diverence statistics engine is illustrated by the means of C++ code snippets. Although this new engine does not yet have a parallel implementation, it has already been applied to HPC performance analysis, of which we provide an example.
Office of Environmental Management (EM)
test test test PDF icon test More Documents & Publications 2009 ECR FINAL REPORT 2010 Final ECR 2008 Report Environmental Conflict Resolution
An overview of component qualification using Bayesian statistics and energy
Office of Scientific and Technical Information (OSTI)
methods. (Technical Report) | SciTech Connect An overview of component qualification using Bayesian statistics and energy methods. Citation Details In-Document Search Title: An overview of component qualification using Bayesian statistics and energy methods. The below overview is designed to give the reader a limited understanding of Bayesian and Maximum Likelihood (MLE) estimation; a basic understanding of some of the mathematical tools to evaluate the quality of an estimation; an
Evaluation of cirrus statistics produced by general circulation models
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
using ARM data cirrus statistics produced by general circulation models using ARM data Hartsock, Daniel University of Utah Mace, Gerald University of Utah Benson, Sally University of Utah Category: Modeling Our goal is to evaluate the skill of various general circulation models for producing climatological cloud statistics by comparing them to the cirrus climatology compiled over the Southern Great Plains (SGP) ARM site. This evaluation includes quantifying similar cloud properties and
Final Report on Statistical Debugging for Petascale Environments (Technical
Office of Scientific and Technical Information (OSTI)
Report) | SciTech Connect Final Report on Statistical Debugging for Petascale Environments Citation Details In-Document Search Title: Final Report on Statistical Debugging for Petascale Environments Authors: Liblit, B Publication Date: 2013-01-18 OSTI Identifier: 1062211 Report Number(s): LLNL-SR-612077 DOE Contract Number: W-7405-ENG-48 Resource Type: Technical Report Research Org: Lawrence Livermore National Laboratory (LLNL), Livermore, CA Sponsoring Org: USDOE Country of Publication:
Lightweight and Statistical Techniques for Petascale Debugging: Correctness
Office of Scientific and Technical Information (OSTI)
on Petascale Systems (CoPS) Preliminry Report (Technical Report) | SciTech Connect Lightweight and Statistical Techniques for Petascale Debugging: Correctness on Petascale Systems (CoPS) Preliminry Report Citation Details In-Document Search Title: Lightweight and Statistical Techniques for Petascale Debugging: Correctness on Petascale Systems (CoPS) Preliminry Report Petascale platforms with O(10{sup 5}) and O(10{sup 6}) processing cores are driving advancements in a wide range of scientific
Mathematical and Statistical Opportunities in Cyber Security (Technical
Office of Scientific and Technical Information (OSTI)
Report) | SciTech Connect Mathematical and Statistical Opportunities in Cyber Security Citation Details In-Document Search Title: Mathematical and Statistical Opportunities in Cyber Security The role of mathematics in a complex system such as the Internet has yet to be deeply explored. In this paper, we summarize some of the important and pressing problems in cyber security from the viewpoint of open science environments. We start by posing the question 'What fundamental problems exist
Statistical Surrogate Models for Estimating Probability of High-Consequence
Office of Scientific and Technical Information (OSTI)
Climate Change. (Conference) | SciTech Connect Statistical Surrogate Models for Estimating Probability of High-Consequence Climate Change. Citation Details In-Document Search Title: Statistical Surrogate Models for Estimating Probability of High-Consequence Climate Change. Abstract not provided. Authors: Field, Richard V., ; Boslough, Mark B. E. ; Constantine, Paul Publication Date: 2011-10-01 OSTI Identifier: 1106521 Report Number(s): SAND2011-8231C 465067 DOE Contract Number:
Statistical characteristics of cloud variability. Part 2: Implication for
Office of Scientific and Technical Information (OSTI)
parameterizations of microphysical and radiative transfer processes in climate models (Journal Article) | SciTech Connect Statistical characteristics of cloud variability. Part 2: Implication for parameterizations of microphysical and radiative transfer processes in climate models Citation Details In-Document Search Title: Statistical characteristics of cloud variability. Part 2: Implication for parameterizations of microphysical and radiative transfer processes in climate models The effects
Statistical surrogate models for prediction of high-consequence climate
Office of Scientific and Technical Information (OSTI)
change. (Technical Report) | SciTech Connect Technical Report: Statistical surrogate models for prediction of high-consequence climate change. Citation Details In-Document Search Title: Statistical surrogate models for prediction of high-consequence climate change. In safety engineering, performance metrics are defined using probabilistic risk assessments focused on the low-probability, high-consequence tail of the distribution of possible events, as opposed to best estimates based on
An overview of component qualification using Bayesian statistics and energy
Office of Scientific and Technical Information (OSTI)
methods. (Technical Report) | SciTech Connect Technical Report: An overview of component qualification using Bayesian statistics and energy methods. Citation Details In-Document Search Title: An overview of component qualification using Bayesian statistics and energy methods. The below overview is designed to give the reader a limited understanding of Bayesian and Maximum Likelihood (MLE) estimation; a basic understanding of some of the mathematical tools to evaluate the quality of an
Physics-based statistical learning approach to mesoscopic model selection
Office of Scientific and Technical Information (OSTI)
(Journal Article) | SciTech Connect Physics-based statistical learning approach to mesoscopic model selection Citation Details In-Document Search This content will become publicly available on November 8, 2016 Title: Physics-based statistical learning approach to mesoscopic model selection Authors: Taverniers, Søren ; Haut, Terry S. ; Barros, Kipton ; Alexander, Francis J. ; Lookman, Turab Publication Date: 2015-11-09 OSTI Identifier: 1225546 Grant/Contract Number: AC52-06NA25396;
STATISTICAL PERFORMANCE EVALUATION OF SPRING OPERATED PRESSURE RELIEF VALVE
Office of Scientific and Technical Information (OSTI)
RELIABILITY IMPROVEMENTS 2004 TO 2014 (Conference) | SciTech Connect STATISTICAL PERFORMANCE EVALUATION OF SPRING OPERATED PRESSURE RELIEF VALVE RELIABILITY IMPROVEMENTS 2004 TO 2014 Citation Details In-Document Search Title: STATISTICAL PERFORMANCE EVALUATION OF SPRING OPERATED PRESSURE RELIEF VALVE RELIABILITY IMPROVEMENTS 2004 TO 2014 Authors: Harris, S. ; Gross, R. ; Watson, H. Publication Date: 2015-02-04 OSTI Identifier: 1209039 Report Number(s): SRNL-STI-2015-00047
Rebound 2007: Analysis of U.S. Light-Duty Vehicle Travel Statistics
Greene, David L
2010-01-01
U.S. national time series data on vehicle travel by passenger cars and light trucks covering the period 1966 2007 are used to test for the existence, size and stability of the rebound effect for motor vehicle fuel efficiency on vehicle travel. The data show a statistically significant effect of gasoline price on vehicle travel but do not support the existence of a direct impact of fuel efficiency on vehicle travel. Additional tests indicate that fuel price effects have not been constant over time, although the hypothesis of symmetry with respect to price increases and decreases is not rejected. Small and Van Dender (2007) model of a declining rebound effect with income is tested and similar results are obtained.
A.G. Crook Company
1993-04-01
This report was prepared by the A.G. Crook Company, under contract to Bonneville Power Administration, and provides statistics of seasonal volumes and streamflow for 28 selected sites in the Columbia River Basin.
Non-gaussian mode coupling and the statistical cosmological principle
LoVerde, Marilena; Nelson, Elliot; Shandera, Sarah E-mail: eln121@psu.edu
2013-06-01
Local-type primordial non-Gaussianity couples statistics of the curvature perturbation ? on vastly different physical scales. Because of this coupling, statistics (i.e. the polyspectra) of ? in our Hubble volume may not be representative of those in the larger universe that is, they may be biased. The bias depends on the local background value of ?, which includes contributions from all modes with wavelength k?
Quality control and statistical process control for nuclear analytical measurements
Seymour, R.; Sergent, F.; Clark, W.H.C.; Gleason, G.
1993-12-31
The same driving forces that are making businesses examine quality control of manufacturing processes are making laboratories reevaluate their quality control programs. Increased regulation (accountability), global competitiveness (profitability), and potential for litigation (defensibility) are the principal driving forces behind the development and implementation of QA/QC programs in the nuclear analytical laboratory. Both manufacturing and scientific quality control can use identical statistical methods, albeit with some differences in the treatment of the measured data. Today, the approaches to QC programs are quite different for most analytical laboratories as compared with manufacturing sciences. This is unfortunate because the statistical process control methods are directly applicable to measurement processes. It is shown that statistical process control methods can provide many benefits for laboratory QC data treatment.
Fishbone, L.G. |; Moussalli, G.; Naegele, G.
1995-05-01
An approach of short-notice random inspections (SNRIs) for inventory-change verification can enhance the effectiveness and efficiency of international safeguards at natural or low-enriched uranium (LEU) fuel fabrication plants. According to this approach, the plant operator declares the contents of nuclear material items before knowing if an inspection will occur to verify them. Additionally, items about which declarations are newly made should remain available for verification for an agreed time. Then a statistical inference can be made from verification results for items verified during SNRIs to the entire populations, i.e. the entire strata, even if inspectors were not present when many items were received or produced. A six-month field test of the feasibility of such SNRIs took place at the Westinghouse Electric Corporation Commercial Nuclear Fuel Division during 1993. Westinghouse personnel made daily declarations about both feed and product items, uranium hexafluoride cylinders and finished fuel assemblies, using a custom-designed computer ``mailbox``. Safeguards inspectors from the IAEA conducted eight SNRIs to verify these declarations. They arrived unannounced at the plant, in most cases immediately after travel from Canada, where the IAEA maintains a regional office. Items from both strata were verified during the SNRIs by meant of nondestructive assay equipment.
Office of Oil, Gas, and Coal Supply Statistics
U.S. Energy Information Administration (EIA) Indexed Site
Office of Oil, Gas, and Coal Supply Statistics www.eia.gov Natural Gas Monthly April 2016 U.S. Department of Energy Washington, DC 20585 April 2016 U.S. Energy Information Administration | Natural Gas Monthly ii This report was prepared by the U.S. Energy Information Administration (EIA), the statistical and analytical agency within the U.S. Department of Energy. By law, EIA's data, analyses, and forecasts are independent of approval by any other officer or employee of the United States
Rhapsody: I. Structural Properties and Formation History from a Statistical
Office of Scientific and Technical Information (OSTI)
Sample of Re-simulated Cluster-size Halos (Journal Article) | SciTech Connect Rhapsody: I. Structural Properties and Formation History from a Statistical Sample of Re-simulated Cluster-size Halos Citation Details In-Document Search Title: Rhapsody: I. Structural Properties and Formation History from a Statistical Sample of Re-simulated Cluster-size Halos Authors: Wu, Hao-Yi ; /KIPAC, Menlo Park /SLAC /Michigan U. ; Hahn, Oliver ; Wechsler, Risa H. ; Mao, Yao-Yuan ; Behroozi, Peter S. ;
Harlim, John; Mahdi, Adam; Majda, Andrew J.
2014-01-15
A central issue in contemporary science is the development of nonlinear data driven statisticaldynamical models for time series of noisy partial observations from nature or a complex model. It has been established recently that ad-hoc quadratic multi-level regression models can have finite-time blow-up of statistical solutions and/or pathological behavior of their invariant measure. Recently, a new class of physics constrained nonlinear regression models were developed to ameliorate this pathological behavior. Here a new finite ensemble Kalman filtering algorithm is developed for estimating the state, the linear and nonlinear model coefficients, the model and the observation noise covariances from available partial noisy observations of the state. Several stringent tests and applications of the method are developed here. In the most complex application, the perfect model has 57 degrees of freedom involving a zonal (eastwest) jet, two topographic Rossby waves, and 54 nonlinearly interacting Rossby waves; the perfect model has significant non-Gaussian statistics in the zonal jet with blocked and unblocked regimes and a non-Gaussian skewed distribution due to interaction with the other 56 modes. We only observe the zonal jet contaminated by noise and apply the ensemble filter algorithm for estimation. Numerically, we find that a three dimensional nonlinear stochastic model with one level of memory mimics the statistical effect of the other 56 modes on the zonal jet in an accurate fashion, including the skew non-Gaussian distribution and autocorrelation decay. On the other hand, a similar stochastic model with zero memory levels fails to capture the crucial non-Gaussian behavior of the zonal jet from the perfect 57-mode model.
LANSCE | Materials Test Station
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Training Office Contact Administrative nav background Materials Test Station dotline ... Materials Test Station: the Preferred Alternative When completed, the Materials Test ...
Energy Science and Technology Software Center (OSTI)
002854MLTPL00 Automated Nuclear Data Test Suite file:///usr/gapps/CNP_src/us/RR/test_suite_cz/cnp_test_suite
SLAC Accelerator Test Facilities
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
FACET & TF Careers & Education Archived FACET User Facility Quick Launch About FACET & Test Facilities Expand About FACET & Test Facilities FACET & Test Facilities User Portal...
Financial statistics of major publicly owned electric utilities, 1991
Not Available
1993-03-31
The Financial Statistics of Major Publicly Owned Electric Utilities publication presents summary and detailed financial accounting data on the publicly owned electric utilities. The objective of the publication is to provide Federal and State governments, industry, and the general public with data that can be used for policymaking and decisionmaking purposes relating to publicly owned electric utility issues.
Statistical Characterization of Medium-Duty Electric Vehicle Drive Cycles: Preprint
Prohaska, R.; Duran, A.; Ragatz, A.; Kelly, K.
2015-05-01
In an effort to help commercialize technologies for electric vehicles (EVs) through deployment and demonstration projects, the U.S. Department of Energy’s (DOE's) American Recovery and Reinvestment Act (ARRA) provided funding to participating U.S. companies to cover part of the cost of purchasing new EVs. Within the medium- and heavy-duty commercial vehicle segment, both Smith Electric Newton and and Navistar eStar vehicles qualified for such funding opportunities. In an effort to evaluate the performance characteristics of the new technologies deployed in these vehicles operating under real world conditions, data from Smith Electric and Navistar medium-duty EVs were collected, compiled, and analyzed by the National Renewable Energy Laboratory's (NREL) Fleet Test and Evaluation team over a period of 3 years. More than 430 Smith Newton EVs have provided data representing more than 150,000 days of operation. Similarly, data have been collected from more than 100 Navistar eStar EVs, resulting in a comparative total of more than 16,000 operating days. Combined, NREL has analyzed more than 6 million kilometers of driving and 4 million hours of charging data collected from commercially operating medium-duty electric vehicles in various configurations. In this paper, extensive duty-cycle statistical analyses are performed to examine and characterize common vehicle dynamics trends and relationships based on in-use field data. The results of these analyses statistically define the vehicle dynamic and kinematic requirements for each vehicle, aiding in the selection of representative chassis dynamometer test cycles and the development of custom drive cycles that emulate daily operation. In this paper, the methodology and accompanying results of the duty-cycle statistical analysis are presented and discussed. Results are presented in both graphical and tabular formats illustrating a number of key relationships between parameters observed within the data set that relate to medium duty EVs.
Piepel, Gregory F.; Cooley, Scott K.; Kuhn, William L.; Rector, David R.; Heredia-Langner, Alejandro
2015-05-01
This report discusses the statistical methods for quantifying uncertainties in 1) test responses and other parameters in the Large Scale Integrated Testing (LSIT), and 2) estimates of coefficients and predictions of mixing performance from models that relate test responses to test parameters. Testing at a larger scale has been committed to by Bechtel National, Inc. and the U.S. Department of Energy (DOE) to “address uncertainties and increase confidence in the projected, full-scale mixing performance and operations” in the Waste Treatment and Immobilization Plant (WTP).
Statistical analysis of content of Cs-137 in soils in Bansko-Razlog region
Kobilarov, R. G.
2014-11-18
Statistical analysis of the data set consisting of the activity concentrations of {sup 137}Cs in soils in BanskoRazlog region is carried out in order to establish the dependence of the deposition and the migration of {sup 137}Cs on the soil type. The descriptive statistics and the test of normality show that the data set have not normal distribution. Positively skewed distribution and possible outlying values of the activity of {sup 137}Cs in soils were observed. After reduction of the effects of outliers, the data set is divided into two parts, depending on the soil type. Test of normality of the two new data sets shows that they have a normal distribution. Ordinary kriging technique is used to characterize the spatial distribution of the activity of {sup 137}Cs over an area covering 40 km{sup 2} (whole Razlog valley). The result (a map of the spatial distribution of the activity concentration of {sup 137}Cs) can be used as a reference point for future studies on the assessment of radiological risk to the population and the erosion of soils in the study area.
Statistical scaling of geometric characteristics in stochastically generated pore microstructures
Hyman, Jeffrey D.; Guadagnini, Alberto; Winter, C. Larrabee
2015-05-21
In this study, we analyze the statistical scaling of structural attributes of virtual porous microstructures that are stochastically generated by thresholding Gaussian random fields. Characterization of the extent at which randomly generated pore spaces can be considered as representative of a particular rock sample depends on the metrics employed to compare the virtual sample against its physical counterpart. Typically, comparisons against features and/patterns of geometric observables, e.g., porosity and specific surface area, flow-related macroscopic parameters, e.g., permeability, or autocorrelation functions are used to assess the representativeness of a virtual sample, and thereby the quality of the generation method. Here, we rely on manifestations of statistical scaling of geometric observables which were recently observed in real millimeter scale rock samples [13] as additional relevant metrics by which to characterize a virtual sample. We explore the statistical scaling of two geometric observables, namely porosity (?) and specific surface area (SSA), of porous microstructures generated using the method of Smolarkiewicz and Winter [42] and Hyman and Winter [22]. Our results suggest that the method can produce virtual pore space samples displaying the symptoms of statistical scaling observed in real rock samples. Order q sample structure functions (statistical moments of absolute increments) of ? and SSA scale as a power of the separation distance (lag) over a range of lags, and extended self-similarity (linear relationship between log structure functions of successive orders) appears to be an intrinsic property of the generated media. The width of the range of lags where power-law scaling is observed and the Hurst coefficient associated with the variables we consider can be controlled by the generation parameters of the method.
Feature-Based Statistical Analysis of Combustion Simulation Data
Bennett, J; Krishnamoorthy, V; Liu, S; Grout, R; Hawkes, E; Chen, J; Pascucci, V; Bremer, P T
2011-11-18
We present a new framework for feature-based statistical analysis of large-scale scientific data and demonstrate its effectiveness by analyzing features from Direct Numerical Simulations (DNS) of turbulent combustion. Turbulent flows are ubiquitous and account for transport and mixing processes in combustion, astrophysics, fusion, and climate modeling among other disciplines. They are also characterized by coherent structure or organized motion, i.e. nonlocal entities whose geometrical features can directly impact molecular mixing and reactive processes. While traditional multi-point statistics provide correlative information, they lack nonlocal structural information, and hence, fail to provide mechanistic causality information between organized fluid motion and mixing and reactive processes. Hence, it is of great interest to capture and track flow features and their statistics together with their correlation with relevant scalar quantities, e.g. temperature or species concentrations. In our approach we encode the set of all possible flow features by pre-computing merge trees augmented with attributes, such as statistical moments of various scalar fields, e.g. temperature, as well as length-scales computed via spectral analysis. The computation is performed in an efficient streaming manner in a pre-processing step and results in a collection of meta-data that is orders of magnitude smaller than the original simulation data. This meta-data is sufficient to support a fully flexible and interactive analysis of the features, allowing for arbitrary thresholds, providing per-feature statistics, and creating various global diagnostics such as Cumulative Density Functions (CDFs), histograms, or time-series. We combine the analysis with a rendering of the features in a linked-view browser that enables scientists to interactively explore, visualize, and analyze the equivalent of one terabyte of simulation data. We highlight the utility of this new framework for combustion science; however, it is applicable to many other science domains.
Statistical scaling of geometric characteristics in stochastically generated pore microstructures
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Hyman, Jeffrey D.; Guadagnini, Alberto; Winter, C. Larrabee
2015-05-21
In this study, we analyze the statistical scaling of structural attributes of virtual porous microstructures that are stochastically generated by thresholding Gaussian random fields. Characterization of the extent at which randomly generated pore spaces can be considered as representative of a particular rock sample depends on the metrics employed to compare the virtual sample against its physical counterpart. Typically, comparisons against features and/patterns of geometric observables, e.g., porosity and specific surface area, flow-related macroscopic parameters, e.g., permeability, or autocorrelation functions are used to assess the representativeness of a virtual sample, and thereby the quality of the generation method. Here, wemore » rely on manifestations of statistical scaling of geometric observables which were recently observed in real millimeter scale rock samples [13] as additional relevant metrics by which to characterize a virtual sample. We explore the statistical scaling of two geometric observables, namely porosity (Φ) and specific surface area (SSA), of porous microstructures generated using the method of Smolarkiewicz and Winter [42] and Hyman and Winter [22]. Our results suggest that the method can produce virtual pore space samples displaying the symptoms of statistical scaling observed in real rock samples. Order q sample structure functions (statistical moments of absolute increments) of Φ and SSA scale as a power of the separation distance (lag) over a range of lags, and extended self-similarity (linear relationship between log structure functions of successive orders) appears to be an intrinsic property of the generated media. The width of the range of lags where power-law scaling is observed and the Hurst coefficient associated with the variables we consider can be controlled by the generation parameters of the method.« less
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Industry Test Stand Experience Stephen Hess, EPRI Heather Feldman, EPRI Brenden Mervin, .........1 2. Westinghouse Test Stand ......
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Revision 1 Effective June 2008 Control of Test Conduct Prepared by Electric ......... 4 6.1 Test Activities ......
Pratt Whitney Rocketdyne Testing
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Battery Abuse Testing Laboratory Cylindrical Boiling Facility Distributed Energy Technology Lab Microsystems and Engineering Sciences Applications National Solar Thermal Test ...
High Statistics Study of Nearby Type 1a Supernovae. QUEST Camera...
Office of Scientific and Technical Information (OSTI)
Statistics Study of Nearby Type 1a Supernovae. QUEST Camera Short Term Maintenance: Final Technical Report Citation Details In-Document Search Title: High Statistics Study of ...
High Statistics Study of Nearby Type 1a Supernovae. QUEST Camera...
Office of Scientific and Technical Information (OSTI)
High Statistics Study of Nearby Type 1a Supernovae. QUEST Camera Short Term Maintenance: Final Technical Report Citation Details In-Document Search Title: High Statistics Study of ...
Starkov, V. N.; Semenov, A. A.; Gomonay, H. V.
2009-07-15
We demonstrate a practical possibility of loss compensation in measured photocounting statistics in the presence of dark counts and background radiation noise. It is shown that satisfactory results are obtained even in the case of low detection efficiency and large experimental errors.
Summary Statistics for Homemade ?Play Dough? -- Data Acquired at LLNL
Kallman, J S; Morales, K E; Whipple, R E; Huber, R D; Martz, A; Brown, W D; Smith, J A; Schneberk, D J; Martz, Jr., H E; White, III, W T
2010-03-11
Using x-ray computerized tomography (CT), we have characterized the x-ray linear attenuation coefficients (LAC) of a homemade Play Dough{trademark}-like material, designated as PDA. Table 1 gives the first-order statistics for each of four CT measurements, estimated with a Gaussian kernel density estimator (KDE) analysis. The mean values of the LAC range from a high of about 2700 LMHU{sub D} 100kVp to a low of about 1200 LMHUD at 300kVp. The standard deviation of each measurement is around 10% to 15% of the mean. The entropy covers the range from 6.0 to 7.4. Ordinarily, we would model the LAC of the material and compare the modeled values to the measured values. In this case, however, we did not have the detailed chemical composition of the material and therefore did not model the LAC. Using a method recently proposed by Lawrence Livermore National Laboratory (LLNL), we estimate the value of the effective atomic number, Z{sub eff}, to be near 10. LLNL prepared about 50mL of the homemade 'Play Dough' in a polypropylene vial and firmly compressed it immediately prior to the x-ray measurements. We used the computer program IMGREC to reconstruct the CT images. The values of the key parameters used in the data capture and image reconstruction are given in this report. Additional details may be found in the experimental SOP and a separate document. To characterize the statistical distribution of LAC values in each CT image, we first isolated an 80% central-core segment of volume elements ('voxels') lying completely within the specimen, away from the walls of the polypropylene vial. All of the voxels within this central core, including those comprised of voids and inclusions, are included in the statistics. We then calculated the mean value, standard deviation and entropy for (a) the four image segments and for (b) their digital gradient images. (A digital gradient image of a given image was obtained by taking the absolute value of the difference between the initial image and that same image offset by one voxel horizontally, parallel to the rows of the x-ray detector array.) The statistics of the initial image of LAC values are called 'first order statistics;' those of the gradient image, 'second order statistics.'
Summary Statistics for Fun Dough Data Acquired at LLNL
Kallman, J S; Morales, K E; Whipple, R E; Huber, R D; Brown, W D; Smith, J A; Schneberk, D J; Martz, Jr., H E; White, III, W T
2010-03-11
Using x-ray computerized tomography (CT), we have characterized the x-ray linear attenuation coefficients (LAC) of a Play Dough{trademark}-like product, Fun Dough{trademark}, designated as PD. Table 1 gives the first-order statistics for each of four CT measurements, estimated with a Gaussian kernel density estimator (KDE) analysis. The mean values of the LAC range from a high of about 2100 LMHU{sub D} at 100kVp to a low of about 1100 LMHU{sub D} at 300kVp. The standard deviation of each measurement is around 1% of the mean. The entropy covers the range from 3.9 to 4.6. Ordinarily, we would model the LAC of the material and compare the modeled values to the measured values. In this case, however, we did not have the composition of the material and therefore did not model the LAC. Using a method recently proposed by Lawrence Livermore National Laboratory (LLNL), we estimate the value of the effective atomic number, Z{sub eff}, to be near 8.5. LLNL prepared about 50mL of the Fun Dough{trademark} in a polypropylene vial and firmly compressed it immediately prior to the x-ray measurements. Still, layers can plainly be seen in the reconstructed images, indicating that the bulk density of the material in the container is affected by voids and bubbles. We used the computer program IMGREC to reconstruct the CT images. The values of the key parameters used in the data capture and image reconstruction are given in this report. Additional details may be found in the experimental SOP and a separate document. To characterize the statistical distribution of LAC values in each CT image, we first isolated an 80% central-core segment of volume elements ('voxels') lying completely within the specimen, away from the walls of the polypropylene vial. All of the voxels within this central core, including those comprised of voids and inclusions, are included in the statistics. We then calculated the mean value, standard deviation and entropy for (a) the four image segments and for (b) their digital gradient images. (A digital gradient image of a given image was obtained by taking the absolute value of the difference between the initial image and that same image offset by one voxel horizontally, parallel to the rows of the x-ray detector array.) The statistics of the initial image of LAC values are called 'first order statistics;' those of the gradient image, 'second order statistics.'
Delande, D.; Gay, J.C.
1986-10-20
The transition to chaos in ''the hydrogen atom in a magnetic field'' is numerically studied and shown to lead to well-defined signature on the energy-level fluctuations. Upon an increase in the energy, the calculated statistics evolve from Poisson to Gaussian orthogonal ensemble according to the regular or chaotic character of the classical motion. Several methods are employed to test the generic nature of these distributions.
Complex statistics and diffusion in nonlinear disordered particle chains
Antonopoulos, Ch. G.; Bountis, T.; Skokos, Ch.; Drossos, L.
2014-06-15
We investigate dynamically and statistically diffusive motion in a Klein-Gordon particle chain in the presence of disorder. In particular, we examine a low energy (subdiffusive) and a higher energy (self-trapping) case and verify that subdiffusive spreading is always observed. We then carry out a statistical analysis of the motion, in both cases, in the sense of the Central Limit Theorem and present evidence of different chaos behaviors, for various groups of particles. Integrating the equations of motion for times as long as 10{sup 9}, our probability distribution functions always tend to Gaussians and show that the dynamics does not relax onto a quasi-periodic Kolmogorov-Arnold-Moser torus and that diffusion continues to spread chaotically for arbitrarily long times.
Lifetime statistics of quantum chaos studied by a multiscale analysis
Di Falco, A.; Krauss, T. F. [School of Physics and Astronomy, University of St. Andrews, North Haugh, St. Andrews, KY16 9SS (United Kingdom); Fratalocchi, A. [PRIMALIGHT, Faculty of Electrical Engineering, Applied Mathematics and Computational Science, King Abdullah University of Science and Technology (KAUST), Thuwal 23955-6900 (Saudi Arabia)
2012-04-30
In a series of pump and probe experiments, we study the lifetime statistics of a quantum chaotic resonator when the number of open channels is greater than one. Our design embeds a stadium billiard into a two dimensional photonic crystal realized on a silicon-on-insulator substrate. We calculate resonances through a multiscale procedure that combines energy landscape analysis and wavelet transforms. Experimental data is found to follow the universal predictions arising from random matrix theory with an excellent level of agreement.
U.S. Department of Commerce Economics and Statistics Administration
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Commerce Economics and Statistics Administration 48% 24% 52% 76% 0% 20% 40% 60% 80% 100% All jobs STEM jobs Men Women By David Beede, Tiffany Julian, David Langdon, George McKittrick, Beethika Khan, and Mark Doms, Office of the Chief Economist Women in STEM: A Gender Gap to Innovation August 2011 Executive Summary ESA Issue Brief #04-11 O ur science, technology, engineering and math (STEM) workforce is crucial to America's innovative capacity and global competitiveness. Yet women are vastly
Financial statistics of selected investor-owned electric utilities, 1989
Not Available
1991-01-01
The Financial Statistics of Selected Investor-Owned Electric Utilities publication presents summary and detailed financial accounting data on the investor-owned electric utilities. The objective of the publication is to provide the Federal and State governments, industry, and the general public with current and historical data that can be used for policymaking and decisionmaking purposes related to investor-owned electric utility issues.
Spatial statistics for predicting flow through a rock fracture
Coakley, K.J.
1989-03-01
Fluid flow through a single rock fracture depends on the shape of the space between the upper and lower pieces of rock which define the fracture. In this thesis, the normalized flow through a fracture, i.e. the equivalent permeability of a fracture, is predicted in terms of spatial statistics computed from the arrangement of voids, i.e. open spaces, and contact areas within the fracture. Patterns of voids and contact areas, with complexity typical of experimental data, are simulated by clipping a correlated Gaussian process defined on a N by N pixel square region. The voids have constant aperture; the distance between the upper and lower surfaces which define the fracture is either zero or a constant. Local flow is assumed to be proportional to local aperture cubed times local pressure gradient. The flow through a pattern of voids and contact areas is solved using a finite-difference method. After solving for the flow through simulated 10 by 10 by 30 pixel patterns of voids and contact areas, a model to predict equivalent permeability is developed. The first model is for patterns with 80% voids where all voids have the same aperture. The equivalent permeability of a pattern is predicted in terms of spatial statistics computed from the arrangement of voids and contact areas within the pattern. Four spatial statistics are examined. The change point statistic measures how often adjacent pixel alternate from void to contact area (or vice versa ) in the rows of the patterns which are parallel to the overall flow direction. 37 refs., 66 figs., 41 tabs.
ARSCL Cloud Statistics - A Value-Added Product
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
ARSCL Cloud Statistics - A Value-Added Product Y. Shi Pacific Northwest National Laboratory Richland, Washington M. A. Miller Brookhaven National Laboratory Upton, New York Introduction The active remote sensing of cloud layers (ARSCLs) value-added product (VAP) combines data from active remote sensors to produce an objective determination of cloud location, radar reflectivity, vertical velocity, and Doppler spectral width. Information about the liquid water path (LWP) in these clouds and the
Spectral statistics in noninteracting many-particle systems
Munoz, L.; Relano, A.; Retamosa, J. [Departamento de Fisica Atomica, Molecular y Nuclear, Universidad Complutense de Madrid, E-28040 Madrid (Spain); Faleiro, E. [Departamento de Fisica Aplicada, E.U.I.T. Industrial, Universidad Politecnica de Madrid, E-28012 Madrid (Spain); Molina, R.A. [Max-Planck-Institut fuer Physik Komplexer Systeme, Noethnitzer Strasse 38, D-01187 Dresden (Germany)
2006-03-15
It is widely accepted that the statistical properties of energy level spectra provide an essential characterization of quantum chaos. Indeed, the spectral fluctuations of many different systems like quantum billiards, atoms, or atomic nuclei have been studied. However, noninteracting many-body systems have received little attention, since it is assumed that they must exhibit Poisson-like fluctuations. Apart from a heuristic argument of Bloch, there are neither systematic numerical calculations nor a rigorous derivation of this fact. Here we present a rigorous study of the spectral fluctuations of noninteracting identical particles moving freely in a mean field emphasizing the evolution with the number of particles N as well as with the energy. Our results are conclusive. For N{>=}2 the spectra of these systems exhibit Poisson fluctuations provided that we consider sufficiently high excitation energies. Nevertheless, when the mean field is chaotic there exists a critical energy scale L{sub c}; beyond this scale, the fluctuations deviate from the Poisson statistics as a reminiscence of the statistical properties of the mean field.
Electron transfer statistics and thermal fluctuations in molecular junctions
Goswami, Himangshu Prabal; Harbola, Upendra
2015-02-28
We derive analytical expressions for probability distribution function (PDF) for electron transport in a simple model of quantum junction in presence of thermal fluctuations. Our approach is based on the large deviation theory combined with the generating function method. For large number of electrons transferred, the PDF is found to decay exponentially in the tails with different rates due to applied bias. This asymmetry in the PDF is related to the fluctuation theorem. Statistics of fluctuations are analyzed in terms of the Fano factor. Thermal fluctuations play a quantitative role in determining the statistics of electron transfer; they tend to suppress the average current while enhancing the fluctuations in particle transfer. This gives rise to both bunching and antibunching phenomena as determined by the Fano factor. The thermal fluctuations and shot noise compete with each other and determine the net (effective) statistics of particle transfer. Exact analytical expression is obtained for delay time distribution. The optimal values of the delay time between successive electron transfers can be lowered below the corresponding shot noise values by tuning the thermal effects.
Yu, Victoria; Kishan, Amar U.; Cao, Minsong; Low, Daniel; Lee, Percy; Ruan, Dan
2014-03-15
Purpose: To demonstrate a new method of evaluating dose response of treatment-induced lung radiographic injury post-SBRT (stereotactic body radiotherapy) treatment and the discovery of bimodal dose behavior within clinically identified injury volumes. Methods: Follow-up CT scans at 3, 6, and 12 months were acquired from 24 patients treated with SBRT for stage-1 primary lung cancers or oligometastic lesions. Injury regions in these scans were propagated to the planning CT coordinates by performing deformable registration of the follow-ups to the planning CTs. A bimodal behavior was repeatedly observed from the probability distribution for dose values within the deformed injury regions. Based on a mixture-Gaussian assumption, an Expectation-Maximization (EM) algorithm was used to obtain characteristic parameters for such distribution. Geometric analysis was performed to interpret such parameters and infer the critical dose level that is potentially inductive of post-SBRT lung injury. Results: The Gaussian mixture obtained from the EM algorithm closely approximates the empirical dose histogram within the injury volume with good consistency. The average Kullback-Leibler divergence values between the empirical differential dose volume histogram and the EM-obtained Gaussian mixture distribution were calculated to be 0.069, 0.063, and 0.092 for the 3, 6, and 12 month follow-up groups, respectively. The lower Gaussian component was located at approximately 70% prescription dose (35 Gy) for all three follow-up time points. The higher Gaussian component, contributed by the dose received by planning target volume, was located at around 107% of the prescription dose. Geometrical analysis suggests the mean of the lower Gaussian component, located at 35 Gy, as a possible indicator for a critical dose that induces lung injury after SBRT. Conclusions: An innovative and improved method for analyzing the correspondence between lung radiographic injury and SBRT treatment dose has been demonstrated. Bimodal behavior was observed in the dose distribution of lung injury after SBRT. Novel statistical and geometrical analysis has shown that the systematically quantified low-dose peak at approximately 35 Gy, or 70% prescription dose, is a good indication of a critical dose for injury. The determined critical dose of 35 Gy resembles the critical dose volume limit of 30 Gy for ipsilateral bronchus in RTOG 0618 and results from previous studies. The authors seek to further extend this improved analysis method to a larger cohort to better understand the interpatient variation in radiographic lung injury dose response post-SBRT.
Final Report: Large-Scale Optimization for Bayesian Inference in Complex Systems
Ghattas, Omar
2013-10-15
The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimiza- tion) Project focuses on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimiza- tion and inversion methods. Our research is directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. Our efforts are integrated in the context of a challenging testbed problem that considers subsurface reacting flow and transport. The MIT component of the SAGUARO Project addresses the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas-Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to- observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as "reduce then sample" and "sample then reduce." In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to achieve their speedups.
Gross, R; Stephen Harris, S
2009-02-18
The Savannah River Site operates a Relief Valve Repair Shop certified by the National Board of Pressure Vessel Inspectors to NB-23, The National Board Inspection Code. Local maintenance forces perform inspection, testing, and repair of approximately 1200 spring-operated relief valves (SORV) each year as the valves are cycled in from the field. The Site now has over 7000 certified test records in the Computerized Maintenance Management System (CMMS); a summary of that data is presented in this paper. In previous papers, several statistical techniques were used to investigate failure on demand and failure rates including a quantal response method for predicting the failure probability as a function of time in service. The non-conservative failure mode for SORV is commonly termed 'stuck shut'; industry defined as the valve opening at greater than or equal to 1.5 times the cold set pressure. Actual time to failure is typically not known, only that failure occurred some time since the last proof test (censored data). This paper attempts to validate the assumptions underlying the statistical lifetime prediction results using Monte Carlo simulation. It employs an aging model for lift pressure as a function of set pressure, valve manufacturer, and a time-related aging effect. This paper attempts to answer two questions: (1) what is the predicted failure rate over the chosen maintenance/ inspection interval; and do we understand aging sufficient enough to estimate risk when basing proof test intervals on proof test results?
Visual Sample Plan (VSP) Statistical Software as Related to the CTBTOs On-Site Inspection Procedure
Pulsipher, Trenton C.; Walsh, Stephen J.; Pulsipher, Brent A.; Milbrath, Brian D.
2010-09-01
In the event of a potential nuclear weapons test the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) is commissioned to conduct an on-site investigation (OSI) of the suspected test site in an effort to find confirmatory evidence of the nuclear test. The OSI activities include collecting air, surface soil, and underground samples to search for indications of a nuclear weapons test - these indicators include radionuclides and radioactive isotopes Ar and Xe. This report investigates the capability of the Visual Sample Plan (VSP) software to contribute to the sampling activities of the CTBTO during an OSI. VSP is a statistical sampling design software, constructed under data quality objectives, which has been adapted for environmental remediation and contamination detection problems for the EPA, US Army, DoD and DHS among others. This report provides discussion of a number of VSP sample designs, which may be pertinent to the work undertaken during an OSI. Examples and descriptions of such designs include hot spot sampling, combined random and judgment sampling, multiple increment sampling, radiological transect surveying, and a brief description of other potentially applicable sampling methods. Further, this work highlights a potential need for the use of statistically based sample designs in OSI activities. The use of such designs may enable canvassing a sample area without full sampling, provide a measure of confidence that radionuclides are not present, and allow investigators to refocus resources in other areas of concern.
Vendor System Vulnerability Testing Test Plan
James R. Davidson
2005-01-01
The Idaho National Laboratory (INL) prepared this generic test plan to provide clients (vendors, end users, program sponsors, etc.) with a sense of the scope and depth of vulnerability testing performed at the INLs Supervisory Control and Data Acquisition (SCADA) Test Bed and to serve as an example of such a plan. Although this test plan specifically addresses vulnerability testing of systems applied to the energy sector (electric/power transmission and distribution and oil and gas systems), it is generic enough to be applied to control systems used in other critical infrastructures such as the transportation sector, water/waste water sector, or hazardous chemical production facilities. The SCADA Test Bed is established at the INL as a testing environment to evaluate the security vulnerabilities of SCADA systems, energy management systems (EMS), and distributed control systems. It now supports multiple programs sponsored by the U.S. Department of Energy, the U.S. Department of Homeland Security, other government agencies, and private sector clients. This particular test plan applies to testing conducted on a SCADA/EMS provided by a vendor. Before performing detailed vulnerability testing of a SCADA/EMS, an as delivered baseline examination of the system is conducted, to establish a starting point for all-subsequent testing. The series of baseline tests document factory delivered defaults, system configuration, and potential configuration changes to aid in the development of a security plan for in depth vulnerability testing. The baseline test document is provided to the System Provider,a who evaluates the baseline report and provides recommendations to the system configuration to enhance the security profile of the baseline system. Vulnerability testing is then conducted at the SCADA Test Bed, which provides an in-depth security analysis of the Vendors system.b a. The term System Provider replaces the name of the company/organization providing the system being evaluated. This can be the system manufacturer, a system user, or a third party organization such as a government agency. b. The term Vendor (or Vendors) System replaces the name of the specific SCADA/EMS being tested.
ZEST flight test experiments, Kauai Test Facility, Hawaii. Test report
Cenkci, M.J.
1991-07-01
The Strategic Defense Initiative Organization (SDIO) is proposing to execute two ZEST flight experiments to obtain information related to the following objectives: validation of payload modeling; characterization of a high energy release cloud; and documentation of scientific phenomena that may occur as a result of releasing a high energy cloud. The proposed action is to design, develop, launch, and detonate two payloads carrying high energy explosives. Activities required to support this proposal include: (1) execution of component assembly tests at Space Data Division (SDD) in Chandler, Arizona and Los Alamos National Laboratory (LANL) in Los Alamos, New Mexico, and (2) execution of pre-flight flight test activities at Kauai Test Facility.
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Major Test Partners Once a technology is ready to be tested at pilot or commercial scale, the cost of building a test facility becomes significant -- often beyond the funding ...
Test | Open Energy Information
Test Jump to: navigation, search OpenEI Reference LibraryAdd to library Report: Test Published Publisher Not Provided, Date Not Provided Report Number Test DOI Not Provided Check...
EIA - Advice from Meetings of the ASA Committee on Energy Statistics
U.S. Energy Information Administration (EIA) Indexed Site
Advice from Meetings of the ASA Committee on Energy Statistics Transcripts and Summaries from the American Statistical Association Committee on Energy Statistics The U.S. Energy Information Administration seeks technical advice semi-annually from the American Statistical Association Committee on Energy Statistics. The meetings are held in the spring and fall in Washington, D.C., and are announced in the Federal Register. These meetings are open to the public and are typically held on Thursday
User Statistics Collection Practices Archives | U.S. DOE Office of Science
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
(SC) Policies and Processes » User Statistics Collection Practices » User Statistics Collection Practices Archives User Facilities User Facilities Home User Facilities at a Glance User Resources User Statistics Policies and Processes Definition Designation Process Official List of SC User Facilities User Statistics Collection Practices User Statistics Collection Practices Archives Frequently Asked Questions User Facility Science Highlights User Facility News Contact Information Office of
SU-E-J-261: Statistical Analysis and Chaotic Dynamics of Respiratory Signal of Patients in BodyFix
Michalski, D; Huq, M; Bednarz, G; Lalonde, R; Yang, Y; Heron, D
2014-06-01
Purpose: To quantify respiratory signal of patients in BodyFix undergoing 4DCT scan with and without immobilization cover. Methods: 20 pairs of respiratory tracks recorded with RPM system during 4DCT scan were analyzed. Descriptive statistic was applied to selected parameters of exhale-inhale decomposition. Standardized signals were used with the delay method to build orbits in embedded space. Nonlinear behavior was tested with surrogate data. Sample entropy SE, Lempel-Ziv complexity LZC and the largest Lyapunov exponents LLE were compared. Results: Statistical tests show difference between scans for inspiration time and its variability, which is bigger for scans without cover. The same is for variability of the end of exhalation and inhalation. Other parameters fail to show the difference. For both scans respiratory signals show determinism and nonlinear stationarity. Statistical test on surrogate data reveals their nonlinearity. LLEs show signals chaotic nature and its correlation with breathing period and its embedding delay time. SE, LZC and LLE measure respiratory signal complexity. Nonlinear characteristics do not differ between scans. Conclusion: Contrary to expectation cover applied to patients in BodyFix appears to have limited effect on signal parameters. Analysis based on trajectories of delay vectors shows respiratory system nonlinear character and its sensitive dependence on initial conditions. Reproducibility of respiratory signal can be evaluated with measures of signal complexity and its predictability window. Longer respiratory period is conducive for signal reproducibility as shown by these gauges. Statistical independence of the exhale and inhale times is also supported by the magnitude of LLE. The nonlinear parameters seem more appropriate to gauge respiratory signal complexity since its deterministic chaotic nature. It contrasts with measures based on harmonic analysis that are blind for nonlinear features. Dynamics of breathing, so crucial for 4D-based clinical technologies, can be better controlled if nonlinear-based methodology, which reflects respiration characteristic, is applied. Funding provided by Varian Medical Systems via Investigator Initiated Research Project.
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Electric Transportation Applications All Rights Reserved NEVAmerica Test Sequence Rev 2 ... Electric Transportation Applications All Rights Reserved NEVAmerica Test Sequence Rev 2 ...
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
in greater detail in the Nevada Test Site Environ- mental Report 2004 (DOENV11718-1080). ... mental programs and efforts Nevada Test Site Environmental Report 2004 Summary ...
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
HICEV America TEST SEQUENCE Revision 0 November 1, 2004 Prepared by Electric ... Donald B. Karner HICEV America Test Sequence Page 1 2004 Electric ...
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
OMB MPI Tests OMB MPI Tests Description The Ohio MicroBenchmark suite is a collection of independent MPI message passing performance microbenchmarks developed and written at The...
National Nuclear Security Administration (NNSA)
Detection System (USNDS), which monitors compliance with the international Limited Test Ban Treaty (LTBT). The LTBT, signed by 108 countries, prohibits nuclear testing in the...
Guo, Genliang; George, S.A.; Lindsey, R.P.
1997-08-01
Thirty-six sets of surface lineaments and fractures mapped from satellite images and/or aerial photos from parts of the Mid-continent and Colorado Plateau regions were collected, digitized, and statistically analyzed in order to obtain the probability distribution functions of natural fractures for characterizing naturally fractured reservoirs. The orientations and lengths of the surface linear features were calculated using the digitized coordinates of the two end points of each individual linear feature. The spacing data of the surface linear features within an individual set were, obtained using a new analytical sampling technique. Statistical analyses were then performed to find the best-fit probability distribution functions for the orientation, length, and spacing of each data set. Twenty-five hypothesized probability distribution functions were used to fit each data set. A chi-square goodness-of-fit test was used to rank the significance of each fit. A distribution which provides the lowest chi-square goodness-of-fit value was considered the best-fit distribution. The orientations of surface linear features were best-fitted by triangular, normal, or logistic distributions; the lengths were best-fitted by PearsonVI, PearsonV, lognormal2, or extreme-value distributions; and the spacing data were best-fitted by lognormal2, PearsonVI, or lognormal distributions. These probability functions can be used to stochastically characterize naturally fractured reservoirs.
Statistical Characterization of School Bus Drive Cycles Collected via Onboard Logging Systems
Duran, A.; Walkowicz, K.
2013-10-01
In an effort to characterize the dynamics typical of school bus operation, National Renewable Energy Laboratory (NREL) researchers set out to gather in-use duty cycle data from school bus fleets operating across the country. Employing a combination of Isaac Instruments GPS/CAN data loggers in conjunction with existing onboard telemetric systems resulted in the capture of operating information for more than 200 individual vehicles in three geographically unique domestic locations. In total, over 1,500 individual operational route shifts from Washington, New York, and Colorado were collected. Upon completing the collection of in-use field data using either NREL-installed data acquisition devices or existing onboard telemetry systems, large-scale duty-cycle statistical analyses were performed to examine underlying vehicle dynamics trends within the data and to explore vehicle operation variations between fleet locations. Based on the results of these analyses, high, low, and average vehicle dynamics requirements were determined, resulting in the selection of representative standard chassis dynamometer test cycles for each condition. In this paper, the methodology and accompanying results of the large-scale duty-cycle statistical analysis are presented, including graphical and tabular representations of a number of relationships between key duty-cycle metrics observed within the larger data set. In addition to presenting the results of this analysis, conclusions are drawn and presented regarding potential applications of advanced vehicle technology as it relates specifically to school buses.
The Fall Meeting of the Committee on Energy Statistics
U.S. Energy Information Administration (EIA) Indexed Site
* * * * * FRIDAY NOVEMBER 5, 1999 The Fall Meeting of the Committee on Energy Statistics commenced at 8:30 a.m. at the Department of Energy, 1000 Independence Avenue, S.W., Room 8E089, Washington, D.C., Daniel Relles, presiding. PRESENT: DANIEL RELLES, Chairman JAY BREIDT LYNDA CARLSON THOMAS COWING CAROL GOTWAY CRAWFORD JAY HAKES JAMES HAMMITT PHILIP HANSER CALVIN KENT W. DAVID MONTGOMERY LARRY PETTIS SEYMOUR SUDMAN BILL WEINIG ROY WHITMORE C O N T E N T S PAGE Opening 5 Addressing Accuracy in
Statistical anisotropy of the curvature perturbation from vector field perturbations
Dimopoulos, Konstantinos; Karciauskas, Mindaugas; Lyth, David H.; Rodriguez, Yeinzon E-mail: m.karciauskas@lancaster.ac.uk E-mail: yeinzon.rodriguez@uan.edu.co
2009-05-15
The {delta}N formula for the primordial curvature perturbation {zeta} is extended to include vector as well as scalar fields. Formulas for the tree-level contributions to the spectrum and bispectrum of {zeta} are given, exhibiting statistical anisotropy. The one-loop contribution to the spectrum of {zeta} is also worked out. We then consider the generation of vector field perturbations from the vacuum, including the longitudinal component that will be present if there is no gauge invariance. Finally, the {delta}N formula is applied to the vector curvaton and vector inflation models with the tensor perturbation also evaluated in the latter case.
Office of Oil, Gas, and Coal Supply Statistics
U.S. Energy Information Administration (EIA) Indexed Site
3 U.S. Department of Energy Washington, DC 20585 2013 U.S. Energy Information Administration | Natural Gas Monthly ii This report was prepared by the U.S. Energy Information Administration (EIA), the statistical and analytical agency within the U.S. Department of Energy. By law, EIA's data, analyses, and forecasts are independent of approval by any other officer or employee of the United States Government. The views in this report therefore should not be construed as representing those of the
Office of Oil, Gas, and Coal Supply Statistics
U.S. Energy Information Administration (EIA) Indexed Site
4 U.S. Department of Energy Washington, DC 20585 2014 U.S. Energy Information Administration | Natural Gas Monthly ii This report was prepared by the U.S. Energy Information Administration (EIA), the statistical and analytical agency within the U.S. Department of Energy. By law, EIA's data, analyses, and forecasts are independent of approval by any other officer or employee of the United States Government. The views in this report therefore should not be construed as representing those of the
Advances on statistical/thermodynamical models for unpolarized structure functions
Trevisan, Luis A.; Mirez, Carlos; Tomio, Lauro
2013-03-25
During the eights and nineties many statistical/thermodynamical models were proposed to describe the nucleons' structure functions and distribution of the quarks in the hadrons. Most of these models describe the compound quarks and gluons inside the nucleon as a Fermi / Bose gas respectively, confined in a MIT bag with continuous energy levels. Another models considers discrete spectrum. Some interesting features of the nucleons are obtained by these models, like the sea asymmetries {sup -}d/{sup -}u and {sup -}d-{sup -}u.
Statistics at work in heavy-ion reactions
Moretto, L.G.
1982-07-01
In the first part special aspects of the compound nucleus decay are considered. The evaporation of particles intermediate between nucleons and fission fragments is explored both theoretically and experimentally. The limitations of the fission decay width expression obtained with the transition state method are discussed, and a more general approach is proposed. In the second part the process of angular momentum transfer in deep inelastic reactions is considered. The limit of statistical equilibrium is studied and specifically applied to the estimation of the degree of alignment of the fragment spins. The magnitude and alignment of the transferred angular momentum is experimentally determined from sequentially emitted alpha, gamma, and fission fragments.
Statistical Software for spatial analysis of stratigraphic data sets
Energy Science and Technology Software Center (OSTI)
2003-04-08
Stratistics s a tool for statistical analysis of spatially explicit data sets and model output for description and for model-data comparisons. lt is intended for the analysis of data sets commonly used in geology, such as gamma ray logs and lithologic sequences, as well as 2-D data such as maps. Stratistics incorporates a far wider range of spatial analysis methods drawn from multiple disciplines, than are currently available in other packages. These include incorporation ofmore » techniques from spatial and landscape ecology, fractal analysis, and mathematical geology. Its use should substantially reduce the risk associated with the use of predictive models« less
Poincar recurrence statistics as an indicator of chaos synchronization
Boev, Yaroslav I. Vadivasova, Tatiana E. Anishchenko, Vadim S.
2014-06-15
The dynamics of the autonomous and non-autonomous Rssler system is studied using the Poincar recurrence time statistics. It is shown that the probability distribution density of Poincar recurrences represents a set of equidistant peaks with the distance that is equal to the oscillation period and the envelope obeys an exponential distribution. The dimension of the spatially uniform Rssler attractor is estimated using Poincar recurrence times. The mean Poincar recurrence time in the non-autonomous Rssler system is locked by the external frequency, and this enables us to detect the effect of phase-frequency synchronization.
Statistical thermodynamics of strain hardening in polycrystalline solids
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Langer, James S.
2015-01-01
This paper starts with a systematic rederivation of the statistical thermodynamic equations of motion for dislocation-mediated plasticity proposed in 2010 by Langer, Bouchbinder, and Lookman. The paper then uses that theory to explain the anomalous rate-hardening behavior reported in 1988 by Follansbee and Kocks and to explore the relation between hardening rate and grain size reported in 1995 by Meyers et al. A central theme is the need for physics-based, nonequilibrium analyses in developing predictive theories of the strength of polycrystalline materials.
Financial statistics of major US publicly owned electric utilities 1993
Not Available
1995-02-01
The 1993 edition of the Financial Statistics of Major U.S. Publicly Owned Electric Utilities publication presents five years (1989 to 1993) of summary financial data and current year detailed financial data on the major publicly owned electric utilities. The objective of the publication is to provide Federal and State governments, industry, and the general public with current and historical data that can be used for policymaking and decision making purposes related to publicly owned electric utility issues. Generator and nongenerator summaries are presented in this publication. The primary source of publicly owned financial data is the Form EIA-412, the Annual Report of Public Electric Utilities, filed on a fiscal basis.
Younger Dryas Boundary (YDB) impact : physical and statistical
Office of Scientific and Technical Information (OSTI)
impossibility. (Conference) | SciTech Connect The YDB impact hypothesis of Firestone et al. (2007) is so extremely improbable it can be considered statistically impossible in addition to being physically impossible. Comets make up only about 1% of the population of Earth-crossing objects. Broken comets are a vanishingly small fraction, and only exist as Earth-sized clusters for a very short period of time. Only a small fraction of impacts occur at angles as shallow as proposed by the YDB
Test report for core drilling ignitability testing
Witwer, K.S.
1996-08-08
Testing was carried out with the cooperation of Westinghouse Hanford Company and the United States Bureau of Mines at the Pittsburgh Research Center in Pennsylvania under the Memorandum of Agreement 14- 09-0050-3666. Several core drilling equipment items, specifically those which can come in contact with flammable gasses while drilling into some waste tanks, were tested under conditions similar to actual field sampling conditions. Rotary drilling against steel and rock as well as drop testing of several different pieces of equipment in a flammable gas environment were the specific items addressed. The test items completed either caused no ignition of the gas mixture, or, after having hardware changes or drilling parameters modified, produced no ignition in repeat testing.
Hacke, P.; Spataru, S.
2014-08-01
We propose a method for increasing the frequency of data collection and reducing the time and cost of accelerated lifetime testing of photovoltaic modules undergoing potential-induced degradation (PID). This consists of in-situ measurements of dark current-voltage curves of the modules at elevated stress temperature, their use to determine the maximum power at 25 degrees C standard test conditions (STC), and distribution statistics for determining degradation rates as a function of stress level. The semi-continuous data obtained by this method clearly show degradation curves of the maximum power, including an incubation phase, rates and extent of degradation, precise time to failure, and partial recovery. Stress tests were performed on crystalline silicon modules at 85% relative humidity and 60 degrees C, 72 degrees C, and 85 degrees C. Activation energy for the mean time to failure (1% relative) of 0.85 eV was determined and a mean time to failure of 8,000 h at 25 degrees C and 85% relative humidity is predicted. No clear trend in maximum degradation as a function of stress temperature was observed.
Dynamometer Testing (Fact Sheet)
Not Available
2010-11-01
This fact sheet describes the dynamometer and its testing capabilities at the National Wind Technology Center.
Image segmentation by hierarchial agglomeration of polygons using ecological statistics
Prasad, Lakshman; Swaminarayan, Sriram
2013-04-23
A method for rapid hierarchical image segmentation based on perceptually driven contour completion and scene statistics is disclosed. The method begins with an initial fine-scale segmentation of an image, such as obtained by perceptual completion of partial contours into polygonal regions using region-contour correspondences established by Delaunay triangulation of edge pixels as implemented in VISTA. The resulting polygons are analyzed with respect to their size and color/intensity distributions and the structural properties of their boundaries. Statistical estimates of granularity of size, similarity of color, texture, and saliency of intervening boundaries are computed and formulated into logical (Boolean) predicates. The combined satisfiability of these Boolean predicates by a pair of adjacent polygons at a given segmentation level qualifies them for merging into a larger polygon representing a coarser, larger-scale feature of the pixel image and collectively obtains the next level of polygonal segments in a hierarchy of fine-to-coarse segmentations. The iterative application of this process precipitates textured regions as polygons with highly convolved boundaries and helps distinguish them from objects which typically have more regular boundaries. The method yields a multiscale decomposition of an image into constituent features that enjoy a hierarchical relationship with features at finer and coarser scales. This provides a traversable graph structure from which feature content and context in terms of other features can be derived, aiding in automated image understanding tasks. The method disclosed is highly efficient and can be used to decompose and analyze large images.
Statistics of anisotropies in inflation with spectator vector fields
Thorsrud, Mikjel; Mota, David F.; Urban, Federico R. E-mail: furban@ulb.ac.be
2014-04-01
We study the statistics of the primordial power spectrum in models where massless gauge vectors are coupled to the inflaton, paying special attention to observational implications of having fundamental or effective horizons embedded in a bath of infrared fluctuations. As quantum infrared modes cross the horizon, they classicalize and build a background vector field. We find that the vector experiences a statistical precession phenomenon. Implications for primordial correlators and the interpretation thereof are considered. Firstly, we show how in general two, not only one, additional observables, a quadrupole amplitude and an intrinsic shape parameter, are necessary to fully describe the correction to the curvature power spectrum, and develop a unique parametrization for them. Secondly, we show that the observed anisotropic amplitude and the associated preferred direction depend on the volume of the patch being probed. We calculate non-zero priors for the expected deviations between detections based on microwave background data (which probes the entire Hubble patch) and large scale structure (which only probes a fraction of it)
View discovery in OLAP databases through statistical combinatorial optimization
Hengartner, Nick W; Burke, John; Critchlow, Terence; Joslyn, Cliff; Hogan, Emilie
2009-01-01
OnLine Analytical Processing (OLAP) is a relational database technology providing users with rapid access to summary, aggregated views of a single large database, and is widely recognized for knowledge representation and discovery in high-dimensional relational databases. OLAP technologies provide intuitive and graphical access to the massively complex set of possible summary views available in large relational (SQL) structured data repositories. The capability of OLAP database software systems to handle data complexity comes at a high price for analysts, presenting them a combinatorially vast space of views of a relational database. We respond to the need to deploy technologies sufficient to allow users to guide themselves to areas of local structure by casting the space of 'views' of an OLAP database as a combinatorial object of all projections and subsets, and 'view discovery' as an search process over that lattice. We equip the view lattice with statistical information theoretical measures sufficient to support a combinatorial optimization process. We outline 'hop-chaining' as a particular view discovery algorithm over this object, wherein users are guided across a permutation of the dimensions by searching for successive two-dimensional views, pushing seen dimensions into an increasingly large background filter in a 'spiraling' search process. We illustrate this work in the context of data cubes recording summary statistics for radiation portal monitors at US ports.
Correlating sampling and intensity statistics in nanoparticle diffraction experiments
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Öztürk, Hande; Yan, Hanfei; Hill, John P.; Noyan, I. Cevdet
2015-07-28
It is shown in a previous article [Öztürk, Yan, Hill & Noyan (2014).J. Appl. Cryst.47, 1016–1025] that the sampling statistics of diffracting particle populations within a polycrystalline ensemble depended on the size of the constituent crystallites: broad X-ray peak breadths enabled some nano-sized particles to contribute more than one diffraction spot to Debye–Scherrer rings. Here it is shown that the equations proposed by Alexander, Klug & Kummer [J. Appl. Phys.(1948),19, 742–753] (AKK) to link diffracting particle and diffracted intensity statistics are not applicable if the constituent crystallites of the powder are below 10 nm. In this size range, (i) themore » one-to-one correspondence between diffracting particles and Laue spots assumed in the AKK analysis is not satisfied, and (ii) the crystallographic correlation between Laue spots originating from the same grain invalidates the assumption that all diffracting plane normals are randomly oriented and uncorrelated. Such correlation produces unexpected results in the selection of diffracting grains. For example, three or more Laue spots from a given grain for a particular reflection can only be observed at certain wavelengths. In addition, correcting the diffracted intensity values by the traditional Lorentz term, 1/cos θ, to compensate for the variation of particles sampled within a reflection band does not maintain fidelity to the number of poles contributing to the diffracted signal. A new term, cos θB/cos θ, corrects this problem.« less
Deep Resistivity Structure of Yucca Flat, Nevada Test Site, Nevada.
Theodore H. Asch, Brian D. Rodriguez; Jay A. Sampson; Erin L. Wallin; and Jackie M. Williams.
2006-09-18
The Department of Energy (DOE) and the National Nuclear Security Administration (NNSA) at their Nevada Site Office are addressing groundwater contamination resulting from historical underground nuclear testing through the Environmental Management program and, in particular, the Underground Test Area project. One issue of concern is the nature of the somewhat poorly constrained pre Tertiary geology and its effects on ground-water flow in the area adjacent to a nuclear test. Ground water modelers would like to know more about the hydrostratigraphy and geologic structure to support a hydrostratigraphic framework model that is under development for the Yucca Flat Corrective Action Unit (CAU). During 2003, the U.S. Geological Survey, supported by the DOE and NNSA-NSO, collected and processed data from 51 magnetotelluric (MT) and audio-magnetotelluric (AMT) stations at the Nevada Test Site in and near Yucca Flat to assist in characterizing the pre-Tertiary geology in that area. The primary purpose was to refine the character, thickness, and lateral extent of pre Tertiary confining units. In particular, a major goal has been to define the upper clastic confining unit (late Devonian Mississippian-age siliciclastic rocks assigned to the Eleana Formation and Chainman Shale) in the Yucca Flat area. The MT and AMT data have been released in separate USGS Open File Reports. The Nevada Test Site magnetotelluric data interpretation presented in this report includes the results of detailed two-dimensional (2 D) resistivity modeling for each profile (including alternative interpretations) and gross inferences on the three dimensional (3 D) character of the geology beneath each station. The character, thickness, and lateral extent of the Chainman Shale and Eleana Formation that comprise the Upper Clastic Confining Unit are generally well determined in the upper 5 km. Inferences can be made regarding the presence of the Lower Clastic Confining Unit at depths below 5 km. Large fault structures such as the CP Thrust fault, the Carpetbag fault, and the Yucca fault that cross Yucca Flat are also discernable as are other smaller faults. The subsurface electrical resistivity distribution and inferred geologic structures determined by this investigation should help constrain the hydrostratigraphic framework model that is under development.
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Testing Photo of a large wind turbine blade sticking out of the structural testing laboratory; it is perpendicular to a building at the National Wind Technology Center. A multimegawatt wind turbine blade extends outside of the structural testing facility at the NWTC. PIX #19010 Testing capabilities at the National Wind Technology Center (NWTC) support the installation and testing of wind turbines that range in size from 400 watts to 5.0 megawatts. Engineers provide wind industry manufacturers,
Yost, F.; Hosking, F.M.; Jellison, J.L.; Short, B.; Giversen, T.; Reed, J.R.
1998-10-27
A new test method to quantify capillary flow solderability on a printed wiring board surface finish. The test is based on solder flow from a pad onto narrow strips or lines. A test procedure and video image analysis technique were developed for conducting the test and evaluating the data. Feasibility tests revealed that the wetted distance was sensitive to the ratio of pad radius to line width (l/r), solder volume, and flux predry time. 11 figs.
Yost, Fred; Hosking, Floyd M.; Jellison, James L.; Short, Bruce; Giversen, Terri; Reed, Jimmy R.
1998-01-01
A new test method to quantify capillary flow solderability on a printed wiring board surface finish. The test is based on solder flow from a pad onto narrow strips or lines. A test procedure and video image analysis technique were developed for conducting the test and evaluating the data. Feasibility tests revealed that the wetted distance was sensitive to the ratio of pad radius to line width (l/r), solder volume, and flux predry time.
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Major Test Partners Once a technology is ready to be tested at pilot or commercial scale, the cost of building a test facility becomes significant -- often beyond the funding provided for any one project. It then becomes critical to test the technology at a pre-existing facility willing to test experimental technologies. Not surprisingly, most commercial facilities are hesitant to interfere with their operations to experiment, but others, with a view towards the future, welcome promising
Entry/Exit Port testing, test report
Winkelman, R.H.
1993-05-01
The Waste Receiving and Processing Module I (WRAP-1) facility must have the ability to allow 55-gallon drums to enter and exit glovebox enclosures. An Entry/Exit Port (Appendix 1, Figure 1), designed by United Engineers and Constructors (UE&C), is one method chosen for drum transfer. The Entry/Exit Port is to be used for entry of 55-gallon drums into both process entry gloveboxes, exit of 55-gallon drum waste pucks from the low-level waste (LLW) glovebox, and loadout of waste from the restricted waste management glovebox. The Entry/Exit Port relies on capture velocity air flow and a neoprene seal to provide alpha confinement when the Port is in the open and closed positions, respectively. Since the glovebox is in a slight vacuum, air flow is directed into the glovebox through the space between the overpack drum and glovebox floor. The air flow is to direct any airborne contamination into the glovebox. A neoprene seal is used to seal the Port door to the glovebox floor, thus maintaining confinement in the closed position. Entry/Exit Port testing took place February 17, 1993, through April 14, 1993, in the 305 building of Westinghouse Hanford Company. Testing was performed in accordance with the Entry/Exit Port Testing Test Plan, document number WHC-SD-WO26-TP-005. A prototype Entry/Exit Port built at the Hanford Site was tested using fluorescent paint pigment and smoke candles as simulant contaminants. This test report is an interim test report. Further developmental testing is required to test modifications made to the Port as the original design of the Port did not provide complete confinement during all stages of operation.
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Almendro, Vanessa; Cheng, Yu -Kang; Randles, Amanda; Itzkovitz, Shalev; Marusyk, Andriy; Ametller, Elisabet; Gonzalez-Farre, Xavier; Muñoz, Montse; Russnes, Hege G.; Helland, Åslaug; et al
2014-02-01
Cancer therapy exerts a strong selection pressure that shapes tumor evolution, yet our knowledge of how tumors change during treatment is limited. Here, we report the analysis of cellular heterogeneity for genetic and phenotypic features and their spatial distribution in breast tumors pre- and post-neoadjuvant chemotherapy. We found that intratumor genetic diversity was tumor-subtype specific, and it did not change during treatment in tumors with partial or no response. However, lower pretreatment genetic diversity was significantly associated with pathologic complete response. In contrast, phenotypic diversity was different between pre- and post-treatment samples. We also observed significant changes in the spatialmore » distribution of cells with distinct genetic and phenotypic features. We used these experimental data to develop a stochastic computational model to infer tumor growth patterns and evolutionary dynamics. Our results highlight the importance of integrated analysis of genotypes and phenotypes of single cells in intact tissues to predict tumor evolution.« less
Almendro, Vanessa; Cheng, Yu -Kang; Randles, Amanda; Itzkovitz, Shalev; Marusyk, Andriy; Ametller, Elisabet; Gonzalez-Farre, Xavier; Muñoz, Montse; Russnes, Hege G.; Helland, Åslaug; Rye, Inga H.; Borresen-Dale, Anne -Lise; Maruyama, Reo; van Oudenaarden, Alexander; Dowsett, Mitchell; Jones, Robin L.; Reis-Filho, Jorge; Gascon, Pere; Gönen, Mithat; Michor, Franziska; Polyak, Kornelia
2014-02-01
Cancer therapy exerts a strong selection pressure that shapes tumor evolution, yet our knowledge of how tumors change during treatment is limited. Here, we report the analysis of cellular heterogeneity for genetic and phenotypic features and their spatial distribution in breast tumors pre- and post-neoadjuvant chemotherapy. We found that intratumor genetic diversity was tumor-subtype specific, and it did not change during treatment in tumors with partial or no response. However, lower pretreatment genetic diversity was significantly associated with pathologic complete response. In contrast, phenotypic diversity was different between pre- and post-treatment samples. We also observed significant changes in the spatial distribution of cells with distinct genetic and phenotypic features. We used these experimental data to develop a stochastic computational model to infer tumor growth patterns and evolutionary dynamics. Our results highlight the importance of integrated analysis of genotypes and phenotypes of single cells in intact tissues to predict tumor evolution.
Benjamin Langhorst; Thomas M Lillo; Henry S Chu
2014-05-01
A statistics based ballistic test method is presented for use when comparing multiple groups of test articles of unknown relative ballistic perforation resistance. The method is intended to be more efficient than many traditional methods for research and development testing. To establish the validity of the method, it is employed in this study to compare test groups of known relative ballistic performance. Multiple groups of test articles were perforated using consistent projectiles and impact conditions. Test groups were made of rolled homogeneous armor (RHA) plates and differed in thickness. After perforation, each residual projectile was captured behind the target and its mass was measured. The residual masses measured for each test group were analyzed to provide ballistic performance rankings with associated confidence levels. When compared to traditional V50 methods, the residual mass (RM) method was found to require fewer test events and be more tolerant of variations in impact conditions.
Lightweight and Statistical Techniques for Petascale PetaScale Debugging
Miller, Barton
2014-06-30
This project investigated novel techniques for debugging scientific applications on petascale architectures. In particular, we developed lightweight tools that narrow the problem space when bugs are encountered. We also developed techniques that either limit the number of tasks and the code regions to which a developer must apply a traditional debugger or that apply statistical techniques to provide direct suggestions of the location and type of error. We extend previous work on the Stack Trace Analysis Tool (STAT), that has already demonstrated scalability to over one hundred thousand MPI tasks. We also extended statistical techniques developed to isolate programming errors in widely used sequential or threaded applications in the Cooperative Bug Isolation (CBI) project to large scale parallel applications. Overall, our research substantially improved productivity on petascale platforms through a tool set for debugging that complements existing commercial tools. Previously, Office Of Science application developers relied either on primitive manual debugging techniques based on printf or they use tools, such as TotalView, that do not scale beyond a few thousand processors. However, bugs often arise at scale and substantial effort and computation cycles are wasted in either reproducing the problem in a smaller run that can be analyzed with the traditional tools or in repeated runs at scale that use the primitive techniques. New techniques that work at scale and automate the process of identifying the root cause of errors were needed. These techniques significantly reduced the time spent debugging petascale applications, thus leading to a greater overall amount of time for application scientists to pursue the scientific objectives for which the systems are purchased. We developed a new paradigm for debugging at scale: techniques that reduced the debugging scenario to a scale suitable for traditional debuggers, e.g., by narrowing the search for the root-cause analysis to a small set of nodes or by identifying equivalence classes of nodes and sampling our debug targets from them. We implemented these techniques as lightweight tools that efficiently work on the full scale of the target machine. We explored four lightweight debugging refinements: generic classification parameters, such as stack traces, application-specific classification parameters, such as global variables, statistical data acquisition techniques and machine learning based approaches to perform root cause analysis. Work done under this project can be divided into two categories, new algorithms and techniques for scalable debugging, and foundation infrastructure work on our MRNet multicast-reduction framework for scalability, and Dyninst binary analysis and instrumentation toolkits.
On the reliability of microvariability tests in quasars
De Diego, Jos A.
2014-11-01
Microvariations probe the physics and internal structure of quasars. Unpredictability and small flux variations make this phenomenon elusive and difficult to detect. Variance-based probes such as the C and F tests, or a combination of both, are popular methods to compare the light curves of the quasar and a comparison star. Recently, detection claims in some studies have depended on the agreement of the results of the C and F tests, or of two instances of the F-test, for rejecting the non-variation null hypothesis. However, the C-test is a non-reliable statistical procedure, the F-test is not robust, and the combination of tests with concurrent results is anything but a straightforward methodology. A priori power analysis calculations and post hoc analysis of Monte Carlo simulations show excellent agreement for the analysis of variance test to detect microvariations as well as the limitations of the F-test. Additionally, the combined tests yield correlated probabilities that make the assessment of statistical significance unworkable. However, it is possible to include data from several field stars to enhance the power in a single F-test, increasing the reliability of the statistical analysis. This would be the preferred methodology when several comparison stars are available. An example using two stars and the enhanced F-test is presented. These results show the importance of using adequate methodologies and avoiding inappropriate procedures that can jeopardize microvariability detections. Power analysis and Monte Carlo simulations are useful tools for research planning, as they can demonstrate the robustness and reliability of different research approaches.
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
test and evaluation NNSA, Air Force Complete Successful B61-12 Life Extension Program Development Flight Test at Tonopah Test Range WASHINGTON - The National Nuclear Security Administration (NNSA) and United States Air Force completed the third development flight test of a non-nuclear B61-12 nuclear gravity bomb at Tonopah Test Range in Nevada on October 20, 2015. "This demonstration of effective end-to-end system... Flight Test of Weapons System Body by Navy Successful Third Flight
Statistical tools for prognostics and health management of complex systems
Collins, David H; Huzurbazar, Aparna V; Anderson - Cook, Christine M
2010-01-01
Prognostics and Health Management (PHM) is increasingly important for understanding and managing today's complex systems. These systems are typically mission- or safety-critical, expensive to replace, and operate in environments where reliability and cost-effectiveness are a priority. We present background on PHM and a suite of applicable statistical tools and methods. Our primary focus is on predicting future states of the system (e.g., the probability of being operational at a future time, or the expected remaining system life) using heterogeneous data from a variety of sources. We discuss component reliability models incorporating physical understanding, condition measurements from sensors, and environmental covariates; system reliability models that allow prediction of system failure time distributions from component failure models; and the use of Bayesian techniques to incorporate expert judgments into component and system models.
Statistical measures of Planck scale signal correlations in interferometers
Hogan, Craig J.; Kwon, Ohkyung
2015-06-22
A model-independent statistical framework is presented to interpret data from systems where the mean time derivative of positional cross correlation between world lines, a measure of spreading in a quantum geometrical wave function, is measured with a precision smaller than the Planck time. The framework provides a general way to constrain possible departures from perfect independence of classical world lines, associated with Planck scale bounds on positional information. A parametrized candidate set of possible correlation functions is shown to be consistent with the known causal structure of the classical geometry measured by an apparatus, and the holographic scaling of information suggested by gravity. Frequency-domain power spectra are derived that can be compared with interferometer data. As a result, simple projections of sensitivity for specific experimental set-ups suggests that measurements will directly yield constraints on a universal time derivative of the correlation function, and thereby confirm or rule out a class of Planck scale departures from classical geometry.
Statistical Methods and Tools for Hanford Staged Feed Tank Sampling
Fountain, Matthew S.; Brigantic, Robert T.; Peterson, Reid A.
2013-10-01
This report summarizes work conducted by Pacific Northwest National Laboratory to technically evaluate the current approach to staged feed sampling of high-level waste (HLW) sludge to meet waste acceptance criteria (WAC) for transfer from tank farms to the Hanford Waste Treatment and Immobilization Plant (WTP). The current sampling and analysis approach is detailed in the document titled Initial Data Quality Objectives for WTP Feed Acceptance Criteria, 24590-WTP-RPT-MGT-11-014, Revision 0 (Arakali et al. 2011). The goal of this current work is to evaluate and provide recommendations to support a defensible, technical and statistical basis for the staged feed sampling approach that meets WAC data quality objectives (DQOs).
Statistical properties of Charney-Hasegawa-Mima zonal flows
Anderson, Johan; Botha, G. J. J.
2015-05-15
A theoretical interpretation of numerically generated probability density functions (PDFs) of intermittent plasma transport events in unforced zonal flows is provided within the Charney-Hasegawa-Mima (CHM) model. The governing equation is solved numerically with various prescribed density gradients that are designed to produce different configurations of parallel and anti-parallel streams. Long-lasting vortices form whose flow is governed by the zonal streams. It is found that the numerically generated PDFs can be matched with analytical predictions of PDFs based on the instanton method by removing the autocorrelations from the time series. In many instances, the statistics generated by the CHM dynamics relaxes to Gaussian distributions for both the electrostatic and vorticity perturbations, whereas in areas with strong nonlinear interactions it is found that the PDFs are exponentially distributed.
Statistical Stability and Time-Reversal Imgaing in Random Media
Berryman, J; Borcea, L; Papanicolaou, G; Tsogka, C
2002-02-05
Localization of targets imbedded in a heterogeneous background medium is a common problem in seismic, ultrasonic, and electromagnetic imaging problems. The best imaging techniques make direct use of the eigenfunctions and eigenvalues of the array response matrix, as recent work on time-reversal acoustics has shown. Of the various imaging functionals studied, one that is representative of a preferred class is a time-domain generalization of MUSIC (MUltiple Signal Classification), which is a well-known linear subspace method normally applied only in the frequency domain. Since statistical stability is not characteristic of the frequency domain, a transform back to the time domain after first diagonalizing the array data in the frequency domain takes optimum advantage of both the time-domain stability and the frequency-domain orthogonality of the relevant eigenfunctions.
Spatial Statistical Procedures to Validate Input Data in Energy Models
Johannesson, G.; Stewart, J.; Barr, C.; Brady Sabeff, L.; George, R.; Heimiller, D.; Milbrandt, A.
2006-01-01
Energy modeling and analysis often relies on data collected for other purposes such as census counts, atmospheric and air quality observations, economic trends, and other primarily non-energy related uses. Systematic collection of empirical data solely for regional, national, and global energy modeling has not been established as in the abovementioned fields. Empirical and modeled data relevant to energy modeling is reported and available at various spatial and temporal scales that might or might not be those needed and used by the energy modeling community. The incorrect representation of spatial and temporal components of these data sets can result in energy models producing misleading conclusions, especially in cases of newly evolving technologies with spatial and temporal operating characteristics different from the dominant fossil and nuclear technologies that powered the energy economy over the last two hundred years. Increased private and government research and development and public interest in alternative technologies that have a benign effect on the climate and the environment have spurred interest in wind, solar, hydrogen, and other alternative energy sources and energy carriers. Many of these technologies require much finer spatial and temporal detail to determine optimal engineering designs, resource availability, and market potential. This paper presents exploratory and modeling techniques in spatial statistics that can improve the usefulness of empirical and modeled data sets that do not initially meet the spatial and/or temporal requirements of energy models. In particular, we focus on (1) aggregation and disaggregation of spatial data, (2) predicting missing data, and (3) merging spatial data sets. In addition, we introduce relevant statistical software models commonly used in the field for various sizes and types of data sets.
Gas Test Loop Booster Fuel Hydraulic Testing
Gas Test Loop Hydraulic Testing Staff
2006-09-01
The Gas Test Loop (GTL) project is for the design of an adaptation to the Advanced Test Reactor (ATR) to create a fast-flux test space where fuels and materials for advanced reactor concepts can undergo irradiation testing. Incident to that design, it was found necessary to make use of special booster fuel to enhance the neutron flux in the reactor lobe in which the Gas Test Loop will be installed. Because the booster fuel is of a different composition and configuration from standard ATR fuel, it is necessary to qualify the booster fuel for use in the ATR. Part of that qualification is the determination that required thermal hydraulic criteria will be met under routine operation and under selected accident scenarios. The Hydraulic Testing task in the GTL project facilitates that determination by measuring flow coefficients (pressure drops) over various regions of the booster fuel over a range of primary coolant flow rates. A high-fidelity model of the NW lobe of the ATR with associated flow baffle, in-pile-tube, and below-core flow channels was designed, constructed and located in the Idaho State University Thermal Fluids Laboratory. A circulation loop was designed and constructed by the university to provide reactor-relevant water flow rates to the test system. Models of the four booster fuel elements required for GTL operation were fabricated from aluminum (no uranium or means of heating) and placed in the flow channel. One of these was instrumented with Pitot tubes to measure flow velocities in the channels between the three booster fuel plates and between the innermost and outermost plates and the side walls of the flow annulus. Flow coefficients in the range of 4 to 6.5 were determined from the measurements made for the upper and middle parts of the booster fuel elements. The flow coefficient for the lower end of the booster fuel and the sub-core flow channel was lower at 2.3.
Table B1. Summary statistics for natural gas in the United States...
U.S. Energy Information Administration (EIA) Indexed Site
8 Table B1. Summary statistics for natural gas in the United States, metric equivalents, ... Gas Annual 199 Table B1. Summary statistics for natural gas in the United states, ...
Super-Poissonian Statistics of Photon Emission from Single CdSe...
Office of Scientific and Technical Information (OSTI)
Statistics of Photon Emission from Single CdSe-CdS Core-Shell Nanocrystals Coupled to Metal Nanostructures Citation Details In-Document Search Title: Super-Poissonian Statistics of ...
Massive Hanford Test Reactor Removed - Plutonium Recycle Test...
Office of Environmental Management (EM)
Massive Hanford Test Reactor Removed - Plutonium Recycle Test Reactor removed from Hanford's 300 Area Massive Hanford Test Reactor Removed - Plutonium Recycle Test Reactor removed ...
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Safety Test. This form can also be picked up and filled out in the CAMD front office, rm. 107 A minimum passing score is 80% (24 out of 30) After completing the test, you will ...
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
August 1, 2013 The design and testing for "Little Boy" took place at Gun Site The design and testing for "Little Boy" took place at Gun Site. RELATED IMAGES http:...
Dwarf galaxy dark matter density profiles inferred from stellar and gas kinematics
Adams, Joshua J.; Simon, Joshua D. [Observatories of the Carnegie Institution of Science, 813 Santa Barbara Street, Pasadena, CA 91101 (United States); Fabricius, Maximilian H.; Bender, Ralf; Thomas, Jens [Max-Planck Institut fr extraterrestrische Physik, Giessenbachstrae, D-85741 Garching bei Mnchen (Germany); Van den Bosch, Remco C. E.; Van de Ven, Glenn [Max-Planck Institut fr Astronomie, Knigstuhl 17, D-69117 Heidelberg (Germany); Barentine, John C.; Gebhardt, Karl; Hill, Gary J. [Department of Astronomy, University of Texas at Austin, 2515 Speedway, Stop C1400, Austin, TX 78712-1205 (United States); Murphy, Jeremy D. [Department of Astrophysical Sciences, Princeton University, 4 Ivy Lane, Peyton Hall, Princeton, NJ 08544 (United States); Swaters, R. A., E-mail: jjadams@obs.carnegiescience.edu, E-mail: jja439@gmail.com [National Optical Astronomy Observatory, 950 North Cherry Avenue, Tucson, AZ 85719 (United States)
2014-07-01
We present new constraints on the density profiles of dark matter (DM) halos in seven nearby dwarf galaxies from measurements of their integrated stellar light and gas kinematics. The gas kinematics of low-mass galaxies frequently suggest that they contain constant density DM cores, while N-body simulations instead predict a cuspy profile. We present a data set of high-resolution integral-field spectroscopy on seven galaxies and measure the stellar and gas kinematics simultaneously. Using Jeans modeling on our full sample, we examine whether gas kinematics in general produce shallower density profiles than are derived from the stars. Although two of the seven galaxies show some localized differences in their rotation curves between the two tracers, estimates of the central logarithmic slope of the DM density profile, ?, are generally robust. The mean and standard deviation of the logarithmic slope for the population are ? = 0.67 0.10 when measured in the stars and ? = 0.58 0.24 when measured in the gas. We also find that the halos are not under-concentrated at the radii of half their maximum velocities. Finally, we search for correlations of the DM density profile with stellar velocity anisotropy and other baryonic properties. Two popular mechanisms to explain cored DM halos are an exotic DM component or feedback models that strongly couple the energy of supernovae into repeatedly driving out gas and dynamically heating the DM halos. While such models do not yet have falsifiable predictions that we can measure, we investigate correlations that may eventually be used to test models. We do not find a secondary parameter that strongly correlates with the central DM density slope, but we do find some weak correlations. The central DM density slope weakly correlates with the abundance of ? elements in the stellar population, anti-correlates with H I fraction, and anti-correlates with vertical orbital anisotropy. We expect, if anything, the opposite of these three trends for feedback models. Determining the importance of these correlations will require further model developments and larger observational samples.
Blade Testing Trends (Presentation)
Desmond, M.
2014-08-01
As an invited guest speaker, Michael Desmond presented on NREL's NWTC structural testing methods and capabilities at the 2014 Sandia Blade Workshop held on August 26-28, 2014 in Albuquerque, NM. Although dynamometer and field testing capabilities were mentioned, the presentation focused primarily on wind turbine blade testing, including descriptions and capabilities for accredited certification testing, historical methodology and technology deployment, and current research and development activities.
Praeg, W.F.
1984-03-30
This invention pertains to arrangements for performing electrical tests on contact material samples, and in particular for testing contact material test samples in an evacuated environment under high current loads. Frequently, it is desirable in developing high-current separable contact material, to have at least a preliminary analysis of selected candidate conductor materials. Testing of material samples will hopefully identify materials unsuitable for high current electrical contact without requiring incorporation of the materials into a completed and oftentimes complex structure.
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
OMB MPI Tests OMB MPI Tests Description The Ohio MicroBenchmark suite is a collection of independent MPI message passing performance microbenchmarks developed and written at The Ohio State University. It includes traditional benchmarks and performance measures such as latency, bandwidth and host overhead and can be used for both traditional and GPU-enhanced nodes. For the purposes of the Trinity / NERSC-8 acquisition this includes only the following tests: (name of OSU test: performance
User Statistics Collection Practices | U.S. DOE Office of Science (SC)
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
User Statistics Collection Practices User Facilities User Facilities Home User Facilities at a Glance User Resources User Statistics Policies and Processes Definition Designation Process Official List of SC User Facilities User Statistics Collection Practices User Statistics Collection Practices Archives Frequently Asked Questions User Facility Science Highlights User Facility News Contact Information Office of Science U.S. Department of Energy 1000 Independence Ave., SW Washington, DC 20585 P:
Energy Science and Technology Software Center (OSTI)
2015-09-09
The NCCS Regression Test Harness is a software package that provides a framework to perform regression and acceptance testing on NCCS High Performance Computers. The package is written in Python and has only the dependency of a Subversion repository to store the regression tests.
Statistically designed study of the variables and parameters of carbon dioxide equations of state
Donohue, M.D.; Naiman, D.Q.; Jin, Gang; Loehe, J.R.
1991-05-01
Carbon dioxide is used widely in enhanced oil recovery (EOR) processes to maximize the production of crude oil from aging and nearly depleted oil wells. Carbon dioxide also is encountered in many processes related to oil recovery. Accurate representations of the properties of carbon dioxide, and its mixtures with hydrocarbons, play a critical role in a number of enhanced oil recovery operations. One of the first tasks of this project was to select an equation of state to calculate the properties of carbon dioxide and its mixtures. The equations simplicity, accuracy, and reliability in representing phase behavior and thermodynamic properties of mixtures containing carbon dioxide with hydrocarbons at conditions relevant to enhanced oil recovery were taken into account. We also have determined the thermodynamic properties that are important to enhanced oil recovery and the ranges of temperature, pressure and composition that are important. We chose twelve equations of state for preliminary studies to be evaluated against these criteria. All of these equations were tested for pure carbon dioxide and eleven were tested for pure alkanes and their mixtures with carbon dioxide. Two equations, the ALS equation and the ESD equation, were selected for detailed statistical analysis. 54 refs., 41 figs., 36 tabs.
McBeath, R.S.
1995-02-28
Testing was performed to determine actual damage to drums when dropped from higher than currently stacked elevations. The drum configurations were the same as they are placed in storage; single drums and four drums banded to a pallet. Maximum drop weights were selected based on successful preliminary tests. Material was lost from each of the single drum tests while only a small amount of material was lost from one of the pelletized drums. The test results are presented in this report. This report also provides recommendations for further testing to determine the appropriate drum weight which can be stored on a fourth tier.
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
ZiaTest ZiaTest Description This test executes a new proposed standard benchmark method for MPI startup that is intended to provide a realistic assessment of both launch and wireup requirements. Accordingly, it exercises both the launch system of the environment and the interconnect subsystem in a specified pattern. Specifically, the test consists of the following steps: Record a time stamp for when the test started - this is passed to rank=0 upon launch. Launch a 100MB executable on a specified
Pendulum detector testing device
Gonsalves, John M.
1997-01-01
A detector testing device which provides consistent, cost-effective, repeatable results. The testing device is primarily constructed of PVC plastic and other non-metallic materials. Sensitivity of a walk-through detector system can be checked by: 1) providing a standard test object simulating the mass, size and material content of a weapon or other contraband, 2) suspending the test object in successive positions, such as head, waist and ankle levels, simulating where the contraband might be concealed on a person walking through the detector system; and 3) swinging the suspended object through each of the positions, while operating the detector system and observing its response. The test object is retained in a holder in which the orientation of the test device or target can be readily changed, to properly complete the testing requirements.
Pendulum detector testing device
Gonsalves, J.M.
1997-09-30
A detector testing device is described which provides consistent, cost-effective, repeatable results. The testing device is primarily constructed of PVC plastic and other non-metallic materials. Sensitivity of a walk-through detector system can be checked by: (1) providing a standard test object simulating the mass, size and material content of a weapon or other contraband, (2) suspending the test object in successive positions, such as head, waist and ankle levels, simulating where the contraband might be concealed on a person walking through the detector system; and (3) swinging the suspended object through each of the positions, while operating the detector system and observing its response. The test object is retained in a holder in which the orientation of the test device or target can be readily changed, to properly complete the testing requirements. 5 figs.
Sample Proficiency Test exercise
Alcaraz, A; Gregg, H; Koester, C
2006-02-05
The current format of the OPCW proficiency tests has multiple sets of 2 samples sent to an analysis laboratory. In each sample set, one is identified as a sample, the other as a blank. This method of conducting proficiency tests differs from how an OPCW designated laboratory would receive authentic samples (a set of three containers, each not identified, consisting of the authentic sample, a control sample, and a blank sample). This exercise was designed to test the reporting if the proficiency tests were to be conducted. As such, this is not an official OPCW proficiency test, and the attached report is one method by which LLNL might report their analyses under a more realistic testing scheme. Therefore, the title on the report ''Report of the Umpteenth Official OPCW Proficiency Test'' is meaningless, and provides a bit of whimsy for the analyses and readers of the report.
Statistical Analysis of Tank 5 Floor Sample Results
Shine, E. P.
2013-01-31
Sampling has been completed for the characterization of the residual material on the floor of Tank 5 in the F-Area Tank Farm at the Savannah River Site (SRS), near Aiken, SC. The sampling was performed by Savannah River Remediation (SRR) LLC using a stratified random sampling plan with volume-proportional compositing. The plan consisted of partitioning the residual material on the floor of Tank 5 into three non-overlapping strata: two strata enclosed accumulations, and a third stratum consisted of a thin layer of material outside the regions of the two accumulations. Each of three composite samples was constructed from five primary sample locations of residual material on the floor of Tank 5. Three of the primary samples were obtained from the stratum containing the thin layer of material, and one primary sample was obtained from each of the two strata containing an accumulation. This report documents the statistical analyses of the analytical results for the composite samples. The objective of the analysis is to determine the mean concentrations and upper 95% confidence (UCL95) bounds for the mean concentrations for a set of analytes in the tank residuals. The statistical procedures employed in the analyses were consistent with the Environmental Protection Agency (EPA) technical guidance by Singh and others [2010]. Savannah River National Laboratory (SRNL) measured the sample bulk density, nonvolatile beta, gross alpha, and the radionuclide1, elemental, and chemical concentrations three times for each of the composite samples. The analyte concentration data were partitioned into three separate groups for further analysis: analytes with every measurement above their minimum detectable concentrations (MDCs), analytes with no measurements above their MDCs, and analytes with a mixture of some measurement results above and below their MDCs. The means, standard deviations, and UCL95s were computed for the analytes in the two groups that had at least some measurements above their MDCs. The identification of distributions and the selection of UCL95 procedures generally followed the protocol in Singh, Armbya, and Singh [2010]. When all of an analyte's measurements lie below their MDCs, only a summary of the MDCs can be provided. The measurement results reported by SRNL are listed, and the results of this analysis are reported. The data were generally found to follow a normal distribution, and to be homogenous across composite samples.
DYNAMIC STABILITY OF THE SOLAR SYSTEM: STATISTICALLY INCONCLUSIVE RESULTS FROM ENSEMBLE INTEGRATIONS
Zeebe, Richard E.
2015-01-01
Due to the chaotic nature of the solar system, the question of its long-term stability can only be answered in a statistical sense, for instance, based on numerical ensemble integrations of nearby orbits. Destabilization of the inner planets, leading to close encounters and/or collisions can be initiated through a large increase in Mercury's eccentricity, with a currently assumed likelihood of ?1%. However, little is known at present about the robustness of this number. Here I report ensemble integrations of the full equations of motion of the eight planets and Pluto over 5 Gyr, including contributions from general relativity. The results show that different numerical algorithms lead to statistically different results for the evolution of Mercury's eccentricity (e{sub M}). For instance, starting at present initial conditions (e{sub M}?0.21), Mercury's maximum eccentricity achieved over 5 Gyr is, on average, significantly higher in symplectic ensemble integrations using heliocentric rather than Jacobi coordinates and stricter error control. In contrast, starting at a possible future configuration (e{sub M}?0.53), Mercury's maximum eccentricity achieved over the subsequent 500 Myr is, on average, significantly lower using heliocentric rather than Jacobi coordinates. For example, the probability for e{sub M} to increase beyond 0.53 over 500 Myr is >90% (Jacobi) versus only 40%-55% (heliocentric). This poses a dilemma because the physical evolution of the real systemand its probabilistic behaviorcannot depend on the coordinate system or the numerical algorithm chosen to describe it. Some tests of the numerical algorithms suggest that symplectic integrators using heliocentric coordinates underestimate the odds for destabilization of Mercury's orbit at high initial e{sub M}.
Statistical and Domain Analytics Applied to PV Module Lifetime and
Broader source: Energy.gov (indexed) [DOE]
Degradation Science | Department of Energy ps2_casewestern_bruckman.pdf More Documents & Publications Literature Review of the Effects of UV Exposure on PV Modules Failure Rates from Certification Testing to UL and IEC Standards for Flat Plate PV Modules Vehicle Technologies Office: 2015 Electric Drive Technologies Annual R&D Progress Report
Baseline Test Specimen Machining Report
mark Carroll
2009-08-01
The Next Generation Nuclear Plant (NGNP) Project is tasked with selecting a high temperature gas reactor technology that will be capable of generating electricity and supplying large amounts of process heat. The NGNP is presently being designed as a helium-cooled high temperature gas reactor (HTGR) with a large graphite core. The graphite baseline characterization project is conducting the research and development (R&D) activities deemed necessary to fully qualify nuclear-grade graphite for use in the NGNP reactor. Establishing nonirradiated thermomechanical and thermophysical properties by characterizing lot-to-lot and billet-to-billet variations (for probabilistic baseline data needs) through extensive data collection and statistical analysis is one of the major fundamental objectives of the project. The reactor core will be made up of stacks of graphite moderator blocks. In order to gain a more comprehensive understanding of the varying characteristics in a wide range of suitable graphites, any of which can be classified as nuclear grade, an experimental program has been initiated to develop an extensive database of the baseline characteristics of numerous candidate graphites. Various factors known to affect the properties of graphite will be investigated, including specimen size, spatial location within a graphite billet, specimen orientation within a billet (either parallel to [P] or transverse to [T] the long axis of the as-produced billet), and billet-to-billet variations within a lot or across different production lots. Because each data point is based on a certain position within a given billet of graphite, particular attention must be paid to the traceability of each specimen and its spatial location and orientation within each billet. The evaluation of these properties is discussed in the Graphite Technology Development Plan (Windes et. al, 2007). One of the key components in the evaluation of these graphite types will be mechanical testing on specimens drawn from carefully controlled sections of each billet. To this end, this report will discuss the machining of the first set of test specimens that will be evaluated in this program through tensile, compressive, and flexural testing. Validation that the test specimens have been produced to the tolerances required by the applicable ASTM standards, and to the quality control levels required by this program, will demonstrate the viability of sending graphite to selected suppliers that will provide valuable and certifiable data to future data sets that are integral to the NGNP program and beyond.
Binh T. Pham; Grant L. Hawkes; Jeffrey J. Einerson
2014-05-01
As part of the High Temperature Reactors (HTR) R&D program, a series of irradiation tests, designated as Advanced Gas-cooled Reactor (AGR), have been defined to support development and qualification of fuel design, fabrication process, and fuel performance under normal operation and accident conditions. The AGR tests employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule and instrumented with thermocouples (TC) embedded in graphite blocks enabling temperature control. While not possible to obtain by direct measurements in the tests, crucial fuel conditions (e.g., temperature, neutron fast fluence, and burnup) are calculated using core physics and thermal modeling codes. This paper is focused on AGR test fuel temperature predicted by the ABAQUS code's finite element-based thermal models. The work follows up on a previous study, in which several statistical analysis methods were adapted, implemented in the NGNP Data Management and Analysis System (NDMAS), and applied for qualification of AGR-1 thermocouple data. Abnormal trends in measured data revealed by the statistical analysis are traced to either measuring instrument deterioration or physical mechanisms in capsules that may have shifted the system thermal response. The main thrust of this work is to exploit the variety of data obtained in irradiation and post-irradiation examination (PIE) for assessment of modeling assumptions. As an example, the uneven reduction of the control gas gap in Capsule 5 found in the capsule metrology measurements in PIE helps identify mechanisms other than TC drift causing the decrease in TC readings. This suggests a more physics-based modification of the thermal model that leads to a better fit with experimental data, thus reducing model uncertainty and increasing confidence in the calculated fuel temperatures of the AGR-1 test.
Glueballs and statistical mechanics of the gluon plasma
Brau, Fabian; Buisseret, Fabien
2009-06-01
We study a pure gluon plasma in the context of quasiparticle models, where the plasma is considered as an ideal gas of massive bosons. In order to reproduce SU(3) gauge field lattice data within such a framework, we review briefly the necessity to use a temperature-dependent gluon mass which accounts for color interactions between the gluons near T{sub c} and agrees with perturbative QCD at large temperatures. Consequently, we discuss the thermodynamics of systems with temperature-dependent Hamiltonians and clarify the situation about the possible solutions proposed in the literature to treat those systems consistently. We then focus our attention on two possible formulations which are thermodynamically consistent, and we extract the gluon mass from the equation of state obtained in SU(3) lattice QCD. We find that the thermal gluon mass is similar in both statistical formalisms. Finally, an interpretation of the gluon plasma as an ideal gas made of glueballs and gluons is also presented. The glueball mass is consistently computed within a relativistic formalism using a potential obtained from lattice QCD. We find that the gluon plasma might be a glueball-rich medium for T < or approx. 1.13T{sub c} and suggest that glueballs could be detected in future experiments dedicated to quark-gluon plasma.
Universal Quake Statistics: From Compressed Nanocrystals to Earthquakes
Uhl, Jonathan T.; Pathak, Shivesh; Schorlemmer, Danijel; Liu, Xin; Swindeman, Ryan; Brinkman, Braden A. W.; LeBlanc, Michael; Tsekenis, Georgios; Friedman, Nir; Behringer, Robert; Denisov, Dmitry; Schall, Peter; Gu, Xiaojun; Wright, Wendelin J.; Hufnagel, Todd; Jennings, Andrew; Greer, Julia R.; Liaw, P. K.; Becker, Thorsten; Dresen, Georg; Dahmen, Karin A.
2015-11-17
Slowly-compressed single crystals, bulk metallic glasses (BMGs), rocks, granular materials, and the earth all deform via intermittent slips or “quakes”. We find that although these systems span 12 decades in length scale, they all show the same scaling behavior for their slip size distributions and other statistical properties. Remarkably, the size distributions follow the same power law multiplied with the same exponential cutoff. The cutoff grows with applied force for materials spanning length scales from nanometers to kilometers. The tuneability of the cutoff with stress reflects “tuned critical” behavior, rather than self-organized criticality (SOC), which would imply stress-independence. A simple mean field model for avalanches of slipping weak spots explains the agreement across scales. It predicts the observed slip-size distributions and the observed stressdependent cutoff function. In conclusion, the results enable extrapolations from one scale to another, and from one force to another, across different materials and structures, from nanocrystals to earthquakes.
Universal Quake Statistics: From Compressed Nanocrystals to Earthquakes
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Uhl, Jonathan T.; Pathak, Shivesh; Schorlemmer, Danijel; Liu, Xin; Swindeman, Ryan; Brinkman, Braden A. W.; LeBlanc, Michael; Tsekenis, Georgios; Friedman, Nir; Behringer, Robert; et al
2015-11-17
Slowly-compressed single crystals, bulk metallic glasses (BMGs), rocks, granular materials, and the earth all deform via intermittent slips or “quakes”. We find that although these systems span 12 decades in length scale, they all show the same scaling behavior for their slip size distributions and other statistical properties. Remarkably, the size distributions follow the same power law multiplied with the same exponential cutoff. The cutoff grows with applied force for materials spanning length scales from nanometers to kilometers. The tuneability of the cutoff with stress reflects “tuned critical” behavior, rather than self-organized criticality (SOC), which would imply stress-independence. A simplemore » mean field model for avalanches of slipping weak spots explains the agreement across scales. It predicts the observed slip-size distributions and the observed stressdependent cutoff function. In conclusion, the results enable extrapolations from one scale to another, and from one force to another, across different materials and structures, from nanocrystals to earthquakes.« less
Financial statistics major US publicly owned electric utilities 1996
1998-03-01
The 1996 edition of The Financial Statistics of Major US Publicly Owned Electric Utilities publication presents 5 years (1992 through 1996) of summary financial data and current year detailed financial data on the major publicly owned electric utilities. The objective of the publication is to provide Federal and State governments, industry, and the general public with current and historical data that can be used for policymaking and decision making purposes related to publicly owned electric utility issues. Generator and nongenerator summaries are presented in this publication. Five years of summary financial data are provided. Summaries of generators for fiscal years ending June 30 and December 31, nongenerators for fiscal years ending June 30 and December 31, and summaries of all respondents are provided. The composite tables present aggregates of income statement and balance sheet data, as well as financial indicators. Composite tables also display electric operation and maintenance expenses, electric utility plant, number of consumers, sales of electricity, and operating revenue, and electric energy account data. 2 figs., 32 tabs.
Statistical language analysis for automatic exfiltration event detection.
Robinson, David Gerald
2010-04-01
This paper discusses the recent development a statistical approach for the automatic identification of anomalous network activity that is characteristic of exfiltration events. This approach is based on the language processing method eferred to as latent dirichlet allocation (LDA). Cyber security experts currently depend heavily on a rule-based framework for initial detection of suspect network events. The application of the rule set typically results in an extensive list of uspect network events that are then further explored manually for suspicious activity. The ability to identify anomalous network events is heavily dependent on the experience of the security personnel wading through the network log. Limitations f this approach are clear: rule-based systems only apply to exfiltration behavior that has previously been observed, and experienced cyber security personnel are rare commodities. Since the new methodology is not a discrete rule-based pproach, it is more difficult for an insider to disguise the exfiltration events. A further benefit is that the methodology provides a risk-based approach that can be implemented in a continuous, dynamic or evolutionary fashion. This permits uspect network activity to be identified early with a quantifiable risk associated with decision making when responding to suspicious activity.
Plutonium metal exchange program : current status and statistical analysis
Tandon, L.; Eglin, J. L.; Michalak, S. E.; Picard, R. R.; Temer, D. J.
2004-01-01
The Rocky Flats Plutonium (Pu) Metal Sample Exchange program was conducted to insure the quality and intercomparability of measurements such as Pu assay, Pu isotopics, and impurity analyses. The Rocky Flats program was discontinued in 1989 after more than 30 years. In 2001, Los Alamos National Laboratory (LANL) reestablished the Pu Metal Exchange program. In addition to the Atomic Weapons Establishment (AWE) at Aldermaston, six Department of Energy (DOE) facilities Argonne East, Argonne West, Livermore, Los Alamos, New Brunswick Laboratory, and Savannah River are currently participating in the program. Plutonium metal samples are prepared and distributed to the sites for destructive measurements to determine elemental concentration, isotopic abundance, and both metallic and nonmetallic impurity levels. The program provides independent verification of analytical measurement capabilies for each participating facility and allows problems in analytical methods to be identified. The current status of the program will be discussed with emphasis on the unique statistical analysis and modeling of the data developed for the program. The discussion includes the definition of the consensus values for each analyte (in the presence and absence of anomalous values and/or censored values), and interesting features of the data and the results.
Statistical study of reconnection exhausts in the solar wind
Enl, J.; P?ech, L.; afrnkov, J.; N?me?ek, Z.
2014-11-20
Magnetic reconnection is a fundamental process that changes magnetic field configuration and converts a magnetic energy to flow energy and plasma heating. This paper presents a survey of the plasma and magnetic field parameters inside 418 reconnection exhausts identified in the WIND data from 1995-2012. The statistical analysis is oriented on the re-distribution of the magnetic energy released due to reconnection between a plasma acceleration and its heating. The results show that both the portion of the energy deposited into heat as well as the energy spent on the acceleration of the exhaust plasma rise with the magnetic shear angle in accord with the increase of the magnetic flux available for reconnection. The decrease of the normalized exhaust speed with the increasing magnetic shear suggests a decreasing efficiency of the acceleration and/or the increasing efficiency of heating in high-shear events. However, we have found that the already suggested relation between the exhaust speed and temperature enhancement would be rather considered as an upper limit of the plasma heating during reconnection regardless of the shear angle.
Ensemble Solar Forecasting Statistical Quantification and Sensitivity Analysis: Preprint
Cheung, WanYin; Zhang, Jie; Florita, Anthony; Hodge, Bri-Mathias; Lu, Siyuan; Hamann, Hendrik F.; Sun, Qian; Lehman, Brad
2015-12-08
Uncertainties associated with solar forecasts present challenges to maintain grid reliability, especially at high solar penetrations. This study aims to quantify the errors associated with the day-ahead solar forecast parameters and the theoretical solar power output for a 51-kW solar power plant in a utility area in the state of Vermont, U.S. Forecasts were generated by three numerical weather prediction (NWP) models, including the Rapid Refresh, the High Resolution Rapid Refresh, and the North American Model, and a machine-learning ensemble model. A photovoltaic (PV) performance model was adopted to calculate theoretical solar power generation using the forecast parameters (e.g., irradiance, cell temperature, and wind speed). Errors of the power outputs were quantified using statistical moments and a suite of metrics, such as the normalized root mean squared error (NRMSE). In addition, the PV model's sensitivity to different forecast parameters was quantified and analyzed. Results showed that the ensemble model yielded forecasts in all parameters with the smallest NRMSE. The NRMSE of solar irradiance forecasts of the ensemble NWP model was reduced by 28.10% compared to the best of the three NWP models. Further, the sensitivity analysis indicated that the errors of the forecasted cell temperature attributed only approximately 0.12% to the NRMSE of the power output as opposed to 7.44% from the forecasted solar irradiance.
Signatures of initial state modifications on bispectrum statistics
Meerburg, P Daniel; Schaar, Jan Pieter van der; Corasaniti, Pier Stefano E-mail: j.p.vanderschaar@uva.nl
2009-05-15
Modifications of the initial-state of the inflaton field can induce a departure from Gaussianity and leave a testable imprint on the higher order correlations of the CMB and large scale structures in the Universe. We focus on the bispectrum statistics of the primordial curvature perturbation and its projection on the CMB. For a canonical single-field action the three-point correlator enhancement is localized, maximizing in the collinear limit, corresponding to enfolded or squashed triangles in comoving momentum space. We show that the available local and equilateral template are very insensitive to this localized enhancement and do not generate noteworthy constraints on initial-state modifications. On the other hand, when considering the addition of a dimension 8 higher order derivative term, we find a dominant rapidly oscillating contribution, which had previously been overlooked and whose significantly enhanced amplitude is independent of the triangle under consideration. Nevertheless, the oscillatory nature of (the sign of) the correlation function implies the signal is nearly orthogonal to currently available observational templates, strongly reducing the sensitivity to the enhancement. Constraints on departures from the standard Bunch-Davies vacuum state can be derived, but also depend on the next-to-leading terms. We emphasize that the construction and application of especially adapted templates could lead to CMB bispectrum constraints on modified initial states already competing with those derived from the power spectrum.
A STATISTICAL STUDY OF TRANSVERSE OSCILLATIONS IN A QUIESCENT PROMINENCE
Hillier, A.; Morton, R. J.; Erdlyi, R.
2013-12-20
The launch of the Hinode satellite has allowed for seeing-free observations at high-resolution and high-cadence making it well suited to study the dynamics of quiescent prominences. In recent years it has become clear that quiescent prominences support small-amplitude transverse oscillations, however, sample sizes are usually too small for general conclusions to be drawn. We remedy this by providing a statistical study of transverse oscillations in vertical prominence threads. Over a 4 hr period of observations it was possible to measure the properties of 3436 waves, finding periods from 50 to 6000s with typical velocity amplitudes ranging between 0.2 and 23km s{sup 1}. The large number of observed waves allows the determination of the frequency dependence of the wave properties and derivation of the velocity power spectrum for the transverse waves. For frequencies less than 7mHz, the frequency dependence of the velocity power is consistent with the velocity power spectra generated from observations of the horizontal motions of magnetic elements in the photosphere, suggesting that the prominence transverse waves are driven by photospheric motions. However, at higher frequencies the two distributions significantly diverge, with relatively more power found at higher frequencies in the prominence oscillations. These results highlight that waves over a large frequency range are ubiquitous in prominences, and that a significant amount of the wave energy is found at higher frequency.
Development of a statistically based access delay timeline methodology.
Rivera, W. Gary; Robinson, David Gerald; Wyss, Gregory Dane; Hendrickson, Stacey M. Langfitt
2013-02-01
The charter for adversarial delay is to hinder access to critical resources through the use of physical systems increasing an adversary's task time. The traditional method for characterizing access delay has been a simple model focused on accumulating times required to complete each task with little regard to uncertainty, complexity, or decreased efficiency associated with multiple sequential tasks or stress. The delay associated with any given barrier or path is further discounted to worst-case, and often unrealistic, times based on a high-level adversary, resulting in a highly conservative calculation of total delay. This leads to delay systems that require significant funding and personnel resources in order to defend against the assumed threat, which for many sites and applications becomes cost prohibitive. A new methodology has been developed that considers the uncertainties inherent in the problem to develop a realistic timeline distribution for a given adversary path. This new methodology incorporates advanced Bayesian statistical theory and methodologies, taking into account small sample size, expert judgment, human factors and threat uncertainty. The result is an algorithm that can calculate a probability distribution function of delay times directly related to system risk. Through further analysis, the access delay analyst or end user can use the results in making informed decisions while weighing benefits against risks, ultimately resulting in greater system effectiveness with lower cost.
International petroleum statistics report, January 1992. [Contains Glossary
1992-01-01
The International Petroleum Statistics Report presents data on international oil production, consumption, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil consumption and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1980, and monthly data for the most two years. Section 2 presents an oil supply/consumption balance for the market economies (i.e., non-communist countries). This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, consumption, and trade in OECD countries. World oil production and OECD consumption data are for the years 1970 through 1990; OECD stocks from 1973 through 1990; and OECD trade from 1982 through 1990.
Statistical mechanics of self-driven Carnot cycles
Smith, E.
1999-10-01
The spontaneous generation and finite-amplitude saturation of sound, in a traveling-wave thermoacoustic engine, are derived as properties of a second-order phase transition. It has previously been argued that this dynamical phase transition, called {open_quotes}onset,{close_quotes} has an equivalent equilibrium representation, but the saturation mechanism and scaling were not computed. In this work, the sound modes implementing the engine cycle are coarse-grained and statistically averaged, in a partition function derived from microscopic dynamics on criteria of scale invariance. Self-amplification performed by the engine cycle is introduced through higher-order modal interactions. Stationary points and fluctuations of the resulting phenomenological Lagrangian are analyzed and related to background dynamical currents. The scaling of the stable sound amplitude near the critical point is derived and shown to arise universally from the interaction of finite-temperature disorder, with the order induced by self-amplification. {copyright} {ital 1999} {ital The American Physical Society}
NEPA litigation 1988-1995: A detailed statistical analysis
Reinke, D.C.; Robitaille, P.
1997-08-01
The intent of this study was to identify trends and lessons learned from litigated NEPA documents and to compare and contrast these trends among Federal agencies. More than 350 NEPA cases were collected, reviewed, and analyzed. Of the NEPA cases reviewed, more than 170 were appeals or Supreme Court cases, mostly from the late 1980s through 1995. For this time period, the sampled documents represent the majority of the appeals court cases and all the Supreme Court cases. Additionally, over 170 district court cases were also examined as a representative sample of district court decisions on NEPA. Cases on agency actions found to need NEPA documentation (but that had no documentation) and cases on NEPA documents that were found to be inadequate were pooled and examined to determine the factors that were responsible for these findings. The inadequate documents were specifically examined to determine if there were any general trends. The results are shown in detailed statistical terms. Generally, when a Federal agency has some type of NEPA documentation (e.g., CX, EA, or EIS) and at least covers the basic NEPA procedural requirements, the agency typically wins the litigation. NEPA documents that lose generally have serious errors of omission. An awareness and understanding of the errors of omission can help Federal agencies to ensure that they produce winner a greater percentage of the time.
Statistical properties of super-hot solar flares
Caspi, Amir; Krucker, Sm; Lin, R. P.
2014-01-20
We use Reuven Ramaty High Energy Solar Spectroscopic Imager (RHESSI) high-resolution imaging and spectroscopy observations from ?6 to 100 keV to determine the statistical relationships between measured parameters (temperature, emission measure, etc.) of hot, thermal plasma in 37 intense (GOES M- and X-class) solar flares. The RHESSI data, most sensitive to the hottest flare plasmas, reveal a strong correlation between the maximum achieved temperature and the flare GOES class, such that 'super-hot' temperatures >30 MK are achieved almost exclusively by X-class events; the observed correlation differs significantly from that of GOES-derived temperatures, and from previous studies. A nearly ubiquitous association with high emission measures, electron densities, and instantaneous thermal energies suggests that super-hot plasmas are physically distinct from cooler, ?10-20 MK GOES plasmas, and that they require substantially greater energy input during the flare. High thermal energy densities suggest that super-hot flares require strong coronal magnetic fields, exceeding ?100 G, and that both the plasma ? and volume filling factor f cannot be much less than unity in the super-hot region.
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Liu, Yixin; Zhou, Kai; Lei, Yu
2015-01-01
High temperature gas sensors have been highly demanded for combustion process optimization and toxic emissions control, which usually suffer from poor selectivity. In order to solve this selectivity issue and identify unknown reducing gas species (CO, CH 4 , and CH 8 ) and concentrations, a high temperature resistive sensor array data set was built in this study based on 5 reported sensors. As each sensor showed specific responses towards different types of reducing gas with certain concentrations, based on which calibration curves were fitted, providing benchmark sensor array response database, then Bayesian inference framework was utilized to processmore » the sensor array data and build a sample selection program to simultaneously identify gas species and concentration, by formulating proper likelihood between input measured sensor array response pattern of an unknown gas and each sampled sensor array response pattern in benchmark database. This algorithm shows good robustness which can accurately identify gas species and predict gas concentration with a small error of less than 10% based on limited amount of experiment data. These features indicate that Bayesian probabilistic approach is a simple and efficient way to process sensor array data, which can significantly reduce the required computational overhead and training data.« less
Barradas, N. P.; Alves, E.; Siketic, Z.; Radovic, I. Bogdanovic
2009-03-10
The accuracy of ion beam analysis experiments depends critically on the stopping power values available. While for H and He ions accuracies normally better than 5% are achieved by usual interpolative schemes such as SRIM, for heavier ions the accuracy is worse. One of the main reasons is that the experimental data bases are very sparse, even for important materials such as Si. New measurements are therefore needed. Measurement of stopping power is often made with transmission in thin films, with the usual problems of film thickness homogeneity. We have previously developed an alternative method based on measuring bulk spectra, and fitting the yield by treating the stopping power as a fit parameter in a Bayesian inference Markov chain Monte Carlo procedure included in the standard IBA code NDF. We report on improvements of the method and on its application to the determination of the stopping power of {sup 7}Li in Si. To validate the method, we also apply it to the stopping of {sup 4}He in Si, which is known with 2% accuracy.
Advanced Technology Vehicle Testing
James Francfort
2004-06-01
The goal of the U.S. Department of Energy's Advanced Vehicle Testing Activity (AVTA) is to increase the body of knowledge as well as the awareness and acceptance of electric drive and other advanced technology vehicles (ATV). The AVTA accomplishes this goal by testing ATVs on test tracks and dynamometers (Baseline Performance testing), as well as in real-world applications (Fleet and Accelerated Reliability testing and public demonstrations). This enables the AVTA to provide Federal and private fleet managers, as well as other potential ATV users, with accurate and unbiased information on vehicle performance and infrastructure needs so they can make informed decisions about acquiring and operating ATVs. The ATVs currently in testing include vehicles that burn gaseous hydrogen (H2) fuel and hydrogen/CNG (H/CNG) blended fuels in internal combustion engines (ICE), and hybrid electric (HEV), urban electric, and neighborhood electric vehicles. The AVTA is part of DOE's FreedomCAR and Vehicle Technologies Program.
Testing of the structural evaluation test unit
Ammerman, D.J.; Bobbe, J.G.
1995-12-31
In the evaluation of the safety of radioactive material transportation it is important to consider the response of Type B packages to environments more severe than that prescribed by the hypothetical accident sequence in Title 10 Part 71 of the Code of Federal Regulations (NRC 1995). The impact event in this sequence is a 9-meter drop onto an essentially unyielding target, resulting in an impact velocity of 13.4 m/s. The behavior of 9 packages when subjected to impacts more severe than this is not well known. It is the purpose of this program to evaluate the structural response of a test package to these environments. Several types of structural response are considered. Of primary importance is the behavior of the package containment boundary, including the bolted closure and 0-rings. Other areas of concern are loss of shielding capability due to lead slump and the deceleration loading of package contents, that may cause damage to them. This type of information is essential for conducting accurate risk assessments on the transportation of radioactive materials. Currently very conservative estimates of the loss of package protection are used in these assessments. This paper will summarize the results of a regulatory impact test and three extra-regulatory impact tests on a sample package.
Pickett, P.T.
A hollow fitting for use in gas spectrometry leak testing of conduit joints is divided into two generally symmetrical halves along the axis of the conduit. A clip may quickly and easily fasten and unfasten the halves around the conduit joint under test. Each end of the fitting is sealable with a yieldable material, such as a piece of foam rubber. An orifice is provided in a wall of the fitting for the insertion or detection of helium during testing. One half of the fitting also may be employed to test joints mounted against a surface.
Pickett, Patrick T.
1981-01-01
A hollow fitting for use in gas spectrometry leak testing of conduit joints is divided into two generally symmetrical halves along the axis of the conduit. A clip may quickly and easily fasten and unfasten the halves around the conduit joint under test. Each end of the fitting is sealable with a yieldable material, such as a piece of foam rubber. An orifice is provided in a wall of the fitting for the insertion or detection of helium during testing. One half of the fitting also may be employed to test joints mounted against a surface.
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Energy Storage Nuclear Power & Engineering Grid Modernization Battery Testing Nuclear Fuel ... SubTER Carbon Sequestration Program Leadership EnergyWater Nexus EnergyWater History ...
Central Receiver Test Facility
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Energy Storage Nuclear Power & Engineering Grid Modernization Battery Testing Nuclear Fuel ... SubTER Carbon Sequestration Program Leadership EnergyWater Nexus EnergyWater History ...
Sensitivity testing and analysis
Neyer, B.T.
1991-01-01
New methods of sensitivity testing and analysis are proposed. The new test method utilizes Maximum Likelihood Estimates to pick the next test level in order to maximize knowledge of both the mean, {mu}, and the standard deviation, {sigma} of the population. Simulation results demonstrate that this new test provides better estimators (less bias and smaller variance) of both {mu} and {sigma} than the other commonly used tests (Probit, Bruceton, Robbins-Monro, Langlie). A new method of analyzing sensitivity tests is also proposed. It uses the Likelihood Ratio Test to compute regions of arbitrary confidence. It can calculate confidence regions, for {mu}, {sigma}, and arbitrary percentiles. Unlike presently used methods, such as the program ASENT which is based on the Cramer-Rao theorem, it can analyze the results of all sensitivity tests, and it does not significantly underestimate the size of the confidence regions. The new test and analysis methods will be explained and compared to the presently used methods. 19 refs., 12 figs.
Performance testing accountability measurements
Oldham, R.D.; Mitchell, W.G.; Spaletto, M.I.
1993-12-31
The New Brunswick Laboratory (NBL) provides assessment support to the DOE Operations Offices in the area of Material Control and Accountability (MC and A). During surveys of facilities, the Operations Offices have begun to request from NBL either assistance in providing materials for performance testing of accountability measurements or both materials and personnel to do performance testing. To meet these needs, NBL has developed measurement and measurement control performance test procedures and materials. The present NBL repertoire of performance tests include the following: (1) mass measurement performance testing procedures using calibrated and traceable test weights, (2) uranium elemental concentration (assay) measurement performance tests which use ampulated solutions of normal uranyl nitrate containing approximately 7 milligrams of uranium per gram of solution, and (3) uranium isotopic measurement performance tests which use ampulated uranyl nitrate solutions with enrichments ranging from 4% to 90% U-235. The preparation, characterization, and packaging of the uranium isotopic and assay performance test materials were done in cooperation with the NBL Safeguards Measurements Evaluation Program since these materials can be used for both purposes.
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Lighting-Test-Facilities Sign In About | Careers | Contact | Investors | bpa.gov Search Policy & Reporting Expand Policy & Reporting EE Sectors Expand EE Sectors Technology &...
National Nuclear Security Administration (NNSA)
5%2A en Office of Test and Evaluation http:nnsa.energy.govaboutusourprogramsdefenseprogramsstockpilestewardshiptestcapabilitiesand-eval
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
submit Nanoparticle toxicity testing Assessing the potential health hazards of nanotechnology March 25, 2013 Robot In the search for more accurate and efficient techniques to...
Financial statistics of major US publicly owned electric utilities 1992
Not Available
1994-01-01
The 1992 edition of the Financial Statistics of Major US Publicly Owned Electric Utilities publication presents 4 years (1989 through 1992) of summary financial data and current year detailed financial data on the major publicly owned electric utilities. The objective of the publication is to provide Federal and State governments, industry, and the general public with current and historical data that can be used for policymaking and decisionmaking purposes related to publicly owned electric utility issues. Generator and nongenerator summaries are presented in this publication. Four years of summary financial data are provided. Summaries of generators for fiscal years ending June 30 and December 31, nongenerators for fiscal years ending June 30 and December 31, and summaries of all respondents are provided. The composite tables present aggregates of income statement and balance sheet data, as well as financial indicators. Composite tables also display electric operation and maintenance expenses, electric utility plant, number of consumers, sales of electricity, and operating revenue, and electric energy account data. The primary source of publicly owned financial data is the Form EIA-412, {open_quotes}Annual Report of Public Electric Utilities.{close_quotes} Public electric utilities file this survey on a fiscal year, rather than a calendar year basis, in conformance with their recordkeeping practices. In previous editions of this publication, data were aggregated by the two most commonly reported fiscal years, June 30 and December 31. This omitted approximately 20 percent of the respondents who operate on fiscal years ending in other months. Accordingly, the EIA undertook a review of the Form EIA-412 submissions to determine if alternative classifications of publicly owned electric utilities would permit the inclusion of all respondents.
Testing cosmic geometry without dynamic distortions using voids
Hamaus, Nico; Sutter, P.M.; Lavaux, Guilhem; Wandelt, Benjamin D. E-mail: sutter@iap.fr E-mail: wandelt@iap.fr
2014-12-01
We propose a novel technique to probe the expansion history of the Universe based on the clustering statistics of cosmic voids. In particular, we compute their two-point statistics in redshift space on the basis of realistic mock galaxy catalogs and apply the Alcock-Paczynski test. In contrast to galaxies, we find void auto-correlations to be marginally affected by peculiar motions, providing a model-independent measure of cosmological parameters without systematics from redshift-space distortions. Because only galaxy-galaxy and void-galaxy correlations have been considered in these types of studies before, the presented method improves both statistical and systematic uncertainties on the product of angular diameter distance and Hubble rate, furnishing the potentially cleanest probe of cosmic geometry available to date.
Praeg, Walter F.
1986-01-01
An assembly is provided for testing one or more contact material samples in a vacuum environment. The samples are positioned as an inner conductive cylinder assembly which is mounted for reciprocal vertical motion as well as deflection from a vertical axis. An outer conductive cylinder is coaxially positioned around the inner cylinder and test specimen to provide a vacuum enclosure therefor. A power source needed to drive test currents through the test specimens is connected to the bottom of each conductive cylinder, through two specially formed conductive plates. The plates are similar in form, having a plurality of equal resistance current paths connecting the power source to a central connecting ring. The connecting rings are secured to the bottom of the inner conductive assembly and the outer cylinder, respectively. A hydraulic actuator is also connected to the bottom of the inner conductor assembly to adjust the pressure applied to the test specimens during testing. The test assembly controls magnetic forces such that the current distribution through the test samples is symmetrical and that contact pressure is not reduced or otherwise disturbed.
Prematurely terminated slug tests
Karasaki, K. )
1990-07-01
A solution of the well response to a prematurely terminated slug test (PTST) is presented. The advantages of a PTST over conventional slug tests are discussed. A systematized procedure of a PTST is proposed, where a slug test is terminated in the midpoint of the flow point, and the subsequent shut-in data is recorded and analyzed. This method requires a downhole shut-in device and a pressure transducer, which is no more than the conventional deep-well slug testing. As opposed to slug tests, which are ineffective when a skin is present, more accurate estimate of formation permeability can be made using a PTST. Premature termination also shortens the test duration considerably. Because in most cases no more information is gained by completing a slug test to the end, the author recommends that conventional slug tests be replaced by the premature termination technique. This study is part of an investigation of the feasibility of geologic isolation of nuclear wastes being carried out by the US Department of Energy and the National Cooperative for the Storage of Radioactive Waste of Switzerland.
Miley, Don
2013-05-28
The Advanced Test Reactor at Idaho National Laboratory is the foremost nuclear materials test reactor in the world. This virtual tour describes the reactor, how experiments are conducted, and how spent nuclear fuel is handled and stored. For more information about INL research, visit http://www.facebook.com/idahonationallaboratory.
Performance demonstration tests for eddy current inspection of steam generator tubing
Kurtz, R.J.; Heasler, P.G.; Anderson, C.M.
1996-05-01
This report describes the methodology and results for development of performance demonstration tests for eddy current (ET) inspection of steam generator tubes. Statistical test design principles were used to develop the performance demonstration tests. Thresholds on ET system inspection performance were selected to ensure that field inspection systems would have a high probability of detecting and and correctly sizing tube degradation. The technical basis for the ET system performance thresholds is presented in detail. Statistical test design calculations for probability of detection and flaw sizing tests are described. A recommended performance demonstration test based on the design calculations is presented. A computer program for grading the probability of detection portion of the performance demonstration test is given.
Hay, R.G.
1982-01-01
The Kauai Test Facility (KTF) is a Department of Energy rocket launch facility operated by Sandia National Laboratories. Originally it was constructed in support of the high altitude atmospheric nuclear test phase of operation Dominic in the early 1960's. Later, the facility went through extensive improvement and modernization to become an integral part of the Safeguard C readiness to resume nuclear testing program. Since its inception and build up, in the decade of the sixties and the subsequent upgrades of the seventies, range test activities have shifted from full scale test to emphasis on research and development of materials and components, and to making high altitude scientific measurements. Primarily, the facility is intended to be utilized in support of development programs at the DOE weapons laboratories, however, other organizations may make use of the facility on a non-interface basis. The physical components at KTF and their operation are described.
Vodnick, David James; Dwivedi, Arpit; Keranen, Lucas Paul; Okerlund, Michael David; Schmitz, Roger William; Warren, Oden Lee; Young, Christopher David
2015-01-27
An automated testing system includes systems and methods to facilitate inline production testing of samples at a micro (multiple microns) or less scale with a mechanical testing instrument. In an example, the system includes a probe changing assembly for coupling and decoupling a probe of the instrument. The probe changing assembly includes a probe change unit configured to grasp one of a plurality of probes in a probe magazine and couple one of the probes with an instrument probe receptacle. An actuator is coupled with the probe change unit, and the actuator is configured to move and align the probe change unit with the probe magazine and the instrument probe receptacle. In another example, the automated testing system includes a multiple degree of freedom stage for aligning a sample testing location with the instrument. The stage includes a sample stage and a stage actuator assembly including translational and rotational actuators.
Vodnick, David James; Dwivedi, Arpit; Keranen, Lucas Paul; Okerlund, Michael David; Schmitz, Roger William; Warren, Oden Lee; Young, Christopher David
2014-07-08
An automated testing system includes systems and methods to facilitate inline production testing of samples at a micro (multiple microns) or less scale with a mechanical testing instrument. In an example, the system includes a probe changing assembly for coupling and decoupling a probe of the instrument. The probe changing assembly includes a probe change unit configured to grasp one of a plurality of probes in a probe magazine and couple one of the probes with an instrument probe receptacle. An actuator is coupled with the probe change unit, and the actuator is configured to move and align the probe change unit with the probe magazine and the instrument probe receptacle. In another example, the automated testing system includes a multiple degree of freedom stage for aligning a sample testing location with the instrument. The stage includes a sample stage and a stage actuator assembly including translational and rotational actuators.
Vodnick, David James; Dwivedi, Arpit; Keranen, Lucas Paul; Okerlund, Michael David; Schmitz, Roger William; Warren, Oden Lee; Young, Christopher David
2015-02-24
An automated testing system includes systems and methods to facilitate inline production testing of samples at a micro (multiple microns) or less scale with a mechanical testing instrument. In an example, the system includes a probe changing assembly for coupling and decoupling a probe of the instrument. The probe changing assembly includes a probe change unit configured to grasp one of a plurality of probes in a probe magazine and couple one of the probes with an instrument probe receptacle. An actuator is coupled with the probe change unit, and the actuator is configured to move and align the probe change unit with the probe magazine and the instrument probe receptacle. In another example, the automated testing system includes a multiple degree of freedom stage for aligning a sample testing location with the instrument. The stage includes a sample stage and a stage actuator assembly including translational and rotational actuators.
Spectral Identification Inference Engine
Energy Science and Technology Software Center (OSTI)
2004-07-27
The software interprets spectra (mass spectra, ion mobility spectra, etc.) using a method that mimics how an expert human analyst would perform the interpretation. Because spectra can be described linguistically (e.g. peak X must be large and peak y must be small), their description can be reduced to rules using fuzzy logic. Therefore, a fuzzy logic rule base can be applied to interpreting the spectra. The fuzzy logic rule base is also easy for themore » user to understand, and therefore, easy to check and verify its accuracy.« less
Dr. Binh T. Pham; Grant L. Hawkes; Jeffrey J. Einerson
2012-10-01
As part of the Research and Development program for Next Generation High Temperature Reactors (HTR), a series of irradiation tests, designated as Advanced Gas-cooled Reactor (AGR), have been defined to support development and qualification of fuel design, fabrication process, and fuel performance under normal operation and accident conditions. The AGR tests employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule and instrumented with thermocouples (TC) embedded in graphite blocks enabling temperature control. The data representing the crucial test fuel conditions (e.g., temperature, neutron fast fluence, and burnup) while impossible to obtain from direct measurements are calculated by physics and thermal models. The irradiation and post-irradiation examination (PIE) experimental data are used in model calibration effort to reduce the inherent uncertainty of simulation results. This paper is focused on fuel temperature predicted by the ABAQUS code’s finite element-based thermal models. The work follows up on a previous study, in which several statistical analysis methods were adapted, implemented in the NGNP Data Management and Analysis System (NDMAS), and applied for improving qualification of AGR-1 thermocouple data. The present work exercises the idea that the abnormal trends of measured data observed from statistical analysis may be caused by either measuring instrument deterioration or physical mechanisms in capsules that may have shifted the system thermal response. As an example, the uneven reduction of the control gas gap in Capsule 5 revealed by the capsule metrology measurements in PIE helps justify the reduction in TC readings instead of TC drift. This in turn prompts modification of thermal model to better fit with experimental data, thus help increase confidence, and in other word reduce model uncertainties in thermal simulation results of the AGR-1 test.
TEST PROCEDURE VALIDATION TEST OF A DISCRIMINATING TRITIUM MONITOR...
Office of Environmental Management (EM)
Test Results For Physical Separation Of Tritium From Noble Gases And It's Implications For ... AIR SAMPLERS NEW FAMILY OF STACK MONITORS TEST PROCEDURE VALIDATION TEST OF A ...
Violations of the ceiling principle: Exact conditions and statistical evidence
Slimowitz, J.R. ); Cohen, J.E. )
1993-08-01
The National Research Council recommended the use of the ceiling principle in forensic applications of DNA testing on the grounds that the ceiling principle was believed to be [open quotes]conservative,[close quotes] giving estimates greater than or equal to the actual genotype frequencies in the appropriate reference population. The authors show here that the ceiling principle can fail to be conservative in a population with two subpopulations and two loci, each with two alleles at Hardy-Weinberg equilibrium, if there is some linkage disequilibrium between loci. They also show that the ceiling principle can fail in a population with two subpopulations and a single locus with two alleles if Hardy-Weinberg equilibrium does not hold. They given explicit analytical formulas to describe when the ceiling principle fails. By showing that the ceiling principle is not always mathematically reliable, this analysis gives users of the ceiling principle the responsibility of demonstrating that it is conservative for the particular data with which it is used. Reanalysis of VNTR data bases of the FBI provides compelling evidence of two-locus associations within three major ethnic groups (Caucasian, black, and Hispanic) in the United States, even though the loci tested are located on different chromosomes. Before the ceiling principle is implemented, more research should be done to determine whether it may be violated in practice. 19 refs., 5 tabs.
Mathematical and statistical analysis of the effect of boron on yield parameters of wheat
Rawashdeh, Hamzeh; Sala, Florin; Boldea, Marius
2015-03-10
The main objective of this research is to investigate the effect of foliar applications of boron at different growth stages on yield and yield parameters of wheat. The contribution of boron in achieving yield parameters is described by second degree polynomial equations, with high statistical confidence (p<0.01; F theoretical < F calculated, according to ANOVA test, for Alfa = 0.05). Regression analysis, based on R{sup 2} values obtained, made it possible to evaluate the particular contribution of boron to the realization of yield parameters. This was lower for spike length (R{sup 2} = 0.812), thousand seeds weight (R{sup 2} = 0.850) and higher in the case of the number of spikelets (R{sup 2} = 0.936) and the number of seeds on a spike (R{sup 2} = 0.960). These results confirm that boron plays an important part in achieving the number of seeds on a spike in the case of wheat, as the contribution of this element to the process of flower fertilization is well-known. In regards to productivity elements, the contribution of macroelements to yield quantity is clear, the contribution of B alone being R{sup 2} = 0.868.
Westinghouse Test Stand Report
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Westinghouse Non-Proprietary Class 3 © 2014 Westinghouse Electric Company LLC. All Rights Reserved MT-14-12 Westinghouse VERA Test Stand Zero Power Physics Test Simulations for the AP1000® PWR Fausto Franceschini, Westinghouse Electric Company LLC Andrew Godfrey, Oak Ridge National Laboratory Joel Kulesza, Westinghouse Electric Company LLC Robert Oelrich, Westinghouse Electric Company LLC L3.AMA.VDT.P8.01 Milestone Report CASL-U-2014-0012-000 March 6, 2014 MT-14-12 Westinghouse VERA Test Stand
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
STAR Test Environment STAR Test Environment These instructions describe how to set up the STAR environment independent of the production environment in order to test different installations in $OPTSTAR and $GROUP_DIR. If you want to modify those installations you will need access to the starofl account. Bypass STAR envionment login Edit your ~/.pdsf_setup file changing the STAR_LINUX_SETUP to "use_none" and start a new session. You should not see all the STAR environmental variables
Energy Science and Technology Software Center (OSTI)
2003-01-01
The Robust Systems Test Framework (RSTF) provides a means of specifying and running test programs on various computation platforms. RSTF provides a level of specification above standard scripting languages. During a set of runs, standard timing information is collected. The RSTF specification can also gather job-specific information, and can include ways to classify test outcomes. All results and scripts can be stored into and retrieved from an SQL database for later data analysis. RSTF alsomore » provides operations for managing the script and result files, and for compiling applications and gathering compilation information such as optimization flags.« less
Statistical Mechanics of Prion Diseases (Journal Article) | SciTech Connect
Office of Scientific and Technical Information (OSTI)
Statistical Mechanics of Prion Diseases Citation Details In-Document Search Title: Statistical Mechanics of Prion Diseases We present a two-dimensional, lattice based, protein-level statistical mechanical model for prion diseases (e.g., mad cow disease) with concomitant prion protein misfolding and aggregation. Our studies lead us to the hypothesis that the observed broad incubation time distribution in epidemiological data reflect fluctuation dominated growth seeded by a few nanometer scale
Statistical behavior in deterministic quantum systems with few degrees of freedom
Jensen, R.V.; Shankar, R.
1985-04-29
Numerical studies of the dynamics of finite quantum spin chains are presented which show that quantum systems with few degrees of freedom (N = 7) can be described by equilibrium statistical mechanics. The success of the statistical description is seen to depend on the interplay between the initial state, the observable, and the Hamiltonian. This work clarifies the impact of integrability and conservation laws on statistical behavior. The relation to quantum chaos is also discussed.
Photon-number statistics of twin beams: Self-consistent measurement, reconstruction, and properties
Pe?ina, Jan Jr.; Haderka, Ond?ej; Michlek, Vclav
2014-12-04
A method for the determination of photon-number statistics of twin beams using the joint signal-idler photocount statistics obtained by an iCCD camera is described. It also provides absolute quantum detection efficiency of the camera. Using the measured photocount statistics, quasi-distributions of integrated intensities are obtained. They attain negative values occurring in characteristic strips an a consequence of pairing of photons in twin beams.
HEV America Baseline Test Sequence
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
BASELINE TEST SEQUENCE Revision 1 September 1, 2006 Prepared by Electric Transportation ... All Rights Reserved HEV America Baseline Test Sequence Page 1 HEV PERFORMANCE TEST ...
Galveston Test | Open Energy Information
Galveston Test Jump to: navigation, search Name Galveston Test Facility Galveston Test Sector Wind energy Facility Type Offshore Wind Facility Status Proposed Owner Coastal Point...
Battery Technology Life Verification Testing and Analysis
Jon P. Christophersen; Gary L. Hunt; Ira Bloom; Ed Thomas; Vince Battaglia
2007-12-01
A critical component to the successful commercialization of batteries for automotive applications is accurate life prediction. The Technology Life Verification Test (TLVT) Manual was developed to project battery life with a high level of statistical confidence within only one or two years of accelerated aging. The validation effort that is presently underway has led to several improvements to the original methodology. For example, a newly developed reference performance test revealed a voltage path dependence effect on resistance for lithium-ion cells. The resistance growth seems to depend on how a target condition is reached (i.e., by a charge or a discharge). Second, the methodology for assessing the level of measurement uncertainty was improved using a propagation of errors in the fundamental measurements to the derived response (e.g., resistance). This new approach provides a more realistic assessment of measurement uncertainty. Third, the methodology for allocating batteries to the test matrix has been improved. The new methodology was developed to assign batteries to the matrix such that the average of each test group would be representative of the overall population. These changes to the TLVT methodology will help to more accurately predict a battery technologys life capability with a high degree of confidence.
Wind Technology Testing Center Acquires New Blade Fatigue Test...
Technology Testing Center Acquires New Blade Fatigue Test System Wind Technology Testing Center Acquires New Blade Fatigue Test System August 1, 2013 - 4:33pm Addthis This is an ...
Request for Information: Operation of Regional Test Center Test...
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Information: Operation of Regional Test Center Test Bed Located at SolarTAC Request for Information: Operation of Regional Test Center Test Bed Located at SolarTAC Solicitation...
User Statistics Collection Practices | U.S. DOE Office of Science...
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Facility News Contact Information Office of Science U.S. Department of Energy 1000 ... The Office of Science upholds a set of core principles regarding user statistics ...
High Statistics Study of Nearby Type 1a Supernovae. QUEST Camera...
Office of Scientific and Technical Information (OSTI)
Statistics Study of Nearby Type 1a Supernovae. QUEST Camera Short Term Maintenance: Final Technical Report Baltay, Charles 79 ASTRONOMY AND ASTROPHYSICS Study of Type 1a Supernovae...
On the ability of Order Statistics to distinguish different models for continuum gamma decay
Sandoval, J. J.; Cristancho, F.
2007-10-26
A simulation procedure to calculate some important parameters to the application of Order Statistics in the analysis of continuum gamma decay is presented.
Statistical Design of Experiment for Li-ion Cell Formation Parameters...
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Design of Experiment for Li-ion Cell Formation Parameters using Gen3 Electrode Materials: Final Summary Statistical Design of Experiment for Li-ion Cell Formation Parameters ...
Zhao, Yang; Mace, Gerald G.; Comstock, Jennifer M.
2011-06-01
To better understand the role of small particles in the microphysical processes and the radiative properties of cirrus, the reliability of historical in-situ data must be understood. Recent studies call into question the validity of that data because of shattering of large crystals on probe and aircraft surfaces thereby artificially amplifying the concentration of crystals smaller than approximately 50 ?m. We contend that the general character of the in-situ measurements must be consistent, in a broad sense, with statistics derived from long-term remote sensing data. To examine this consistency, an algorithm using Doppler radar moments and Raman lidar extinction is developed to retrieve a bimodal particle size distribution and its uncertainty. Using case studies and statistics compiled over one year we show that the existence of high concentrations (> 1 cm-3 ) of small (sub 50 ?m) particles in cirrus are not consistent with any reasonable interpretation of the remote sensing data. We conclude that the high concentrations of small particles found in many aircraft data sets are therefore likely an artifact of the in situ measurement process.
Energy Science and Technology Software Center (OSTI)
2007-08-22
The pamtest utility calls the normal PAM hooks using a service and username supplied on the command line. This allows an administratory to test any one of many configured PAM stacks as any existing user on the machine.
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
of the pump's operating performance including lift, discharge pressure, power input, and water flow. The results of the pump test provide a value for the overall efficiency of the...
Project W-320, combined pump winch assembly test - Test report
Bellomy, J.R., Westinghouse Hanford
1996-05-15
Test report documenting results of the Project W-320 combined pump/winch test performed at Lawrence Pumps.
Steimke, J.L.
2001-07-10
A full-scale, transparent replica of a GeoSiphon was constructed in the TFL to test a new concept, using a solar powered vacuum pump to remove accumulated gases from the air chamber. It did not have a treatment cell containing iron filings as do the actual TNX GeoSiphons in the field, but it was accurate in all other respects. The gas generation that is observed in an actual GeoSiphon was simulated by air injection at the inlet of the TFL GeoSiphon. After facility shakedown, three stages of testing were conducted: verification testing, parametric testing and long term testing. In verification testing, the TFL GeoSiphon was used to reproduce a particular test at TNX in which the water flowrate decreased gradually as the result of air accumulation at the crest of a siphon without an air chamber. For this test the vacuum pump was not used and the air chamber was initially filled with air rather than water. Agreement between data from the TNX GeoSiphon and the TFL GeoSiphon was good, which gave confidence that the TFL GeoSiphon was a good hydraulic representation of the TNX GeoSiphon. For the remaining tests, the solar powered vacuum pump and air chamber were used. In parametric testing, steady state runs were made for water flowrates ranging from 1 gpm to 19 gpm, air injection rates ranging from 0 to 77 standard cc/min and outfall line angles ranging from vertical to 60 degrees from vertical. In all cases, the air chamber and vacuum pump removed nearly all of the air and the GeoSiphon operated without problems. In long term testing, the GeoSiphon was allowed to run continuously for 21 days at one set of conditions. During this time the solar cell kept the storage battery fully charged at all times and the control circuit for the vacuum pump operated reliably. The solar panel was observed to have a large excess capacity when used with the vacuum pump. With two changes, the concept of using a solar powered vacuum pump attached to an air chamber should be ready for long term use in the field. Those changes are to insulate the air chamber of the GeoSiphon so it will not freeze in the winter and to make the tank from steel rather than transparent plastic.
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Testing - Sandia Energy Energy Search Icon Sandia Home Locations Contact Us Employee Locator Energy & Climate Secure & Sustainable Energy Future Stationary Power Energy Conversion Efficiency Solar Energy Wind Energy Water Power Supercritical CO2 Geothermal Natural Gas Safety, Security & Resilience of the Energy Infrastructure Energy Storage Nuclear Power & Engineering Grid Modernization Battery Testing Nuclear Fuel Cycle Defense Waste Management Programs Advanced Nuclear Energy
Central Receiver Test Facility
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Receiver Test Facility - Sandia Energy Energy Search Icon Sandia Home Locations Contact Us Employee Locator Energy & Climate Secure & Sustainable Energy Future Stationary Power Energy Conversion Efficiency Solar Energy Wind Energy Water Power Supercritical CO2 Geothermal Natural Gas Safety, Security & Resilience of the Energy Infrastructure Energy Storage Nuclear Power & Engineering Grid Modernization Battery Testing Nuclear Fuel Cycle Defense Waste Management Programs Advanced
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
experimental tank tests - Sandia Energy Energy Search Icon Sandia Home Locations Contact Us Employee Locator Energy & Climate Secure & Sustainable Energy Future Stationary Power Energy Conversion Efficiency Solar Energy Wind Energy Water Power Supercritical CO2 Geothermal Natural Gas Safety, Security & Resilience of the Energy Infrastructure Energy Storage Nuclear Power & Engineering Grid Modernization Battery Testing Nuclear Fuel Cycle Defense Waste Management Programs Advanced
McAdams, Wm.A.; Foss, M.H.
1958-08-12
A method of testing containers for leaks is described, particularly the testing of containers or cans in which the uranium slugs for nuelear reactors are jacketed. This method involves the immersion of the can in water under l50 pounds of pressure, then removing, drying, and coating the can with anhydrous copper sulfate. Amy water absorbed by the can under pressure will exude and discolor the copper sulfate in the area about the leak.
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Test Circuit Service Network R&D Software-Defined Networking (SDN) Experimental Network Testbeds 100G SDN Testbed Dark Fiber Testbed Test Circuit Service Testbed Results Current Testbed Research Previous Testbed Research Performance (perfSONAR) Software & Tools Development Data for Researchers Partnerships Publications Workshops Contact Us Technical Assistance: 1 800-33-ESnet (Inside US) 1 800-333-7638 (Inside US) 1 510-486-7600 (Globally) 1 510-486-7607 (Globally) Report Network
Amidan, Brett G.; Pulsipher, Brent A.; Matzke, Brett D.
2009-12-17
In September 2008 a large-scale testing operation (referred to as the INL-2 test) was performed within a two-story building (PBF-632) at the Idaho National Laboratory (INL). The report “Operational Observations on the INL-2 Experiment” defines the seven objectives for this test and discusses the results and conclusions. This is further discussed in the introduction of this report. The INL-2 test consisted of five tests (events) in which a floor (level) of the building was contaminated with the harmless biological warfare agent simulant Bg and samples were taken in most, if not all, of the rooms on the contaminated floor. After the sampling, the building was decontaminated, and the next test performed. Judgmental samples and probabilistic samples were determined and taken during each test. Vacuum, wipe, and swab samples were taken within each room. The purpose of this report is to study an additional four topics that were not within the scope of the original report. These topics are: 1) assess the quantitative assumptions about the data being normally or log-normally distributed; 2) evaluate differences and quantify the sample to sample variability within a room and across the rooms; 3) perform geostatistical types of analyses to study spatial correlations; and 4) quantify the differences observed between surface types and sampling methods for each scenario and study the consistency across the scenarios. The following four paragraphs summarize the results of each of the four additional analyses. All samples after decontamination came back negative. Because of this, it was not appropriate to determine if these clearance samples were normally distributed. As Table 1 shows, the characterization data consists of values between and inclusive of 0 and 100 CFU/cm2 (100 was the value assigned when the number is too numerous to count). The 100 values are generally much bigger than the rest of the data, causing the data to be right skewed. There are also a significant number of zeros. Using QQ plots these data characteristics show a lack of normality from the data after contamination. Normality is improved when looking at log(CFU/cm2). Variance component analysis (VCA) and analysis of variance (ANOVA) were used to estimate the amount of variance due to each source and to determine which sources of variability were statistically significant. In general, the sampling methods interacted with the across event variability and with the across room variability. For this reason, it was decided to do analyses for each sampling method, individually. The between event variability and between room variability were significant for each method, except for the between event variability for the swabs. For both the wipes and vacuums, the within room standard deviation was much larger (26.9 for wipes and 7.086 for vacuums) than the between event standard deviation (6.552 for wipes and 1.348 for vacuums) and the between room standard deviation (6.783 for wipes and 1.040 for vacuums). Swabs between room standard deviation was 0.151, while both the within room and between event standard deviations are less than 0.10 (all measurements in CFU/cm2).
Test report - caustic addition system operability test procedure
Parazin, R.E.
1995-10-13
This Operability Test Report documents the test results of test procedure WHC-SD-WM-OTP-167 ``Caustic Addition System Operability Test Procedure``. The Objective of the test was to verify the operability of the 241-AN-107 Caustic Addition System. The objective of the test was met
Investigation of statistical iterative reconstruction for dedicated breast CT
Makeev, Andrey; Glick, Stephen J.
2013-08-15
Purpose: Dedicated breast CT has great potential for improving the detection and diagnosis of breast cancer. Statistical iterative reconstruction (SIR) in dedicated breast CT is a promising alternative to traditional filtered backprojection (FBP). One of the difficulties in using SIR is the presence of free parameters in the algorithm that control the appearance of the resulting image. These parameters require tuning in order to achieve high quality reconstructions. In this study, the authors investigated the penalized maximum likelihood (PML) method with two commonly used types of roughness penalty functions: hyperbolic potential and anisotropic total variation (TV) norm. Reconstructed images were compared with images obtained using standard FBP. Optimal parameters for PML with the hyperbolic prior are reported for the task of detecting microcalcifications embedded in breast tissue.Methods: Computer simulations were used to acquire projections in a half-cone beam geometry. The modeled setup describes a realistic breast CT benchtop system, with an x-ray spectra produced by a point source and an a-Si, CsI:Tl flat-panel detector. A voxelized anthropomorphic breast phantom with 280 ?m microcalcification spheres embedded in it was used to model attenuation properties of the uncompressed woman's breast in a pendant position. The reconstruction of 3D images was performed using the separable paraboloidal surrogates algorithm with ordered subsets. Task performance was assessed with the ideal observer detectability index to determine optimal PML parameters.Results: The authors' findings suggest that there is a preferred range of values of the roughness penalty weight and the edge preservation threshold in the penalized objective function with the hyperbolic potential, which resulted in low noise images with high contrast microcalcifications preserved. In terms of numerical observer detectability index, the PML method with optimal parameters yielded substantially improved performance (by a factor of greater than 10) compared to FBP. The hyperbolic prior was also observed to be superior to the TV norm. A few of the best-performing parameter pairs for the PML method also demonstrated superior performance for various radiation doses. In fact, using PML with certain parameter values results in better images, acquired using 2 mGy dose, than FBP-reconstructed images acquired using 6 mGy dose.Conclusions: A range of optimal free parameters for the PML algorithm with hyperbolic and TV norm-based potentials is presented for the microcalcification detection task, in dedicated breast CT. The reported values can be used as starting values of the free parameters, when SIR techniques are used for image reconstruction. Significant improvement in image quality can be achieved by using PML with optimal combination of parameters, as compared to FBP. Importantly, these results suggest improved detection of microcalcifications can be obtained by using PML with lower radiation dose to the patient, than using FBP with higher dose.