Multivariate Statistical Analysis Strategies for Hyperspectral...
Office of Scientific and Technical Information (OSTI)
held May 26, 2011 in Seoul, South Korea.; Related Information: Proposed for ... Union of Microbeam Analysis Scoieties held May 26, 2011 in Seoul, South Korea
Classification of Malaysia aromatic rice using multivariate statistical analysis
Abdullah, A. H.; Adom, A. H.; Shakaff, A. Y. Md; Masnan, M. J.; Zakaria, A.; Rahim, N. A.; Omar, O.
2015-05-15
Aromatic rice (Oryza sativa L.) is considered as the best quality premium rice. The varieties are preferred by consumers because of its preference criteria such as shape, colour, distinctive aroma and flavour. The price of aromatic rice is higher than ordinary rice due to its special needed growth condition for instance specific climate and soil. Presently, the aromatic rice quality is identified by using its key elements and isotopic variables. The rice can also be classified via Gas Chromatography Mass Spectrometry (GC-MS) or human sensory panels. However, the uses of human sensory panels have significant drawbacks such as lengthy training time, and prone to fatigue as the number of sample increased and inconsistent. The GC–MS analysis techniques on the other hand, require detailed procedures, lengthy analysis and quite costly. This paper presents the application of in-house developed Electronic Nose (e-nose) to classify new aromatic rice varieties. The e-nose is used to classify the variety of aromatic rice based on the samples odour. The samples were taken from the variety of rice. The instrument utilizes multivariate statistical data analysis, including Principal Component Analysis (PCA), Linear Discriminant Analysis (LDA) and K-Nearest Neighbours (KNN) to classify the unknown rice samples. The Leave-One-Out (LOO) validation approach is applied to evaluate the ability of KNN to perform recognition and classification of the unspecified samples. The visual observation of the PCA and LDA plots of the rice proves that the instrument was able to separate the samples into different clusters accordingly. The results of LDA and KNN with low misclassification error support the above findings and we may conclude that the e-nose is successfully applied to the classification of the aromatic rice varieties.
Broader source: Energy.gov [DOE]
Multivariate Statistical Analysis of Water Chemistry in Evaluating the Origin of Contamination in Many Devils Wash, Shiprock, New Mexico
Characterization of Nuclear Fuel using Multivariate Statistical Analysis
Robel, M; Robel, M; Robel, M; Kristo, M J; Kristo, M J
2007-11-27
Various combinations of reactor type and fuel composition have been characterized using principle components analysis (PCA) of the concentrations of 9 U and Pu isotopes in the 10 fuel as a function of burnup. The use of PCA allows the reduction of the 9-dimensional data (isotopic concentrations) into a 3-dimensional approximation, giving a visual representation of the changes in nuclear fuel composition with burnup. Real-world variation in the concentrations of {sup 234}U and {sup 236}U in the fresh (unirradiated) fuel was accounted for. The effects of reprocessing were also simulated. The results suggest that, 15 even after reprocessing, Pu isotopes can be used to determine both the type of reactor and the initial fuel composition with good discrimination. Finally, partial least squares discriminant analysis (PSLDA) was investigated as a substitute for PCA. Our results suggest that PLSDA is a better tool for this application where separation between known classes is most important.
Beyth, M.; McInteer, C.; Broxton, D.E.; Bolivar, S.L.; Luke, M.E.
1980-06-01
Multivariate statistical analyses were carried out on Hydrogeochemical and Stream Sediment Reconnaissance data from the Craig quadrangle, Colorado, to support the National Uranium Resource Evaluation and to evaluate strategic or other important commercial mineral resources. A few areas for favorable uranium mineralization are suggested for parts of the Wyoming Basin, Park Range, and Gore Range. Six potential source rocks for uranium are postulated based on factor score mapping. Vanadium in stream sediments is suggested as a pathfinder for carnotite-type mineralization. A probable northwest trend of lead-zinc-copper mineralization associated with Tertiary intrusions is suggested. A few locations are mapped where copper is associated with cobalt. Concentrations of placer sands containing rare earth elements, probably of commercial value, are indicated for parts of the Sand Wash Basin.
An Application of Multivariate Statistical Analysis for Query-Driven Visualization
Gosink, Luke J.; Garth, Christoph; Anderson, John C.; Bethel, E. Wes; Joy, Kenneth I.
2010-03-01
Abstract?Driven by the ability to generate ever-larger, increasingly complex data, there is an urgent need in the scientific community for scalable analysis methods that can rapidly identify salient trends in scientific data. Query-Driven Visualization (QDV) strategies are among the small subset of techniques that can address both large and highly complex datasets. This paper extends the utility of QDV strategies with a statistics-based framework that integrates non-parametric distribution estimation techniques with a new segmentation strategy to visually identify statistically significant trends and features within the solution space of a query. In this framework, query distribution estimates help users to interactively explore their query's solution and visually identify the regions where the combined behavior of constrained variables is most important, statistically, to their inquiry. Our new segmentation strategy extends the distribution estimation analysis by visually conveying the individual importance of each variable to these regions of high statistical significance. We demonstrate the analysis benefits these two strategies provide and show how they may be used to facilitate the refinement of constraints over variables expressed in a user's query. We apply our method to datasets from two different scientific domains to demonstrate its broad applicability.
None, None
2012-12-31
This report evaluates the chemistry of seep water occurring in three desert drainages near Shiprock, New Mexico: Many Devils Wash, Salt Creek Wash, and Eagle Nest Arroyo. Through the use of geochemical plotting tools and multivariate statistical analysis techniques, analytical results of samples collected from the three drainages are compared with the groundwater chemistry at a former uranium mill in the Shiprock area (the Shiprock site), managed by the U.S. Department of Energy Office of Legacy Management. The objective of this study was to determine, based on the water chemistry of the samples, if statistically significant patterns or groupings are apparent between the sample populations and, if so, whether there are any reasonable explanations for those groupings.
Method of multivariate spectral analysis
Keenan, Michael R.; Kotula, Paul G.
2004-01-06
A method of determining the properties of a sample from measured spectral data collected from the sample by performing a multivariate spectral analysis. The method can include: generating a two-dimensional matrix A containing measured spectral data; providing a weighted spectral data matrix D by performing a weighting operation on matrix A; factoring D into the product of two matrices, C and S.sup.T, by performing a constrained alternating least-squares analysis of D=CS.sup.T, where C is a concentration intensity matrix and S is a spectral shapes matrix; unweighting C and S by applying the inverse of the weighting used previously; and determining the properties of the sample by inspecting C and S. This method can be used to analyze X-ray spectral data generated by operating a Scanning Electron Microscope (SEM) with an attached Energy Dispersive Spectrometer (EDS).
Lloyd, K. G.
2007-07-15
Buried irregular interfaces and particulate present special challenges in terms of chemical analysis and identification, and are critical issues in the manufacture of electronic materials and devices. Cross sectioning at the right location is often difficult, and, while dual-beam scanning electron microscopy/focused ion beam instruments can often provide excellent visualization of buried defects, matching chemical analysis may be absent or problematic. Time-of-flight secondary ion mass spectrometry (ToF-SIMS) depth profiling, with its ability to acquire spatially resolved depth profiles while collecting an entire mass spectrum at every 'voxel,' offers a way to revisit the problem of buried defects. Multivariate analysis of the overwhelming amount of data can reduce the output from essentially a depth profile at every mass to a small set of chemically meaningful factors. Data scaling is an important consideration in the application of these methods, and a comparison of scaling procedures is shown. Examples of ToF-SIMS depth profiles of relatively homogeneous layers, severely inhomogeneous layers, and buried particulate are discussed.
Advanced Multivariate Analysis Tools Applied to Surface Analysis...
Office of Scientific and Technical Information (OSTI)
Resource Relation: Conference: Proposed for presentation at the Microscopy and ... SUPERCONDUCTIVITY AND SUPERFLUIDITY; MICROANALYSIS; MICROSCOPY; MULTIVARIATE ANALYSIS
Glick, D.C.; Davis, A.
1984-07-01
The multivariate statistical techniques of correlation coefficients, factor analysis, and cluster analysis, implemented by computer programs, can be used to process a large data set and produce a summary of relationships between variables and between samples. These techniques were used to find relationships for data on the inorganic constituents of US coals. Three hundred thirty-five whole-seam channel samples from six US coal provinces were analyzed for inorganic variables. After consideration of the attributes of data expressed on ash basis and whole-coal basis, it was decided to perform complete statistical analyses on both data sets. Thirty variables expressed on whole-coal basis and twenty-six variables expressed on ash basis were used. For each inorganic variable, a frequency distribution histogram and a set of summary statistics was produced. These were subdivided to reveal the manner in which concentrations of inorganic constituents vary between coal provinces and between coal regions. Data collected on 124 samples from three stratigraphic groups (Pottsville, Monongahela, Allegheny) in the Appalachian region were studied using analysis of variance to determine degree of variability between stratigraphic levels. Most variables showed differences in mean values between the three groups. 193 references, 71 figures, 54 tables.
Classical least squares multivariate spectral analysis
Haaland, David M.
2002-01-01
An improved classical least squares multivariate spectral analysis method that adds spectral shapes describing non-calibrated components and system effects (other than baseline corrections) present in the analyzed mixture to the prediction phase of the method. These improvements decrease or eliminate many of the restrictions to the CLS-type methods and greatly extend their capabilities, accuracy, and precision. One new application of PACLS includes the ability to accurately predict unknown sample concentrations when new unmodeled spectral components are present in the unknown samples. Other applications of PACLS include the incorporation of spectrometer drift into the quantitative multivariate model and the maintenance of a calibration on a drifting spectrometer. Finally, the ability of PACLS to transfer a multivariate model between spectrometers is demonstrated.
Hybrid least squares multivariate spectral analysis methods
Haaland, David M.
2002-01-01
A set of hybrid least squares multivariate spectral analysis methods in which spectral shapes of components or effects not present in the original calibration step are added in a following estimation or calibration step to improve the accuracy of the estimation of the amount of the original components in the sampled mixture. The "hybrid" method herein means a combination of an initial classical least squares analysis calibration step with subsequent analysis by an inverse multivariate analysis method. A "spectral shape" herein means normally the spectral shape of a non-calibrated chemical component in the sample mixture but can also mean the spectral shapes of other sources of spectral variation, including temperature drift, shifts between spectrometers, spectrometer drift, etc. The "shape" can be continuous, discontinuous, or even discrete points illustrative of the particular effect.
Hybrid least squares multivariate spectral analysis methods
Haaland, David M.
2004-03-23
A set of hybrid least squares multivariate spectral analysis methods in which spectral shapes of components or effects not present in the original calibration step are added in a following prediction or calibration step to improve the accuracy of the estimation of the amount of the original components in the sampled mixture. The hybrid method herein means a combination of an initial calibration step with subsequent analysis by an inverse multivariate analysis method. A spectral shape herein means normally the spectral shape of a non-calibrated chemical component in the sample mixture but can also mean the spectral shapes of other sources of spectral variation, including temperature drift, shifts between spectrometers, spectrometer drift, etc. The shape can be continuous, discontinuous, or even discrete points illustrative of the particular effect.
Independent Statistics & Analysis
U.S. Energy Information Administration (EIA) Indexed Site
October 2014 Independent Statistics & Analysis www.eia.gov U.S. Department of Energy Washington, DC 20585 Quarterly Coal Distribution Report April - June 2014 This report was...
Augmented classical least squares multivariate spectral analysis
Haaland, David M.; Melgaard, David K.
2004-02-03
A method of multivariate spectral analysis, termed augmented classical least squares (ACLS), provides an improved CLS calibration model when unmodeled sources of spectral variation are contained in a calibration sample set. The ACLS methods use information derived from component or spectral residuals during the CLS calibration to provide an improved calibration-augmented CLS model. The ACLS methods are based on CLS so that they retain the qualitative benefits of CLS, yet they have the flexibility of PLS and other hybrid techniques in that they can define a prediction model even with unmodeled sources of spectral variation that are not explicitly included in the calibration model. The unmodeled sources of spectral variation may be unknown constituents, constituents with unknown concentrations, nonlinear responses, non-uniform and correlated errors, or other sources of spectral variation that are present in the calibration sample spectra. Also, since the various ACLS methods are based on CLS, they can incorporate the new prediction-augmented CLS (PACLS) method of updating the prediction model for new sources of spectral variation contained in the prediction sample set without having to return to the calibration process. The ACLS methods can also be applied to alternating least squares models. The ACLS methods can be applied to all types of multivariate data.
Augmented Classical Least Squares Multivariate Spectral Analysis
Haaland, David M.; Melgaard, David K.
2005-07-26
A method of multivariate spectral analysis, termed augmented classical least squares (ACLS), provides an improved CLS calibration model when unmodeled sources of spectral variation are contained in a calibration sample set. The ACLS methods use information derived from component or spectral residuals during the CLS calibration to provide an improved calibration-augmented CLS model. The ACLS methods are based on CLS so that they retain the qualitative benefits of CLS, yet they have the flexibility of PLS and other hybrid techniques in that they can define a prediction model even with unmodeled sources of spectral variation that are not explicitly included in the calibration model. The unmodeled sources of spectral variation may be unknown constituents, constituents with unknown concentrations, nonlinear responses, non-uniform and correlated errors, or other sources of spectral variation that are present in the calibration sample spectra. Also, since the various ACLS methods are based on CLS, they can incorporate the new prediction-augmented CLS (PACLS) method of updating the prediction model for new sources of spectral variation contained in the prediction sample set without having to return to the calibration process. The ACLS methods can also be applied to alternating least squares models. The ACLS methods can be applied to all types of multivariate data.
Augmented Classical Least Squares Multivariate Spectral Analysis
Haaland, David M.; Melgaard, David K.
2005-01-11
A method of multivariate spectral analysis, termed augmented classical least squares (ACLS), provides an improved CLS calibration model when unmodeled sources of spectral variation are contained in a calibration sample set. The ACLS methods use information derived from component or spectral residuals during the CLS calibration to provide an improved calibration-augmented CLS model. The ACLS methods are based on CLS so that they retain the qualitative benefits of CLS, yet they have the flexibility of PLS and other hybrid techniques in that they can define a prediction model even with unmodeled sources of spectral variation that are not explicitly included in the calibration model. The unmodeled sources of spectral variation may be unknown constituents, constituents with unknown concentrations, nonlinear responses, non-uniform and correlated errors, or other sources of spectral variation that are present in the calibration sample spectra. Also, since the various ACLS methods are based on CLS, they can incorporate the new prediction-augmented CLS (PACLS) method of updating the prediction model for new sources of spectral variation contained in the prediction sample set without having to return to the calibration process. The ACLS methods can also be applied to alternating least squares models. The ACLS methods can be applied to all types of multivariate data.
Apparatus and system for multivariate spectral analysis
Keenan, Michael R.; Kotula, Paul G.
2003-06-24
An apparatus and system for determining the properties of a sample from measured spectral data collected from the sample by performing a method of multivariate spectral analysis. The method can include: generating a two-dimensional matrix A containing measured spectral data; providing a weighted spectral data matrix D by performing a weighting operation on matrix A; factoring D into the product of two matrices, C and S.sup.T, by performing a constrained alternating least-squares analysis of D=CS.sup.T, where C is a concentration intensity matrix and S is a spectral shapes matrix; unweighting C and S by applying the inverse of the weighting used previously; and determining the properties of the sample by inspecting C and S. This method can be used by a spectrum analyzer to process X-ray spectral data generated by a spectral analysis system that can include a Scanning Electron Microscope (SEM) with an Energy Dispersive Detector and Pulse Height Analyzer.
Independent Statistics & Analysis
U.S. Energy Information Administration (EIA) Indexed Site
August 2016 Independent Statistics & Analysis www.eia.gov U.S. Department of Energy Washington, DC 20585 Quarterly Coal Distribution Report October - December 2015 This report was prepared by the U.S. Energy Information Administration (EIA), the statistical and analytical agency within the U.S. Department of Energy. By law, EIA's data, analyses, and forecasts are independent of approval by any other officer or employee of the United States Government. The views in this report therefore
Independent Statistics & Analysis
U.S. Energy Information Administration (EIA) Indexed Site
Independent Statistics & Analysis www.eia.gov U.S. Department of Energy Washington, DC 20585 Quarterly Coal Report (Abbreviated) January - March 2016 June 2016 This report was prepared by the U.S. Energy Information Administration (EIA), the statistical and analytical agency within the U.S. Department of Energy. By law, EIA's data, analyses, and forecasts are independent of approval by any other officer or employee of the United States Government. The views in this report therefore should
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
LM Stakeholder Interaction and External Communications June 2014 Page 1 OVERVIEW The U.S. Department of Energy (DOE) Offi ce of Legacy Management (LM) makes every effort to communicate with its stakeholders through public and small group meetings, conferences, briefi ngs, news releases, telephone, e-mail, informational materials, and through the LM website. To assess the effectiveness of LM's communication with stakeholders across the nation, an analysis of stakeholder interaction was performed.
Ladd-Lively, Jennifer L
2014-01-01
The objective of this work was to determine the feasibility of using on-line multivariate statistical process control (MSPC) for safeguards applications in natural uranium conversion plants. Multivariate statistical process control is commonly used throughout industry for the detection of faults. For safeguards applications in uranium conversion plants, faults could include the diversion of intermediate products such as uranium dioxide, uranium tetrafluoride, and uranium hexafluoride. This study was limited to a 100 metric ton of uranium (MTU) per year natural uranium conversion plant (NUCP) using the wet solvent extraction method for the purification of uranium ore concentrate. A key component in the multivariate statistical methodology is the Principal Component Analysis (PCA) approach for the analysis of data, development of the base case model, and evaluation of future operations. The PCA approach was implemented through the use of singular value decomposition of the data matrix where the data matrix represents normal operation of the plant. Component mole balances were used to model each of the process units in the NUCP. However, this approach could be applied to any data set. The monitoring framework developed in this research could be used to determine whether or not a diversion of material has occurred at an NUCP as part of an International Atomic Energy Agency (IAEA) safeguards system. This approach can be used to identify the key monitoring locations, as well as locations where monitoring is unimportant. Detection limits at the key monitoring locations can also be established using this technique. Several faulty scenarios were developed to test the monitoring framework after the base case or normal operating conditions of the PCA model were established. In all of the scenarios, the monitoring framework was able to detect the fault. Overall this study was successful at meeting the stated objective.
Nonparametric Multivariate Anomaly Analysis in Support of HPC Resilience
Ostrouchov, George; Naughton, III, Thomas J; Engelmann, Christian; Vallee, Geoffroy R; Scott, Stephen L
2009-01-01
Large-scale computing systems provide great potential for scientific exploration. However, the complexity that accompanies these enormous machines raises challeges for both, users and operators. The effective use of such systems is often hampered by failures encountered when running applications on systems containing tens-of-thousands of nodes and hundreds-of-thousands of compute cores capable of yielding petaflops of performance. In systems of this size failure detection is complicated and root-cause diagnosis difficult. This paper describes our recent work in the identification of anomalies in monitoring data and system logs to provide further insights into machine status, runtime behavior, failure modes and failure root causes. It discusses the details of an initial prototype that gathers the data and uses statistical techniques for analysis.
Systematic wavelength selection for improved multivariate spectral analysis
Thomas, Edward V.; Robinson, Mark R.; Haaland, David M.
1995-01-01
Methods and apparatus for determining in a biological material one or more unknown values of at least one known characteristic (e.g. the concentration of an analyte such as glucose in blood or the concentration of one or more blood gas parameters) with a model based on a set of samples with known values of the known characteristics and a multivariate algorithm using several wavelength subsets. The method includes selecting multiple wavelength subsets, from the electromagnetic spectral region appropriate for determining the known characteristic, for use by an algorithm wherein the selection of wavelength subsets improves the model's fitness of the determination for the unknown values of the known characteristic. The selection process utilizes multivariate search methods that select both predictive and synergistic wavelengths within the range of wavelengths utilized. The fitness of the wavelength subsets is determined by the fitness function F=.function.(cost, performance). The method includes the steps of: (1) using one or more applications of a genetic algorithm to produce one or more count spectra, with multiple count spectra then combined to produce a combined count spectrum; (2) smoothing the count spectrum; (3) selecting a threshold count from a count spectrum to select these wavelength subsets which optimize the fitness function; and (4) eliminating a portion of the selected wavelength subsets. The determination of the unknown values can be made: (1) noninvasively and in vivo; (2) invasively and in vivo; or (3) in vitro.
Clegg, Samuel M; Barefield, James E; Wiens, Roger C; Sklute, Elizabeth; Dyare, Melinda D
2008-01-01
Quantitative analysis with LIBS traditionally employs calibration curves that are complicated by the chemical matrix effects. These chemical matrix effects influence the LIBS plasma and the ratio of elemental composition to elemental emission line intensity. Consequently, LIBS calibration typically requires a priori knowledge of the unknown, in order for a series of calibration standards similar to the unknown to be employed. In this paper, three new Multivariate Analysis (MV A) techniques are employed to analyze the LIBS spectra of 18 disparate igneous and highly-metamorphosed rock samples. Partial Least Squares (PLS) analysis is used to generate a calibration model from which unknown samples can be analyzed. Principal Components Analysis (PCA) and Soft Independent Modeling of Class Analogy (SIMCA) are employed to generate a model and predict the rock type of the samples. These MV A techniques appear to exploit the matrix effects associated with the chemistries of these 18 samples.
Long, C.L.
1991-02-01
Multivariate calibration techniques can reduce the time required for routine testing and can provide new methods of analysis. Multivariate calibration is commonly used with near infrared reflectance analysis (NIRA) and Fourier transform infrared (FTIR) spectroscopy. Two feasibility studies were performed to determine the capability of NIRA, using multivariate calibration techniques, to perform analyses on the types of samples that are routinely analyzed at this laboratory. The first study performed included a variety of samples and indicated that NIRA would be well-suited to perform analyses on selected materials properties such as water content and hydroxyl number on polyol samples, epoxy content on epoxy resins, water content of desiccants, and the amine values of various amine cure agents. A second study was performed to assess the capability of NIRA to perform quantitative analysis of hydroxyl numbers and water contents of hydroxyl-containing materials. Hydroxyl number and water content were selected for determination because these tests are frequently run on polyol materials and the hydroxyl number determination is time consuming. This study pointed out the necessity of obtaining calibration standards identical to the samples being analyzed for each type of polyol or other material being analyzed. Multivariate calibration techniques are frequently used with FTIR data to determine the composition of a large variety of complex mixtures. A literature search indicated many applications of multivariate calibration to FTIR data. Areas identified where quantitation by FTIR would provide a new capability are quantitation of components in epoxy and silicone resins, polychlorinated biphenyls (PCBs) in oils, and additives to polymers. 19 refs., 15 figs., 6 tabs.
Reichardt, Thomas A.; Timlin, Jerilyn Ann; Jones, Howland D. T.; Sickafoose, Shane M.; Schmitt, Randal L.
2010-09-01
Laser-induced fluorescence measurements of cuvette-contained laser dye mixtures are made for evaluation of multivariate analysis techniques to optically thick environments. Nine mixtures of Coumarin 500 and Rhodamine 610 are analyzed, as well as the pure dyes. For each sample, the cuvette is positioned on a two-axis translation stage to allow the interrogation at different spatial locations, allowing the examination of both primary (absorption of the laser light) and secondary (absorption of the fluorescence) inner filter effects. In addition to these expected inner filter effects, we find evidence that a portion of the absorbed fluorescence is re-emitted. A total of 688 spectra are acquired for the evaluation of multivariate analysis approaches to account for nonlinear effects.
Independent Statistics & Analysis Drilling Productivity Report
U.S. Energy Information Administration (EIA) Indexed Site
Independent Statistics & Analysis Drilling Productivity Report The seven regions analyzed in this report accounted for 92% of domestic oil production growth and all domestic natural gas production growth during 2011-14. August 2016 For key tight oil and shale gas regions U.S. Energy Information Administration Contents Year-over-year summary 2 Bakken Region 3 Eagle Ford Region 4 Haynesville Region 5 Marcellus Region 6 Niobrara Region 7 Permian Region 8 Utica Region 9 Explanatory notes 10
Spectral compression algorithms for the analysis of very large multivariate images
Keenan, Michael R.
2007-10-16
A method for spectrally compressing data sets enables the efficient analysis of very large multivariate images. The spectral compression algorithm uses a factored representation of the data that can be obtained from Principal Components Analysis or other factorization technique. Furthermore, a block algorithm can be used for performing common operations more efficiently. An image analysis can be performed on the factored representation of the data, using only the most significant factors. The spectral compression algorithm can be combined with a spatial compression algorithm to provide further computational efficiencies.
Enhancing e-waste estimates: Improving data quality by multivariate Input–Output Analysis
Wang, Feng; Huisman, Jaco; Stevels, Ab; Baldé, Cornelis Peter
2013-11-15
Highlights: • A multivariate Input–Output Analysis method for e-waste estimates is proposed. • Applying multivariate analysis to consolidate data can enhance e-waste estimates. • We examine the influence of model selection and data quality on e-waste estimates. • Datasets of all e-waste related variables in a Dutch case study have been provided. • Accurate modeling of time-variant lifespan distributions is critical for estimate. - Abstract: Waste electrical and electronic equipment (or e-waste) is one of the fastest growing waste streams, which encompasses a wide and increasing spectrum of products. Accurate estimation of e-waste generation is difficult, mainly due to lack of high quality data referred to market and socio-economic dynamics. This paper addresses how to enhance e-waste estimates by providing techniques to increase data quality. An advanced, flexible and multivariate Input–Output Analysis (IOA) method is proposed. It links all three pillars in IOA (product sales, stock and lifespan profiles) to construct mathematical relationships between various data points. By applying this method, the data consolidation steps can generate more accurate time-series datasets from available data pool. This can consequently increase the reliability of e-waste estimates compared to the approach without data processing. A case study in the Netherlands is used to apply the advanced IOA model. As a result, for the first time ever, complete datasets of all three variables for estimating all types of e-waste have been obtained. The result of this study also demonstrates significant disparity between various estimation models, arising from the use of data under different conditions. It shows the importance of applying multivariate approach and multiple sources to improve data quality for modelling, specifically using appropriate time-varying lifespan parameters. Following the case study, a roadmap with a procedural guideline is provided to enhance e
Spatial compression algorithm for the analysis of very large multivariate images
Keenan, Michael R.
2008-07-15
A method for spatially compressing data sets enables the efficient analysis of very large multivariate images. The spatial compression algorithms use a wavelet transformation to map an image into a compressed image containing a smaller number of pixels that retain the original image's information content. Image analysis can then be performed on a compressed data matrix consisting of a reduced number of significant wavelet coefficients. Furthermore, a block algorithm can be used for performing common operations more efficiently. The spatial compression algorithms can be combined with spectral compression algorithms to provide further computational efficiencies.
Statistical Hot Channel Analysis for the NBSR
Cuadra A.; Baek J.
2014-05-27
A statistical analysis of thermal limits has been carried out for the research reactor (NBSR) at the National Institute of Standards and Technology (NIST). The objective of this analysis was to update the uncertainties of the hot channel factors with respect to previous analysis for both high-enriched uranium (HEU) and low-enriched uranium (LEU) fuels. Although uncertainties in key parameters which enter into the analysis are not yet known for the LEU core, the current analysis uses reasonable approximations instead of conservative estimates based on HEU values. Cumulative distribution functions (CDFs) were obtained for critical heat flux ratio (CHFR), and onset of flow instability ratio (OFIR). As was done previously, the Sudo-Kaminaga correlation was used for CHF and the Saha-Zuber correlation was used for OFI. Results were obtained for probability levels of 90%, 95%, and 99.9%. As an example of the analysis, the results for both the existing reactor with HEU fuel and the LEU core show that CHFR would have to be above 1.39 to assure with 95% probability that there is no CHF. For the OFIR, the results show that the ratio should be above 1.40 to assure with a 95% probability that OFI is not reached.
The Multi-Isotope Process Monitor: Multivariate Analysis of Gamma Spectra
Orton, Christopher R.; Rutherford, Crystal E.; Fraga, Carlos G.; Schwantes, Jon M.
2011-10-30
The International Atomic Energy Agency (IAEA) has established international safeguards standards for fissionable material at spent fuel reprocessing plants to ensure that significant quantities of nuclear material are not diverted from these facilities. Currently, methods to verify material control and accountancy (MC&A) at these facilities require time-consuming and resource-intensive destructive assay (DA). The time delay between sampling and subsequent DA provides a potential opportunity to divert the material out of the appropriate chemical stream. Leveraging new on-line nondestructive assay (NDA) techniques in conjunction with the traditional and highly precise DA methods may provide a more timely, cost-effective and resource efficient means for MC&A verification at such facilities. Pacific Northwest National Laboratory (PNNL) is developing on-line NDA process monitoring technologies, including the Multi-Isotope Process (MIP) Monitor. The MIP Monitor uses gamma spectroscopy and pattern recognition software to identify off-normal conditions in process streams. Recent efforts have been made to explore the basic limits of using multivariate analysis techniques on gamma-ray spectra. This paper will provide an overview of the methods and report our on-going efforts to develop and demonstrate the technology.
Statistical analysis of random duration times
Engelhardt, M.E.
1996-04-01
This report presents basic statistical methods for analyzing data obtained by observing random time durations. It gives nonparametric estimates of the cumulative distribution function, reliability function and cumulative hazard function. These results can be applied with either complete or censored data. Several models which are commonly used with time data are discussed, and methods for model checking and goodness-of-fit tests are discussed. Maximum likelihood estimates and confidence limits are given for the various models considered. Some results for situations where repeated durations such as repairable systems are also discussed.
Characterization of Used Nuclear Fuel with Multivariate Analysis for Process Monitoring
Dayman, Kenneth J.; Coble, Jamie B.; Orton, Christopher R.; Schwantes, Jon M.
2014-01-01
The Multi-Isotope Process (MIP) Monitor combines gamma spectroscopy and multivariate analysis to detect anomalies in various process streams in a nuclear fuel reprocessing system. Measured spectra are compared to models of nominal behavior at each measurement location to detect unexpected changes in system behavior. In order to improve the accuracy and specificity of process monitoring, fuel characterization may be used to more accurately train subsequent models in a full analysis scheme. This paper presents initial development of a reactor-type classifier that is used to select a reactor-specific partial least squares model to predict fuel burnup. Nuclide activities for prototypic used fuel samples were generated in ORIGEN-ARP and used to investigate techniques to characterize used nuclear fuel in terms of reactor type (pressurized or boiling water reactor) and burnup. A variety of reactor type classification algorithms, including k-nearest neighbors, linear and quadratic discriminant analyses, and support vector machines, were evaluated to differentiate used fuel from pressurized and boiling water reactors. Then, reactor type-specific partial least squares models were developed to predict the burnup of the fuel. Using these reactor type-specific models instead of a model trained for all light water reactors improved the accuracy of burnup predictions. The developed classification and prediction models were combined and applied to a large dataset that included eight fuel assembly designs, two of which were not used in training the models, and spanned the range of the initial 235U enrichment, cooling time, and burnup values expected of future commercial used fuel for reprocessing. Error rates were consistent across the range of considered enrichment, cooling time, and burnup values. Average absolute relative errors in burnup predictions for validation data both within and outside the training space were 0.0574% and 0.0597%, respectively. The errors seen in this
Experimental control analysis of a fuel gas saturator. Final report. [Multivariable
Terwilliger, G.E.; Brower, A.S.; Baheti, R.S.; Smith, R.E.; Brown, D.H.
1985-01-01
The multivariable control of the clean fuel gas saturator of a coal gasification process has been demonstrated. First principle process models described the process dynamics from which linear models were generated and used for the actual control designs. The multivariable control was designed, its response to transients simulated and the controls were implemented in a computer controller for a fuel gas saturator. The test results obtained for the gas flow transients showed good correlation with the computer simulations, giving confidence in the ability of the simulation to predict the plant performance for other transients. In this study, both time and frequency domain multivariable design techniques were applied to provide the best possible design and to determine their relative effectiveness. No clear guidelines resulted; it appears that the selection may be made on the basis of personal preference, experience or the availability of computer-aided design tools, rather than inherent technical differences. This EPRI/GE fuel gas saturator control demonstration has shown that multivariable design techniques can be applied to a real process and that practical controls are developed. With suitable process models, presently available computer-aided control design software allows the control design, evaluation and implementation to be completed in a reasonable time period. The application of these techniques to power generation processes is recommended.
Statistical Analysis of Transient Cycle Test Results in a 40...
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Statistical Analysis of Transient Cycle Test Results in a 40 ... of calibration and measurement methods deer09shade.pdf ... Evaluation of a Partial Flow Dilution System for Transient ...
A Divergence Statistics Extension to VTK for Performance Analysis.
Pebay, Philippe Pierre; Bennett, Janine Camille
2015-02-01
This report follows the series of previous documents ([PT08, BPRT09b, PT09, BPT09, PT10, PB13], where we presented the parallel descriptive, correlative, multi-correlative, principal component analysis, contingency, k -means, order and auto-correlative statistics engines which we developed within the Visualization Tool Kit ( VTK ) as a scalable, parallel and versatile statistics package. We now report on a new engine which we developed for the calculation of divergence statistics, a concept which we hereafter explain and whose main goal is to quantify the discrepancy, in a stasticial manner akin to measuring a distance, between an observed empirical distribution and a theoretical, "ideal" one. The ease of use of the new diverence statistics engine is illustrated by the means of C++ code snippets. Although this new engine does not yet have a parallel implementation, it has already been applied to HPC performance analysis, of which we provide an example.
Data analysis using the Gnu R system for statistical computation
Simone, James; /Fermilab
2011-07-01
R is a language system for statistical computation. It is widely used in statistics, bioinformatics, machine learning, data mining, quantitative finance, and the analysis of clinical drug trials. Among the advantages of R are: it has become the standard language for developing statistical techniques, it is being actively developed by a large and growing global user community, it is open source software, it is highly portable (Linux, OS-X and Windows), it has a built-in documentation system, it produces high quality graphics and it is easily extensible with over four thousand extension library packages available covering statistics and applications. This report gives a very brief introduction to R with some examples using lattice QCD simulation results. It then discusses the development of R packages designed for chi-square minimization fits for lattice n-pt correlation functions.
Statistical Software for spatial analysis of stratigraphic data sets
Energy Science and Technology Software Center (OSTI)
2003-04-08
Stratistics s a tool for statistical analysis of spatially explicit data sets and model output for description and for model-data comparisons. lt is intended for the analysis of data sets commonly used in geology, such as gamma ray logs and lithologic sequences, as well as 2-D data such as maps. Stratistics incorporates a far wider range of spatial analysis methods drawn from multiple disciplines, than are currently available in other packages. These include incorporation ofmore » techniques from spatial and landscape ecology, fractal analysis, and mathematical geology. Its use should substantially reduce the risk associated with the use of predictive models« less
Feature-Based Statistical Analysis of Combustion Simulation Data
Bennett, J; Krishnamoorthy, V; Liu, S; Grout, R; Hawkes, E; Chen, J; Pascucci, V; Bremer, P T
2011-11-18
We present a new framework for feature-based statistical analysis of large-scale scientific data and demonstrate its effectiveness by analyzing features from Direct Numerical Simulations (DNS) of turbulent combustion. Turbulent flows are ubiquitous and account for transport and mixing processes in combustion, astrophysics, fusion, and climate modeling among other disciplines. They are also characterized by coherent structure or organized motion, i.e. nonlocal entities whose geometrical features can directly impact molecular mixing and reactive processes. While traditional multi-point statistics provide correlative information, they lack nonlocal structural information, and hence, fail to provide mechanistic causality information between organized fluid motion and mixing and reactive processes. Hence, it is of great interest to capture and track flow features and their statistics together with their correlation with relevant scalar quantities, e.g. temperature or species concentrations. In our approach we encode the set of all possible flow features by pre-computing merge trees augmented with attributes, such as statistical moments of various scalar fields, e.g. temperature, as well as length-scales computed via spectral analysis. The computation is performed in an efficient streaming manner in a pre-processing step and results in a collection of meta-data that is orders of magnitude smaller than the original simulation data. This meta-data is sufficient to support a fully flexible and interactive analysis of the features, allowing for arbitrary thresholds, providing per-feature statistics, and creating various global diagnostics such as Cumulative Density Functions (CDFs), histograms, or time-series. We combine the analysis with a rendering of the features in a linked-view browser that enables scientists to interactively explore, visualize, and analyze the equivalent of one terabyte of simulation data. We highlight the utility of this new framework for combustion
Lifetime statistics of quantum chaos studied by a multiscale analysis
Di Falco, A.; Krauss, T. F. [School of Physics and Astronomy, University of St. Andrews, North Haugh, St. Andrews, KY16 9SS (United Kingdom); Fratalocchi, A. [PRIMALIGHT, Faculty of Electrical Engineering, Applied Mathematics and Computational Science, King Abdullah University of Science and Technology (KAUST), Thuwal 23955-6900 (Saudi Arabia)
2012-04-30
In a series of pump and probe experiments, we study the lifetime statistics of a quantum chaotic resonator when the number of open channels is greater than one. Our design embeds a stadium billiard into a two dimensional photonic crystal realized on a silicon-on-insulator substrate. We calculate resonances through a multiscale procedure that combines energy landscape analysis and wavelet transforms. Experimental data is found to follow the universal predictions arising from random matrix theory with an excellent level of agreement.
Statistical Analysis of Abnormal Electric Power Grid Behavior
Ferryman, Thomas A.; Amidan, Brett G.
2010-10-30
Pacific Northwest National Laboratory is developing a technique to analyze Phasor Measurement Unit data to identify typical patterns, atypical events and precursors to a blackout or other undesirable event. The approach combines a data-driven multivariate analysis with an engineering-model approach. The method identifies atypical events, provides a plane English description of the event, and the capability to use drill-down graphics for detailed investigations. The tool can be applied to the entire grid, individual organizations (e.g. TVA, BPA), or specific substations (e.g., TVA_CUMB). The tool is envisioned for (1) event investigations, (2) overnight processing to generate a Morning Report that characterizes the previous days activity with respect to previous activity over the previous 10-30 days, and (3) potentially near-real-time operation to support the grid operators. This paper presents the current status of the tool and illustrations of its application to real world PMU data collected in three 10-day periods in 2007.
Multi-scale statistical analysis of coronal solar activity
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Gamborino, Diana; del-Castillo-Negrete, Diego; Martinell, Julio J.
2016-07-08
Multi-filter images from the solar corona are used to obtain temperature maps that are analyzed using techniques based on proper orthogonal decomposition (POD) in order to extract dynamical and structural information at various scales. Exploring active regions before and after a solar flare and comparing them with quiet regions, we show that the multi-scale behavior presents distinct statistical properties for each case that can be used to characterize the level of activity in a region. Information about the nature of heat transport is also to be extracted from the analysis.
Statistical analysis of cascading failures in power grids
Chertkov, Michael; Pfitzner, Rene; Turitsyn, Konstantin
2010-12-01
We introduce a new microscopic model of cascading failures in transmission power grids. This model accounts for automatic response of the grid to load fluctuations that take place on the scale of minutes, when optimum power flow adjustments and load shedding controls are unavailable. We describe extreme events, caused by load fluctuations, which cause cascading failures of loads, generators and lines. Our model is quasi-static in the causal, discrete time and sequential resolution of individual failures. The model, in its simplest realization based on the Directed Current description of the power flow problem, is tested on three standard IEEE systems consisting of 30, 39 and 118 buses. Our statistical analysis suggests a straightforward classification of cascading and islanding phases in terms of the ratios between average number of removed loads, generators and links. The analysis also demonstrates sensitivity to variations in line capacities. Future research challenges in modeling and control of cascading outages over real-world power networks are discussed.
Multivariate Data EXplorer (MDX)
Energy Science and Technology Software Center (OSTI)
2012-08-01
The MDX toolkit facilitates exploratory data analysis and visualization of multivariate datasets. MDX provides and interactive graphical user interface to load, explore, and modify multivariate datasets stored in tabular forms. MDX uses an extended version of the parallel coordinates plot and scatterplots to represent the data. The user can perform rapid visual queries using mouse gestures in the visualization panels to select rows or columns of interest. The visualization panel provides coordinated multiple views wherebymore » selections made in one plot are propagated to the other plots. Users can also export selected data or reconfigure the visualization panel to explore relationships between columns and rows in the data.« less
ROOT: A C++ framework for petabyte data storage, statistical analysis and visualization
Antcheva, I.; Ballintijn, M.; Bellenot, B.; Biskup, M.; Brun, R.; Buncic, N.; Canal, Ph.; Casadei, D.; Couet, O.; Fine, V.; Franco, L.; ,
2009-01-01
ROOT is an object-oriented C++ framework conceived in the high-energy physics (HEP) community, designed for storing and analyzing petabytes of data in an efficient way. Any instance of a C++ class can be stored into a ROOT file in a machine-independent compressed binary format. In ROOT the TTree object container is optimized for statistical data analysis over very large data sets by using vertical data storage techniques. These containers can span a large number of files on local disks, the web or a number of different shared file systems. In order to analyze this data, the user can chose out of a wide set of mathematical and statistical functions, including linear algebra classes, numerical algorithms such as integration and minimization, and various methods for performing regression analysis (fitting). In particular, the RooFit package allows the user to perform complex data modeling and fitting while the RooStats library provides abstractions and implementations for advanced statistical tools. Multivariate classification methods based on machine learning techniques are available via the TMVA package. A central piece in these analysis tools are the histogram classes which provide binning of one- and multi-dimensional data. Results can be saved in high-quality graphical formats like Postscript and PDF or in bitmap formats like JPG or GIF. The result can also be stored into ROOT macros that allow a full recreation and rework of the graphics. Users typically create their analysis macros step by step, making use of the interactive C++ interpreter CINT, while running over small data samples. Once the development is finished, they can run these macros at full compiled speed over large data sets, using on-the-fly compilation, or by creating a stand-alone batch program. Finally, if processing farms are available, the user can reduce the execution time of intrinsically parallel tasks - e.g. data mining in HEP - by using PROOF, which will take care of optimally
Statistical Analysis of Tank 5 Floor Sample Results
Shine, E. P.
2013-01-31
Sampling has been completed for the characterization of the residual material on the floor of Tank 5 in the F-Area Tank Farm at the Savannah River Site (SRS), near Aiken, SC. The sampling was performed by Savannah River Remediation (SRR) LLC using a stratified random sampling plan with volume-proportional compositing. The plan consisted of partitioning the residual material on the floor of Tank 5 into three non-overlapping strata: two strata enclosed accumulations, and a third stratum consisted of a thin layer of material outside the regions of the two accumulations. Each of three composite samples was constructed from five primary sample locations of residual material on the floor of Tank 5. Three of the primary samples were obtained from the stratum containing the thin layer of material, and one primary sample was obtained from each of the two strata containing an accumulation. This report documents the statistical analyses of the analytical results for the composite samples. The objective of the analysis is to determine the mean concentrations and upper 95% confidence (UCL95) bounds for the mean concentrations for a set of analytes in the tank residuals. The statistical procedures employed in the analyses were consistent with the Environmental Protection Agency (EPA) technical guidance by Singh and others [2010]. Savannah River National Laboratory (SRNL) measured the sample bulk density, nonvolatile beta, gross alpha, and the radionuclide1, elemental, and chemical concentrations three times for each of the composite samples. The analyte concentration data were partitioned into three separate groups for further analysis: analytes with every measurement above their minimum detectable concentrations (MDCs), analytes with no measurements above their MDCs, and analytes with a mixture of some measurement results above and below their MDCs. The means, standard deviations, and UCL95s were computed for the analytes in the two groups that had at least some measurements
Plutonium metal exchange program : current status and statistical analysis
Tandon, L.; Eglin, J. L.; Michalak, S. E.; Picard, R. R.; Temer, D. J.
2004-01-01
The Rocky Flats Plutonium (Pu) Metal Sample Exchange program was conducted to insure the quality and intercomparability of measurements such as Pu assay, Pu isotopics, and impurity analyses. The Rocky Flats program was discontinued in 1989 after more than 30 years. In 2001, Los Alamos National Laboratory (LANL) reestablished the Pu Metal Exchange program. In addition to the Atomic Weapons Establishment (AWE) at Aldermaston, six Department of Energy (DOE) facilities Argonne East, Argonne West, Livermore, Los Alamos, New Brunswick Laboratory, and Savannah River are currently participating in the program. Plutonium metal samples are prepared and distributed to the sites for destructive measurements to determine elemental concentration, isotopic abundance, and both metallic and nonmetallic impurity levels. The program provides independent verification of analytical measurement capabilies for each participating facility and allows problems in analytical methods to be identified. The current status of the program will be discussed with emphasis on the unique statistical analysis and modeling of the data developed for the program. The discussion includes the definition of the consensus values for each analyte (in the presence and absence of anomalous values and/or censored values), and interesting features of the data and the results.
Ensemble Solar Forecasting Statistical Quantification and Sensitivity Analysis: Preprint
Cheung, WanYin; Zhang, Jie; Florita, Anthony; Hodge, Bri-Mathias; Lu, Siyuan; Hamann, Hendrik F.; Sun, Qian; Lehman, Brad
2015-12-08
Uncertainties associated with solar forecasts present challenges to maintain grid reliability, especially at high solar penetrations. This study aims to quantify the errors associated with the day-ahead solar forecast parameters and the theoretical solar power output for a 51-kW solar power plant in a utility area in the state of Vermont, U.S. Forecasts were generated by three numerical weather prediction (NWP) models, including the Rapid Refresh, the High Resolution Rapid Refresh, and the North American Model, and a machine-learning ensemble model. A photovoltaic (PV) performance model was adopted to calculate theoretical solar power generation using the forecast parameters (e.g., irradiance, cell temperature, and wind speed). Errors of the power outputs were quantified using statistical moments and a suite of metrics, such as the normalized root mean squared error (NRMSE). In addition, the PV model's sensitivity to different forecast parameters was quantified and analyzed. Results showed that the ensemble model yielded forecasts in all parameters with the smallest NRMSE. The NRMSE of solar irradiance forecasts of the ensemble NWP model was reduced by 28.10% compared to the best of the three NWP models. Further, the sensitivity analysis indicated that the errors of the forecasted cell temperature attributed only approximately 0.12% to the NRMSE of the power output as opposed to 7.44% from the forecasted solar irradiance.
Thermal hydraulic limits analysis using statistical propagation of parametric uncertainties
Chiang, K. Y.; Hu, L. W.; Forget, B.
2012-07-01
The MIT Research Reactor (MITR) is evaluating the conversion from highly enriched uranium (HEU) to low enrichment uranium (LEU) fuel. In addition to the fuel element re-design, a reactor power upgraded from 6 MW to 7 MW is proposed in order to maintain the same reactor performance of the HEU core. Previous approach in analyzing the impact of engineering uncertainties on thermal hydraulic limits via the use of engineering hot channel factors (EHCFs) was unable to explicitly quantify the uncertainty and confidence level in reactor parameters. The objective of this study is to develop a methodology for MITR thermal hydraulic limits analysis by statistically combining engineering uncertainties with an aim to eliminate unnecessary conservatism inherent in traditional analyses. This method was employed to analyze the Limiting Safety System Settings (LSSS) for the MITR, which is the avoidance of the onset of nucleate boiling (ONB). Key parameters, such as coolant channel tolerances and heat transfer coefficients, were considered as normal distributions using Oracle Crystal Ball to calculate ONB. The LSSS power is determined with 99.7% confidence level. The LSSS power calculated using this new methodology is 9.1 MW, based on core outlet coolant temperature of 60 deg. C, and primary coolant flow rate of 1800 gpm, compared to 8.3 MW obtained from the analytical method using the EHCFs with same operating conditions. The same methodology was also used to calculate the safety limit (SL) for the MITR, conservatively determined using onset of flow instability (OFI) as the criterion, to verify that adequate safety margin exists between LSSS and SL. The calculated SL is 10.6 MW, which is 1.5 MW higher than LSSS. (authors)
Deep Data Analysis of Conductive Phenomena on Complex Oxide Interfaces...
Office of Scientific and Technical Information (OSTI)
This approach conjoins multivariate statistical analysis with physics-based ... local transport and other functional phenomena in other spatially inhomogeneous systems. ...
Statistical language analysis for automatic exfiltration event detection.
Robinson, David Gerald
2010-04-01
This paper discusses the recent development a statistical approach for the automatic identification of anomalous network activity that is characteristic of exfiltration events. This approach is based on the language processing method eferred to as latent dirichlet allocation (LDA). Cyber security experts currently depend heavily on a rule-based framework for initial detection of suspect network events. The application of the rule set typically results in an extensive list of uspect network events that are then further explored manually for suspicious activity. The ability to identify anomalous network events is heavily dependent on the experience of the security personnel wading through the network log. Limitations f this approach are clear: rule-based systems only apply to exfiltration behavior that has previously been observed, and experienced cyber security personnel are rare commodities. Since the new methodology is not a discrete rule-based pproach, it is more difficult for an insider to disguise the exfiltration events. A further benefit is that the methodology provides a risk-based approach that can be implemented in a continuous, dynamic or evolutionary fashion. This permits uspect network activity to be identified early with a quantifiable risk associated with decision making when responding to suspicious activity.
NEPA litigation 1988-1995: A detailed statistical analysis
Reinke, D.C.; Robitaille, P.
1997-08-01
The intent of this study was to identify trends and lessons learned from litigated NEPA documents and to compare and contrast these trends among Federal agencies. More than 350 NEPA cases were collected, reviewed, and analyzed. Of the NEPA cases reviewed, more than 170 were appeals or Supreme Court cases, mostly from the late 1980s through 1995. For this time period, the sampled documents represent the majority of the appeals court cases and all the Supreme Court cases. Additionally, over 170 district court cases were also examined as a representative sample of district court decisions on NEPA. Cases on agency actions found to need NEPA documentation (but that had no documentation) and cases on NEPA documents that were found to be inadequate were pooled and examined to determine the factors that were responsible for these findings. The inadequate documents were specifically examined to determine if there were any general trends. The results are shown in detailed statistical terms. Generally, when a Federal agency has some type of NEPA documentation (e.g., CX, EA, or EIS) and at least covers the basic NEPA procedural requirements, the agency typically wins the litigation. NEPA documents that lose generally have serious errors of omission. An awareness and understanding of the errors of omission can help Federal agencies to ensure that they produce winner a greater percentage of the time.
Templates and Examples — Statistics and Search Log Analysis
Office of Energy Efficiency and Renewable Energy (EERE)
Here you will find custom templates and EERE-specific examples you can use to plan, conduct, and report on your usability and analysis activities. These templates are examples of forms you might use, but you are not required to use them for EERE products.
Zhan, Xianyuan; Aziz, H. M. Abdul; Ukkusuri, Satish V.
2015-11-19
Our study investigates the Multivariate Poisson-lognormal (MVPLN) model that jointly models crash frequency and severity accounting for correlations. The ordinary univariate count models analyze crashes of different severity level separately ignoring the correlations among severity levels. The MVPLN model is capable to incorporate the general correlation structure and takes account of the over dispersion in the data that leads to a superior data fitting. But, the traditional estimation approach for MVPLN model is computationally expensive, which often limits the use of MVPLN model in practice. In this work, a parallel sampling scheme is introduced to improve the original Markov Chain Monte Carlo (MCMC) estimation approach of the MVPLN model, which significantly reduces the model estimation time. Two MVPLN models are developed using the pedestrian vehicle crash data collected in New York City from 2002 to 2006, and the highway-injury data from Washington State (5-year data from 1990 to 1994) The Deviance Information Criteria (DIC) is used to evaluate the model fitting. The estimation results show that the MVPLN models provide a superior fit over univariate Poisson-lognormal (PLN), univariate Poisson, and Negative Binomial models. Moreover, the correlations among the latent effects of different severity levels are found significant in both datasets that justifies the importance of jointly modeling crash frequency and severity accounting for correlations.
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Zhan, Xianyuan; Aziz, H. M. Abdul; Ukkusuri, Satish V.
2015-11-19
Our study investigates the Multivariate Poisson-lognormal (MVPLN) model that jointly models crash frequency and severity accounting for correlations. The ordinary univariate count models analyze crashes of different severity level separately ignoring the correlations among severity levels. The MVPLN model is capable to incorporate the general correlation structure and takes account of the over dispersion in the data that leads to a superior data fitting. But, the traditional estimation approach for MVPLN model is computationally expensive, which often limits the use of MVPLN model in practice. In this work, a parallel sampling scheme is introduced to improve the original Markov Chainmore » Monte Carlo (MCMC) estimation approach of the MVPLN model, which significantly reduces the model estimation time. Two MVPLN models are developed using the pedestrian vehicle crash data collected in New York City from 2002 to 2006, and the highway-injury data from Washington State (5-year data from 1990 to 1994) The Deviance Information Criteria (DIC) is used to evaluate the model fitting. The estimation results show that the MVPLN models provide a superior fit over univariate Poisson-lognormal (PLN), univariate Poisson, and Negative Binomial models. Moreover, the correlations among the latent effects of different severity levels are found significant in both datasets that justifies the importance of jointly modeling crash frequency and severity accounting for correlations.« less
U.S. Energy Information Administration Independent Statistics & Analysis
U.S. Energy Information Administration (EIA) Indexed Site
EIA's crude-by-rail data For EIA Energy Conference June 16, 2015 | Washington, DC By Mindi Farber-DeAnda, Biofuels and Emerging Technologies Team Office of Petroleum, Natural Gas, and Biofuels Analysis Takeaways * At the end of March, EIA published monthly crude-by-rail (CBR) data along with its monthly petroleum supply balances * EIA monthly data provides credible and publicly-available information on CBR movements, including historical monthly data starting in 2010 * Inter-regional CBR
A Statistical Analysis Of Bottom-Hole Temperature Data In The...
Statistical Analysis Of Bottom-Hole Temperature Data In The Hinton Area Of West-Central Alberta Jump to: navigation, search OpenEI Reference LibraryAdd to library Journal Article:...
Analysis of compressive fracture in rock using statistical techniques
Blair, S.C.
1994-12-01
Fracture of rock in compression is analyzed using a field-theory model, and the processes of crack coalescence and fracture formation and the effect of grain-scale heterogeneities on macroscopic behavior of rock are studied. The model is based on observations of fracture in laboratory compression tests, and incorporates assumptions developed using fracture mechanics analysis of rock fracture. The model represents grains as discrete sites, and uses superposition of continuum and crack-interaction stresses to create cracks at these sites. The sites are also used to introduce local heterogeneity. Clusters of cracked sites can be analyzed using percolation theory. Stress-strain curves for simulated uniaxial tests were analyzed by studying the location of cracked sites, and partitioning of strain energy for selected intervals. Results show that the model implicitly predicts both development of shear-type fracture surfaces and a strength-vs-size relation that are similar to those observed for real rocks. Results of a parameter-sensitivity analysis indicate that heterogeneity in the local stresses, attributed to the shape and loading of individual grains, has a first-order effect on strength, and that increasing local stress heterogeneity lowers compressive strength following an inverse power law. Peak strength decreased with increasing lattice size and decreasing mean site strength, and was independent of site-strength distribution. A model for rock fracture based on a nearest-neighbor algorithm for stress redistribution is also presented and used to simulate laboratory compression tests, with promising results.
Integrated Data Collection Analysis (IDCA) Program - Statistical Analysis of RDX Standard Data Sets
Sandstrom, Mary M.; Brown, Geoffrey W.; Preston, Daniel N.; Pollard, Colin J.; Warner, Kirstin F.; Sorensen, Daniel N.; Remmers, Daniel L.; Phillips, Jason J.; Shelley, Timothy J.; Reyes, Jose A.; Hsu, Peter C.; Reynolds, John G.
2015-10-30
The Integrated Data Collection Analysis (IDCA) program is conducting a Proficiency Test for Small- Scale Safety and Thermal (SSST) testing of homemade explosives (HMEs). Described here are statistical analyses of the results for impact, friction, electrostatic discharge, and differential scanning calorimetry analysis of the RDX Type II Class 5 standard. The material was tested as a well-characterized standard several times during the proficiency study to assess differences among participants and the range of results that may arise for well-behaved explosive materials. The analyses show that there are detectable differences among the results from IDCA participants. While these differences are statistically significant, most of them can be disregarded for comparison purposes to assess potential variability when laboratories attempt to measure identical samples using methods assumed to be nominally the same. The results presented in this report include the average sensitivity results for the IDCA participants and the ranges of values obtained. The ranges represent variation about the mean values of the tests of between 26% and 42%. The magnitude of this variation is attributed to differences in operator, method, and environment as well as the use of different instruments that are also of varying age. The results appear to be a good representation of the broader safety testing community based on the range of methods, instruments, and environments included in the IDCA Proficiency Test.
University of Illinois at Chicago; Montana State University; Bhardwaj, Chhavi; Cui, Yang; Hofstetter, Theresa; Liu, Suet Yi; Bernstein, Hans C.; Carlson, Ross P.; Ahmed, Musahid; Hanley, Luke
2013-04-01
7.87 to 10.5 eV vacuum ultraviolet (VUV) photon energies were used in laser desorption postionization mass spectrometry (LDPI-MS) to analyze biofilms comprised of binary cultures of interacting microorganisms. The effect of photon energy was examined using both tunable synchrotron and laser sources of VUV radiation. Principal components analysis (PCA) was applied to the MS data to differentiate species in Escherichia coli-Saccharomyces cerevisiae coculture biofilms. PCA of LDPI-MS also differentiated individual E. coli strains in a biofilm comprised of two interacting gene deletion strains, even though these strains differed from the wild type K-12 strain by no more than four gene deletions each out of approximately 2000 genes. PCA treatment of 7.87 eV LDPI-MS data separated the E. coli strains into three distinct groups two ?pure? groups and a mixed region. Furthermore, the ?pure? regions of the E. coli cocultures showed greater variance by PCA when analyzed by 7.87 eV photon energies than by 10.5 eV radiation. Comparison of the 7.87 and 10.5 eV data is consistent with the expectation that the lower photon energy selects a subset of low ionization energy analytes while 10.5 eV is more inclusive, detecting a wider range of analytes. These two VUV photon energies therefore give different spreads via PCA and their respective use in LDPI-MS constitute an additional experimental parameter to differentiate strains and species.
Statistical Analysis of Transient Cycle Test Results in a 40 CFR Part 1065
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Engine Dynamometer Test Cell | Department of Energy Analysis of Transient Cycle Test Results in a 40 CFR Part 1065 Engine Dynamometer Test Cell Statistical Analysis of Transient Cycle Test Results in a 40 CFR Part 1065 Engine Dynamometer Test Cell Effects of ""new"" engine testing procedures (40 CFR Part 1065) with respect to repeatability of transient engine dynamometer tests were examined as well as the effects of calibration and measurement methods deer09_shade.pdf
STATISTICAL ANALYSIS OF CURRENT SHEETS IN THREE-DIMENSIONAL MAGNETOHYDRODYNAMIC TURBULENCE
Zhdankin, Vladimir; Boldyrev, Stanislav; Uzdensky, Dmitri A.; Perez, Jean C. E-mail: boldyrev@wisc.edu E-mail: jcperez@wisc.edu
2013-07-10
We develop a framework for studying the statistical properties of current sheets in numerical simulations of magnetohydrodynamic (MHD) turbulence with a strong guide field, as modeled by reduced MHD. We describe an algorithm that identifies current sheets in a simulation snapshot and then determines their geometrical properties (including length, width, and thickness) and intensities (peak current density and total energy dissipation rate). We then apply this procedure to simulations of reduced MHD and perform a statistical analysis on the obtained population of current sheets. We evaluate the role of reconnection by separately studying the populations of current sheets which contain magnetic X-points and those which do not. We find that the statistical properties of the two populations are different in general. We compare the scaling of these properties to phenomenological predictions obtained for the inertial range of MHD turbulence. Finally, we test whether the reconnecting current sheets are consistent with the Sweet-Parker model.
Statistical planning and analysis for treatments of tar sand wastewater. Final report
Pirie, W.R.
1984-03-01
The first part of this report discusses the overall statistical planning, coordination and design for several tar sand wastewater treatment projects contracted by the Laramie Energy Technology Center (LETC) of the Department of Energy. A general discussion of the benefits of consistent statistical design and analysis for data-oriented projects is included, with recommendations for implementation. A detailed outline of the principles of general linear models design is followed by an introduction to recent developments in general linear models by ranks (GLMR) analysis and a comparison to standard analysis using Gaussian or normal theory (GLMN). A listing of routines contained in the VPI Nonparametric Statistics Package (NPSP), installed on the Cyber computer system at the University of Wyoming is included. Part 2 describes in detail the design and analysis for treatments by Gas Flotation, Foam Separation, Coagulation, and Ozonation, with comparisons among the first three methods. Rank methods are used for most analyses, and several detailed examples are included. For optimization studies, the powerful tools of response surface analysis (RSA) are employed, and several sections contain discussion on the benefits of RSA. All four treatment methods proved to be effective for removal of TOC and suspended solids from the wastewater. Because the processes and equipment designs were new, optimum removals were not achieved by these initial studies and reasons for that are discussed. Pollutant levels were nevertheless reduced to levels appropriate for recycling within the process, and for such reuses as steam generation, according to the DOE/LETC project officer. 12 refs., 8 figs., 21 tabs.
Statistical analysis of multipole components in the magnetic field of the RHIC arc regions
Beebe-Wang,J.; Jain, A.
2009-05-04
The existence of multipolar components in the dipole and quadrupole magnets is one of the factors limiting the beam stability in the RHIC operations. Therefore, the statistical properties of the non-linear fields are crucial for understanding the beam behavior and for achieving the superior performance in RHIC. In an earlier work [1], the field quality analysis of the RHIC interaction regions (IR) was presented. Furthermore, a procedure for developing non-linear IR models constructed from measured multipolar data of RHIC IR magnets was described. However, the field quality in the regions outside of the RHIC IR had not yet been addressed. In this paper, we present the statistical analysis of multipolar components in the magnetic fields of the RHIC arc regions. The emphasis is on the lower order components, especially the sextupole in the arc dipole and the 12-pole in the quadrupole magnets, since they are shown to have the strongest effects on the beam stability. Finally, the inclusion of the measured multipolar components data of RHIC arc regions and their statistical properties into tracking models is discussed.
HotPatch Web Gateway: Statistical Analysis of Unusual Patches on Protein Surfaces
DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]
Pettit, Frank K.; Bowie, James U. [DOE-Molecular Biology Institute
HotPatch finds unusual patches on the surface of proteins, and computes just how unusual they are (patch rareness), and how likely each patch is to be of functional importance (functional confidence (FC).) The statistical analysis is done by comparing your protein's surface against the surfaces of a large set of proteins whose functional sites are known. Optionally, HotPatch can also write a script that will display the patches on the structure, when the script is loaded into some common molecular visualization programs. HotPatch generates complete statistics (functional confidence and patch rareness) on the most significant patches on your protein. For each property you choose to analyze, you'll receive an email to which will be attached a PDB-format file in which atomic B-factors (temp. factors) are replaced by patch indices; and the PDB file's Header Remarks will give statistical scores and a PDB-format file in which atomic B-factors are replaced by the raw values of the property used for patch analysis (for example, hydrophobicity instead of hydrophobic patches). [Copied with edits from http://hotpatch.mbi.ucla.edu/
Kareem, A.; Zhao, J.
1994-12-31
The nonlinearities in the wind and wave loadings of compliant offshore platforms and in their structural characteristics result in response statistics that deviate from a Gaussian distribution. This paper focuses on the analysis of the response of these structures to random nonlinear wind and wave loads. As an improvement over the commonly used linearization approach an equivalent statistical quadratization (ESQ) and cubicization (ESC) approach is presented. The nonlinear loading or structural characteristics can be expressed in terms of an equivalent polynomial that contains terms up to quadratic or cubic depending on the type of nonlinearity. The response statistics and its cumulants are based on Volterra theory. A direct integration scheme is utilized to evaluate the response cumulants. The results provide a good comparison with simulation. It is noted that the ESQ provides an accurate description of the systems with asymmetrical nonlinearities, whereas, for symmetrical nonlinearities the ESC provides a good representation. Based on the information on higher-order cumulants, the response pdf, crossing rates and peak value distributions can be derived.
An Asynchronous Many-Task Implementation of In-Situ Statistical Analysis using Legion.
Pebay, Philippe Pierre; Bennett, Janine Camille
2015-11-01
In this report, we propose a framework for the design and implementation of in-situ analy- ses using an asynchronous many-task (AMT) model, using the Legion programming model together with the MiniAero mini-application as a surrogate for full-scale parallel scientific computing applications. The bulk of this work consists of converting the Learn/Derive/Assess model which we had initially developed for parallel statistical analysis using MPI [PTBM11], from a SPMD to an AMT model. In this goal, we propose an original use of the concept of Legion logical regions as a replacement for the parallel communication schemes used for the only operation of the statistics engines that require explicit communication. We then evaluate this proposed scheme in a shared memory environment, using the Legion port of MiniAero as a proxy for a full-scale scientific application, as a means to provide input data sets of variable size for the in-situ statistical analyses in an AMT context. We demonstrate in particular that the approach has merit, and warrants further investigation, in collaboration with ongoing efforts to improve the overall parallel performance of the Legion system.
In-Situ Statistical Analysis of Autotune Simulation Data using Graphical Processing Units
Ranjan, Niloo; Sanyal, Jibonananda; New, Joshua Ryan
2013-08-01
Developing accurate building energy simulation models to assist energy efficiency at speed and scale is one of the research goals of the Whole-Building and Community Integration group, which is a part of Building Technologies Research and Integration Center (BTRIC) at Oak Ridge National Laboratory (ORNL). The aim of the Autotune project is to speed up the automated calibration of building energy models to match measured utility or sensor data. The workflow of this project takes input parameters and runs EnergyPlus simulations on Oak Ridge Leadership Computing Facility s (OLCF) computing resources such as Titan, the world s second fastest supercomputer. Multiple simulations run in parallel on nodes having 16 processors each and a Graphics Processing Unit (GPU). Each node produces a 5.7 GB output file comprising 256 files from 64 simulations. Four types of output data covering monthly, daily, hourly, and 15-minute time steps for each annual simulation is produced. A total of 270TB+ of data has been produced. In this project, the simulation data is statistically analyzed in-situ using GPUs while annual simulations are being computed on the traditional processors. Titan, with its recent addition of 18,688 Compute Unified Device Architecture (CUDA) capable NVIDIA GPUs, has greatly extended its capability for massively parallel data processing. CUDA is used along with C/MPI to calculate statistical metrics such as sum, mean, variance, and standard deviation leveraging GPU acceleration. The workflow developed in this project produces statistical summaries of the data which reduces by multiple orders of magnitude the time and amount of data that needs to be stored. These statistical capabilities are anticipated to be useful for sensitivity analysis of EnergyPlus simulations.
Interactive statistical-distribution-analysis program utilizing numerical and graphical methods
Glandon, S. R.; Fields, D. E.
1982-04-01
The TERPED/P program is designed to facilitate the quantitative analysis of experimental data, determine the distribution function that best describes the data, and provide graphical representations of the data. This code differs from its predecessors, TEDPED and TERPED, in that a printer-plotter has been added for graphical output flexibility. The addition of the printer-plotter provides TERPED/P with a method of generating graphs that is not dependent on DISSPLA, Integrated Software Systems Corporation's confidential proprietary graphics package. This makes it possible to use TERPED/P on systems not equipped with DISSPLA. In addition, the printer plot is usually produced more rapidly than a high-resolution plot can be generated. Graphical and numerical tests are performed on the data in accordance with the user's assumption of normality or lognormality. Statistical analysis options include computation of the chi-squared statistic and its significance level and the Kolmogorov-Smirnov one-sample test confidence level for data sets of more than 80 points. Plots can be produced on a Calcomp paper plotter, a FR80 film plotter, or a graphics terminal using the high-resolution, DISSPLA-dependent plotter or on a character-type output device by the printer-plotter. The plots are of cumulative probability (abscissa) versus user-defined units (ordinate). The program was developed on a Digital Equipment Corporation (DEC) PDP-10 and consists of 1500 statements. The language used is FORTRAN-10, DEC's extended version of FORTRAN-IV.
Statistical analysis of content of Cs-137 in soils in Bansko-Razlog region
Kobilarov, R. G.
2014-11-18
Statistical analysis of the data set consisting of the activity concentrations of {sup 137}Cs in soils in Bansko–Razlog region is carried out in order to establish the dependence of the deposition and the migration of {sup 137}Cs on the soil type. The descriptive statistics and the test of normality show that the data set have not normal distribution. Positively skewed distribution and possible outlying values of the activity of {sup 137}Cs in soils were observed. After reduction of the effects of outliers, the data set is divided into two parts, depending on the soil type. Test of normality of the two new data sets shows that they have a normal distribution. Ordinary kriging technique is used to characterize the spatial distribution of the activity of {sup 137}Cs over an area covering 40 km{sup 2} (whole Razlog valley). The result (a map of the spatial distribution of the activity concentration of {sup 137}Cs) can be used as a reference point for future studies on the assessment of radiological risk to the population and the erosion of soils in the study area.
Statistical Projections for Multi-resolution, Multi-dimensional Visual Data Exploration and Analysis
Hoa T. Nguyen; Stone, Daithi; E. Wes Bethel
2016-01-01
An ongoing challenge in visual exploration and analysis of large, multi-dimensional datasets is how to present useful, concise information to a user for some specific visualization tasks. Typical approaches to this problem have proposed either reduced-resolution versions of data, or projections of data, or both. These approaches still have some limitations such as consuming high computation or suffering from errors. In this work, we explore the use of a statistical metric as the basis for both projections and reduced-resolution versions of data, with a particular focus on preserving one key trait in data, namely variation. We use two different case studies to explore this idea, one that uses a synthetic dataset, and another that uses a large ensemble collection produced by an atmospheric modeling code to study long-term changes in global precipitation. The primary findings of our work are that in terms of preserving the variation signal inherent in data, that using a statistical measure more faithfully preserves this key characteristic across both multi-dimensional projections and multi-resolution representations than a methodology based upon averaging.
Frome, EL
2005-09-20
Environmental exposure measurements are, in general, positive and may be subject to left censoring; i.e,. the measured value is less than a ''detection limit''. In occupational monitoring, strategies for assessing workplace exposures typically focus on the mean exposure level or the probability that any measurement exceeds a limit. Parametric methods used to determine acceptable levels of exposure, are often based on a two parameter lognormal distribution. The mean exposure level, an upper percentile, and the exceedance fraction are used to characterize exposure levels, and confidence limits are used to describe the uncertainty in these estimates. Statistical methods for random samples (without non-detects) from the lognormal distribution are well known for each of these situations. In this report, methods for estimating these quantities based on the maximum likelihood method for randomly left censored lognormal data are described and graphical methods are used to evaluate the lognormal assumption. If the lognormal model is in doubt and an alternative distribution for the exposure profile of a similar exposure group is not available, then nonparametric methods for left censored data are used. The mean exposure level, along with the upper confidence limit, is obtained using the product limit estimate, and the upper confidence limit on an upper percentile (i.e., the upper tolerance limit) is obtained using a nonparametric approach. All of these methods are well known but computational complexity has limited their use in routine data analysis with left censored data. The recent development of the R environment for statistical data analysis and graphics has greatly enhanced the availability of high-quality nonproprietary (open source) software that serves as the basis for implementing the methods in this paper.
Eason, E.D.; Merton, A.A.; Wright, J.E.
1996-05-01
The effects of Li, pH and H, on primary water stress corrosion cracking (PWSCC) of Alloy 600 were investigated for temperatures between 320 and 330{degrees}C. Specimens included in the study were reverse U-bends (RUBs) made from several different heats of Alloy 600. The characteristic life, {eta}, which represents the time until 63.2% of the population initiates PWSCC, was computed using a modified Weibull statistical analysis algorithm and was analyzed for effects of the water chemistry variables previously mentioned. It was determined that the water chemistry variables are less sensitive than the metallurgical characteristics defined by the heat, heat treatment and initial stress state of the specimen (diameter and style of RUB); the maximum impact of chemistry effects was 0.13 to 0.59 standard deviations compared to a range of three (3) standard deviations for all variables. A first-order model was generated to estimate the effect of changes in pH, Li and H, concentrations on the characteristic life. The characteristic time to initiate cracks, {eta}, is not sensitive to Li and H{sub 2} concentrations in excess of 3.5 ppm and 25 ml/kg, respectively. Below these values, (1) {eta} decreases by {approximately}20% when [Li] is increased from 0.7 to 3.5 ppm; (2) {eta} decreases by {approximately}9% when [H{sub 2}] is increased from 13.1 to 25.0 ml/kg; and (3) {eta} decreases by {approximately}14% when pH is increased from 7.0 to 7.4, in each case holding the other two variables constant.
Nazemi, A.H.; Smith, E.P.
1985-11-01
Metal loss from in-bed heat exchangers has been a persistent problem in FBC systems. As part of its program in FBC technology development, the US Department of Energy/Morgantown Energy Technology Center (DOE/METC) supports a number of projects directed toward providing both theoretical and experimental results which will guide solutions to the metal loss problem. As a part of this effort, METC and The MITRE Corporation began a project in 1984 to collect and analyze metal loss data from various experimental, pilot-scale, and full-scale coal-fired FBC systems worldwide. The specific objective of this effort is to investigate the effects of unit design parameters and operating conditions on metal loss through the use of regression and analysis of variance techniques. To date, forty-one FBC systems worldwide have been identified and most of the data sets which characterize the metal loss occurrences in those units have been developed. The results of MITRE's effort to date were reported earlier (Interim Report No. DOE/MC/21351-1930, August 1985). This report describes the statistical procedures that MITRE will follow to analyze FBC metal loss data. The data will be analyzed using several regression techniques to find variables related to metal loss. Correlation and single variable regressions will be used to indicate important relationships. The joint relationships between the explanatory variables and metal loss will be examined by building multiple regression models. In order to prevent erroneous conclusions, diagnostics will be performed based on partial residual plots, residual analysis, and multicollinearity statistics. 7 refs.
Statistical Analysis of Microarray Data with Replicated Spots: A Case Study withSynechococcusWH8102
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Thomas, E. V.; Phillippy, K. H.; Brahamsha, B.; Haaland, D. M.; Timlin, J. A.; Elbourne, L. D. H.; Palenik, B.; Paulsen, I. T.
2009-01-01
Until recently microarray experiments often involved relatively few arrays with only a single representation of each gene on each array. A complete genome microarray with multiple spots per gene (spread out spatially across the array) was developed in order to compare the gene expression of a marine cyanobacterium and a knockout mutant strain in a defined artificial seawater medium. Statistical methods were developed for analysis in the special situation of this case study where there is gene replication within an array and where relatively few arrays are used, which can be the case with current array technology. Due in partmoreto the replication within an array, it was possible to detect very small changes in the levels of expression between the wild type and mutant strains. One interesting biological outcome of this experiment is the indication of the extent to which the phosphorus regulatory system of this cyanobacterium affects the expression of multiple genes beyond those strictly involved in phosphorus acquisition.less
Statistical Analysis of Baseline Load Models for Non-Residential Buildings
Coughlin, Katie; Piette, Mary Ann; Goldman, Charles; Kiliccote, Sila
2008-11-10
Policymakers are encouraging the development of standardized and consistent methods to quantify the electric load impacts of demand response programs. For load impacts, an essential part of the analysis is the estimation of the baseline load profile. In this paper, we present a statistical evaluation of the performance of several different models used to calculate baselines for commercial buildings participating in a demand response program in California. In our approach, we use the model to estimate baseline loads for a large set of proxy event days for which the actual load data are also available. Measures of the accuracy and bias of different models, the importance of weather effects, and the effect of applying morning adjustment factors (which use data from the day of the event to adjust the estimated baseline) are presented. Our results suggest that (1) the accuracy of baseline load models can be improved substantially by applying a morning adjustment, (2) the characterization of building loads by variability and weather sensitivity is a useful indicator of which types of baseline models will perform well, and (3) models that incorporate temperature either improve the accuracy of the model fit or do not change it.
STAC -- a new Swedish code for statistical analysis of cracks in SG-tubes
Poern, K.
1997-02-01
Steam generator (SG) tubes in pressurized water reactor plants are exposed to various types of degradation processes, among which stress corrosion cracking in particular has been observed. To be able to evaluate the safety importance of such cracking of SG-tubes one has to have a good and empirically founded knowledge about the scope and the size of the cracks as well as the rate of their continuous growth. The basis of experience is to a large extent constituted of the annually performed SG-inspections and crack sizing procedures. On the basis of this experience one can estimate the distribution of existing crack lengths, and modify this distribution with regard to maintenance (plugging) and the predicted rate of crack propagation. Finally, one can calculate the rupture probability of SG-tubes as a function of a given critical crack length. On account of the Swedish Nuclear Power Inspectorate an introductory study has been performed in order to get a survey of what has been done elsewhere in this field. The study resulted in a proposal of a computerizable model to be able to estimate the distribution of true cracks, to modify this distribution due to the crack growth and to compute the probability of tube rupture. The model has now been implemented in a compute code, called STAC (STatistical Analysis of Cracks). This paper is aimed to give a brief outline of the model to facilitate the understanding of the possibilities and limitations associated with the model.
Mathematical and statistical analysis of the effect of boron on yield parameters of wheat
Rawashdeh, Hamzeh; Sala, Florin; Boldea, Marius
2015-03-10
The main objective of this research is to investigate the effect of foliar applications of boron at different growth stages on yield and yield parameters of wheat. The contribution of boron in achieving yield parameters is described by second degree polynomial equations, with high statistical confidence (p<0.01; F theoretical < F calculated, according to ANOVA test, for Alfa = 0.05). Regression analysis, based on R{sup 2} values obtained, made it possible to evaluate the particular contribution of boron to the realization of yield parameters. This was lower for spike length (R{sup 2} = 0.812), thousand seeds weight (R{sup 2} = 0.850) and higher in the case of the number of spikelets (R{sup 2} = 0.936) and the number of seeds on a spike (R{sup 2} = 0.960). These results confirm that boron plays an important part in achieving the number of seeds on a spike in the case of wheat, as the contribution of this element to the process of flower fertilization is well-known. In regards to productivity elements, the contribution of macroelements to yield quantity is clear, the contribution of B alone being R{sup 2} = 0.868.
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Thomas, E. V.; Phillippy, K. H.; Brahamsha, B.; Haaland, D. M.; Timlin, J. A.; Elbourne, L. D. H.; Palenik, B.; Paulsen, I. T.
2009-01-01
Until recently microarray experiments often involved relatively few arrays with only a single representation of each gene on each array. A complete genome microarray with multiple spots per gene (spread out spatially across the array) was developed in order to compare the gene expression of a marine cyanobacterium and a knockout mutant strain in a defined artificial seawater medium. Statistical methods were developed for analysis in the special situation of this case study where there is gene replication within an array and where relatively few arrays are used, which can be the case with current array technology. Due in partmore » to the replication within an array, it was possible to detect very small changes in the levels of expression between the wild type and mutant strains. One interesting biological outcome of this experiment is the indication of the extent to which the phosphorus regulatory system of this cyanobacterium affects the expression of multiple genes beyond those strictly involved in phosphorus acquisition.« less
Yu, Victoria; Kishan, Amar U.; Cao, Minsong; Low, Daniel; Lee, Percy; Ruan, Dan
2014-03-15
been demonstrated. Bimodal behavior was observed in the dose distribution of lung injury after SBRT. Novel statistical and geometrical analysis has shown that the systematically quantified low-dose peak at approximately 35 Gy, or 70% prescription dose, is a good indication of a critical dose for injury. The determined critical dose of 35 Gy resembles the critical dose volume limit of 30 Gy for ipsilateral bronchus in RTOG 0618 and results from previous studies. The authors seek to further extend this improved analysis method to a larger cohort to better understand the interpatient variation in radiographic lung injury dose response post-SBRT.
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
6 Statistical Sciences Applying statistical reasoning and rigor to multidisciplinary scientific investigations Contact Us Group Leader James Gattiker (Acting) Email Deputy Group Leader Geralyn Hemphill (Acting) Email Group Administrator LeeAnn Martinez (505) 667-3308 Email Statistical Sciences Statistical Sciences provides statistical reasoning and rigor to multidisciplinary scientific investigations and development, application, and communication of cutting-edge statistical sciences research.
Widen, Joakim; Waeckelgaard, Ewa; Paatero, Jukka; Lund, Peter
2010-03-15
The trend of increasing application of distributed generation with solar photovoltaics (PV-DG) suggests that a widespread integration in existing low-voltage (LV) grids is possible in the future. With massive integration in LV grids, a major concern is the possible negative impacts of excess power injection from on-site generation. For power-flow simulations of such grid impacts, an important consideration is the time resolution of demand and generation data. This paper investigates the impact of time averaging on high-resolution data series of domestic electricity demand and PV-DG output and on voltages in a simulated LV grid. Effects of 10-minutely and hourly averaging on descriptive statistics and duration curves were determined. Although time averaging has a considerable impact on statistical properties of the demand in individual households, the impact is smaller on aggregate demand, already smoothed from random coincidence, and on PV-DG output. Consequently, the statistical distribution of simulated grid voltages was also robust against time averaging. The overall judgement is that statistical investigation of voltage variations in the presence of PV-DG does not require higher resolution than hourly. (author)
Time varying, multivariate volume data reduction
Ahrens, James P; Fout, Nathaniel; Ma, Kwan - Liu
2010-01-01
Large-scale supercomputing is revolutionizing the way science is conducted. A growing challenge, however, is understanding the massive quantities of data produced by large-scale simulations. The data, typically time-varying, multivariate, and volumetric, can occupy from hundreds of gigabytes to several terabytes of storage space. Transferring and processing volume data of such sizes is prohibitively expensive and resource intensive. Although it may not be possible to entirely alleviate these problems, data compression should be considered as part of a viable solution, especially when the primary means of data analysis is volume rendering. In this paper we present our study of multivariate compression, which exploits correlations among related variables, for volume rendering. Two configurations for multidimensional compression based on vector quantization are examined. We emphasize quality reconstruction and interactive rendering, which leads us to a solution using graphics hardware to perform on-the-fly decompression during rendering. In this paper we present a solution which addresses the need for data reduction in large supercomputing environments where data resulting from simulations occupies tremendous amounts of storage. Our solution employs a lossy encoding scheme to acrueve data reduction with several options in terms of rate-distortion behavior. We focus on encoding of multiple variables together, with optional compression in space and time. The compressed volumes can be rendered directly with commodity graphics cards at interactive frame rates and rendering quality similar to that of static volume renderers. Compression results using a multivariate time-varying data set indicate that encoding multiple variables results in acceptable performance in the case of spatial and temporal encoding as compared to independent compression of variables. The relative performance of spatial vs. temporal compression is data dependent, although temporal compression has the
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Brown, Geoffrey W.; Sandstrom, Mary M.; Preston, Daniel N.; Pollard, Colin J.; Warner, Kirstin F.; Sorensen, Daniel N.; Remmers, Daniel L.; Phillips, Jason J.; Shelley, Timothy J.; Reyes, Jose A.; et al
2014-11-17
In this study, the Integrated Data Collection Analysis (IDCA) program has conducted a proficiency test for small-scale safety and thermal (SSST) testing of homemade explosives (HMEs). Described here are statistical analyses of the results from this test for impact, friction, electrostatic discharge, and differential scanning calorimetry analysis of the RDX Class 5 Type II standard. The material was tested as a well-characterized standard several times during the proficiency test to assess differences among participants and the range of results that may arise for well-behaved explosive materials.
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Usage Statistics Usage Statistics Genepool Cluster Statistics Period: daily weekly monthly quarter yearly 2year Utilization By Group Jobs Pending Last edited: 2013-09-26 18:21:13...
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Downtime Log Yearly Operation Statistics 2016 Statistics 2015 Statistics 2014 Statistics 2013 Statistics 2012 Statistics 2011 Statistics 2010 Statistics 2009 Statistics 2008...
Guo, Genliang; George, S.A.; Lindsey, R.P.
1997-08-01
Thirty-six sets of surface lineaments and fractures mapped from satellite images and/or aerial photos from parts of the Mid-continent and Colorado Plateau regions were collected, digitized, and statistically analyzed in order to obtain the probability distribution functions of natural fractures for characterizing naturally fractured reservoirs. The orientations and lengths of the surface linear features were calculated using the digitized coordinates of the two end points of each individual linear feature. The spacing data of the surface linear features within an individual set were, obtained using a new analytical sampling technique. Statistical analyses were then performed to find the best-fit probability distribution functions for the orientation, length, and spacing of each data set. Twenty-five hypothesized probability distribution functions were used to fit each data set. A chi-square goodness-of-fit test was used to rank the significance of each fit. A distribution which provides the lowest chi-square goodness-of-fit value was considered the best-fit distribution. The orientations of surface linear features were best-fitted by triangular, normal, or logistic distributions; the lengths were best-fitted by PearsonVI, PearsonV, lognormal2, or extreme-value distributions; and the spacing data were best-fitted by lognormal2, PearsonVI, or lognormal distributions. These probability functions can be used to stochastically characterize naturally fractured reservoirs.
SU-E-J-261: Statistical Analysis and Chaotic Dynamics of Respiratory Signal of Patients in BodyFix
Michalski, D; Huq, M; Bednarz, G; Lalonde, R; Yang, Y; Heron, D
2014-06-01
Purpose: To quantify respiratory signal of patients in BodyFix undergoing 4DCT scan with and without immobilization cover. Methods: 20 pairs of respiratory tracks recorded with RPM system during 4DCT scan were analyzed. Descriptive statistic was applied to selected parameters of exhale-inhale decomposition. Standardized signals were used with the delay method to build orbits in embedded space. Nonlinear behavior was tested with surrogate data. Sample entropy SE, Lempel-Ziv complexity LZC and the largest Lyapunov exponents LLE were compared. Results: Statistical tests show difference between scans for inspiration time and its variability, which is bigger for scans without cover. The same is for variability of the end of exhalation and inhalation. Other parameters fail to show the difference. For both scans respiratory signals show determinism and nonlinear stationarity. Statistical test on surrogate data reveals their nonlinearity. LLEs show signals chaotic nature and its correlation with breathing period and its embedding delay time. SE, LZC and LLE measure respiratory signal complexity. Nonlinear characteristics do not differ between scans. Conclusion: Contrary to expectation cover applied to patients in BodyFix appears to have limited effect on signal parameters. Analysis based on trajectories of delay vectors shows respiratory system nonlinear character and its sensitive dependence on initial conditions. Reproducibility of respiratory signal can be evaluated with measures of signal complexity and its predictability window. Longer respiratory period is conducive for signal reproducibility as shown by these gauges. Statistical independence of the exhale and inhale times is also supported by the magnitude of LLE. The nonlinear parameters seem more appropriate to gauge respiratory signal complexity since its deterministic chaotic nature. It contrasts with measures based on harmonic analysis that are blind for nonlinear features. Dynamics of breathing, so crucial for
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Multivariate Spatio-Temporal Clustering of Times-Series Data: An Approach for Diagnosing Cloud Properties and Understanding ARM Site Representativeness F. M. Hoffman and W. W. Hargrove Oak Ridge National Laboratory Oakridge, Tennessee A. D. Del Genio National Aeronautics and Space Administration Goddard Institute for Space Studies Columbia University, New York Multivariate Clustering A multivariate statistical clustering technique-based on the iterative k-means algorithm of Hartigan (Hartigan
Rebound 2007: Analysis of U.S. Light-Duty Vehicle Travel Statistics
Greene, David L
2010-01-01
U.S. national time series data on vehicle travel by passenger cars and light trucks covering the period 1966 2007 are used to test for the existence, size and stability of the rebound effect for motor vehicle fuel efficiency on vehicle travel. The data show a statistically significant effect of gasoline price on vehicle travel but do not support the existence of a direct impact of fuel efficiency on vehicle travel. Additional tests indicate that fuel price effects have not been constant over time, although the hypothesis of symmetry with respect to price increases and decreases is not rejected. Small and Van Dender (2007) model of a declining rebound effect with income is tested and similar results are obtained.
STATISTICAL ANALYSIS OF INTERFEROMETRIC MEASUREMENTS OF AXIS RATIOS FOR CLASSICAL Be STARS
Cyr, R. P.; Jones, C. E.; Tycner, C.
2015-01-20
This work presents a novel method to estimate the effective opening angle of CBe star disks from projected axis ratio measurements, obtained by interferometry using a statistical approach. A Monte Carlo scheme was used to generate a large set of theoretical axis ratios from disk models using different distributions of disk densities and opening angles. These theoretical samples were then compared to observational samples, using a two-sample Kolmogorov-Smirnov test, to determine which theoretical distribution best reproduces the observations. The results suggest that the observed ratio distributions in the K, H, and N band can best be explained by the presence of thin disks, with opening half-angles of the order of 0.°15-4.°0. Results for measurements over the Hα line point toward slightly thicker disks, 3.°7-14°, which is consistent with a flaring disk predicted by the viscous disk model.
International Energy Statistics
U.S. Energy Information Administration (EIA) Indexed Site
Eia.gov BETA - Data - U.S. Energy Information Administration (EIA) U.S. Energy Information Administration - EIA - Independent Statistics and Analysis Sources & Uses Petroleum &...
Analysis of hyper-spectral data derived from an imaging Fourier transform: A statistical perspective
Sengupta, S.K.; Clark, G.A.; Fields, D.J.
1996-01-10
Fourier transform spectrometers (FTS) using optical sensors are increasingly being used in various branches of science. Typically, a FTS generates a three-dimensional data cube with two spatial dimensions and one frequency/wavelength dimension. The number of frequency dimensions in such data cubes is generally very large, often in the hundreds, making data analytical procedures extremely complex. In the present report, the problem is viewed from a statistical perspective. A set of procedures based on the high degree of inter-channel correlation structure often present in such hyper-spectral data, has been identified and applied to an example data set of dimension 100 x 128 x 128 comprising 128 spectral bands. It is shown that in this case, the special eigen-structure of the correlation matrix has allowed the authors to extract just a few linear combinations of the channels (the significant principal vectors) that effectively contain almost all of the spectral information contained in the data set analyzed. This in turn, enables them to segment the objects in the given spatial frame using, in a parsimonious yet highly effective way, most of the information contained in the data set.
Wolfrum, E. J.; Sluiter, A. D.
2009-01-01
We have studied rapid calibration models to predict the composition of a variety of biomass feedstocks by correlating near-infrared (NIR) spectroscopic data to compositional data produced using traditional wet chemical analysis techniques. The rapid calibration models are developed using multivariate statistical analysis of the spectroscopic and wet chemical data. This work discusses the latest versions of the NIR calibration models for corn stover feedstock and dilute-acid pretreated corn stover. Measures of the calibration precision and uncertainty are presented. No statistically significant differences (p = 0.05) are seen between NIR calibration models built using different mathematical pretreatments. Finally, two common algorithms for building NIR calibration models are compared; no statistically significant differences (p = 0.05) are seen for the major constituents glucan, xylan, and lignin, but the algorithms did produce different predictions for total extractives. A single calibration model combining the corn stover feedstock and dilute-acid pretreated corn stover samples gave less satisfactory predictions than the separate models.
High Statistics Analysis using Anisotropic Clover Lattices: (I) Single Hadron Correlation Functions
Beane, S; Detmold, W; Luu, T; Orginos, K; Parreno, A; Savage, M; Torok, A; Walker-Loud, A
2009-03-23
We present the results of high-statistics calculations of correlation functions generated with single-baryon interpolating operators on an ensemble of dynamical anisotropic gauge-field configurations generated by the Hadron Spectrum Collaboration using a tadpole-improved clover fermion action and Symanzik-improved gauge action. A total of 292, 500 sets of measurements are made using 1194 gauge configurations of size 20{sup 3} x 128 with an anisotropy parameter {zeta} = b{sub s}/b{sub t} = 3.5, a spatial lattice spacing of b{sub s} = 0.1227 {+-} 0.0008 fm, and pion mass of M{sub {pi}} {approx} 390 MeV. Ground state baryons masses are extracted with fully quantified uncertainties that are at or below the {approx} 0.2%-level in lattice units. The lowest-lying negative-parity states are also extracted albeit with a somewhat lower level of precision. In the case of the nucleon, this negative-parity state is above the N{pi} threshold and, therefore, the isospin-1/2 {pi}N s-wave scattering phase-shift can be extracted using Luescher's method. The disconnected contributions to this process are included indirectly in the gauge-field configurations and do not require additional calculations. The signal-to-noise ratio in the various correlation functions is explored and is found to degrade exponentially faster than naive expectations on many time-slices. This is due to backward propagating states arising from the anti-periodic boundary conditions imposed on the quark-propagators in the time-direction. We explore how best to distribute computational resources between configuration generation and propagator measurements in order to optimize the extraction of single baryon observables.
Statistical analysis of liquid seepage in partially saturated heterogeneous fracture systems
Liou, T.S.
1999-12-01
Field evidence suggests that water flow in unsaturated fracture systems may occur along fast preferential flow paths. However, conventional macroscale continuum approaches generally predict the downward migration of water as a spatially uniform wetting front subjected to strong inhibition into the partially saturated rock matrix. One possible cause of this discrepancy may be the spatially random geometry of the fracture surfaces, and hence, the irregular fracture aperture. Therefore, a numerical model was developed in this study to investigate the effects of geometric features of natural rock fractures on liquid seepage and solute transport in 2-D planar fractures under isothermal, partially saturated conditions. The fractures were conceptualized as 2-D heterogeneous porous media that are characterized by their spatially correlated permeability fields. A statistical simulator, which uses a simulated annealing (SA) algorithm, was employed to generate synthetic permeability fields. Hypothesized geometric features that are expected to be relevant for seepage behavior, such as spatially correlated asperity contacts, were considered in the SA algorithm. Most importantly, a new perturbation mechanism for SA was developed in order to consider specifically the spatial correlation near conditioning asperity contacts. Numerical simulations of fluid flow and solute transport were then performed in these synthetic fractures by the flow simulator TOUGH2, assuming that the effects of matrix permeability, gas phase pressure, capillary/permeability hysteresis, and molecular diffusion can be neglected. Results of flow simulation showed that liquid seepage in partially saturated fractures is characterized by localized preferential flow, along with bypassing, funneling, and localized ponding. Seepage pattern is dominated by the fraction of asperity contracts, and their shape, size, and spatial correlation. However, the correlation structure of permeability field is less important
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Statistics Cluster Statistics Ganglia Ganglia can be used to monitor performance of PDSF nodes... Read More PDSF IO Monitoring This page shows the IO response of the elizas and...
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Kelly named Fellow of the American Statistical Association August 2, 2016 The American Statistical Association (ASA) has honored Elizabeth Kelly of the Lab's Statistical Sciences group with the title of Fellow. The ASA recognized her for providing exemplary statistical leadership of and collaboration on multidisciplinary teams dealing with environmental restoration, weapon quality, and nuclear materials storage to ensure the safety and security of the Nation. She will receive the Fellow award at
Davis, Adam Christopher; Booth, Steven Richard
2015-08-20
Voluntary Protection Program (VPP) surveys were conducted in 2013 and 2014 to assess the degree to which workers at Los Alamos National Laboratory feel that their safety is valued by their management and peers. The goal of this analysis is to determine whether the difference between the VPP survey scores in 2013 and 2014 is significant, and to present the data in a way such that it can help identify either positive changes or potential opportunities for improvement. Data for several questions intended to identify the demographic groups of the respondent are included in both the 2013 and 2014 VPP survey results. These can be used to identify any significant differences among groups of employees as well as to identify any temporal trends in these cohorts.
Binh T. Pham; Grant L. Hawkes; Jeffrey J. Einerson
2014-05-01
As part of the High Temperature Reactors (HTR) R&D program, a series of irradiation tests, designated as Advanced Gas-cooled Reactor (AGR), have been defined to support development and qualification of fuel design, fabrication process, and fuel performance under normal operation and accident conditions. The AGR tests employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule and instrumented with thermocouples (TC) embedded in graphite blocks enabling temperature control. While not possible to obtain by direct measurements in the tests, crucial fuel conditions (e.g., temperature, neutron fast fluence, and burnup) are calculated using core physics and thermal modeling codes. This paper is focused on AGR test fuel temperature predicted by the ABAQUS code's finite element-based thermal models. The work follows up on a previous study, in which several statistical analysis methods were adapted, implemented in the NGNP Data Management and Analysis System (NDMAS), and applied for qualification of AGR-1 thermocouple data. Abnormal trends in measured data revealed by the statistical analysis are traced to either measuring instrument deterioration or physical mechanisms in capsules that may have shifted the system thermal response. The main thrust of this work is to exploit the variety of data obtained in irradiation and post-irradiation examination (PIE) for assessment of modeling assumptions. As an example, the uneven reduction of the control gas gap in Capsule 5 found in the capsule metrology measurements in PIE helps identify mechanisms other than TC drift causing the decrease in TC readings. This suggests a more physics-based modification of the thermal model that leads to a better fit with experimental data, thus reducing model uncertainty and increasing confidence in the calculated fuel temperatures of the AGR-1 test.
Dr. Binh T. Pham; Grant L. Hawkes; Jeffrey J. Einerson
2012-10-01
As part of the Research and Development program for Next Generation High Temperature Reactors (HTR), a series of irradiation tests, designated as Advanced Gas-cooled Reactor (AGR), have been defined to support development and qualification of fuel design, fabrication process, and fuel performance under normal operation and accident conditions. The AGR tests employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule and instrumented with thermocouples (TC) embedded in graphite blocks enabling temperature control. The data representing the crucial test fuel conditions (e.g., temperature, neutron fast fluence, and burnup) while impossible to obtain from direct measurements are calculated by physics and thermal models. The irradiation and post-irradiation examination (PIE) experimental data are used in model calibration effort to reduce the inherent uncertainty of simulation results. This paper is focused on fuel temperature predicted by the ABAQUS code’s finite element-based thermal models. The work follows up on a previous study, in which several statistical analysis methods were adapted, implemented in the NGNP Data Management and Analysis System (NDMAS), and applied for improving qualification of AGR-1 thermocouple data. The present work exercises the idea that the abnormal trends of measured data observed from statistical analysis may be caused by either measuring instrument deterioration or physical mechanisms in capsules that may have shifted the system thermal response. As an example, the uneven reduction of the control gas gap in Capsule 5 revealed by the capsule metrology measurements in PIE helps justify the reduction in TC readings instead of TC drift. This in turn prompts modification of thermal model to better fit with experimental data, thus help increase confidence, and in other word reduce model uncertainties in thermal simulation results of the AGR-1 test.
Brandt, Timothy D.; Spiegel, David S.; McElwain, Michael W.; Grady, C. A.; Turner, Edwin L.; Mede, Kyle; Kuzuhara, Masayuki; Schlieder, Joshua E.; Brandner, W.; Feldt, M.; Wisniewski, John P.; Abe, L.; Biller, B.; Carson, J.; Currie, T.; Egner, S.; Golota, T.; Guyon, O.; Goto, M.; Hashimoto, J.; and others
2014-10-20
We conduct a statistical analysis of a combined sample of direct imaging data, totalling nearly 250 stars. The stars cover a wide range of ages and spectral types, and include five detections (? And b, two ?60 M {sub J} brown dwarf companions in the Pleiades, PZ Tel B, and CD35 2722B). For some analyses we add a currently unpublished set of SEEDS observations, including the detections GJ 504b and GJ 758B. We conduct a uniform, Bayesian analysis of all stellar ages using both membership in a kinematic moving group and activity/rotation age indicators. We then present a new statistical method for computing the likelihood of a substellar distribution function. By performing most of the integrals analytically, we achieve an enormous speedup over brute-force Monte Carlo. We use this method to place upper limits on the maximum semimajor axis of the distribution function derived from radial-velocity planets, finding model-dependent values of ?30-100 AU. Finally, we model the entire substellar sample, from massive brown dwarfs to a theoretically motivated cutoff at ?5 M {sub J}, with a single power-law distribution. We find that p(M, a)?M {sup 0.65} {sup } {sup 0.60} a {sup 0.85} {sup } {sup 0.39} (1? errors) provides an adequate fit to our data, with 1.0%-3.1% (68% confidence) of stars hosting 5-70 M {sub J} companions between 10 and 100 AU. This suggests that many of the directly imaged exoplanets known, including most (if not all) of the low-mass companions in our sample, formed by fragmentation in a cloud or disk, and represent the low-mass tail of the brown dwarfs.
Independent Statistics & Analysis
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
monthly data for smaller electric power plants that are excluded from the monthly filing ... independent power producers, and electric utility combined heat and power plants. ...
Comnes, G.A.; Belden, T.N.; Kahn, E.P.
1995-02-01
The market for long-term bulk power is becoming increasingly competitive and mature. Given that many privately developed power projects have been or are being developed in the US, it is possible to begin to evaluate the performance of the market by analyzing its revealed prices. Using a consistent method, this paper presents levelized contract prices for a sample of privately developed US generation properties. The sample includes 26 projects with a total capacity of 6,354 MW. Contracts are described in terms of their choice of technology, choice of fuel, treatment of fuel price risk, geographic location, dispatchability, expected dispatch niche, and size. The contract price analysis shows that gas technologies clearly stand out as the most attractive. At an 80% capacity factor, coal projects have an average 20-year levelized price of $0.092/kWh, whereas natural gas combined cycle and/or cogeneration projects have an average price of $0.069/kWh. Within each technology type subsample, however, there is considerable variation. Prices for natural gas combustion turbines and one wind project are also presented. A preliminary statistical analysis is conducted to understand the relationship between price and four categories of explanatory factors including product heterogeneity, geographic heterogeneity, economic and technological change, and other buyer attributes (including avoided costs). Because of residual price variation, we are unable to accept the hypothesis that electricity is a homogeneous product. Instead, the analysis indicates that buyer value still plays an important role in the determination of price for competitively-acquired electricity.
Kurtz, S.E.; Fields, D.E.
1983-10-01
This report describes a version of the TERPED/P computer code that is very useful for small data sets. A new algorithm for determining the Kolmogorov-Smirnov (KS) statistics is used to extend program applicability. The TERPED/P code facilitates the analysis of experimental data and assists the user in determining its probability distribution function. Graphical and numerical tests are performed interactively in accordance with the user's assumption of normally or log-normally distributed data. Statistical analysis options include computation of the chi-square statistic and the KS one-sample test statistic and the corresponding significance levels. Cumulative probability plots of the user's data are generated either via a local graphics terminal, a local line printer or character-oriented terminal, or a remote high-resolution graphics device such as the FR80 film plotter or the Calcomp paper plotter. Several useful computer methodologies suffer from limitations of their implementations of the KS nonparametric test. This test is one of the more powerful analysis tools for examining the validity of an assumption about the probability distribution of a set of data. KS algorithms are found in other analysis codes, including the Statistical Analysis Subroutine (SAS) package and earlier versions of TERPED. The inability of these algorithms to generate significance levels for sample sizes less than 50 has limited their usefulness. The release of the TERPED code described herein contains algorithms to allow computation of the KS statistic and significance level for data sets of, if the user wishes, as few as three points. Values computed for the KS statistic are within 3% of the correct value for all data set sizes.
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Genepool Memory Heatmaps Usage Statistics UGE Scheduler Cycle Time File storage and I/O Data Management Supported Systems FAQ Performance and Optimization Genepool Completed Jobs Genepool Training and Tutorials Websites, databases and cluster services Queues and Scheduling Retired Systems Storage & File Systems Application Performance Data & Analytics Job Logs & Statistics Training & Tutorials Software Policies User Surveys NERSC Users Group Help Staff Blogs Request Repository
Klenzing, J. H.; Earle, G. D.; Heelis, R. A.; Coley, W. R. [William B. Hanson Center for Space Sciences, University of Texas at Dallas, 800 W. Campbell Rd. WT15, Richardson, Texas 75080 (United States)
2009-05-15
The use of biased grids as energy filters for charged particles is common in satellite-borne instruments such as a planar retarding potential analyzer (RPA). Planar RPAs are currently flown on missions such as the Communications/Navigation Outage Forecast System and the Defense Meteorological Satellites Program to obtain estimates of geophysical parameters including ion velocity and temperature. It has been shown previously that the use of biased grids in such instruments creates a nonuniform potential in the grid plane, which leads to inherent errors in the inferred parameters. A simulation of ion interactions with various configurations of biased grids has been developed using a commercial finite-element analysis software package. Using a statistical approach, the simulation calculates collected flux from Maxwellian ion distributions with three-dimensional drift relative to the instrument. Perturbations in the performance of flight instrumentation relative to expectations from the idealized RPA flux equation are discussed. Both single grid and dual-grid systems are modeled to investigate design considerations. Relative errors in the inferred parameters for each geometry are characterized as functions of ion temperature and drift velocity.
Extracting bb Higgs Decay Signals using Multivariate Techniques
Smith, W Clarke; /George Washington U. /SLAC
2012-08-28
For low-mass Higgs boson production at ATLAS at {radical}s = 7 TeV, the hard subprocess gg {yields} h{sup 0} {yields} b{bar b} dominates but is in turn drowned out by background. We seek to exploit the intrinsic few-MeV mass width of the Higgs boson to observe it above the background in b{bar b}-dijet mass plots. The mass resolution of existing mass-reconstruction algorithms is insufficient for this purpose due to jet combinatorics, that is, the algorithms cannot identify every jet that results from b{bar b} Higgs decay. We combine these algorithms using the neural net (NN) and boosted regression tree (BDT) multivariate methods in attempt to improve the mass resolution. Events involving gg {yields} h{sup 0} {yields} b{bar b} are generated using Monte Carlo methods with Pythia and then the Toolkit for Multivariate Analysis (TMVA) is used to train and test NNs and BDTs. For a 120 GeV Standard Model Higgs boson, the m{sub h{sup 0}}-reconstruction width is reduced from 8.6 to 6.5 GeV. Most importantly, however, the methods used here allow for more advanced m{sub h{sup 0}}-reconstructions to be created in the future using multivariate methods.
ORISE: Statistical Analyses of Worker Health
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
appropriate methods of statistical analysis to a variety of problems in occupational health and other areas. Our expertise spans a range of capabilities essential for statistical...
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Storage Trends and Summaries Storage by Scientific Discipline Troubleshooting I/O Resources for Scientific Applications at NERSC Optimizing I/O performance on the Lustre file system I/O Formats Science Databases Sharing Data Transferring Data Unix Groups at NERSC Unix File Permissions Application Performance Data & Analytics Job Logs & Statistics Training & Tutorials Software Policies User Surveys NERSC Users Group Help Staff Blogs Request Repository Mailing List Home » For Users
Smith, Steven G.; Skalski, John R.; Schelechte, J. Warren
1994-12-01
Program SURPH is the culmination of several years of research to develop a comprehensive computer program to analyze survival studies of fish and wildlife populations. Development of this software was motivated by the advent of the PIT-tag (Passive Integrated Transponder) technology that permits the detection of salmonid smolt as they pass through hydroelectric facilities on the Snake and Columbia Rivers in the Pacific Northwest. Repeated detections of individually tagged smolt and analysis of their capture-histories permits estimates of downriver survival probabilities. Eventual installation of detection facilities at adult fish ladders will also permit estimation of ocean survival and upstream survival of returning salmon using the statistical methods incorporated in SURPH.1. However, the utility of SURPH.1 far exceeds solely the analysis of salmonid tagging studies. Release-recapture and radiotelemetry studies from a wide range of terrestrial and aquatic species have been analyzed using SURPH.1 to estimate discrete time survival probabilities and investigate survival relationships. The interactive computing environment of SURPH.1 was specifically developed to allow researchers to investigate the relationship between survival and capture processes and environmental, experimental and individual-based covariates. Program SURPH.1 represents a significant advancement in the ability of ecologists to investigate the interplay between morphologic, genetic, environmental and anthropogenic factors on the survival of wild species. It is hoped that this better understanding of risk factors affecting survival will lead to greater appreciation of the intricacies of nature and to improvements in the management of wild resources. This technical report is an introduction to SURPH.1 and provides a user guide for both the UNIX and MS-Windows{reg_sign} applications of the SURPH software.
Chang, Wen-Kuei; Hong, Tianzhen
2013-01-01
Occupancy profile is one of the driving factors behind discrepancies between the measured and simulated energy consumption of buildings. The frequencies of occupants leaving their offices and the corresponding durations of absences have significant impact on energy use and the operational controls of buildings. This study used statistical methods to analyze the occupancy status, based on measured lighting-switch data in five-minute intervals, for a total of 200 open-plan (cubicle) offices. Five typical occupancy patterns were identified based on the average daily 24-hour profiles of the presence of occupants in their cubicles. These statistical patterns were represented by a one-square curve, a one-valley curve, a two-valley curve, a variable curve, and a flat curve. The key parameters that define the occupancy model are the average occupancy profile together with probability distributions of absence duration, and the number of times an occupant is absent from the cubicle. The statistical results also reveal that the number of absence occurrences decreases as total daily presence hours decrease, and the duration of absence from the cubicle decreases as the frequency of absence increases. The developed occupancy model captures the stochastic nature of occupants moving in and out of cubicles, and can be used to generate a more realistic occupancy schedule. This is crucial for improving the evaluation of the energy saving potential of occupancy based technologies and controls using building simulations. Finally, to demonstrate the use of the occupancy model, weekday occupant schedules were generated and discussed.
Multivariate Calibration Models for Sorghum Composition using Near-Infrared Spectroscopy
Wolfrum, E.; Payne, C.; Stefaniak, T.; Rooney, W.; Dighe, N.; Bean, B.; Dahlberg, J.
2013-03-01
NREL developed calibration models based on near-infrared (NIR) spectroscopy coupled with multivariate statistics to predict compositional properties relevant to cellulosic biofuels production for a variety of sorghum cultivars. A robust calibration population was developed in an iterative fashion. The quality of models developed using the same sample geometry on two different types of NIR spectrometers and two different sample geometries on the same spectrometer did not vary greatly.
Arrowood, L.F.; Tonn, B.E.
1992-02-01
This report presents recommendations relative to the use of expert systems and machine learning techniques by the Bureau of Labor Statistics (BLS) to substantially automate product substitution decisions associated with the Consumer Price Index (CPI). Thirteen commercially available, PC-based expert system shells have received in-depth evaluations. Various machine learning techniques were also reviewed. Two recommendations are given: (1) BLS should use the expert system shell LEVEL5 OBJECT and establish a software development methodology for expert systems; and (2) BLS should undertake a small study to evaluate the potential of machine learning techniques to create and maintain the approximately 350 ELI-specific knowledge bases to be used in CPI product substitution review.
Intrinsic alignments of galaxies in the MassiveBlack-II simulation: Analysis of two-point statistics
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Tenneti, Ananth; Singh, Sukhdeep; Mandelbaum, Rachel; Matteo, Tiziana Di; Feng, Yu; Khandai, Nishikanta
2015-03-11
The intrinsic alignment of galaxies with the large-scale density field in an important astrophysical contaminant in upcoming weak lensing surveys. We present detailed measurements of the galaxy intrinsic alignments and associated ellipticity-direction (ED) and projected shape (wg₊) correlation functions for galaxies in the cosmological hydrodynamic MassiveBlack-II (MB-II) simulation. We carefully assess the effects on galaxy shapes, misalignment of the stellar component with the dark matter shape and two-point statistics of iterative weighted (by mass and luminosity) definitions of the (reduced and unreduced) inertia tensor. We find that iterative procedures must be adopted for a reliable measurement of the reduced tensormore » but that luminosity versus mass weighting has only negligible effects. Both ED and wg₊ correlations increase in amplitude with subhalo mass (in the range of 10¹⁰ – 6.0 X 10¹⁴h⁻¹ M⊙), with a weak redshift dependence (from z = 1 to z = 0.06) at fixed mass. At z ~ 0.3, we predict a wg₊ that is in reasonable agreement with SDSS LRG measurements and that decreases in amplitude by a factor of ~ 5–18 for galaxies in the LSST survey. We also compared the intrinsic alignment of centrals and satellites, with clear detection of satellite radial alignments within the host halos. Finally, we show that wg₊ (using subhalos as tracers of density and wδ (using dark matter density) predictions from the simulations agree with that of non-linear alignment models (NLA) at scales where the 2-halo term dominates in the correlations (and tabulate associated NLA fitting parameters). The 1-halo term induces a scale dependent bias at small scales which is not modeled in the NLA model.« less
Intrinsic alignments of galaxies in the MassiveBlack-II simulation: Analysis of two-point statistics
Tenneti, Ananth; Singh, Sukhdeep; Mandelbaum, Rachel; Matteo, Tiziana Di; Feng, Yu; Khandai, Nishikanta
2015-03-11
The intrinsic alignment of galaxies with the large-scale density field in an important astrophysical contaminant in upcoming weak lensing surveys. We present detailed measurements of the galaxy intrinsic alignments and associated ellipticity-direction (ED) and projected shape (w_{g}₊) correlation functions for galaxies in the cosmological hydrodynamic MassiveBlack-II (MB-II) simulation. We carefully assess the effects on galaxy shapes, misalignment of the stellar component with the dark matter shape and two-point statistics of iterative weighted (by mass and luminosity) definitions of the (reduced and unreduced) inertia tensor. We find that iterative procedures must be adopted for a reliable measurement of the reduced tensor but that luminosity versus mass weighting has only negligible effects. Both ED and w_{g}₊ correlations increase in amplitude with subhalo mass (in the range of 10¹⁰ – 6.0 X 10¹⁴h⁻¹ M_{⊙}), with a weak redshift dependence (from z = 1 to z = 0.06) at fixed mass. At z ~ 0.3, we predict a w_{g}₊ that is in reasonable agreement with SDSS LRG measurements and that decreases in amplitude by a factor of ~ 5–18 for galaxies in the LSST survey. We also compared the intrinsic alignment of centrals and satellites, with clear detection of satellite radial alignments within the host halos. Finally, we show that w_{g}₊ (using subhalos as tracers of density and w_{δ} (using dark matter density) predictions from the simulations agree with that of non-linear alignment models (NLA) at scales where the 2-halo term dominates in the correlations (and tabulate associated NLA fitting parameters). The 1-halo term induces a scale dependent bias at small scales which is not modeled in the NLA model.
Intrinsic alignments of galaxies in the MassiveBlack-II simulation: Analysis of two-point statistics
Tenneti, Ananth; Singh, Sukhdeep; Mandelbaum, Rachel; Matteo, Tiziana Di; Feng, Yu; Khandai, Nishikanta
2015-03-11
The intrinsic alignment of galaxies with the large-scale density field in an important astrophysical contaminant in upcoming weak lensing surveys. We present detailed measurements of the galaxy intrinsic alignments and associated ellipticity-direction (ED) and projected shape (w_{g}?) correlation functions for galaxies in the cosmological hydrodynamic MassiveBlack-II (MB-II) simulation. We carefully assess the effects on galaxy shapes, misalignment of the stellar component with the dark matter shape and two-point statistics of iterative weighted (by mass and luminosity) definitions of the (reduced and unreduced) inertia tensor. We find that iterative procedures must be adopted for a reliable measurement of the reduced tensor but that luminosity versus mass weighting has only negligible effects. Both ED and w_{g}? correlations increase in amplitude with subhalo mass (in the range of 10? 6.0 X 10?h? M_{?}), with a weak redshift dependence (from z = 1 to z = 0.06) at fixed mass. At z ~ 0.3, we predict a w_{g}? that is in reasonable agreement with SDSS LRG measurements and that decreases in amplitude by a factor of ~ 518 for galaxies in the LSST survey. We also compared the intrinsic alignment of centrals and satellites, with clear detection of satellite radial alignments within the host halos. Finally, we show that w_{g}? (using subhalos as tracers of density and w_{?} (using dark matter density) predictions from the simulations agree with that of non-linear alignment models (NLA) at scales where the 2-halo term dominates in the correlations (and tabulate associated NLA fitting parameters). The 1-halo term induces a scale dependent bias at small scales which is not modeled in the NLA model.
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Belu, Radian; Koracin, Darko
2013-01-01
The main objective of the study was to investigate spatial and temporal characteristics of the wind speed and direction in complex terrain that are relevant to wind energy assessment and development, as well as to wind energy system operation, management, and grid integration. Wind data from five tall meteorological towers located in Western Nevada, USA, operated from August 2003 to March 2008, used in the analysis. The multiannual average wind speeds did not show significant increased trend with increasing elevation, while the turbulence intensity slowly decreased with an increase were the average wind speed. The wind speed and direction weremore » modeled using the Weibull and the von Mises distribution functions. The correlations show a strong coherence between the wind speed and direction with slowly decreasing amplitude of the multiday periodicity with increasing lag periods. The spectral analysis shows significant annual periodicity with similar characteristics at all locations. The relatively high correlations between the towers and small range of the computed turbulence intensity indicate that wind variability is dominated by the regional synoptic processes. Knowledge and information about daily, seasonal, and annual wind periodicities are very important for wind energy resource assessment, wind power plant operation, management, and grid integration.« less
Computing contingency statistics in parallel.
Bennett, Janine Camille; Thompson, David; Pebay, Philippe Pierre
2010-09-01
Statistical analysis is typically used to reduce the dimensionality of and infer meaning from data. A key challenge of any statistical analysis package aimed at large-scale, distributed data is to address the orthogonal issues of parallel scalability and numerical stability. Many statistical techniques, e.g., descriptive statistics or principal component analysis, are based on moments and co-moments and, using robust online update formulas, can be computed in an embarrassingly parallel manner, amenable to a map-reduce style implementation. In this paper we focus on contingency tables, through which numerous derived statistics such as joint and marginal probability, point-wise mutual information, information entropy, and {chi}{sup 2} independence statistics can be directly obtained. However, contingency tables can become large as data size increases, requiring a correspondingly large amount of communication between processors. This potential increase in communication prevents optimal parallel speedup and is the main difference with moment-based statistics where the amount of inter-processor communication is independent of data size. Here we present the design trade-offs which we made to implement the computation of contingency tables in parallel.We also study the parallel speedup and scalability properties of our open source implementation. In particular, we observe optimal speed-up and scalability when the contingency statistics are used in their appropriate context, namely, when the data input is not quasi-diffuse.
Multivariate volume visualization through dynamic projections
Liu, Shusen; Wang, Bei; Thiagarajan, Jayaraman J.; Bremer, Peer -Timo; Pascucci, Valerio
2014-11-01
We propose a multivariate volume visualization framework that tightly couples dynamic projections with a high-dimensional transfer function design for interactive volume visualization. We assume that the complex, high-dimensional data in the attribute space can be well-represented through a collection of low-dimensional linear subspaces, and embed the data points in a variety of 2D views created as projections onto these subspaces. Through dynamic projections, we present animated transitions between different views to help the user navigate and explore the attribute space for effective transfer function design. Our framework not only provides a more intuitive understanding of the attribute space but also allows the design of the transfer function under multiple dynamic views, which is more flexible than being restricted to a single static view of the data. For large volumetric datasets, we maintain interactivity during the transfer function design via intelligent sampling and scalable clustering. As a result, using examples in combustion and climate simulations, we demonstrate how our framework can be used to visualize interesting structures in the volumetric space.
Demkin, V. P.; Mel'nichuk, S. V.
2014-09-15
In the present work, results of investigations into the dynamics of secondary electrons with helium atoms in the presence of the reverse electric field arising in the flare of a high-voltage pulsed beam-type discharge and leading to degradation of the primary electron beam are presented. The electric field in the discharge of this type at moderate pressures can reach several hundred V/cm and leads to considerable changes in the kinetics of secondary electrons created in the process of propagation of the electron beam generated in the accelerating gap with a grid anode. Moving in the accelerating electric field toward the anode, secondary electrons create the so-called compensating current to the anode. The character of electron motion and the compensating current itself are determined by the ratio of the field strength to the concentration of atoms (E/n). The energy and angular spectra of secondary electrons are calculated by the Monte Carlo method for different ratios E/n of the electric field strength to the helium atom concentration. The motion of secondary electrons with threshold energy is studied for inelastic collisions of helium atoms and differential analysis is carried out of the collisional processes causing energy losses of electrons in helium for different E/n values. The mechanism of creation and accumulation of slow electrons as a result of inelastic collisions of secondary electrons with helium atoms and selective population of metastable states of helium atoms is considered. It is demonstrated that in a wide range of E/n values the motion of secondary electrons in the beam-type discharge flare has the character of drift. At E/n values characteristic for the discharge of the given type, the drift velocity of these electrons is calculated and compared with the available experimental data.
Parallel auto-correlative statistics with VTK.
Pebay, Philippe Pierre; Bennett, Janine Camille
2013-08-01
This report summarizes existing statistical engines in VTK and presents both the serial and parallel auto-correlative statistics engines. It is a sequel to [PT08, BPRT09b, PT09, BPT09, PT10] which studied the parallel descriptive, correlative, multi-correlative, principal component analysis, contingency, k-means, and order statistics engines. The ease of use of the new parallel auto-correlative statistics engine is illustrated by the means of C++ code snippets and algorithm verification is provided. This report justifies the design of the statistics engines with parallel scalability in mind, and provides scalability and speed-up analysis results for the autocorrelative statistics engine.
Deep Data Analysis of Conductive Phenomena on Complex Oxide Interfaces: Physics from Data Mining
Strelcov, Evgheni; Belianinov, Alex; Hsieh, Ying-Hui; Jesse, Stephen; Baddorf, Arthur P; Chu, Ying Hao; Kalinin, Sergei V
2014-01-01
Spatial variability of electronic transport in BiFeO3-CoFe2O4 (BFO-CFO) self-assembled heterostructures is explored using spatially resolved first order reversal curve (FORC) current voltage (IV) mapping. Multivariate statistical analysis of FORC-IV data classifies statistically significant behaviors and maps characteristic responses spatially. In particular, regions of grain, matrix, and grain boundary responses are clearly identified. K-means and Bayesian demixing analysis suggests the characteristic response be separated into four components, with hysteretic type behavior localized at the BFO-CFO tubular interfaces. The conditions under which Bayesian components allow direct physical interpretation are explored, and transport mechanisms at the grain boundaries and individual phases are analyzed. This approach conjoins multivariate statistical analysis with physics-based interpretation, actualizing a robust, universal, data driven approach to problem solving, which can be applied to exploration of local transport and other functional phenomena in other spatially inhomogeneous systems.
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
ARMFacility Statistics 2015 Quarterly Reports First Quarter (PDF) Second Quarter (PDF) Third Quarter (PDF) Fourth Quarter (PDF) Historical Statistics Field Campaigns Operational...
Office of Energy Efficiency and Renewable Energy (EERE)
EERE uses Google Analytics to capture statistics on its websites. These statistics help website managers measure and report on users, sessions, most visited pages, and more.
Environment/Health/Safety (EHS): Monthly Accident Statistics
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Personal Protective Equipment (PPE) Injury Review & Analysis Worker Safety and Health Program: PUB-3851 Monthly Accident Statistics Latest Accident Statistics Accident...
Automated Multivariate Optimization Tool for Energy Analysis: Preprint
Ellis, P. G.; Griffith, B. T.; Long, N.; Torcellini, P. A.; Crawley, D.
2006-07-01
Building energy simulations are often used for trial-and-error evaluation of ''what-if'' options in building design--a limited search for an optimal solution, or ''optimization''. Computerized searching has the potential to automate the input and output, evaluate many options, and perform enough simulations to account for the complex interactions among combinations of options. This paper describes ongoing efforts to develop such a tool. The optimization tool employs multiple modules, including a graphical user interface, a database, a preprocessor, the EnergyPlus simulation engine, an optimization engine, and a simulation run manager. Each module is described and the overall application architecture is summarized.
Assessment of Critical Events Corridors through Multivariate Cascading Outages Analysis
Makarov, Yuri V.; Samaan, Nader A.; Diao, Ruisheng; Kumbale, Murali; Chen, Yousu; Singh, Ruchi; Green, Irina; Morgan, Mark P.
2011-10-17
Massive blackouts of electrical power systems in North America over the past decade has focused increasing attention upon ways to identify and simulate network events that may potentially lead to widespread network collapse. This paper summarizes a method to simulate power-system vulnerability to cascading failures to a supplied set of initiating events synonymously termed as Extreme Events. The implemented simulation method is currently confined to simulating steady state power-system response to a set of extreme events. The outlined method of simulation is meant to augment and provide a new insight into bulk power transmission network planning that at present remains mainly confined to maintaining power system security for single and double component outages under a number of projected future network operating conditions. Although one of the aims of this paper is to demonstrate the feasibility of simulating network vulnerability to cascading outages, a more important goal has been to determine vulnerable parts of the network that may potentially be strengthened in practice so as to mitigate system susceptibility to cascading failures. This paper proposes to demonstrate a systematic approach to analyze extreme events and identify vulnerable system elements that may be contributing to cascading outages. The hypothesis of critical events corridors is proposed to represent repeating sequential outages that can occur in the system for multiple initiating events. The new concept helps to identify system reinforcements that planners could engineer in order to 'break' the critical events sequences and therefore lessen the likelihood of cascading outages. This hypothesis has been successfully validated with a California power system model.
Visualization of Multivariate Data --- Inventor: Eliot Feibush | Princeton
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Plasma Physics Lab Visualization of Multivariate Data --- Inventor: Eliot Feibush The invention is a process for displaying multivariate data while eliminating ambiguity. It clarifies and shows more dimensions of data than existing methods such as scatter plots and icon plots where multiple samples can occupy the same location in a drawing. The display technique is not affected by the sequential order of the data. The invented process solves these problems by mapping the sample totals of
AMERICAN STATISTICAL ASSOCIATION
U.S. Energy Information Administration (EIA) Indexed Site
... The t-statistics test significant and those t-statistics are to the actual regression ... on the NEMS, you've got to test this with real live data eventually to see how it's doing. ...
Method for factor analysis of GC/MS data
Van Benthem, Mark H; Kotula, Paul G; Keenan, Michael R
2012-09-11
The method of the present invention provides a fast, robust, and automated multivariate statistical analysis of gas chromatography/mass spectroscopy (GC/MS) data sets. The method can involve systematic elimination of undesired, saturated peak masses to yield data that follow a linear, additive model. The cleaned data can then be subjected to a combination of PCA and orthogonal factor rotation followed by refinement with MCR-ALS to yield highly interpretable results.
Forecasting of municipal solid waste quantity in a developing country using multivariate grey models
Intharathirat, Rotchana; Abdul Salam, P.; Kumar, S.; Untong, Akarapong
2015-05-15
Highlights: • Grey model can be used to forecast MSW quantity accurately with the limited data. • Prediction interval overcomes the uncertainty of MSW forecast effectively. • A multivariate model gives accuracy associated with factors affecting MSW quantity. • Population, urbanization, employment and household size play role for MSW quantity. - Abstract: In order to plan, manage and use municipal solid waste (MSW) in a sustainable way, accurate forecasting of MSW generation and composition plays a key role. It is difficult to carry out the reliable estimates using the existing models due to the limited data available in the developing countries. This study aims to forecast MSW collected in Thailand with prediction interval in long term period by using the optimized multivariate grey model which is the mathematical approach. For multivariate models, the representative factors of residential and commercial sectors affecting waste collected are identified, classified and quantified based on statistics and mathematics of grey system theory. Results show that GMC (1, 5), the grey model with convolution integral, is the most accurate with the least error of 1.16% MAPE. MSW collected would increase 1.40% per year from 43,435–44,994 tonnes per day in 2013 to 55,177–56,735 tonnes per day in 2030. This model also illustrates that population density is the most important factor affecting MSW collected, followed by urbanization, proportion employment and household size, respectively. These mean that the representative factors of commercial sector may affect more MSW collected than that of residential sector. Results can help decision makers to develop the measures and policies of waste management in long term period.
Multivariate classification of infrared spectra of cell and tissue samples
Haaland, David M.; Jones, Howland D. T.; Thomas, Edward V.
1997-01-01
Multivariate classification techniques are applied to spectra from cell and tissue samples irradiated with infrared radiation to determine if the samples are normal or abnormal (cancerous). Mid and near infrared radiation can be used for in vivo and in vitro classifications using at least different wavelengths.
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Statistical Table by Appropriation (dollars in thousands - OMB Scoring) Table of Contents Summary...................................................................................................... 1 Mandatory Funding....................................................................................... 3 Energy Supply.............................................................................................. 4 Non-Defense site acceleration
ARM - Historical Visitor Statistics
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Operational Visitors and Accounts Data Archive and Usage (October 1995 - Present) Historical Visitor Statistics As a national user facility, ARM is required to report...
International Energy Statistics - EIA
Gasoline and Diesel Fuel Update (EIA)
International > International Energy Statistics International Energy Statistics Petroleum Production | Annual Monthly/Quarterly Consumption | Annual Monthly/Quarterly Capacity | Bunker Fuels | Stocks | Annual Monthly/Quarterly Reserves | Imports | Annual Monthly/Quarterly Exports | CO2 Emissions | Heat Content Natural Gas All Flows | Production | Consumption | Reserves | Imports | Exports | Carbon Dioxide Emissions | Heat Content Coal All Flows | Production | Consumption | Reserves | Imports
Studying Resist Stochastics with the Multivariate Poisson Propagation Model
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Naulleau, Patrick; Anderson, Christopher; Chao, Weilun; Bhattarai, Suchit; Neureuther, Andrew
2014-01-01
Progress in the ultimate performance of extreme ultraviolet resist has arguably decelerated in recent years suggesting an approach to stochastic limits both in photon counts and material parameters. Here we report on the performance of a variety of leading extreme ultraviolet resist both with and without chemical amplification. The measured performance is compared to stochastic modeling results using the Multivariate Poisson Propagation Model. The results show that the best materials are indeed nearing modeled performance limits.
Various forms of indexing HDMR for modelling multivariate classification problems
Aksu, a?r?; Tunga, M. Alper
2014-12-10
The Indexing HDMR method was recently developed for modelling multivariate interpolation problems. The method uses the Plain HDMR philosophy in partitioning the given multivariate data set into less variate data sets and then constructing an analytical structure through these partitioned data sets to represent the given multidimensional problem. Indexing HDMR makes HDMR be applicable to classification problems having real world data. Mostly, we do not know all possible class values in the domain of the given problem, that is, we have a non-orthogonal data structure. However, Plain HDMR needs an orthogonal data structure in the given problem to be modelled. In this sense, the main idea of this work is to offer various forms of Indexing HDMR to successfully model these real life classification problems. To test these different forms, several well-known multivariate classification problems given in UCI Machine Learning Repository were used and it was observed that the accuracy results lie between 80% and 95% which are very satisfactory.
Statistical design of a uranium corrosion experiment
Wendelberger, Joanne R; Moore, Leslie M
2009-01-01
This work supports an experiment being conducted by Roland Schulze and Mary Ann Hill to study hydride formation, one of the most important forms of corrosion observed in uranium and uranium alloys. The study goals and objectives are described in Schulze and Hill (2008), and the work described here focuses on development of a statistical experiment plan being used for the study. The results of this study will contribute to the development of a uranium hydriding model for use in lifetime prediction models. A parametric study of the effect of hydrogen pressure, gap size and abrasion on hydride initiation and growth is being planned where results can be analyzed statistically to determine individual effects as well as multi-variable interactions. Input to ESC from this experiment will include expected hydride nucleation, size, distribution, and volume on various uranium surface situations (geometry) as a function of age. This study will also address the effect of hydrogen threshold pressure on corrosion nucleation and the effect of oxide abrasion/breach on hydriding processes. Statistical experiment plans provide for efficient collection of data that aids in understanding the impact of specific experiment factors on initiation and growth of corrosion. The experiment planning methods used here also allow for robust data collection accommodating other sources of variation such as the density of inclusions, assumed to vary linearly along the cast rods from which samples are obtained.
Scalable k-means statistics with Titan.
Thompson, David C.; Bennett, Janine C.; Pebay, Philippe Pierre
2009-11-01
This report summarizes existing statistical engines in VTK/Titan and presents both the serial and parallel k-means statistics engines. It is a sequel to [PT08], [BPRT09], and [PT09] which studied the parallel descriptive, correlative, multi-correlative, principal component analysis, and contingency engines. The ease of use of the new parallel k-means engine is illustrated by the means of C++ code snippets and algorithm verification is provided. This report justifies the design of the statistics engines with parallel scalability in mind, and provides scalability and speed-up analysis results for the k-means engine.
Candidate Assembly Statistical Evaluation
Energy Science and Technology Software Center (OSTI)
1998-07-15
The Savannah River Site (SRS) receives aluminum clad spent Material Test Reactor (MTR) fuel from all over the world for storage and eventual reprocessing. There are hundreds of different kinds of MTR fuels and these fuels will continue to be received at SRS for approximately ten more years. SRS''s current criticality evaluation methodology requires the modeling of all MTR fuels utilizing Monte Carlo codes, which is extremely time consuming and resource intensive. Now that amore » significant number of MTR calculations have been conducted it is feasible to consider building statistical models that will provide reasonable estimations of MTR behavior. These statistical models can be incorporated into a standardized model homogenization spreadsheet package to provide analysts with a means of performing routine MTR fuel analyses with a minimal commitment of time and resources. This became the purpose for development of the Candidate Assembly Statistical Evaluation (CASE) program at SRS.« less
Statistical methods for environmental pollution monitoring
Gilbert, R.O.
1987-01-01
The application of statistics to environmental pollution monitoring studies requires a knowledge of statistical analysis methods particularly well suited to pollution data. This book fills that need by providing sampling plans, statistical tests, parameter estimation procedure techniques, and references to pertinent publications. Most of the statistical techniques are relatively simple, and examples, exercises, and case studies are provided to illustrate procedures. The book is logically divided into three parts. Chapters 1, 2, and 3 are introductory chapters. Chapters 4 through 10 discuss field sampling designs and Chapters 11 through 18 deal with a broad range of statistical analysis procedures. Some statistical techniques given here are not commonly seen in statistics book. For example, see methods for handling correlated data (Sections 4.5 and 11.12), for detecting hot spots (Chapter 10), and for estimating a confidence interval for the mean of a lognormal distribution (Section 13.2). Also, Appendix B lists a computer code that estimates and tests for trends over time at one or more monitoring stations using nonparametric methods (Chapters 16 and 17). Unfortunately, some important topics could not be included because of their complexity and the need to limit the length of the book. For example, only brief mention could be made of time series analysis using Box-Jenkins methods and of kriging techniques for estimating spatial and spatial-time patterns of pollution, although multiple references on these topics are provided. Also, no discussion of methods for assessing risks from environmental pollution could be included.
Statistics, Uncertainty, and Transmitted Variation
Wendelberger, Joanne Roth
2014-11-05
The field of Statistics provides methods for modeling and understanding data and making decisions in the presence of uncertainty. When examining response functions, variation present in the input variables will be transmitted via the response function to the output variables. This phenomenon can potentially have significant impacts on the uncertainty associated with results from subsequent analysis. This presentation will examine the concept of transmitted variation, its impact on designed experiments, and a method for identifying and estimating sources of transmitted variation in certain settings.
Statistically significant relational data mining :
Berry, Jonathan W.; Leung, Vitus Joseph; Phillips, Cynthia Ann; Pinar, Ali; Robinson, David Gerald; Berger-Wolf, Tanya; Bhowmick, Sanjukta; Casleton, Emily; Kaiser, Mark; Nordman, Daniel J.; Wilson, Alyson G.
2014-02-01
This report summarizes the work performed under the project (3z(BStatitically significant relational data mining.(3y (BThe goal of the project was to add more statistical rigor to the fairly ad hoc area of data mining on graphs. Our goal was to develop better algorithms and better ways to evaluate algorithm quality. We concetrated on algorithms for community detection, approximate pattern matching, and graph similarity measures. Approximate pattern matching involves finding an instance of a relatively small pattern, expressed with tolerance, in a large graph of data observed with uncertainty. This report gathers the abstracts and references for the eight refereed publications that have appeared as part of this work. We then archive three pieces of research that have not yet been published. The first is theoretical and experimental evidence that a popular statistical measure for comparison of community assignments favors over-resolved communities over approximations to a ground truth. The second are statistically motivated methods for measuring the quality of an approximate match of a small pattern in a large graph. The third is a new probabilistic random graph model. Statisticians favor these models for graph analysis. The new local structure graph model overcomes some of the issues with popular models such as exponential random graph models and latent variable models.
Neels, J.K.; Brown, F.C.
1992-09-01
The economic benefits of supplemental fuel injections depend, in part, on the coke replacement ratio. An assessment of the accuracy with which blast furnace coke rate may be measured and a determination of the key drivers of coke rate uncertainty are offered, to provide guidance for experiments in high-rate gas injection. Using statistical analysis tools, an expression for the measurement error associated with the various terms of blast furnace carbon balance is developed. Coke rate calculations based on the material balance are most sensitive to coke carbon content and to proper tracking of hot metal tapping schedule.
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Statistical Table by Appropriation (dollars in thousands - OMB Scoring) FY 2004 FY 2005 FY 2006 Comparable Comparable Request to FY 2006 vs. FY 2005 Approp Approp Congress Discretionary Summary By Appropriation Energy And Water Development Appropriation Summary: Energy Programs Energy supply Operation and maintenance................................................. 787,941 909,903 862,499 -47,404 -5.2% Construction......................................................................... 6,956
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Statistical Table by Appropriation (dollars in thousands - OMB Scoring) FY 2011 FY 2012 FY 2013 Current Enacted Congressional Approp. Approp. * Request $ % Discretionary Summary By Appropriation Energy And Water Development, And Related Agencies Appropriation Summary: Energy Programs Energy efficiency and renewable energy........................................ 1,771,721 1,809,638 2,337,000 +527,362 +29.1% Electricity delivery and energy reliability.........................................
Gunst, R. F.
2013-05-01
Phase 3 of the EPAct/V2/E-89 Program investigated the effects of 27 program fuels and 15 program vehicles on exhaust emissions and fuel economy. All vehicles were tested over the California Unified Driving Cycle (LA-92) at 75 degrees F. The program fuels differed on T50, T90, ethanol, Reid vapor pressure, and aromatics. The vehicles tested were new, low-mileage 2008 model year Tier 2 vehicles. A total of 956 test runs were made. Comprehensive statistical modeling and analyses were conducted on methane, carbon dioxide, carbon monoxide, fuel economy, non-methane hydrocarbons, non-methane organic gases, oxides of nitrogen, particulate matter, and total hydrocarbons. In general, model fits determined that emissions and fuel economy were complicated by functions of the five fuel parameters. An extensive evaluation of alternative model fits produced a number of competing model fits. Many of these alternative fits produce similar estimates of mean emissions for the 27 program fuels but should be carefully evaluated for use with emerging fuels with combinations of fuel parameters not included here. The program includes detailed databases on each of the 27 program fuels on each of the 15 vehicles and on each of the vehicles on each of the program fuels.
Derrien, H.; Harvey, J.A.; Larson, N.M.; Leal, L.C.; Wright, R.Q.
2000-05-01
The average {sup 235}U neutron total cross sections were obtained in the energy range 2 keV to 330 keV from high-resolution transmission measurements of a 0.033 atom/b sample.1 The experimental data were corrected for the contribution of isotope impurities and for resonance self-shielding effects in the sample. The results are in very good agreement with the experimental data of Poenitz et al.4 in the energy range 40 keV to 330 keV and are the only available accurate experimental data in the energy range 2 keV to 40 keV. ENDF/B-VI evaluated data are 1.7% larger. The SAMMY/FITACS code 2 was used for a statistical model analysis of the total cross section, selected fission cross sections and data in the energy range 2 keV to 200 keV. SAMMY/FITACS is an extended version of SAMMY which allows consistent analysis of the experimental data in the resolved and unresolved resonance region. The Reich-Moore resonance parameters were obtained 3 from a SAMMY Bayesian fits of high resolution experimental neutron transmission and partial cross section data below 2.25 keV, and the corresponding average parameters and covariance data were used in the present work as input for the statistical model analysis of the high energy range of the experimental data. The result of the analysis shows that the average resonance parameters obtained from the analysis of the unresolved resonance region are consistent with those obtained in the resolved energy region. Another important result is that ENDF/B-VI capture cross section could be too small by more than 10% in the energy range 10 keV to 200 keV.
Computer, Computational, and Statistical Sciences
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
CCS Computer, Computational, and Statistical Sciences Computational physics, computer science, applied mathematics, statistics and the integration of large data streams are central ...
Topology for statistical modeling of petascale data.
Pascucci, Valerio; Mascarenhas, Ajith Arthur; Rusek, Korben; Bennett, Janine Camille; Levine, Joshua; Pebay, Philippe Pierre; Gyulassy, Attila; Thompson, David C.; Rojas, Joseph Maurice
2011-07-01
This document presents current technical progress and dissemination of results for the Mathematics for Analysis of Petascale Data (MAPD) project titled 'Topology for Statistical Modeling of Petascale Data', funded by the Office of Science Advanced Scientific Computing Research (ASCR) Applied Math program. Many commonly used algorithms for mathematical analysis do not scale well enough to accommodate the size or complexity of petascale data produced by computational simulations. The primary goal of this project is thus to develop new mathematical tools that address both the petascale size and uncertain nature of current data. At a high level, our approach is based on the complementary techniques of combinatorial topology and statistical modeling. In particular, we use combinatorial topology to filter out spurious data that would otherwise skew statistical modeling techniques, and we employ advanced algorithms from algebraic statistics to efficiently find globally optimal fits to statistical models. This document summarizes the technical advances we have made to date that were made possible in whole or in part by MAPD funding. These technical contributions can be divided loosely into three categories: (1) advances in the field of combinatorial topology, (2) advances in statistical modeling, and (3) new integrated topological and statistical methods.
ORISE: Statistical Analyses of Worker Health
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Statistical Analyses Statistical analyses at the Oak Ridge Institute for Science and Education (ORISE) support ongoing programs involving medical surveillance of workers and other populations, as well as occupational epidemiology and research. ORISE emphasizes insightful and accurate analysis, practical interpretation of results and clear, easily read reports. All analyses are preceded by extensive data scrubbing and verification. ORISE's approach relies on applying appropriate methods of
Key World Energy Statistics-2010 | Open Energy Information
World Energy Statistics-2010 AgencyCompany Organization: International Energy Agency Sector: Energy Topics: Market analysis Resource Type: Dataset, Maps Website: www.iea.org...
Experimental Mathematics and Computational Statistics
Bailey, David H.; Borwein, Jonathan M.
2009-04-30
The field of statistics has long been noted for techniques to detect patterns and regularities in numerical data. In this article we explore connections between statistics and the emerging field of 'experimental mathematics'. These includes both applications of experimental mathematics in statistics, as well as statistical methods applied to computational mathematics.
Primout, M.; Babonneau, D.; Jacquet, L.; Gilleron, F.; Peyrusse, O.; Fournier, K. B.; Marrs, R.; May, M. J.; Heeter, R. F.; Wallace, R. J.
2015-11-10
We studied the titanium K-shell emission spectra from multi-keV x-ray source experiments with hybrid targets on the OMEGA laser facility. Using the collisional-radiative TRANSPEC code, dedicated to K-shell spectroscopy, we reproduced the main features of the detailed spectra measured with the time-resolved MSPEC spectrometer. We developed a general method to infer the N_{e}, T_{e} and T_{i} characteristics of the target plasma from the spectral analysis (ratio of integrated Lyman-α to Helium-α in-band emission and the peak amplitude of individual line ratios) of the multi-keV x-ray emission. Finally, these thermodynamic conditions are compared to those calculated independently by the radiation-hydrodynamics transport code FCI2.
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Primout, M.; Babonneau, D.; Jacquet, L.; Gilleron, F.; Peyrusse, O.; Fournier, K. B.; Marrs, R.; May, M. J.; Heeter, R. F.; Wallace, R. J.
2015-11-10
We studied the titanium K-shell emission spectra from multi-keV x-ray source experiments with hybrid targets on the OMEGA laser facility. Using the collisional-radiative TRANSPEC code, dedicated to K-shell spectroscopy, we reproduced the main features of the detailed spectra measured with the time-resolved MSPEC spectrometer. We developed a general method to infer the Ne, Te and Ti characteristics of the target plasma from the spectral analysis (ratio of integrated Lyman-α to Helium-α in-band emission and the peak amplitude of individual line ratios) of the multi-keV x-ray emission. Finally, these thermodynamic conditions are compared to those calculated independently by the radiation-hydrodynamics transportmore » code FCI2.« less
International petroleum statistics report
1995-10-01
The International Petroleum Statistics Report is a monthly publication that provides current international oil data. This report presents data on international oil production, demand, imports, exports and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). Section 2 presents an oil supply/demand balance for the world, in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries.
International petroleum statistics report
1997-05-01
The International Petroleum Statistics Report is a monthly publication that provides current international oil data. This report is published for the use of Members of Congress, Federal agencies, State agencies, industry, and the general public. Publication of this report is in keeping with responsibilities given the Energy Information Administration in Public Law 95-91. The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1995; OECD stocks from 1973 through 1995; and OECD trade from 1985 through 1995.
AMERICAN STATISTICAL ASSOCIATION (ASA)
U.S. Energy Information Administration (EIA) Indexed Site
... team to EIA products and services, data analysis, forecast, education, and documentation. ... the use or misuse of review recommendations that should be considered in the project plan? ...
Broader source: Energy.gov [DOE]
Improved Geothermometry Through Multivariate Reaction Path Modeling and Evaluation of Geomicrobiological Influences on Geochemical Temperature Indicators presentation at the April 2013 peer review meeting held in Denver, Colorado.
Statistical physics ""Beyond equilibrium
Ecke, Robert E
2009-01-01
The scientific challenges of the 21st century will increasingly involve competing interactions, geometric frustration, spatial and temporal intrinsic inhomogeneity, nanoscale structures, and interactions spanning many scales. We will focus on a broad class of emerging problems that will require new tools in non-equilibrium statistical physics and that will find application in new material functionality, in predicting complex spatial dynamics, and in understanding novel states of matter. Our work will encompass materials under extreme conditions involving elastic/plastic deformation, competing interactions, intrinsic inhomogeneity, frustration in condensed matter systems, scaling phenomena in disordered materials from glasses to granular matter, quantum chemistry applied to nano-scale materials, soft-matter materials, and spatio-temporal properties of both ordinary and complex fluids.
Statistical methods for environmental pollution monitoring
Gilbert, R.O.
1986-01-01
This volume covers planning, design, and data analysis. It offers statistical methods for designing environmental sampling and monitoring programs as well as analyzing the resulting data. Statistical sample survey methods to problems of estimating average and total amounts of environmental pollution are presented in detail. The book also provides a broad array of statistical analysis methods for many purposes...numerous examples...three case studies...end-of-chapter questions...computer codes (showing what output looks like along with its interpretation)...a discussion of Kriging methods for estimating pollution concentration contours over space and/or time...nomographs for determining the number of samples required to detect hot spots with specified confidence...and a description and tables for conducting Rosner's test to identify outlaying (usually large) pollution measurements in a data set.
Exploratory Data analysis ENvironment eXtreme scale (EDENx)
Steed, Chad Allen
2015-07-01
EDENx is a multivariate data visualization tool that allows interactive user driven analysis of large-scale data sets with high dimensionality. EDENx builds on our earlier system, called EDEN to enable analysis of more dimensions and larger scale data sets. EDENx provides an initial overview of summary statistics for each variable in the data set under investigation. EDENx allows the user to interact with graphical summary plots of the data to investigate subsets and their statistical associations. These plots include histograms, binned scatterplots, binned parallel coordinate plots, timeline plots, and graphical correlation indicators. From the EDENx interface, a user can select a subsample of interest and launch a more detailed data visualization via the EDEN system. EDENx is best suited for high-level, aggregate analysis tasks while EDEN is more appropriate for detail data investigations.
statistics Home Rmckeel's picture Submitted by Rmckeel(297) Contributor 8 November, 2012 - 12:58 OpenEI dashboard Google Analytics mediawiki OpenEI statistics wiki OpenEI web...
International petroleum statistics report
1996-10-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. Word oil production and OECD demand data are for the years 1970 through 1995; OECD stocks from 1973 through 1995; and OECD trade from 1985 through 1995.
International petroleum statistics report
1995-11-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1994; OECD stocks from 1973 through 1994; and OECD trade from 1984 through 1994.
International petroleum statistics report
1997-07-01
The International Petroleum Statistics Report is a monthly publication that provides current international data. The report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent 12 months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1996; OECD stocks from 1973 through 1996; and OECD trade from 1986 through 1996.
International petroleum statistics report
1995-07-27
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1994; OECD stocks from 1973 through 1994; and OECD trade from 1984 through 1994.
International petroleum statistics report
1996-05-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1995; OECD stocks from 1973 through 1995; and OECD trade from 1084 through 1994.
Independent Statistics & Analysis Drilling Productivity Report
Gasoline and Diesel Fuel Update (EIA)
with +- signs and color-coded arrows to highlight the growth or decline in oil (brown) or natural gas (blue). New-well oilgas production per rig Charts present historical...
Big-Data RHEED analysis for understanding epitaxial film growth processes
Vasudevan, Rama K; Tselev, Alexander; Baddorf, Arthur P; Kalinin, Sergei V
2014-10-28
Reflection high energy electron diffraction (RHEED) has by now become a standard tool for in-situ monitoring of film growth by pulsed laser deposition and molecular beam epitaxy. Yet despite the widespread adoption and wealth of information in RHEED image, most applications are limited to observing intensity oscillations of the specular spot, and much additional information on growth is discarded. With ease of data acquisition and increased computation speeds, statistical methods to rapidly mine the dataset are now feasible. Here, we develop such an approach to the analysis of the fundamental growth processes through multivariate statistical analysis of RHEED image sequence. This approach is illustrated for growth of LaxCa1-xMnO3 films grown on etched (001) SrTiO3 substrates, but is universal. The multivariate methods including principal component analysis and k-means clustering provide insight into the relevant behaviors, the timing and nature of a disordered to ordered growth change, and highlight statistically significant patterns. Fourier analysis yields the harmonic components of the signal and allows separation of the relevant components and baselines, isolating the assymetric nature of the step density function and the transmission spots from the imperfect layer-by-layer (LBL) growth. These studies show the promise of big data approaches to obtaining more insight into film properties during and after epitaxial film growth. Furthermore, these studies open the pathway to use forward prediction methods to potentially allow significantly more control over growth process and hence final film quality.
Statistics for nuclear engineers and scientists. Part 1. Basic statistical inference
Beggs, W.J.
1981-02-01
This report is intended for the use of engineers and scientists working in the nuclear industry, especially at the Bettis Atomic Power Laboratory. It serves as the basis for several Bettis in-house statistics courses. The objectives of the report are to introduce the reader to the language and concepts of statistics and to provide a basic set of techniques to apply to problems of the collection and analysis of data. Part 1 covers subjects of basic inference. The subjects include: descriptive statistics; probability; simple inference for normally distributed populations, and for non-normal populations as well; comparison of two populations; the analysis of variance; quality control procedures; and linear regression analysis.
ARM - Historical Field Campaign Statistics
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Operational Visitors and Accounts Data Archive and Usage (October 1995 - Present) Historical Field Campaign Statistics ARM Climate Research Facility users regularly conduct...
Angular-momentum nonclassicality by breaking classical bounds on statistics
Luis, Alfredo; Rivas, Angel
2011-10-15
We derive simple practical procedures revealing the quantum behavior of angular momentum variables by the violation of classical upper bounds on the statistics. Data analysis is minimum and definite conclusions are obtained without evaluation of moments, or any other more sophisticated procedures. These nonclassical tests are very general and independent of other typical quantum signatures of nonclassical behavior such as sub-Poissonian statistics, squeezing, or oscillatory statistics, being insensitive to the nonclassical behavior displayed by other variables.
Design and performance of a scalable, parallel statistics toolkit.
Thompson, David C.; Bennett, Janine Camille; Pebay, Philippe Pierre
2010-11-01
Most statistical software packages implement a broad range of techniques but do so in an ad hoc fashion, leaving users who do not have a broad knowledge of statistics at a disadvantage since they may not understand all the implications of a given analysis or how to test the validity of results. These packages are also largely serial in nature, or target multicore architectures instead of distributed-memory systems, or provide only a small number of statistics in parallel. This paper surveys a collection of parallel implementations of statistics algorithm developed as part of a common framework over the last 3 years. The framework strategically groups modeling techniques with associated verification and validation techniques to make the underlying assumptions of the statistics more clear. Furthermore it employs a design pattern specifically targeted for distributed-memory parallelism, where architectural advances in large-scale high-performance computing have been focused. Moment-based statistics (which include descriptive, correlative, and multicorrelative statistics, principal component analysis (PCA), and k-means statistics) scale nearly linearly with the data set size and number of processes. Entropy-based statistics (which include order and contingency statistics) do not scale well when the data in question is continuous or quasi-diffuse but do scale well when the data is discrete and compact. We confirm and extend our earlier results by now establishing near-optimal scalability with up to 10,000 processes.
Moore named an American Statistical Society Fellow
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Moore named an American Statistical Society Fellow Moore named an American Statistical Society Fellow The ASA inducted Leslie (Lisa) Moore as a Fellow at the 2014 Joint Statistical...
Lupoi, Jason S.; Smith-Moritz, Andreia; Singh, Seema; McQualter, Richard; Scheller, Henrik V.; Simmons, Blake A.; Henry, Robert J.
2015-07-10
Background: Slow-degrading, fossil fuel-derived plastics can have deleterious effects on the environment, especially marine ecosystems. The production of bio-based, biodegradable plastics from or in plants can assist in supplanting those manufactured using fossil fuels. Polyhydroxybutyrate (PHB) is one such biodegradable polyester that has been evaluated as a possible candidate for relinquishing the use of environmentally harmful plastics. Results: PHB, possessing similar properties to polyesters produced from non-renewable sources, has been previously engineered in sugarcane, thereby creating a high-value co-product in addition to the high biomass yield. This manuscript illustrates the coupling of a Fourier-transform infrared microspectrometer, equipped with a focal plane array (FPA) detector, with multivariate imaging to successfully identify and localize PHB aggregates. Principal component analysis imaging facilitated the mining of the abundant quantity of spectral data acquired using the FPA for distinct PHB vibrational modes. PHB was measured in the chloroplasts of mesophyll and bundle sheath cells, acquiescent with previously evaluated plant samples. Conclusion: This study demonstrates the power of IR microspectroscopy to rapidly image plant sections to provide a snapshot of the chemical composition of the cell. While PHB was localized in sugarcane, this method is readily transferable to other value-added co-products in different plants.
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Lupoi, Jason S.; Smith-Moritz, Andreia; Singh, Seema; McQualter, Richard; Scheller, Henrik V.; Simmons, Blake A.; Henry, Robert J.
2015-07-10
Background: Slow-degrading, fossil fuel-derived plastics can have deleterious effects on the environment, especially marine ecosystems. The production of bio-based, biodegradable plastics from or in plants can assist in supplanting those manufactured using fossil fuels. Polyhydroxybutyrate (PHB) is one such biodegradable polyester that has been evaluated as a possible candidate for relinquishing the use of environmentally harmful plastics. Results: PHB, possessing similar properties to polyesters produced from non-renewable sources, has been previously engineered in sugarcane, thereby creating a high-value co-product in addition to the high biomass yield. This manuscript illustrates the coupling of a Fourier-transform infrared microspectrometer, equipped with a focalmore » plane array (FPA) detector, with multivariate imaging to successfully identify and localize PHB aggregates. Principal component analysis imaging facilitated the mining of the abundant quantity of spectral data acquired using the FPA for distinct PHB vibrational modes. PHB was measured in the chloroplasts of mesophyll and bundle sheath cells, acquiescent with previously evaluated plant samples. Conclusion: This study demonstrates the power of IR microspectroscopy to rapidly image plant sections to provide a snapshot of the chemical composition of the cell. While PHB was localized in sugarcane, this method is readily transferable to other value-added co-products in different plants.« less
High Performance Multivariate Visual Data Exploration for Extremely Large Data
Rubel, Oliver; Wu, Kesheng; Childs, Hank; Meredith, Jeremy; Geddes, Cameron G.R.; Cormier-Michel, Estelle; Ahern, Sean; Weber, Gunther H.; Messmer, Peter; Hagen, Hans; Hamann, Bernd; Bethel, E. Wes; Prabhat,
2008-08-22
One of the central challenges in modern science is the need to quickly derive knowledge and understanding from large, complex collections of data. We present a new approach that deals with this challenge by combining and extending techniques from high performance visual data analysis and scientific data management. This approach is demonstrated within the context of gaining insight from complex, time-varying datasets produced by a laser wakefield accelerator simulation. Our approach leverages histogram-based parallel coordinates for both visual information display as well as a vehicle for guiding a data mining operation. Data extraction and subsetting are implemented with state-of-the-art index/query technology. This approach, while applied here to accelerator science, is generally applicable to a broad set of science applications, and is implemented in a production-quality visual data analysis infrastructure. We conduct a detailed performance analysis and demonstrate good scalability on a distributed memory Cray XT4 system.
Topological Cacti: Visualizing Contour-based Statistics
Weber, Gunther H.; Bremer, Peer-Timo; Pascucci, Valerio
2011-05-26
Contours, the connected components of level sets, play an important role in understanding the global structure of a scalar field. In particular their nestingbehavior and topology-often represented in form of a contour tree-have been used extensively for visualization and analysis. However, traditional contour trees onlyencode structural properties like number of contours or the nesting of contours, but little quantitative information such as volume or other statistics. Here we use thesegmentation implied by a contour tree to compute a large number of per-contour (interval) based statistics of both the function defining the contour tree as well asother co-located functions. We introduce a new visual metaphor for contour trees, called topological cacti, that extends the traditional toporrery display of acontour tree to display additional quantitative information as width of the cactus trunk and length of its spikes. We apply the new technique to scalar fields ofvarying dimension and different measures to demonstrate the effectiveness of the approach.
A Visual Analytics Approach for Correlation, Classification, and Regression Analysis
Steed, Chad A; SwanII, J. Edward; Fitzpatrick, Patrick J.; Jankun-Kelly, T.J.
2012-02-01
New approaches that combine the strengths of humans and machines are necessary to equip analysts with the proper tools for exploring today's increasing complex, multivariate data sets. In this paper, a novel visual data mining framework, called the Multidimensional Data eXplorer (MDX), is described that addresses the challenges of today's data by combining automated statistical analytics with a highly interactive parallel coordinates based canvas. In addition to several intuitive interaction capabilities, this framework offers a rich set of graphical statistical indicators, interactive regression analysis, visual correlation mining, automated axis arrangements and filtering, and data classification techniques. The current work provides a detailed description of the system as well as a discussion of key design aspects and critical feedback from domain experts.
Key China Energy Statistics 2011
Levine, Mark; Fridley, David; Lu, Hongyou; Fino-Chen, Cecilia
2012-01-15
The China Energy Group at Lawrence Berkeley National Laboratory (LBNL) was established in 1988. Over the years the Group has gained recognition as an authoritative source of China energy statistics through the publication of its China Energy Databook (CED). In 2008 the Group published the Seventh Edition of the CED (http://china.lbl.gov/research/chinaenergy-databook). This handbook summarizes key statistics from the CED and is expressly modeled on the International Energy Agency’s “Key World Energy Statistics” series of publications. The handbook contains timely, clearly-presented data on the supply, transformation, and consumption of all major energy sources.
Key China Energy Statistics 2012
Levine, Mark; Fridley, David; Lu, Hongyou; Fino-Chen, Cecilia
2012-05-01
The China Energy Group at Lawrence Berkeley National Laboratory (LBNL) was established in 1988. Over the years the Group has gained recognition as an authoritative source of China energy statistics through the publication of its China Energy Databook (CED). The Group has published seven editions to date of the CED (http://china.lbl.gov/research/chinaenergy-databook). This handbook summarizes key statistics from the CED and is expressly modeled on the International Energy Agency’s “Key World Energy Statistics” series of publications. The handbook contains timely, clearly-presented data on the supply, transformation, and consumption of all major energy sources.
Exploratory Data analysis ENvironment eXtreme scale (EDENx)
Energy Science and Technology Software Center (OSTI)
2015-07-01
EDENx is a multivariate data visualization tool that allows interactive user driven analysis of large-scale data sets with high dimensionality. EDENx builds on our earlier system, called EDEN to enable analysis of more dimensions and larger scale data sets. EDENx provides an initial overview of summary statistics for each variable in the data set under investigation. EDENx allows the user to interact with graphical summary plots of the data to investigate subsets and their statisticalmore » associations. These plots include histograms, binned scatterplots, binned parallel coordinate plots, timeline plots, and graphical correlation indicators. From the EDENx interface, a user can select a subsample of interest and launch a more detailed data visualization via the EDEN system. EDENx is best suited for high-level, aggregate analysis tasks while EDEN is more appropriate for detail data investigations.« less
QUANTUM MECHANICS WITHOUT STATISTICAL POSTULATES
G. GEIGER; ET AL
2000-11-01
The Bohmian formulation of quantum mechanics describes the measurement process in an intuitive way without a reduction postulate. Due to the chaotic motion of the hidden classical particle all statistical features of quantum mechanics during a sequence of repeated measurements can be derived in the framework of a deterministic single system theory.
Ideas for Effective Communication of Statistical Results
Anderson-Cook, Christine M.
2015-03-01
Effective presentation of statistical results to those with less statistical training, including managers and decision-makers requires planning, anticipation and thoughtful delivery. Here are several recommendations for effectively presenting statistical results.
IEA Energy Statistics | Open Energy Information
Statistics Jump to: navigation, search Tool Summary LAUNCH TOOL Name: IEA Energy Statistics AgencyCompany Organization: International Energy Agency Sector: Energy Topics: GHG...
VTPI-Transportation Statistics | Open Energy Information
Area: Transportation Resource Type: Dataset Website: www.vtpi.orgtdmtdm80.htm Cost: Free VTPI-Transportation Statistics Screenshot References: VTPI-Transportation Statistics1...
STATISTICAL PERFORMANCE EVALUATION OF SPRING OPERATED PRESSURE...
Office of Scientific and Technical Information (OSTI)
STATISTICAL PERFORMANCE EVALUATION OF SPRING OPERATED PRESSURE RELIEF VALVE RELIABILITY IMPROVEMENTS 2004 TO 2014 Citation Details In-Document Search Title: STATISTICAL PERFORMANCE ...
Statistical Transmutation in Floquet Driven Optical Lattices...
Office of Scientific and Technical Information (OSTI)
Statistical Transmutation in Floquet Driven Optical Lattices Citation Details In-Document Search This content will become publicly available on November 3, 2016 Title: Statistical ...
Statistical Transmutation in Floquet Driven Optical Lattices...
Office of Scientific and Technical Information (OSTI)
Statistical Transmutation in Floquet Driven Optical Lattices This content will become publicly available on November 3, 2016 Prev Next Title: Statistical Transmutation in ...
ARM - Lesson Plans: Historical Climate Statistics
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Historical Climate Statistics Outreach Home Room News Publications Traditional Knowledge ... Teachers' Toolbox Lesson Plans Lesson Plans: Historical Climate Statistics Objective The ...
On the ability of Order Statistics to distinguish different models for continuum gamma decay
Sandoval, J. J.; Cristancho, F.
2007-10-26
A simulation procedure to calculate some important parameters to the application of Order Statistics in the analysis of continuum gamma decay is presented.
Moore honored with American Statistical Association award
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
American Statistical Association Award Moore honored with American Statistical Association award Lisa Moore is the recipient of the 2013 Don Owen Award presented by the American Statistical Association, San Antonio Chapter. May 24, 2013 Leslie "Lisa" Moore Leslie "Lisa" Moore The American Statistical Association (ASA) is the world's largest community of statisticians. It was founded in Massachusetts in 1839. Leslie "Lisa" Moore of the Laboratory's Statistical
Office of Survey Development and Statistical Integration
U.S. Energy Information Administration (EIA) Indexed Site
Steve Harvey April 27, 2011 | Washington, D.C. Tough Choices in U.S. EIA's Data Programs Agenda * Office of Oil, Gas, and Coal Supply Statistics * Office of Petroleum and Biofuels Statistics * Office of Electricity, Renewables, and Uranium Statistics * Office of Energy Consumption and Efficiency Statistics * Office of Survey Development and Statistical Integration 2 Presenter name, Presentation location, Presentation date Coal Data Collection Program 3 James Kendell Washington, DC, April 27,
Transportation Statistics Annual Report 1997
Fenn, M.
1997-01-01
This document is the fourth Transportation Statistics Annual Report (TSAR) prepared by the Bureau of Transportation Statistics (BTS) for the President and Congress. As in previous years, it reports on the state of U.S. transportation system at two levels. First, in Part I, it provides a statistical and interpretive survey of the system—its physical characteristics, its economic attributes, aspects of its use and performance, and the scale and severity of unintended consequences of transportation, such as fatalities and injuries, oil import dependency, and environment impacts. Part I also explores the state of transportation statistics, and new needs of the rapidly changing world of transportation. Second, Part II of the report, as in prior years, explores in detail the performance of the U.S. transportation system from the perspective of desired social outcomes or strategic goals. This year, the performance aspect of transportation chosen for thematic treatment is “Mobility and Access,” which complements past TSAR theme sections on “The Economic Performance of Transportation” (1995) and “Transportation and the Environment” (1996). Mobility and access are at the heart of the transportation system’s performance from the user’s perspective. In what ways and to what extent does the geographic freedom provided by transportation enhance personal fulfillment of the nation’s residents and contribute to economic advancement of people and businesses? This broad question underlies many of the topics examined in Part II: What is the current level of personal mobility in the United States, and how does it vary by sex, age, income level, urban or rural location, and over time? What factors explain variations? Has transportation helped improve people’s access to work, shopping, recreational facilities, and medical services, and in what ways and in what locations? How have barriers, such as age, disabilities, or lack of an automobile, affected these
Lectures on probability and statistics
Yost, G.P.
1984-09-01
These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another.
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Improved Geothermometry Through Multivariate Reaction Path Modeling and Evaluation of Geomicrobiological Influences on Geochemical Temperature Indicators Project Officer: Eric Hass Total Project Funding: $999,000 April 24, 2013 Craig Cooper Larry Hull Idaho National Laboratory This presentation does not contain any proprietary confidential, or otherwise restricted information. 2 | US DOE Geothermal Program eere.energy.gov Relevance/Impact of Research Geothermometry enables estimation of
A new subgrid-scale representation of hydrometeor fields using a multivariate PDF
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Griffin, Brian M.; Larson, Vincent E.
2016-06-03
The subgrid-scale representation of hydrometeor fields is important for calculating microphysical process rates. In order to represent subgrid-scale variability, the Cloud Layers Unified By Binormals (CLUBB) parameterization uses a multivariate probability density function (PDF). In addition to vertical velocity, temperature, and moisture fields, the PDF includes hydrometeor fields. Previously, hydrometeor fields were assumed to follow a multivariate single lognormal distribution. Now, in order to better represent the distribution of hydrometeors, two new multivariate PDFs are formulated and introduced.The new PDFs represent hydrometeors using either a delta-lognormal or a delta-double-lognormal shape. The two new PDF distributions, plus the previous single lognormalmore » shape, are compared to histograms of data taken from large-eddy simulations (LESs) of a precipitating cumulus case, a drizzling stratocumulus case, and a deep convective case. Finally, the warm microphysical process rates produced by the different hydrometeor PDFs are compared to the same process rates produced by the LES.« less
Tomography and weak lensing statistics
Munshi, Dipak; Coles, Peter; Kilbinger, Martin E-mail: peter.coles@astro.cf.ac.uk
2014-04-01
We provide generic predictions for the lower order cumulants of weak lensing maps, and their correlators for tomographic bins as well as in three dimensions (3D). Using small-angle approximation, we derive the corresponding one- and two-point probability distribution function for the tomographic maps from different bins and for 3D convergence maps. The modelling of weak lensing statistics is obtained by adopting a detailed prescription for the underlying density contrast that involves hierarchal ansatz and lognormal distribution. We study the dependence of our results on cosmological parameters and source distributions corresponding to the realistic surveys such as LSST and DES. We briefly outline how photometric redshift information can be incorporated in our results. We also show how topological properties of convergence maps can be quantified using our results.
EERE Statistics Archive | Department of Energy
Broader source: Energy.gov (indexed) [DOE]
This page provides EERE Web statistics for all office and corporate websites that opted to use EERE's analytics account. Webtrends statistics for Fiscal Year 2009 (FY09) to FY11 ...
DOE - NNSA/NFO -- FOIA Statistics
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Statistics NNSANFO Language Options U.S. DOENNSA - Nevada Field Office FOIA Statistics The FOIA has become a useful tool for researchers, news media, and the general public. In ...
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
I/O Statistics Last 30 Days I/O Statistics Last 30 Days These plots show the daily statistics for the last 30 days for the storage systems at NERSC in terms of the amount of data transferred and the number of files transferred. Daily I/O Volume Daily I/O Count
STORM: A STatistical Object Representation Model
Rafanelli, M. ); Shoshani, A. )
1989-11-01
In this paper we explore the structure and semantic properties of the entities stored in statistical databases. We call such entities statistical objects'' (SOs) and propose a new statistical object representation model,'' based on a graph representation. We identify a number of SO representational problems in current models and propose a methodology for their solution. 11 refs.
Quantrum chaos and statistical nuclear physics
Not Available
1986-01-01
This book contains 33 selections. Some of the titles are: Chaotic motion and statistical nuclear theory; Test of spectrum and strength fluctuations with proton resonances; Nuclear level densities and level spacing distributions; Spectral statistics of scale invariant systems; and Antiunitary symmetries and energy level statistics.
Search for: All records | SciTech Connect
Office of Scientific and Technical Information (OSTI)
... conjoins multivariate statistical analysis with physics-based interpretation, ... of local transport and other functional phenomena in other spatially ...
Statistics and Discoveries at the LHC (3/4)
None
2011-10-06
The lectures will give an introduction to statistics as applied in particle physics and will provide all the necessary basics for data analysis at the LHC. Special emphasis will be placed on the the problems and questions that arise when searching for new phenomena, including p-values, discovery significance, limit setting procedures, treatment of small signals in the presence of large backgrounds. Specific issues that will be addressed include the advantages and drawbacks of different statistical test procedures (cut-based, likelihood-ratio, etc.), the look-elsewhere effect and treatment of systematic uncertainties.
Statistics and Discoveries at the LHC (1/4)
None
2011-10-06
The lectures will give an introduction to statistics as applied in particle physics and will provide all the necessary basics for data analysis at the LHC. Special emphasis will be placed on the the problems and questions that arise when searching for new phenomena, including p-values, discovery significance, limit setting procedures, treatment of small signals in the presence of large backgrounds. Specific issues that will be addressed include the advantages and drawbacks of different statistical test procedures (cut-based, likelihood-ratio, etc.), the look-elsewhere effect and treatment of systematic uncertainties.
Statistics and Discoveries at the LHC (4/4)
None
2011-10-06
The lectures will give an introduction to statistics as applied in particle physics and will provide all the necessary basics for data analysis at the LHC. Special emphasis will be placed on the the problems and questions that arise when searching for new phenomena, including p-values, discovery significance, limit setting procedures, treatment of small signals in the presence of large backgrounds. Specific issues that will be addressed include the advantages and drawbacks of different statistical test procedures (cut-based, likelihood-ratio, etc.), the look-elsewhere effect and treatment of systematic uncertainties.
Statistics and Discoveries at the LHC (2/4)
None
2011-10-06
The lectures will give an introduction to statistics as applied in particle physics and will provide all the necessary basics for data analysis at the LHC. Special emphasis will be placed on the the problems and questions that arise when searching for new phenomena, including p-values, discovery significance, limit setting procedures, treatment of small signals in the presence of large backgrounds. Specific issues that will be addressed include the advantages and drawbacks of different statistical test procedures (cut-based, likelihood-ratio, etc.), the look-elsewhere effect and treatment of systematic uncertainties.
Multivariable Robust Control of a Simulated Hybrid Solid Oxide Fuel Cell Gas Turbine Plant
Tsai, Alex; Banta, Larry; Tucker, D.A.; Gemmen, R.S.
2008-06-01
This paper presents a systematic approach to the multivariable robust control of a hybrid fuel cell gas turbine plant. The hybrid configuration under investigation comprises a physical simulation of a 300kW fuel cell coupled to a 120kW auxiliary power unit single spool gas turbine. The facility provides for the testing and simulation of different fuel cell models that in turn help identify the key issues encountered in the transient operation of such systems. An empirical model of the facility consisting of a simulated fuel cell cathode volume and balance of plant components is derived via frequency response data. Through the modulation of various airflow bypass valves within the hybrid configuration, Bode plots are used to derive key input/output interactions in Transfer Function format. A multivariate system is then built from individual transfer functions, creating a matrix that serves as the nominal plant in an H-Infinity robust control algorithm. The controller’s main objective is to track and maintain hybrid operational constraints in the fuel cell’s cathode airflow, and the turbo machinery states of temperature and speed, under transient disturbances. This algorithm is then tested on a Simulink/MatLab platform for various perturbations of load and fuel cell heat effluence.
Konomi, Bledar A.; Karagiannis, Georgios; Sarkar, Avik; Sun, Xin; Lin, Guang
2014-05-16
Computer experiments (numerical simulations) are widely used in scientific research to study and predict the behavior of complex systems, which usually have responses consisting of a set of distinct outputs. The computational cost of the simulations at high resolution are often expensive and become impractical for parametric studies at different input values. To overcome these difficulties we develop a Bayesian treed multivariate Gaussian process (BTMGP) as an extension of the Bayesian treed Gaussian process (BTGP) in order to model and evaluate a multivariate process. A suitable choice of covariance function and the prior distributions facilitates the different Markov chain Monte Carlo (MCMC) movements. We utilize this model to sequentially sample the input space for the most informative values, taking into account model uncertainty and expertise gained. A simulation study demonstrates the use of the proposed method and compares it with alternative approaches. We apply the sequential sampling technique and BTMGP to model the multiphase flow in a full scale regenerator of a carbon capture unit. The application presented in this paper is an important tool for research into carbon dioxide emissions from thermal power plants.
On the Bayesian Treed Multivariate Gaussian Process with Linear Model of Coregionalization
Konomi, Bledar A.; Karagiannis, Georgios; Lin, Guang
2015-02-01
The Bayesian treed Gaussian process (BTGP) has gained popularity in recent years because it provides a straightforward mechanism for modeling non-stationary data and can alleviate computational demands by fitting models to less data. The extension of BTGP to the multivariate setting requires us to model the cross-covariance and to propose efficient algorithms that can deal with trans-dimensional MCMC moves. In this paper we extend the cross-covariance of the Bayesian treed multivariate Gaussian process (BTMGP) to that of linear model of Coregionalization (LMC) cross-covariances. Different strategies have been developed to improve the MCMC mixing and invert smaller matrices in the Bayesian inference. Moreover, we compare the proposed BTMGP with existing multiple BTGP and BTMGP in test cases and multiphase flow computer experiment in a full scale regenerator of a carbon capture unit. The use of the BTMGP with LMC cross-covariance helped to predict the computer experiments relatively better than existing competitors. The proposed model has a wide variety of applications, such as computer experiments and environmental data. In the case of computer experiments we also develop an adaptive sampling strategy for the BTMGP with LMC cross-covariance function.
Statistical assessment of Monte Carlo distributional tallies
Kiedrowski, Brian C; Solomon, Clell J
2010-12-09
Four tests are developed to assess the statistical reliability of distributional or mesh tallies. To this end, the relative variance density function is developed and its moments are studied using simplified, non-transport models. The statistical tests are performed upon the results of MCNP calculations of three different transport test problems and appear to show that the tests are appropriate indicators of global statistical quality.
Statistics for characterizing data on the periphery
Theiler, James P; Hush, Donald R
2010-01-01
We introduce a class of statistics for characterizing the periphery of a distribution, and show that these statistics are particularly valuable for problems in target detection. Because so many detection algorithms are rooted in Gaussian statistics, we concentrate on ellipsoidal models of high-dimensional data distributions (that is to say: covariance matrices), but we recommend several alternatives to the sample covariance matrix that more efficiently model the periphery of a distribution, and can more effectively detect anomalous data samples.
Moore honored with American Statistical Association award
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Before his death in 1991, Professor Owen was the Distinguished Professor of Statistics at Southern Methodist University in Dallas, Texas. His illustrious career serves as the ...
Structure Learning and Statistical Estimation in Distribution...
Office of Scientific and Technical Information (OSTI)
Citation Details In-Document Search Title: Structure Learning and Statistical Estimation ... Part I of this paper discusses the problem of learning the operational structure of the ...
Statistics for Industry Groups and Industries, 2003
2009-01-18
Statistics for the U.S. Department of Commerce including types of manufacturing, employees, and products as outlined in the Annual Survey of Manufacturers (ASM).
Statistical methods for nuclear material management
Bowen W.M.; Bennett, C.A.
1988-12-01
This book is intended as a reference manual of statistical methodology for nuclear material management practitioners. It describes statistical methods currently or potentially important in nuclear material management, explains the choice of methods for specific applications, and provides examples of practical applications to nuclear material management problems. Together with the accompanying training manual, which contains fully worked out problems keyed to each chapter, this book can also be used as a textbook for courses in statistical methods for nuclear material management. It should provide increased understanding and guidance to help improve the application of statistical methods to nuclear material management problems.
ANNUAL FEDERAL EQUAL EMPLOYMENT OPPORTUNITY STATISTICAL REPORT...
National Nuclear Security Administration (NNSA)
STATISTICAL REPORT OF DISCRIMINATION COMPLAINTS (REPORTING PERIOD BEGINS OCTOBER 1ST AND ... COUNSELORINVESTIGATOR F. COMPLAINTS IN LINE E CLOSED DURING REPORT PERIOD a. FULL-TIME b. ...
[pic] EERE Web Site Statistics - Social Media
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
EERE Web Site Statistics - Social Media Custom View: 10110 - 93011 October 1, 2010 ... site compels visitors to return. Updating web site content is one way to draw return ...
STATISTICAL PERFORMANCE EVALUATION OF SPRING OPERATED PRESSURE...
Office of Scientific and Technical Information (OSTI)
VALVE RELIABILITY IMPROVEMENTS 2004 TO 2014 Citation Details In-Document Search Title: STATISTICAL PERFORMANCE EVALUATION OF SPRING OPERATED PRESSURE RELIEF VALVE RELIABILITY ...
Beyth, M.; Broxton, D.; McInteer, C.; Averett, W.R.; Stablein, N.K.
1980-06-01
Multivariate statistical analysis to support the National Uranium Resource Evaluation and to evaluate strategic and other commercially important mineral resources was carried out on Hydrogeochemical and Stream Sediment Reconnaissance data from the Montrose quadrangle, Colorado. The analysis suggests that: (1) the southern Colorado Mineral Belt is an area favorable for uranium mineral occurrences; (2) carnotite-type occurrences are likely in the nose of the Gunnison Uplift; (3) uranium mineral occurrences may be present along the western and northern margins of the West Elk crater; (4) a base-metal mineralized area is associated with the Uncompahgre Uplift; and (5) uranium and base metals are associated in some areas, and both are often controlled by faults trending west-northwest and north.
Federal offshore statistics: leasing - exploration - production - revenue
Essertier, E.P.
1984-01-01
Federal Offshore Statistics is a numerical record of what has happened since Congress gave authority to the Secretary of the Interior in 1953 to lease the Federal portion of the Continental Shelf for oil and gas. The publication updates and augments the first Federal Offshore Statistics, published in December 1983. It also extends a statistical series published annually from 1969 until 1981 by the US Geological Survey (USGS) under the title Outer Continental Shelf Statistics. The USGS collected royalties and supervised operation and production of minerals on the Outer Continental Shelf (OCS) until the Minerals Management Service (MMS) took over these functions in 1982. Statistics are presented under the following topics: (1) highlights, (2) leasing, (3) exploration and development, (4) production and revenue, (5) federal offshore production by ranking operator, 1983, (6) reserves and undiscovered recoverable resources, and (7) oil pollution in the world's oceans.
Multivariable Robust Control of a Simulated Hybrid Solid Oxide Fuel Cell Gas Turbine Plant
Tsai, Alex; Banta, Larry; Tucker, David; Gemmen, Randall
2010-08-01
This work presents a systematic approach to the multivariable robust control of a hybrid fuel cell gas turbine plant. The hybrid configuration under investigation built by the National Energy Technology Laboratory comprises a physical simulation of a 300kW fuel cell coupled to a 120kW auxiliary power unit single spool gas turbine. The public facility provides for the testing and simulation of different fuel cell models that in turn help identify the key difficulties encountered in the transient operation of such systems. An empirical model of the built facility comprising a simulated fuel cell cathode volume and balance of plant components is derived via frequency response data. Through the modulation of various airflow bypass valves within the hybrid configuration, Bode plots are used to derive key input/output interactions in transfer function format. A multivariate system is then built from individual transfer functions, creating a matrix that serves as the nominal plant in an H{sub {infinity}} robust control algorithm. The controller’s main objective is to track and maintain hybrid operational constraints in the fuel cell’s cathode airflow, and the turbo machinery states of temperature and speed, under transient disturbances. This algorithm is then tested on a Simulink/MatLab platform for various perturbations of load and fuel cell heat effluence. As a complementary tool to the aforementioned empirical plant, a nonlinear analytical model faithful to the existing process and instrumentation arrangement is evaluated and designed in the Simulink environment. This parallel task intends to serve as a building block to scalable hybrid configurations that might require a more detailed nonlinear representation for a wide variety of controller schemes and hardware implementations.
U.S. Energy Information Administration (EIA) Indexed Site
Statistical Methodology of Estimating Petroleum Exports Using Data from U.S. Customs and Border Protection August 31, 2016 Independent Statistics & Analysis www.eia.gov U.S. Department of Energy Washington, DC 20585 U.S. Energy Information Administration | Statistical Methodology of Estimating Petroleum Exports Using Data from U.S. Customs and Border Protection This report was prepared by the U.S. Energy Information Administration (EIA), the statistical and analytical agency within the U.S.
Martin, Madhavi Z; Allman, Steve L; Brice, Deanne Jane; Martin, Rodger Carl; Andre, Nicolas O
2012-01-01
Laser-induced breakdown spectroscopy (LIBS) has been used to determine the limits of detection of strontium (Sr) and cesium (Cs), common nuclear fission products. Additionally, detection limits were determined for cerium (Ce), often used as a surrogate for radioactive plutonium in laboratory studies. Results were obtained using a laboratory instrument with a Nd:YAG laser at fundamental wavelength of 1064 nm, frequency doubled to 532 nm with energy of 50 mJ/pulse. The data was compared for different concentrations of Sr and Ce dispersed in a CaCO3 (white) and carbon (black) matrix. We have addressed the sampling errors, limits of detection, reproducibility, and accuracy of measurements as they relate to multivariate analysis in pellets that were doped with the different elements at various concentrations. These results demonstrate that LIBS technique is inherently well suited for in situ analysis of nuclear materials in hot cells. Three key advantages are evident: (1) small samples (mg) can be evaluated; (2) nuclear materials can be analyzed with minimal sample preparation; and (3) samples can be remotely analyzed very rapidly (ms-seconds). Our studies also show that the methods can be made quantitative. Very robust multivariate models have been used to provide quantitative measurement and statistical evaluation of complex materials derived from our previous research on wood and soil samples.
Security of statistical data bases: invasion of privacy through attribute correlational modeling
Palley, M.A.
1985-01-01
This study develops, defines, and applies a statistical technique for the compromise of confidential information in a statistical data base. Attribute Correlational Modeling (ACM) recognizes that the information contained in a statistical data base represents real world statistical phenomena. As such, ACM assumes correlational behavior among the database attributes. ACM proceeds to compromise confidential information through creation of a regression model, where the confidential attribute is treated as the dependent variable. The typical statistical data base may preclude the direct application of regression. In this scenario, the research introduces the notion of a synthetic data base, created through legitimate queries of the actual data base, and through proportional random variation of responses to these queries. The synthetic data base is constructed to resemble the actual data base as closely as possible in a statistical sense. ACM then applies regression analysis to the synthetic data base, and utilizes the derived model to estimate confidential information in the actual database.
Computing contingency statistics in parallel : design trade-offs and limiting cases.
Bennett, Janine Camille; Thompson, David; Pebay, Philippe Pierre
2010-06-01
Statistical analysis is typically used to reduce the dimensionality of and infer meaning from data. A key challenge of any statistical analysis package aimed at large-scale, distributed data is to address the orthogonal issues of parallel scalability and numerical stability. Many statistical techniques, e.g., descriptive statistics or principal component analysis, are based on moments and co-moments and, using robust online update formulas, can be computed in an embarrassingly parallel manner, amenable to a map-reduce style implementation. In this paper we focus on contingency tables, through which numerous derived statistics such as joint and marginal probability, point-wise mutual information, information entropy, and {chi}{sup 2} independence statistics can be directly obtained. However, contingency tables can become large as data size increases, requiring a correspondingly large amount of communication between processors. This potential increase in communication prevents optimal parallel speedup and is the main difference with moment-based statistics (which we discussed in [1]) where the amount of inter-processor communication is independent of data size. Here we present the design trade-offs which we made to implement the computation of contingency tables in parallel. We also study the parallel speedup and scalability properties of our open source implementation. In particular, we observe optimal speed-up and scalability when the contingency statistics are used in their appropriate context, namely, when the data input is not quasi-diffuse.
Computing contingency statistics in parallel : design trade-offs and limiting cases.
Thompson, David C.; Bennett, Janine C.; Pebay, Philippe Pierre
2010-03-01
Statistical analysis is typically used to reduce the dimensionality of and infer meaning from data. A key challenge of any statistical analysis package aimed at large-scale, distributed data is to address the orthogonal issues of parallel scalability and numerical stability. Many statistical techniques, e.g., descriptive statistics or principal component analysis, are based on moments and co-moments and, using robust online update formulas, can be computed in an embarrassingly parallel manner, amenable to a map-reduce style implementation. In this paper we focus on contingency tables, through which numerous derived statistics such as joint and marginal probability, point-wise mutual information, information entropy, and X{sup 2} independence statistics can be directly obtained. However, contingency tables can become large as data size increases, requiring a correspondingly large amount of communication between processors. This potential increase in communication prevents optimal parallel speedup and is the main difference with moment-based statistics (which we discussed in [1]) where the amount of inter-processor communication is independent of data size. Here we present the design trade-offs which we made to implement the computation of contingency tables in parallel.We also study the parallel speedup and scalability properties of our open source implementation. In particular, we observe optimal speed-up and scalability when the contingency statistics are used in their appropriate context, namely, when the data input is not quasi-diffuse.
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Last 30 Days These plots show the daily statistics for the last 30 days for the storage systems at NERSC in terms of the amount of data transferred and the number of files...
Statistical criteria for characterizing irradiance time series.
Stein, Joshua S.; Ellis, Abraham; Hansen, Clifford W.
2010-10-01
We propose and examine several statistical criteria for characterizing time series of solar irradiance. Time series of irradiance are used in analyses that seek to quantify the performance of photovoltaic (PV) power systems over time. Time series of irradiance are either measured or are simulated using models. Simulations of irradiance are often calibrated to or generated from statistics for observed irradiance and simulations are validated by comparing the simulation output to the observed irradiance. Criteria used in this comparison should derive from the context of the analyses in which the simulated irradiance is to be used. We examine three statistics that characterize time series and their use as criteria for comparing time series. We demonstrate these statistics using observed irradiance data recorded in August 2007 in Las Vegas, Nevada, and in June 2009 in Albuquerque, New Mexico.
EERE Web Site Engagement Statistics: FY09
Broader source: Energy.gov (indexed) [DOE]
WEB SITE ENGAGEMENT STATISTICS TECHNOLOGY ADVANCEMENT AND OUTREACH | 01 TABLE OF CONTENTS ... Views 02 Average Visit Duration 03 Top 20 Web Sites by Visits 03 Top 20 Visited Pages 04 ...
FY 2015 Statistical Table by Appropriation
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Statistical Table by Appropriation (dollars in thousands - OMB Scoring) Statistical Table by Appropriation Page 1 FY 2015 Congressional Request FY 2013 FY 2014 FY 2014 FY 2014 FY 2015 Current Enacted Adjustment Current Congressional Approp. Approp. Approp. Request Discretionary Summary By Appropriation Energy And Water Development And Related Agencies Appropriation Summary: Energy Programs Energy efficiency and renewable energy............................... 1,691,757 1,900,641 ---- 1,900,641
FY 2015 Statistical Table by Organization
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
of Energy FY 2015 Statistical Table by Organization (dollars in thousands - OMB Scoring) Statistical Table by Organization Page 1 FY 2015 Congressional Request FY 2013 FY 2014 FY 2014 FY 2014 FY 2015 Current Enacted Adjustments Current Congressional Approp. Approp. Approp. Request Discretionary Summary By Organization National Nuclear Security Administration Weapons Activities........................................................................... 6,966,855 7,781,000 ---- 7,781,000 8,314,902
Statistical Fault Detection & Diagnosis Expert System
Energy Science and Technology Software Center (OSTI)
1996-12-18
STATMON is an expert system that performs real-time fault detection and diagnosis of redundant sensors in any industrial process requiring high reliability. After a training period performed during normal operation, the expert system monitors the statistical properties of the incoming signals using a pattern recognition test. If the test determines that statistical properties of the signals have changed, the expert system performs a sequence of logical steps to determine which sensor or machine component hasmoredegraded.less
Canadian National Energy Use Database: Statistics and Analysis...
Occidental, Inuktitut, Inupiaq, Iranian languages, Irish, Iroquoian languages, Italian, Japanese, Javanese, Judeo-Arabic, Judeo-Persian, Kabardian, Kabyle, Kachin; Jingpho,...
Statistical Analysis of Variation in the Human Plasma Proteome
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Corzett, Todd H.; Fodor, Imola K.; Choi, Megan W.; Walsworth, Vicki L.; Turteltaub, Kenneth W.; McCutchen-Maloney, Sandra L.; Chromy, Brett A.
2010-01-01
Quantifying the variation in the human plasma proteome is an essential prerequisite for disease-specific biomarker detection. We report here on the longitudinal and individual variation in human plasma characterized by two-dimensional difference gel electrophoresis (2-D DIGE) using plasma samples from eleven healthy subjects collected three times over a two week period. Fixed-effects modeling was used to remove dye and gel variability. Mixed-effects modeling was then used to quantitate the sources of proteomic variation. The subject-to-subject variation represented the largest variance component, while the time-within-subject variation was comparable to the experimental variation found in a previous technical variability study where onemore » human plasma sample was processed eight times in parallel and each was then analyzed by 2-D DIGE in triplicate. Here, 21 protein spots had larger than 50% CV, suggesting that these proteins may not be appropriate as biomarkers and should be carefully scrutinized in future studies. Seventy-eight protein spots showing differential protein levels between different individuals or individual collections were identified by mass spectrometry and further characterized using hierarchical clustering. The results present a first step toward understanding the complexity of longitudinal and individual variation in the human plasma proteome, and provide a baseline for improved biomarker discovery.« less
Statistical analysis of variations in impurity ion heating at...
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
... heating of majority ions has been studied; 16 and most recently, the charge and mass dependency of impurity ion heating has been investigated. 17 Despite many laboratory ...
Statistics of particle time-temperature histories.
Hewson, John C.; Lignell, David O.; Sun, Guangyuan
2014-10-01
Particles in non - isothermal turbulent flow are subject to a stochastic environment tha t produces a distribution of particle time - temperature histories. This distribution is a function of the dispersion of the non - isothermal (continuous) gas phase and the distribution of particles relative to that gas phase. In this work we extend the one - dimensional turbulence (ODT) model to predict the joint dispersion of a dispersed particle phase and a continuous phase. The ODT model predicts the turbulent evolution of continuous scalar fields with a model for the cascade of fluctuations to smaller sc ales (the 'triplet map') at a rate that is a function of the fully resolved one - dimens ional velocity field . Stochastic triplet maps also drive Lagrangian particle dispersion with finite Stokes number s including inertial and eddy trajectory - crossing effect s included. Two distinct approaches to this coupling between triplet maps and particle dispersion are developed and implemented along with a hybrid approach. An 'instantaneous' particle displacement model matches the tracer particle limit and provide s an accurate description of particle dispersion. A 'continuous' particle displacement m odel translates triplet maps into a continuous velocity field to which particles respond. Particles can alter the turbulence, and modifications to the stochastic rate expr ession are developed for two - way coupling between particles and the continuous phase. Each aspect of model development is evaluated in canonical flows (homogeneous turbulence, free - shear flows and wall - bounded flows) for which quality measurements are ava ilable. ODT simulations of non - isothermal flows provide statistics for particle heating. These simulations show the significance of accurately predicting the joint statistics of particle and fluid dispersion . Inhomogeneous turbulence coupled with the in fluence of the mean flow fields on particles of varying properties alter s particle dispersion. The
MAVTgsa: An R Package for Gene Set (Enrichment) Analysis
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Chien, Chih-Yi; Chang, Ching-Wei; Tsai, Chen-An; Chen, James J.
2014-01-01
Gene semore » t analysis methods aim to determine whether an a priori defined set of genes shows statistically significant difference in expression on either categorical or continuous outcomes. Although many methods for gene set analysis have been proposed, a systematic analysis tool for identification of different types of gene set significance modules has not been developed previously. This work presents an R package, called MAVTgsa, which includes three different methods for integrated gene set enrichment analysis. (1) The one-sided OLS (ordinary least squares) test detects coordinated changes of genes in gene set in one direction, either up- or downregulation. (2) The two-sided MANOVA (multivariate analysis variance) detects changes both up- and downregulation for studying two or more experimental conditions. (3) A random forests-based procedure is to identify gene sets that can accurately predict samples from different experimental conditions or are associated with the continuous phenotypes. MAVTgsa computes the P values and FDR (false discovery rate) q -value for all gene sets in the study. Furthermore, MAVTgsa provides several visualization outputs to support and interpret the enrichment results. This package is available online.« less
Complex statistics and diffusion in nonlinear disordered particle chains
Antonopoulos, Ch. G.; Bountis, T.; Skokos, Ch.; Drossos, L.
2014-06-15
We investigate dynamically and statistically diffusive motion in a Klein-Gordon particle chain in the presence of disorder. In particular, we examine a low energy (subdiffusive) and a higher energy (self-trapping) case and verify that subdiffusive spreading is always observed. We then carry out a statistical analysis of the motion, in both cases, in the sense of the Central Limit Theorem and present evidence of different chaos behaviors, for various groups of particles. Integrating the equations of motion for times as long as 10{sup 9}, our probability distribution functions always tend to Gaussians and show that the dynamics does not relax onto a quasi-periodic Kolmogorov-Arnold-Moser torus and that diffusion continues to spread chaotically for arbitrarily long times.
Statistical physics on the light-front
Raufeisen, J.
2005-06-14
The formulation of statistical physics using light-front quantization, instead of conventional equal-time boundary conditions, has important advantages for describing relativistic statistical systems, such as heavy ion collisions. We develop light-front field theory at finite temperature and density with special attention to Quantum Chromodynamics. We construct the most general form of the statistical operator allowed by the Poincare algebra and introduce the chemical potential in a covariant way. In light-front quantization, the Green's functions of a quark in a medium can be defined in terms of just 2-component spinors and does not lead to doublers in the transverse directions. A seminal property of light-front Green's functions is that they are related to parton densities in coordinate space. Namely, the diagonal and off-diagonal parton distributions measured in hard scattering experiments can be interpreted as light-front density matrices.
n-dimensional Statistical Inverse Graphical Hydraulic Test Simulator
Energy Science and Technology Software Center (OSTI)
2012-09-12
nSIGHTS (n-dimensional Statistical Inverse Graphical Hydraulic Test Simulator) is a comprehensive well test analysis software package. It provides a user-interface, a well test analysis model and many tools to analyze both field and simulated data. The well test analysis model simulates a single-phase, one-dimensional, radial/non-radial flow regime, with a borehole at the center of the modeled flow system. nSIGHTS solves the radially symmetric n-dimensional forward flow problem using a solver based on a graph-theoretic approach.more » The results of the forward simulation are pressure, and flow rate, given all the input parameters. The parameter estimation portion of nSIGHTS uses a perturbation-based approach to interpret the best-fit well and reservoir parameters, given an observed dataset of pressure and flow rate.« less
Federal offshore statistics: leasing, exploration, production, revenue
Essertier, E.P.
1983-01-01
The statistics in this update of the Outer Continental Shelf Statistics publication document what has happened since federal leasing began on the Outer Continental Shelf (OCS) in 1954. Highlights note that of the 29.8 million acres actually leased from 175.6 million acres offered for leasing, 20.1% were in frontier areas. Total revenues for the 1954-1982 period were $58.9 billion with about 13% received in 1982. The book is divided into six parts covering highlights, leasing, exploration and development, production and revenue, reserves and undiscovered recoverable resources, and pollution problems from well and tanker accidents. 5 figures, 59 tables.
Statistical Fault Detection & Diagnosis Expert System
Energy Science and Technology Software Center (OSTI)
1996-12-18
STATMON is an expert system that performs real-time fault detection and diagnosis of redundant sensors in any industrial process requiring high reliability. After a training period performed during normal operation, the expert system monitors the statistical properties of the incoming signals using a pattern recognition test. If the test determines that statistical properties of the signals have changed, the expert system performs a sequence of logical steps to determine which sensor or machine component hasmore » degraded.« less
Statistical data of the uranium industry
1983-01-01
This report is a compendium of information relating to US uranium reserves and potential resources and to exploration, mining, milling, and other activities of the uranium industry through 1982. The statistics are based primarily on data provided voluntarily by the uranium exploration, mining and milling companies. The compendium has been published annually since 1968 and reflects the basic programs of the Grand Junction Area Office of the US Department of Energy. Statistical data obtained from surveys conducted by the Energy Information Administration are included in Section IX. The production, reserves, and drilling data are reported in a manner which avoids disclosure of proprietary information.
FRAMES Software System: Linking to the Statistical Package R
Castleton, Karl J.; Whelan, Gene; Hoopes, Bonnie L.
2006-12-11
This document provides requirements, design, data-file specifications, test plan, and Quality Assurance/Quality Control protocol for the linkage between the statistical package R and the Framework for Risk Analysis in Multimedia Environmental Systems (FRAMES) Versions 1.x and 2.0. The requirements identify the attributes of the system. The design describes how the system will be structured to meet those requirements. The specification presents the specific modifications to FRAMES to meet the requirements and design. The test plan confirms that the basic functionality listed in the requirements (black box testing) actually functions as designed, and QA/QC confirms that the software meets the client’s needs.
Picard, R.R.
1987-01-01
Many aspects of the MUF-D statistic, used for verification of accountability data, have been examined in the safeguards literature. In this paper, basic MUF-D results are extended to more general environments than are usually considered. These environments include arbitrary measurement error structures, various sampling regimes that could be imposed by the inspectorate, and the attributes/variables framework.
Statistics of dislocation pinning at localized obstacles
Dutta, A.; Bhattacharya, M. Barat, P.
2014-10-14
Pinning of dislocations at nanosized obstacles like precipitates, voids, and bubbles is a crucial mechanism in the context of phenomena like hardening and creep. The interaction between such an obstacle and a dislocation is often studied at fundamental level by means of analytical tools, atomistic simulations, and finite element methods. Nevertheless, the information extracted from such studies cannot be utilized to its maximum extent on account of insufficient information about the underlying statistics of this process comprising a large number of dislocations and obstacles in a system. Here, we propose a new statistical approach, where the statistics of pinning of dislocations by idealized spherical obstacles is explored by taking into account the generalized size-distribution of the obstacles along with the dislocation density within a three-dimensional framework. Starting with a minimal set of material parameters, the framework employs the method of geometrical statistics with a few simple assumptions compatible with the real physical scenario. The application of this approach, in combination with the knowledge of fundamental dislocation-obstacle interactions, has successfully been demonstrated for dislocation pinning at nanovoids in neutron irradiated type 316-stainless steel in regard to the non-conservative motion of dislocations. An interesting phenomenon of transition from rare pinning to multiple pinning regimes with increasing irradiation temperature is revealed.
Multifragmentation: New dynamics or old statistics?
Moretto, L.G.; Delis, D.N.; Wozniak, G.J.
1993-10-01
The understanding of the fission process as it has developed over the last fifty years has been applied to multifragmentation. Two salient aspects have been discovered: 1) a strong decoupling of the entrance and exit channels with the formation of well-characterized sources: 2) a statistical competition between two-, three-, four-, five-, ... n-body decays.
Baseballs and Barrels: World Statistics Day
Broader source: Energy.gov [DOE]
Statistics don’t just help us answer trivia questions – they also help us make intelligent decisions. For example, if I heat my home with natural gas, I’m probably interested in what natural gas prices are likely to be this winter.
Summary Statistics for Homemade ?Play Dough? -- Data Acquired at LLNL
Kallman, J S; Morales, K E; Whipple, R E; Huber, R D; Martz, A; Brown, W D; Smith, J A; Schneberk, D J; Martz, Jr., H E; White, III, W T
2010-03-11
Using x-ray computerized tomography (CT), we have characterized the x-ray linear attenuation coefficients (LAC) of a homemade Play Dough{trademark}-like material, designated as PDA. Table 1 gives the first-order statistics for each of four CT measurements, estimated with a Gaussian kernel density estimator (KDE) analysis. The mean values of the LAC range from a high of about 2700 LMHU{sub D} 100kVp to a low of about 1200 LMHUD at 300kVp. The standard deviation of each measurement is around 10% to 15% of the mean. The entropy covers the range from 6.0 to 7.4. Ordinarily, we would model the LAC of the material and compare the modeled values to the measured values. In this case, however, we did not have the detailed chemical composition of the material and therefore did not model the LAC. Using a method recently proposed by Lawrence Livermore National Laboratory (LLNL), we estimate the value of the effective atomic number, Z{sub eff}, to be near 10. LLNL prepared about 50mL of the homemade 'Play Dough' in a polypropylene vial and firmly compressed it immediately prior to the x-ray measurements. We used the computer program IMGREC to reconstruct the CT images. The values of the key parameters used in the data capture and image reconstruction are given in this report. Additional details may be found in the experimental SOP and a separate document. To characterize the statistical distribution of LAC values in each CT image, we first isolated an 80% central-core segment of volume elements ('voxels') lying completely within the specimen, away from the walls of the polypropylene vial. All of the voxels within this central core, including those comprised of voids and inclusions, are included in the statistics. We then calculated the mean value, standard deviation and entropy for (a) the four image segments and for (b) their digital gradient images. (A digital gradient image of a given image was obtained by taking the absolute value of the difference between the initial image
Summary Statistics for Fun Dough Data Acquired at LLNL
Kallman, J S; Morales, K E; Whipple, R E; Huber, R D; Brown, W D; Smith, J A; Schneberk, D J; Martz, Jr., H E; White, III, W T
2010-03-11
Using x-ray computerized tomography (CT), we have characterized the x-ray linear attenuation coefficients (LAC) of a Play Dough{trademark}-like product, Fun Dough{trademark}, designated as PD. Table 1 gives the first-order statistics for each of four CT measurements, estimated with a Gaussian kernel density estimator (KDE) analysis. The mean values of the LAC range from a high of about 2100 LMHU{sub D} at 100kVp to a low of about 1100 LMHU{sub D} at 300kVp. The standard deviation of each measurement is around 1% of the mean. The entropy covers the range from 3.9 to 4.6. Ordinarily, we would model the LAC of the material and compare the modeled values to the measured values. In this case, however, we did not have the composition of the material and therefore did not model the LAC. Using a method recently proposed by Lawrence Livermore National Laboratory (LLNL), we estimate the value of the effective atomic number, Z{sub eff}, to be near 8.5. LLNL prepared about 50mL of the Fun Dough{trademark} in a polypropylene vial and firmly compressed it immediately prior to the x-ray measurements. Still, layers can plainly be seen in the reconstructed images, indicating that the bulk density of the material in the container is affected by voids and bubbles. We used the computer program IMGREC to reconstruct the CT images. The values of the key parameters used in the data capture and image reconstruction are given in this report. Additional details may be found in the experimental SOP and a separate document. To characterize the statistical distribution of LAC values in each CT image, we first isolated an 80% central-core segment of volume elements ('voxels') lying completely within the specimen, away from the walls of the polypropylene vial. All of the voxels within this central core, including those comprised of voids and inclusions, are included in the statistics. We then calculated the mean value, standard deviation and entropy for (a) the four image segments and for (b
Masked Areas in Shear Peak Statistics: A Forward Modeling Approach...
Office of Scientific and Technical Information (OSTI)
Journal Article: Masked Areas in Shear Peak Statistics: A Forward Modeling Approach Citation Details In-Document Search Title: Masked Areas in Shear Peak Statistics: A Forward ...
Statistics Show Bearing Problems Cause the Majority of Wind Turbine...
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Statistics Show Bearing Problems Cause the Majority of Wind Turbine Gearbox Failures Statistics Show Bearing Problems Cause the Majority of Wind Turbine Gearbox Failures September ...
RITA-Bureau of Transportation Statistics | Open Energy Information
RITA-Bureau of Transportation Statistics Jump to: navigation, search Tool Summary LAUNCH TOOL Name: RITA-Bureau of Transportation Statistics AgencyCompany Organization: United...
Fact #602: December 21, 2009 Freight Statistics by Mode, 2007...
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
21, 2009 Freight Statistics by Mode, 2007 Commodity Flow Survey Fact 602: December 21, 2009 Freight Statistics by Mode, 2007 Commodity Flow Survey Results from the 2007 Commodity ...
FY 2014 Budget Request Statistical Table | Department of Energy
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Statistical Table FY 2014 Budget Request Statistical Table PDF icon Stats Table FY2014.pdf More Documents & Publications FY 2009 Environmental Management Budget Request to Congress ...
Statistical and Domain Analytics Applied to PV Module Lifetime...
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Statistical and Domain Analytics Applied to PV Module Lifetime and Degradation Science Statistical and Domain Analytics Applied to PV Module Lifetime and Degradation Science ...
ARM Climate Modeling Best Estimate Lamont, OK Statistical Summary...
Office of Scientific and Technical Information (OSTI)
Climate Modeling Best Estimate Lamont, OK Statistical Summary (ARMBE-CLDRAD SGPC1) Title: ARM Climate Modeling Best Estimate Lamont, OK Statistical Summary (ARMBE-CLDRAD SGPC1) ...
Physics-based statistical learning approach to mesoscopic model...
Office of Scientific and Technical Information (OSTI)
Physics-based statistical learning approach to mesoscopic model selection Citation Details ... Title: Physics-based statistical learning approach to mesoscopic model selection Authors: ...
Statistical approach to nuclear level density
Sen'kov, R. A.; Horoi, M.; Zelevinsky, V. G.
2014-10-15
We discuss the level density in a finite many-body system with strong interaction between the constituents. Our primary object of applications is the atomic nucleus but the same techniques can be applied to other mesoscopic systems. We calculate and compare nuclear level densities for given quantum numbers obtained by different methods, such as nuclear shell model (the most successful microscopic approach), our main instrument - moments method (statistical approach), and Fermi-gas model; the calculation with the moments method can use any shell-model Hamiltonian excluding the spurious states of the center-of-mass motion. Our goal is to investigate statistical properties of nuclear level density, define its phenomenological parameters, and offer an affordable and reliable way of calculation.
Robust statistical reconstruction for charged particle tomography
Schultz, Larry Joe; Klimenko, Alexei Vasilievich; Fraser, Andrew Mcleod; Morris, Christopher; Orum, John Christopher; Borozdin, Konstantin N; Sossong, Michael James; Hengartner, Nicolas W
2013-10-08
Systems and methods for charged particle detection including statistical reconstruction of object volume scattering density profiles from charged particle tomographic data to determine the probability distribution of charged particle scattering using a statistical multiple scattering model and determine a substantially maximum likelihood estimate of object volume scattering density using expectation maximization (ML/EM) algorithm to reconstruct the object volume scattering density. The presence of and/or type of object occupying the volume of interest can be identified from the reconstructed volume scattering density profile. The charged particle tomographic data can be cosmic ray muon tomographic data from a muon tracker for scanning packages, containers, vehicles or cargo. The method can be implemented using a computer program which is executable on a computer.
New paradigms for the statistics profession
Iman, R.L.
1995-02-01
This paper is a presentation made in support of the statistics profession. This field can say it has had a major impact in most major fields of study presently undertaken by man, yet it is not perceived as an important, or critical field of study. It is not a growth field either, witness the almost level number of faculty and new PhD`s produced over the past twenty years. The author argues the profession must do a better job of selling itself to the students it educates. Awaken them to the impact of statistics in their lives and their business worlds, so that they see beyond the formulae to the application of these principles.
Statistical data of the uranium industry
1980-01-01
This document is a compilation of historical facts and figures through 1979. These statistics are based primarily on information provided voluntarily by the uranium exploration, mining, and milling companies. The production, reserves, drilling, and production capability information has been reported in a manner which avoids disclosure of proprietary information. Only the totals for the $1.5 reserves are reported. Because of increased interest in higher cost resources for long range planning purposes, a section covering the distribution of $100 per pound reserves statistics has been newly included. A table of mill recovery ranges for the January 1, 1980 reserves has also been added to this year's edition. The section on domestic uranium production capability has been deleted this year but will be included next year. The January 1, 1980 potential resource estimates are unchanged from the January 1, 1979 estimates.
Workforce Statistics | National Nuclear Security Administration | (NNSA)
National Nuclear Security Administration (NNSA)
Workforce Statistics The Semi Annual Report provides age, disability, diversity, education, pay plan, retirement eligibility, years of service, and veteran's status data. The Year-End reports include the semi-annual information, as well as information on hires; separations; promotions; and historical data for those organization's where it is available, for five years. National Nuclear Security Administration Kansas City Field Office NA 1 - Immediate Office of the Administrator Livermore Field
Statistical data of the uranium industry
1981-01-01
Data are presented on US uranium reserves, potential resources, exploration, mining, drilling, milling, and other activities of the uranium industry through 1980. The compendium reflects the basic programs of the Grand Junction Office. Statistics are based primarily on information provided by the uranium exploration, mining, and milling companies. Data on commercial U/sub 3/O/sub 8/ sales and purchases are included. Data on non-US uranium production and resources are presented in the appendix. (DMC)
FY 2017 Statistical Table by Organization
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Organization (dollars in thousands - OMB Scoring) Statistical Table by Organization Page 1 FY 2017 Congressional Budget Justification FY 2015 FY 2015 FY 2016 FY 2017 Enacted Current Enacted Congressional Approp. Approp. Approp. Request $ % Discretionary Summary By Organization National Nuclear Security Administration Weapons Activities................................................................................. 8,180,359 8,180,609 8,846,948 9,243,147 +396,199 +4.5% Defense Nuclear
Thermal battery statistics and plotting programs
Scharrer, G.L.
1990-04-01
Thermal battery functional test data are stored in an HP3000 minicomputer operated by the Power Sources Department. A program was written to read data from a battery data base, compute simple statistics (mean, minimum, maximum, standard deviation, and K-factor), print out the results, and store the data in a file for subsequent plotting. A separate program was written to plot the data. The programs were written in the Pascal programming language. 1 tab.
Federal offshore statistics: leasing, exploration, production, revenue
Essertier, E.P.
1984-09-01
This publication is a numerical record of what has happened since Congress gave authority to the Secretary of the Interior in 1953 to lease the federal portion of the Continental Shelf for oil and gas. The publication updates and augments the first Federal Offshore Statistics, published in December 1983. It also extends a statistical series published annually from 1969 until 1981 by the US Geological Survey (USGS) under the title Outer Continental Shelf Statistics. The USGS collected royalties and supervised operation and production of minerals on the Outer Continental Shelf (OCS) until the Minerals Management Service (MMS) took over these functions in 1982. Some of the highlights are: of the 329.5 million acres offered for leasing, 37.1 million acres were actually leased; total revenues for the 1954 to 1983 period were $68,173,112,563 and for 1983 $9,161,435,540; a total of 22,095 wells were drilled in federal waters and 10,145 wells were drilled in state waters; from 1954 through 1983, federal offshore areas produced 6.4 billion barrels of oil and condensate, and 62.1 trillion cubic feet of natural gas; in 1983 alone production was 340.7 million barrels of oil and condensate, and 3.9 trillion cubic feet of gas; and for the second straight year, no oil was lost in 1983 as a result of blowouts in federal waters. 8 figures, 66 tables.
Weatherization Assistance Program - Background Data and Statistics
Eisenberg, Joel Fred
2010-03-01
This technical memorandum is intended to provide readers with information that may be useful in understanding the purposes, performance, and outcomes of the Department of Energy's (DOE's) Weatherization Assistance Program (Weatherization). Weatherization has been in operation for over thirty years and is the nation's largest single residential energy efficiency program. Its primary purpose, established by law, is 'to increase the energy efficiency of dwellings owned or occupied by low-income persons, reduce their total residential energy expenditures, and improve their health and safety, especially low-income persons who are particularly vulnerable such as the elderly, the handicapped, and children.' The American Reinvestment and Recovery Act PL111-5 (ARRA), passed and signed into law in February 2009, committed $5 Billion over two years to an expanded Weatherization Assistance Program. This has created substantial interest in the program, the population it serves, the energy and cost savings it produces, and its cost-effectiveness. This memorandum is intended to address the need for this kind of information. Statistically valid answers to many of the questions surrounding Weatherization and its performance require comprehensive evaluation of the program. DOE is undertaking precisely this kind of independent evaluation in order to ascertain program effectiveness and to improve its performance. Results of this evaluation effort will begin to emerge in late 2010 and 2011, but they require substantial time and effort. In the meantime, the data and statistics in this memorandum can provide reasonable and transparent estimates of key program characteristics. The memorandum is laid out in three sections. The first deals with some key characteristics describing low-income energy consumption and expenditures. The second section provides estimates of energy savings and energy bill reductions that the program can reasonably be presumed to be producing. The third section
Lightweight and Statistical Techniques for Petascale PetaScale Debugging
Miller, Barton
2014-06-30
This project investigated novel techniques for debugging scientific applications on petascale architectures. In particular, we developed lightweight tools that narrow the problem space when bugs are encountered. We also developed techniques that either limit the number of tasks and the code regions to which a developer must apply a traditional debugger or that apply statistical techniques to provide direct suggestions of the location and type of error. We extend previous work on the Stack Trace Analysis Tool (STAT), that has already demonstrated scalability to over one hundred thousand MPI tasks. We also extended statistical techniques developed to isolate programming errors in widely used sequential or threaded applications in the Cooperative Bug Isolation (CBI) project to large scale parallel applications. Overall, our research substantially improved productivity on petascale platforms through a tool set for debugging that complements existing commercial tools. Previously, Office Of Science application developers relied either on primitive manual debugging techniques based on printf or they use tools, such as TotalView, that do not scale beyond a few thousand processors. However, bugs often arise at scale and substantial effort and computation cycles are wasted in either reproducing the problem in a smaller run that can be analyzed with the traditional tools or in repeated runs at scale that use the primitive techniques. New techniques that work at scale and automate the process of identifying the root cause of errors were needed. These techniques significantly reduced the time spent debugging petascale applications, thus leading to a greater overall amount of time for application scientists to pursue the scientific objectives for which the systems are purchased. We developed a new paradigm for debugging at scale: techniques that reduced the debugging scenario to a scale suitable for traditional debuggers, e.g., by narrowing the search for the root-cause analysis
Kelly named Fellow of the American Statistical Association
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Kelly named ASA Fellow Kelly named Fellow of the American Statistical Association The American Statistical Association (ASA) has honored Elizabeth Kelly with the title of Fellow. August 2, 2016 Elizabeth Kelly Elizabeth Kelly Communications Office (505) 667-7000 Kelly is a member of the American Statistical Society and the American Society for Quality. The American Statistical Association (ASA) has honored Elizabeth Kelly of the Lab's Statistical Sciences group with the title of Fellow. The ASA
Statistical simulation ?of the magnetorotational dynamo
Squire, J.; Bhattacharjee, A.
2014-08-01
We analyze turbulence and dynamo induced by the magnetorotational instability (MRI) using quasi-linear statistical simulation methods. We find that homogenous turbulence is unstable to a large scale dynamo instability, which saturates to an inhomogenous equilibrium with a very strong dependence on the magnetic Prandtl number (Pm). Despite its enormously reduced nonlinearity, the quasi-linear model exhibits the same qualitative scaling of angular momentum transport with Pm as fully nonlinear turbulence. This demonstrates the relationship of recent convergence problems to the large scale dynamo and suggests possible methods for studying astrophysically relevant regimes at very low or high Pm.
Statistical Characterization of Medium-Duty Electric Vehicle Drive Cycles
Prohaska, Robert; Duran, Adam; Ragatz, Adam; Kelly, Kenneth
2015-05-03
In an effort to help commercialize technologies for electric vehicles (EVs) through deployment and demonstration projects, the U.S. Department of Energy's (DOE's) American Recovery and Reinvestment Act (ARRA) provided funding to participating U.S. companies to cover part of the cost of purchasing new EVs. Within the medium- and heavy-duty commercial vehicle segment, both Smith Electric Newton and and Navistar eStar vehicles qualified for such funding opportunities. In an effort to evaluate the performance characteristics of the new technologies deployed in these vehicles operating under real world conditions, data from Smith Electric and Navistar medium-duty EVs were collected, compiled, and analyzed by the National Renewable Energy Laboratory's (NREL) Fleet Test and Evaluation team over a period of 3 years. More than 430 Smith Newton EVs have provided data representing more than 150,000 days of operation. Similarly, data have been collected from more than 100 Navistar eStar EVs, resulting in a comparative total of more than 16,000 operating days. Combined, NREL has analyzed more than 6 million kilometers of driving and 4 million hours of charging data collected from commercially operating medium-duty electric vehicles in various configurations. In this paper, extensive duty-cycle statistical analyses are performed to examine and characterize common vehicle dynamics trends and relationships based on in-use field data. The results of these analyses statistically define the vehicle dynamic and kinematic requirements for each vehicle, aiding in the selection of representative chassis dynamometer test cycles and the development of custom drive cycles that emulate daily operation. In this paper, the methodology and accompanying results of the duty-cycle statistical analysis are presented and discussed. Results are presented in both graphical and tabular formats illustrating a number of key relationships between parameters observed within the data set that relate to
Distributed Design and Analysis of Computer Experiments
Energy Science and Technology Software Center (OSTI)
2002-11-11
DDACE is a C++ object-oriented software library for the design and analysis of computer experiments. DDACE can be used to generate samples from a variety of sampling techniques. These samples may be used as input to a application code. DDACE also contains statistical tools such as response surface models and correlation coefficients to analyze input/output relationships between variables in an application code. DDACE can generate input values for uncertain variables within a user's application. Formore » example, a user might like to vary a temperature variable as well as some material variables in a series of simulations. Through the series of simulations the user might be looking for optimal settings of parameters based on some user criteria. Or the user may be interested in the sensitivity to input variability shown by an output variable. In either case, the user may provide information about the suspected ranges and distributions of a set of input variables, along with a sampling scheme, and DDACE will generate input points based on these specifications. The input values generated by DDACE and the one or more outputs computed through the user's application code can be analyzed with a variety of statistical methods. This can lead to a wealth of information about the relationships between the variables in the problem. While statistical and mathematical packages may be employeed to carry out the analysis on the input/output relationships, DDACE also contains some tools for analyzing the simulation data. DDACE incorporates a software package called MARS (Multivariate Adaptive Regression Splines), developed by Jerome Friedman. MARS is used for generating a spline surface fit of the data. With MARS, a model simplification may be calculated using the input and corresponding output, values for the user's application problem. The MARS grid data may be used for generating 3-dimensional response surface plots of the simulation data. DDACE also contains an implementation
International petroleum statistics report, July 1999
1999-07-01
The International Petroleum Statistics Report is a monthly publication that provides current international oil data. This report is published for the use of Members of Congress, Federal agencies, State agencies, industry, and the general public. The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1990, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years and annually for the three years prior to that. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1998; OECD stocks from 1973 through 1998; and OECD trade from 1988 through 1998. 4 figs., 44 tabs.
International petroleum statistics report, March 1998
1998-03-01
The International Petroleum Statistics Report is a monthly publication that provides current international oil data. This report is published for the use of Members of Congress, Federal agencies, State agencies, industry, and the general public. The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1996; OECD stocks from 1973 through 1996; and OECD trade from 1986 through 1996.
Statistics and geometry of cosmic voids
Gaite, Jos
2009-11-01
We introduce new statistical methods for the study of cosmic voids, focusing on the statistics of largest size voids. We distinguish three different types of distributions of voids, namely, Poisson-like, lognormal-like and Pareto-like distributions. The last two distributions are connected with two types of fractal geometry of the matter distribution. Scaling voids with Pareto distribution appear in fractal distributions with box-counting dimension smaller than three (its maximum value), whereas the lognormal void distribution corresponds to multifractals with box-counting dimension equal to three. Moreover, voids of the former type persist in the continuum limit, namely, as the number density of observable objects grows, giving rise to lacunar fractals, whereas voids of the latter type disappear in the continuum limit, giving rise to non-lacunar (multi)fractals. We propose both lacunar and non-lacunar multifractal models of the cosmic web structure of the Universe. A non-lacunar multifractal model is supported by current galaxy surveys as well as cosmological N-body simulations. This model suggests, in particular, that small dark matter halos and, arguably, faint galaxies are present in cosmic voids.
Lectures on probability and statistics. Revision
Yost, G.P.
1985-06-01
These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. They begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probabilty of any specified outcome. They finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another. Hopefully, the reader will come away from these notes with a feel for some of the problems and uncertainties involved. Although there are standard approaches, most of the time there is no cut and dried ''best'' solution - ''best'' according to every criterion.
International energy indicators. [Statistical tables and graphs
Bauer, E.K.
1980-05-01
International statistical tables and graphs are given for the following: (1) Iran - Crude Oil Capacity, Production and Shut-in, June 1974-April 1980; (2) Saudi Arabia - Crude Oil Capacity, Production, and Shut-in, March 1974-Apr 1980; (3) OPEC (Ex-Iran and Saudi Arabia) - Capacity, Production and Shut-in, June 1974-March 1980; (4) Non-OPEC Free World and US Production of Crude Oil, January 1973-February 1980; (5) Oil Stocks - Free World, US, Japan, and Europe (Landed, 1973-1st Quarter, 1980); (6) Petroleum Consumption by Industrial Countries, January 1973-December 1979; (7) USSR Crude Oil Production and Exports, January 1974-April 1980; and (8) Free World and US Nuclear Generation Capacity, January 1973-March 1980. Similar statistical tables and graphs included for the United States include: (1) Imports of Crude Oil and Products, January 1973-April 1980; (2) Landed Cost of Saudi Oil in Current and 1974 Dollars, April 1974-January 1980; (3) US Trade in Coal, January 1973-March 1980; (4) Summary of US Merchandise Trade, 1976-March 1980; and (5) US Energy/GNP Ratio, 1947 to 1979.
Issues in International Energy Consumption Analysis: Electricity...
U.S. Energy Information Administration (EIA) Indexed Site
Energy Consumption Analysis: Electricity Usage in India's Housing Sector November 2014 Independent Statistics & Analysis www.eia.gov U.S. Department of Energy Washington, DC ...
Correlating sampling and intensity statistics in nanoparticle diffraction experiments
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Öztürk, Hande; Yan, Hanfei; Hill, John P.; Noyan, I. Cevdet
2015-07-28
It is shown in a previous article [Öztürk, Yan, Hill & Noyan (2014).J. Appl. Cryst.47, 1016–1025] that the sampling statistics of diffracting particle populations within a polycrystalline ensemble depended on the size of the constituent crystallites: broad X-ray peak breadths enabled some nano-sized particles to contribute more than one diffraction spot to Debye–Scherrer rings. Here it is shown that the equations proposed by Alexander, Klug & Kummer [J. Appl. Phys.(1948),19, 742–753] (AKK) to link diffracting particle and diffracted intensity statistics are not applicable if the constituent crystallites of the powder are below 10 nm. In this size range, (i) themore » one-to-one correspondence between diffracting particles and Laue spots assumed in the AKK analysis is not satisfied, and (ii) the crystallographic correlation between Laue spots originating from the same grain invalidates the assumption that all diffracting plane normals are randomly oriented and uncorrelated. Such correlation produces unexpected results in the selection of diffracting grains. For example, three or more Laue spots from a given grain for a particular reflection can only be observed at certain wavelengths. In addition, correcting the diffracted intensity values by the traditional Lorentz term, 1/cos θ, to compensate for the variation of particles sampled within a reflection band does not maintain fidelity to the number of poles contributing to the diffracted signal. A new term, cos θB/cos θ, corrects this problem.« less
Freeman, John; /Fermilab
2004-12-01
The authors measure the mass of the top quark using 162 pb{sup -1} of data collected by the CDF experiment at FNAL in Run II. The decay chain t{bar t} {yields} bq{bar q}{bar b}lv is studied using a novel technique called the Multivariate Template Method (MTM). Using this technique they obtain a result of M{sub top} = 179.6{sub -6.3}{sup +6.4} {+-} 6.8 GeV/c{sup 2} for the top quark.
Wang, Hongmei [Department of Radiation Oncology, Nanfang Hospital, Southern Medical University, Guangzhou, Guangdong Province, P.R. of China (China); Liao, Zhongxing, E-mail: zliao@mdanderson.org [Department of Radiation Oncology, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Zhuang, Yan; Xu, Ting; Nguyen, Quynh-Nhu; Levy, Lawrence B.; O'Reilly, Michael [Department of Radiation Oncology, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Gold, Kathryn A. [Department of Thoracic Medical Oncology, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Gomez, Daniel R. [Department of Radiation Oncology, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States)
2013-12-01
Purpose: Preclinical studies have suggested that angiotensin-converting enzyme inhibitors (ACEIs) can mitigate radiation-induced lung injury. We sought here to investigate possible associations between ACEI use and the risk of symptomatic radiation pneumonitis (RP) among patients undergoing radiation therapy (RT) for nonsmall cell lung cancer (NSCLC). Methods and Materials: We retrospectively identified patients who received definitive radiation therapy for stages I to III NSCLC between 2004 and 2010 at a single tertiary cancer center. Patients must have received a radiation dose of at least 60 Gy for a single primary lung tumor and have had imaging and dosimetric data available for analysis. RP was quantified according to Common Terminology Criteria for Adverse Events, version 3.0. A Cox proportional hazard model was used to assess potential associations between ACEI use and risk of symptomatic RP. Results: Of 413 patients analyzed, 65 were using ACEIs during RT. In univariate analysis, the rate of RP grade ?2 seemed lower in ACEI users than in nonusers (34% vs 46%), but this apparent difference was not statistically significant (P=.06). In multivariate analysis of all patients, ACEI use was not associated with the risk of symptomatic RP (hazard ratio [HR] = 0.66; P=.07) after adjustment for sex, smoking status, mean lung dose (MLD), and concurrent carboplatin and paclitaxel chemotherapy. Subgroup analysis showed that ACEI use did have a protective effect from RP grade ?2 among patients who received a low (?20-Gy) MLD (P<.01) or were male (P=.04). Conclusions: A trend toward reduction in symptomatic RP among patients taking ACEIs during RT for NSCLC was not statistically significant on univariate or multivariate analyses, although certain subgroups may benefit from use (ie, male patients and those receiving low MLD). The evidence at this point is insufficient to establish whether the use of ACEIs does or does not reduce the risk of RP.
Statistical Methods and Tools for Hanford Staged Feed Tank Sampling
Fountain, Matthew S.; Brigantic, Robert T.; Peterson, Reid A.
2013-10-01
This report summarizes work conducted by Pacific Northwest National Laboratory to technically evaluate the current approach to staged feed sampling of high-level waste (HLW) sludge to meet waste acceptance criteria (WAC) for transfer from tank farms to the Hanford Waste Treatment and Immobilization Plant (WTP). The current sampling and analysis approach is detailed in the document titled Initial Data Quality Objectives for WTP Feed Acceptance Criteria, 24590-WTP-RPT-MGT-11-014, Revision 0 (Arakali et al. 2011). The goal of this current work is to evaluate and provide recommendations to support a defensible, technical and statistical basis for the staged feed sampling approach that meets WAC data quality objectives (DQOs).
International petroleum statistics report, October 1997
1997-10-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1996; OECD stocks from 1973 through 1996; and OECD trade from 1986 through 1996. 4 figs., 48 tabs.
Statistical fingerprinting for malware detection and classification
Prowell, Stacy J.; Rathgeb, Christopher T.
2015-09-15
A system detects malware in a computing architecture with an unknown pedigree. The system includes a first computing device having a known pedigree and operating free of malware. The first computing device executes a series of instrumented functions that, when executed, provide a statistical baseline that is representative of the time it takes the software application to run on a computing device having a known pedigree. A second computing device executes a second series of instrumented functions that, when executed, provides an actual time that is representative of the time the known software application runs on the second computing device. The system detects malware when there is a difference in execution times between the first and the second computing devices.
International Petroleum Statistics Report, January 1994
Not Available
1994-01-31
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade for the years 1970 through 1992; OECD stocks from 1973 through 1992; and OECD trade from 1982 through 1992.
International petroleum statistics report, November 1993
Not Available
1993-11-26
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries World oil production and OECD demand data are for the years 1970 through 1992; OECD stocks from 1973 through 1992: and OECD trade from 1982 through 1992.
Statistical correlations in the Moshinsky atom
Laguna, H. G.; Sagar, R. P.
2011-07-15
We study the influence of the interparticle and confining potentials on statistical correlation via the correlation coefficient and mutual information in ground and some excited states of the Moshinsky atom in position and momentum space. The magnitude of the correlation between positions and between momenta is equal in the ground state. In excited states, the correlation between the momenta of the particles is greater than between their positions when they interact through an attractive potential whereas for repulsive interparticle potentials the opposite is true. Shannon entropies, and their sums (entropic formulations of the uncertainty principle), are also analyzed, showing that the one-particle entropy sum is dependent on the interparticle potential and thus able to detect the correlation between particles.
International Petroleum Statistics Report, July 1994
Not Available
1994-07-26
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1993; OECD stocks from 1973 through 1993; and OECD trade from 1983 through 1993. Data for the United States are developed by the Energy Information Administration`s (EIA) Office of Oil and Gas. Data for other countries are derived largely from published sources, including International Energy Agency publications, the EIA International Energy Annual, and the trade press. (See sources after each section.) All data are reviewed by the International Statistics Branch of EIA. All data have been converted to units of measurement familiar to the American public. Definitions of oil production and consumption are consistent with other EIA publications.
Chapter 11. Community analysis-based methods
Cao, Y.; Wu, C.H.; Andersen, G.L.; Holden, P.A.
2010-05-01
Microbial communities are each a composite of populations whose presence and relative abundance in water or other environmental samples are a direct manifestation of environmental conditions, including the introduction of microbe-rich fecal material and factors promoting persistence of the microbes therein. As shown by culture-independent methods, different animal-host fecal microbial communities appear distinctive, suggesting that their community profiles can be used to differentiate fecal samples and to potentially reveal the presence of host fecal material in environmental waters. Cross-comparisons of microbial communities from different hosts also reveal relative abundances of genetic groups that can be used to distinguish sources. In increasing order of their information richness, several community analysis methods hold promise for MST applications: phospholipid fatty acid (PLFA) analysis, denaturing gradient gel electrophoresis (DGGE), terminal restriction fragment length polymorphism (TRFLP), cloning/sequencing, and PhyloChip. Specific case studies involving TRFLP and PhyloChip approaches demonstrate the ability of community-based analyses of contaminated waters to confirm a diagnosis of water quality based on host-specific marker(s). The success of community-based MST for comprehensively confirming fecal sources relies extensively upon using appropriate multivariate statistical approaches. While community-based MST is still under evaluation and development as a primary diagnostic tool, results presented herein demonstrate its promise. Coupled with its inherently comprehensive ability to capture an unprecedented amount of microbiological data that is relevant to water quality, the tools for microbial community analysis are increasingly accessible, and community-based approaches have unparalleled potential for translation into rapid, perhaps real-time, monitoring platforms.
U.S. Department of Commerce Economics and Statistics Administration
Office of Environmental Management (EM)
... Data from the Bureau of Labor Statistics' Occupational Employment Statistics program show ... degrees (e.g., MBA's, law, and medicine) work in STEM jobs because such U.S. ...
Detailed Monthly and Annual LNG Import Statistics (2004-2012...
Broader source: Energy.gov (indexed) [DOE]
Monthly and Annual LNG Import Statistics (2004-2012) Detailed Monthly and Annual LNG Import Statistics (2004-2012) (1.07 MB) More Documents & Publications U.S. LNG Imports and ...
EU Pocketbook - European Vehicle Market Statistics | Open Energy...
- European Vehicle Market Statistics AgencyCompany Organization: International Council on Clean Transportation Website: eupocketbook.theicct.org Transport Toolkit...
User Statistics | U.S. DOE Office of Science (SC)
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Statistics User Facilities User Facilities Home User Facilities at a Glance User Resources User Statistics By Institution By Project Data Archive User Statistics Collection Practices Policies and Processes Frequently Asked Questions User Facility Science Highlights User Facility News Contact Information Office of Science U.S. Department of Energy 1000 Independence Ave., SW Washington, DC 20585 P: (202) 586-5430 User Statistics Print Text Size: A A A FeedbackShare Page The Office of Science
2011 Annual Merit Review Results Report - Project and Program Statistics
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Calculations Overview | Department of Energy Project and Program Statistics Calculations Overview 2011 Annual Merit Review Results Report - Project and Program Statistics Calculations Overview Merit review of DOE Vehicle Technologies research activities 2011_amr_11.pdf (119.47 KB) More Documents & Publications 2013 Annual Merit Review Results Report - Project and Program Statistical Calculations Overview 2012 Annual Merit Review Results Report - Project and Program Statistical
2012 Annual Merit Review Results Report - Project and Program Statistical
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Calculations Overview | Department of Energy Project and Program Statistical Calculations Overview 2012 Annual Merit Review Results Report - Project and Program Statistical Calculations Overview Merit review of DOE Vehicle Technologies research activities 2012_amr_11.pdf (202.53 KB) More Documents & Publications 2013 Annual Merit Review Results Report - Project and Program Statistical Calculations Overview 2014 Annual Merit Review Results Report - Project and Program Statistical
2013 Annual Merit Review Results Report - Project and Program Statistical
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Calculations Overview | Department of Energy Project and Program Statistical Calculations Overview 2013 Annual Merit Review Results Report - Project and Program Statistical Calculations Overview Merit review of DOE Vehicle Technologies research activities 2013_amr_12.pdf (202.9 KB) More Documents & Publications 2012 Annual Merit Review Results Report - Project and Program Statistical Calculations Overview 2014 Annual Merit Review Results Report - Project and Program Statistical
2014 Annual Merit Review Results Report - Project and Program Statistical
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Calculations Overview | Department of Energy Project and Program Statistical Calculations Overview 2014 Annual Merit Review Results Report - Project and Program Statistical Calculations Overview Merit review of DOE Vehicle Technologies research activities 2014_amr_12.pdf (427.96 KB) More Documents & Publications 2013 Annual Merit Review Results Report - Project and Program Statistical Calculations Overview 2012 Annual Merit Review Results Report - Project and Program Statistical
Metoyer, Candace N.; Walsh, Stephen J.; Tardiff, Mark F.; Chilton, Lawrence
2008-10-30
The detection and identification of weak gaseous plumes using thermal imaging data is complicated by many factors. These include variability due to atmosphere, ground and plume temperature, and background clutter. This paper presents an analysis of one formulation of the physics-based model that describes the at-sensor observed radiance. The motivating question for the analyses performed in this paper is as follows. Given a set of backgrounds, is there a way to predict the background over which the probability of detecting a given chemical will be the highest? Two statistics were developed to address this question. These statistics incorporate data from the long-wave infrared band to predict the background over which chemical detectability will be the highest. These statistics can be computed prior to data collection. As a preliminary exploration into the predictive ability of these statistics, analyses were performed on synthetic hyperspectral images. Each image contained one chemical (either carbon tetrachloride or ammonia) spread across six distinct background types. The statistics were used to generate predictions for the background ranks. Then, the predicted ranks were compared to the empirical ranks obtained from the analyses of the synthetic images. For the simplified images under consideration, the predicted and empirical ranks showed a promising amount of agreement. One statistic accurately predicted the best and worst background for detection in all of the images. Future work may include explorations of more complicated plume ingredients, background types, and noise structures.
International petroleum statistics report, February 1999
1999-02-01
The International petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970--1997; OECD stocks from 1973--1997; and OECD trade from 1987--1997.
International petroleum statistics report, April 1999
1999-05-04
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance fore the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1997; OECD stocks from 1973 through 1997; and OECD trade from 1987 through 1997. 4 figs., 48 tabs.
International petroleum statistics report, February 1997
1997-02-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1995; OECD stocks from 1973 through 1995; and OECD trade from 1985 through 1995. 4 figs., 47 tabs.
International petroleum statistics report, January 1999
1999-01-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1997; OECD stocks from 1973 through 1997; and OECD trade from 1987 through 1997. 4 figs., 46 tabs.
International petroleum statistics report, August 1995
1995-08-25
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1994; OECD stocks from 1973 through 1994; and OECD trade from 1984 through 1994.
International petroleum statistics report, August 1998
1998-08-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1997; OECD stocks from 1973 through 1997; and OECD trade from 1987 through 1997. 4 figs., 48 tabs.
International petroleum statistics report, June 1998
1998-06-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1997; OECD stocks from 1973 through 1997; and OECD trade from 1987 through 1997. 4 figs., 48 tabs.
International petroleum statistics report, August 1994
Not Available
1994-08-26
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1993; OECD stocks from 1973 through 1993; and OECD trade from 1983 through 1993.
International petroleum statistics report, March 1995
1995-03-30
The International Petroleum Statistics Report presents data for March 1995 on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1993; OECD stocks from 1973 through 1993; and OECD trade from 1983 through 1993.
International petroleum statistics report, November 1994
Not Available
1994-11-25
Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The International production, and on oil and stocks. The report has four sections. Section 1 contains time series data on world oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1993; OECD stocks from 1973 through 1993; and OECD trade from 1983 through 1993.
International petroleum statistics report, April 1998
1998-04-01
The International Petroleum Statistics Report presents data on International oil production, demand, imports and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1997; OECD stocks from 1973 through 1997; and OECD trade from 1986 through 1996. 4 figs., 46 tabs.
International petroleum statistics report, December 1997
1997-12-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. The balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1996; OECD stocks from 1973 through 1996; and OECD trade from 1986 through 1996. 4 figs., 46 tabs.
International petroleum statistics report, November 1998
1998-11-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1997; OECD stocks from 1973 through 1997; and OECD trade from 1987 through 1997. 4 figs., 46 tabs.
International petroleum statistics report, October 1998
1998-10-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1997; OECD stocks from 1973 through 1997; and OECD trade from 1987 through 1997. 4 figs., 46 tabs.
International petroleum statistics report, May 1998
1998-05-01
The International Petroleum Statistics report is a monthly publication that provides current international oil data. It presents data on international production, demand, imports and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two year. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1997; OECD stocks from 1973 through 1997, and OECD trade from 1987 through 1997. 4 fig., 48 tabs.
International petroleum statistics report, September 1995
1995-09-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1994; OECD stocks from 1973 through 1994; and OECD trade from 1984 through 1994. 4 figs., 45 tabs.
International petroleum statistics report, September 1994
Not Available
1994-09-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (ECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1993; OECD stocks from 1973 through 1993; and OECD trade from 1983 through 1993.
International petroleum statistics report, October 1993
Not Available
1993-10-27
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1980, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1992; OECD stocks from 1982 through 1992.
International petroleum statistics report, December 1993
Not Available
1993-12-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1992; OECD stocks from 1973 through 1992; and OECD trade from 1982 through 1992. 41 tabs.
International petroleum statistics report, April 1994
Not Available
1994-04-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1993; OECD stocks from 1973 through 1993; and OECD trade from 1982 through 1992. 41 tables.
International petroleum statistics report, February 1994
Not Available
1994-02-28
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1992; OECD stocks from 1973 through 1992; and OECD trade from 1982 through 1992.
International petroleum statistics report, September 1996
1996-09-27
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1995; OECD stocks from 1973 through 1995; and OECD trade from 1985 through 1995.
International petroleum statistics report, September 1998
1998-09-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1997; OECD stocks from 1973 through 1997; and OECD trade from 1987 through 1997. 4 figs., 46 tabs.
International petroleum statistics report, June 1997
1997-06-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1996; OECD stocks from 1973 through 1996; and OECD trade from 1986 through 1996. 46 tabs.
International petroleum statistics report, April 1997
1997-04-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1995; OECD stocks from 1973 through 1995; and OECD trade from 1985 through 1995. 4 figs., 47 tabs.
International petroleum statistics report, May 1999
1999-05-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1990, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1998; OECD stocks from 1973 through 1998; and OECD trade from 1988 through 1998. 4 figs., 48 tabs.
International petroleum statistics report, May 1995
1995-05-30
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1994; OECD stocks from 1973 through 1994; and OECD trade from 1983 through 1993.
International petroleum statistics report, June 1999
1999-06-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1990, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years and annually for the three years prior to that. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1998; OECD stocks from 1973 through 1998; and OECD trade from 1988 through 1998. 4 figs., 46 tabs.
International petroleum statistics report, February 1996
1996-02-28
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1994; OECD stocks from 1973 through 1994; and OECD trade from 1984 through 1994.
International petroleum statistics report, July 1998
1998-07-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1997; OECD stocks from 1973 through 1997; and OECD trade from 1987 through 1997. 4 figs., 46 tabs.
International petroleum statistics report, December 1998
1998-12-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1997; OECD stocks from 1973 through 1997; and OECD trade from 1987 through 1997. 4 figs., 46 tabs.
International petroleum statistics report, March 1999
1999-03-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1990, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years and annually for the three years prior to that. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarter data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1997; OECD stocks from 1973 through 1997; and OECD trade from 1987 through 1997. 4 figs., 48 tabs.
International petroleum statistics report, March 1994
1994-03-28
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1992; OECD stocks from 1973 through 1992; and OECD trade from 1982 through 1992.
International petroleum statistics report, February 1998
1998-02-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1996; OECD stocks from 1973 through 1996; and OECD trade from 1986 through 1996. 4 figs., 48 tabs.
Random paths and current fluctuations in nonequilibrium statistical mechanics
Gaspard, Pierre
2014-07-15
An overview is given of recent advances in nonequilibrium statistical mechanics about the statistics of random paths and current fluctuations. Although statistics is carried out in space for equilibrium statistical mechanics, statistics is considered in time or spacetime for nonequilibrium systems. In this approach, relationships have been established between nonequilibrium properties such as the transport coefficients, the thermodynamic entropy production, or the affinities, and quantities characterizing the microscopic Hamiltonian dynamics and the chaos or fluctuations it may generate. This overview presents results for classical systems in the escape-rate formalism, stochastic processes, and open quantum systems.
Microsoft PowerPoint - Belianinov_2015_StaffScienceHighlight...
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
channel capacity limit. The use of multivariate statistical methods such as Principal Component Analysis, allows one to guide a data sampling strategy and attain insight into...
Theodore M. Flynn | Argonne National Laboratory
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
amplicon and whole-genome shotgun sequence data, metagenomics, genome assembly, and multivariate statistical analysis of large microbial community datasets). He also has expertise...
Correlating sampling and intensity statistics in nanoparticle diffraction experiments
Öztürk, Hande; Yan, Hanfei; Hill, John P.; Noyan, I. Cevdet
2015-07-28
It is shown in a previous article [Öztürk, Yan, Hill & Noyan (2014).
Spatial Statistical Procedures to Validate Input Data in Energy Models
Johannesson, G.; Stewart, J.; Barr, C.; Brady Sabeff, L.; George, R.; Heimiller, D.; Milbrandt, A.
2006-01-01
Energy modeling and analysis often relies on data collected for other purposes such as census counts, atmospheric and air quality observations, economic trends, and other primarily non-energy related uses. Systematic collection of empirical data solely for regional, national, and global energy modeling has not been established as in the abovementioned fields. Empirical and modeled data relevant to energy modeling is reported and available at various spatial and temporal scales that might or might not be those needed and used by the energy modeling community. The incorrect representation of spatial and temporal components of these data sets can result in energy models producing misleading conclusions, especially in cases of newly evolving technologies with spatial and temporal operating characteristics different from the dominant fossil and nuclear technologies that powered the energy economy over the last two hundred years. Increased private and government research and development and public interest in alternative technologies that have a benign effect on the climate and the environment have spurred interest in wind, solar, hydrogen, and other alternative energy sources and energy carriers. Many of these technologies require much finer spatial and temporal detail to determine optimal engineering designs, resource availability, and market potential. This paper presents exploratory and modeling techniques in spatial statistics that can improve the usefulness of empirical and modeled data sets that do not initially meet the spatial and/or temporal requirements of energy models. In particular, we focus on (1) aggregation and disaggregation of spatial data, (2) predicting missing data, and (3) merging spatial data sets. In addition, we introduce relevant statistical software models commonly used in the field for various sizes and types of data sets.
Bonne, François; Bonnay, Patrick; Bradu, Benjamin
2014-01-29
In this paper, a multivariable model-based non-linear controller for Warm Compression Stations (WCS) is proposed. The strategy is to replace all the PID loops controlling the WCS with an optimally designed model-based multivariable loop. This new strategy leads to high stability and fast disturbance rejection such as those induced by a turbine or a compressor stop, a key-aspect in the case of large scale cryogenic refrigeration. The proposed control scheme can be used to have precise control of every pressure in normal operation or to stabilize and control the cryoplant under high variation of thermal loads (such as a pulsed heat load expected to take place in future fusion reactors such as those expected in the cryogenic cooling systems of the International Thermonuclear Experimental Reactor ITER or the Japan Torus-60 Super Advanced fusion experiment JT-60SA). The paper details how to set the WCS model up to synthesize the Linear Quadratic Optimal feedback gain and how to use it. After preliminary tuning at CEA-Grenoble on the 400W@1.8K helium test facility, the controller has been implemented on a Schneider PLC and fully tested first on the CERN's real-time simulator. Then, it was experimentally validated on a real CERN cryoplant. The efficiency of the solution is experimentally assessed using a reasonable operating scenario of start and stop of compressors and cryogenic turbines. This work is partially supported through the European Fusion Development Agreement (EFDA) Goal Oriented Training Program, task agreement WP10-GOT-GIRO.
Statistical theory of turbulent incompressible multimaterial flow
Kashiwa, B.
1987-10-01
Interpenetrating motion of incompressible materials is considered. ''Turbulence'' is defined as any deviation from the mean motion. Accordingly a nominally stationary fluid will exhibit turbulent fluctuations due to a single, slowly moving sphere. Mean conservation equations for interpenetrating materials in arbitrary proportions are derived using an ensemble averaging procedure, beginning with the exact equations of motion. The result is a set of conservation equations for the mean mass, momentum and fluctuational kinetic energy of each material. The equation system is at first unclosed due to integral terms involving unknown one-point and two-point probability distribution functions. In the mean momentum equation, the unclosed terms are clearly identified as representing two physical processes. One is transport of momentum by multimaterial Reynolds stresses, and the other is momentum exchange due to pressure fluctuations and viscous stress at material interfaces. Closure is approached by combining careful examination of multipoint statistical correlations with the traditional physical technique of kappa-epsilon modeling for single-material turbulence. This involves representing the multimaterial Reynolds stress for each material as a turbulent viscosity times the rate of strain based on the mean velocity of that material. The multimaterial turbulent viscosity is related to the fluctuational kinetic energy kappa, and the rate of fluctuational energy dissipation epsilon, for each material. Hence a set of kappa and epsilon equations must be solved, together with mean mass and momentum conservation equations, for each material. Both kappa and the turbulent viscosities enter into the momentum exchange force. The theory is applied to (a) calculation of the drag force on a sphere fixed in a uniform flow, (b) calculation of the settling rate in a suspension and (c) calculation of velocity profiles in the pneumatic transport of solid particles in a pipe.
A Statistical Perspective on Highly Accelerated Testing.
Thomas, Edward V.
2015-02-01
Highly accelerated life testing has been heavily promoted at Sandia (and elsewhere) as a means to rapidly identify product weaknesses caused by flaws in the product's design or manufacturing process. During product development, a small number of units are forced to fail at high stress. The failed units are then examined to determine the root causes of failure. The identification of the root causes of product failures exposed by highly accelerated life testing can instigate changes to the product's design and/or manufacturing process that result in a product with increased reliability. It is widely viewed that this qualitative use of highly accelerated life testing (often associated with the acronym HALT) can be useful. However, highly accelerated life testing has also been proposed as a quantitative means for "demonstrating" the reliability of a product where unreliability is associated with loss of margin via an identified and dominating failure mechanism. It is assumed that the dominant failure mechanism can be accelerated by changing the level of a stress factor that is assumed to be related to the dominant failure mode. In extreme cases, a minimal number of units (often from a pre-production lot) are subjected to a single highly accelerated stress relative to normal use. If no (or, sufficiently few) units fail at this high stress level, some might claim that a certain level of reliability has been demonstrated (relative to normal use conditions). Underlying this claim are assumptions regarding the level of knowledge associated with the relationship between the stress level and the probability of failure. The primary purpose of this document is to discuss (from a statistical perspective) the efficacy of using accelerated life testing protocols (and, in particular, "highly accelerated" protocols) to make quantitative inferences concerning the performance of a product (e.g., reliability) when in fact there is lack-of-knowledge and uncertainty concerning the
A Statistical Framework for Microbial Source Attribution
Velsko, S P; Allen, J E; Cunningham, C T
2009-04-28
This report presents a general approach to inferring transmission and source relationships among microbial isolates from their genetic sequences. The outbreak transmission graph (also called the transmission tree or transmission network) is the fundamental structure which determines the statistical distributions relevant to source attribution. The nodes of this graph are infected individuals or aggregated sub-populations of individuals in which transmitted bacteria or viruses undergo clonal expansion, leading to a genetically heterogeneous population. Each edge of the graph represents a transmission event in which one or a small number of bacteria or virions infects another node thus increasing the size of the transmission network. Recombination and re-assortment events originate in nodes which are common to two distinct networks. In order to calculate the probability that one node was infected by another, given the observed genetic sequences of microbial isolates sampled from them, we require two fundamental probability distributions. The first is the probability of obtaining the observed mutational differences between two isolates given that they are separated by M steps in a transmission network. The second is the probability that two nodes sampled randomly from an outbreak transmission network are separated by M transmission events. We show how these distributions can be obtained from the genetic sequences of isolates obtained by sampling from past outbreaks combined with data from contact tracing studies. Realistic examples are drawn from the SARS outbreak of 2003, the FMDV outbreak in Great Britain in 2001, and HIV transmission cases. The likelihood estimators derived in this report, and the underlying probability distribution functions required to calculate them possess certain compelling general properties in the context of microbial forensics. These include the ability to quantify the significance of a sequence 'match' or 'mismatch' between two isolates
Nuclear Forensic Inferences Using Iterative Multidimensional Statistics
Robel, M; Kristo, M J; Heller, M A
2009-06-09
Nuclear forensics involves the analysis of interdicted nuclear material for specific material characteristics (referred to as 'signatures') that imply specific geographical locations, production processes, culprit intentions, etc. Predictive signatures rely on expert knowledge of physics, chemistry, and engineering to develop inferences from these material characteristics. Comparative signatures, on the other hand, rely on comparison of the material characteristics of the interdicted sample (the 'questioned sample' in FBI parlance) with those of a set of known samples. In the ideal case, the set of known samples would be a comprehensive nuclear forensics database, a database which does not currently exist. In fact, our ability to analyze interdicted samples and produce an extensive list of precise materials characteristics far exceeds our ability to interpret the results. Therefore, as we seek to develop the extensive databases necessary for nuclear forensics, we must also develop the methods necessary to produce the necessary inferences from comparison of our analytical results with these large, multidimensional sets of data. In the work reported here, we used a large, multidimensional dataset of results from quality control analyses of uranium ore concentrate (UOC, sometimes called 'yellowcake'). We have found that traditional multidimensional techniques, such as principal components analysis (PCA), are especially useful for understanding such datasets and drawing relevant conclusions. In particular, we have developed an iterative partial least squares-discriminant analysis (PLS-DA) procedure that has proven especially adept at identifying the production location of unknown UOC samples. By removing classes which fell far outside the initial decision boundary, and then rebuilding the PLS-DA model, we have consistently produced better and more definitive attributions than with a single pass classification approach. Performance of the iterative PLS-DA method
Experimental uncertainty estimation and statistics for data having interval uncertainty.
Kreinovich, Vladik; Oberkampf, William Louis; Ginzburg, Lev; Ferson, Scott; Hajagos, Janos
2007-05-01
This report addresses the characterization of measurements that include epistemic uncertainties in the form of intervals. It reviews the application of basic descriptive statistics to data sets which contain intervals rather than exclusively point estimates. It describes algorithms to compute various means, the median and other percentiles, variance, interquartile range, moments, confidence limits, and other important statistics and summarizes the computability of these statistics as a function of sample size and characteristics of the intervals in the data (degree of overlap, size and regularity of widths, etc.). It also reviews the prospects for analyzing such data sets with the methods of inferential statistics such as outlier detection and regressions. The report explores the tradeoff between measurement precision and sample size in statistical results that are sensitive to both. It also argues that an approach based on interval statistics could be a reasonable alternative to current standard methods for evaluating, expressing and propagating measurement uncertainties.
A structural analysis of natural gas consumption by income class from 1987 to 1993
Poyer, D.A.
1996-12-01
This study had two major objectives: (1) assess and compare changes in natural gas consumption between 1987 and 1993 by income group and (2) assess the potential influence of energy policy on observed changes in natural gas consumption over time and across income groups. This analysis used U.S. Department of Energy (DOE) data files and involved both the generation of simple descriptive statistics and the use of multivariate regression analysis. The consumption of natural gas by the groups was studied over a six-year period. The results showed that: (1) natural gas use was substantially higher for the highest income group than for the two lower income groups and (2) natural gas consumption declined for the lowest and middle income quintiles and increased for the highest income quintile between 1987 and 1990; between 1990 and 1993, consumption increased for the lowest and middle income quintile, but remained relatively constant for the highest income quintile. The relative importance of the structural and variable factors in explaining consumption changes between survey periods varies by income group. The analysis provides two major energy policy implications: (1) natural gas intensity has been the highest for the lowest income group, indicating that this group is more vulnerable to sudden changes in demand-indicator variables, in particular weather-related variables, than increase natural gas consumption, and (2) the fall in natural gas intensity between 1987 and 1993 may indicate that energy policy has had some impact on reducing natural gas consumption. 11 refs., 4 figs., 16 tabs.
Nonlinearity sensing via photon-statistics excitation spectroscopy
Assmann, Marc; Bayer, Manfred
2011-11-15
We propose photon-statistics excitation spectroscopy as an adequate tool to describe the optical response of a nonlinear system. To this end we suggest to use optical excitation with varying photon statistics as another spectroscopic degree of freedom to gather information about the system in question. The responses of several simple model systems to excitation beams with different photon statistics are discussed. Possible spectroscopic applications in terms of identifying lasing operation are pointed out.
Energy-Efficient and Comfortable Buildings through Multivariate Integrated Control (ECoMIC)
Birru, Dagnachew; Wen, Yao-Jung; Rubinstein, Francis M.; Clear, Robert D.
2013-10-28
instantiated. Analysis on the aspect of economic benefit has yielded an about 6-year payback time for a medium-sized building, including the installation of all hardware and software, such as motorized blinds and LED luminaires. The payback time can be significantly reduced if part of the hardware is already in place for retrofitting projects. It needs to be noted that since the payback analysis was partly based on the testbed performance results, it is constrained by the caveats associated with the testbed implementations. The main uncertainty lies in the contribution from the space conditioning energy savings as it was non-trivial to realistically configure a room-size HVAC system for directly extrapolating whole-building HVAC energy savings. It is recommended to further evaluate the developed technology at a larger scale, where the lighting and HVAC energy consumption can be realistically measured at the building level, to more rigorously quantify the performance potentials.
Testing Statistical Cloud Scheme Ideas in the GFDL Climate Model
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Testing Statistical Cloud Scheme Ideas in the GFDL Climate Model Klein, Stephen Lawrence Livermore National Laboratory Pincus, Robert NOAA-CIRES Climate Diagnostics Center...
Robust Algorithm for Computing Statistical Stark Broadening of...
Office of Scientific and Technical Information (OSTI)
Citation Details In-Document Search Title: Robust Algorithm for Computing Statistical ... Language: English Subject: 70 PLASMA PHYSICS AND FUSION; ACCURACY; ALGORITHMS; ...
Doppler Lidar Vertical Velocity Statistics Value-Added Product...
Office of Scientific and Technical Information (OSTI)
Citation Details In-Document Search Title: Doppler Lidar Vertical Velocity Statistics ... Facility operates coherent Doppler lidar systems at several sites around the globe. ...
UN-Glossary for Transportation Statistics | Open Energy Information
Publications Website: www.internationaltransportforum.orgPubpdfGloStat3e.pdf Cost: Free UN-Glossary for Transportation Statistics Screenshot References: UN-Glossary for...
Overview of North American Energy Trade Statistics: Methodologies...
U.S. Energy Information Administration (EIA) Indexed Site
... 34 Imports that enter bonded storage are reported ... trade statistics in terms of the Harmonized System; ... service providers and Web search portals, and Internet ...
An overview of component qualification using Bayesian statistics...
Office of Scientific and Technical Information (OSTI)
Example problems with solutions have been supplied as a learning aid. Bold letters are ... COMPUTING, AND INFORMATION SCIENCE; LEARNING; STATISTICS; MATHEMATICS Word Cloud More ...
Autocorrelation Function Statistics and Implication to Decay Ratio Estimation
March-Leuba, Jose A.
2016-01-01
This document summarizes the results of a series of computer simulations to attempt to identify the statistics of the autocorrelation function, and implications for decay ratio estimation.
Experimental and Statistical Comparison of Engine Response as...
Office of Scientific and Technical Information (OSTI)
Experimental and Statistical Comparison of Engine Response as a Function of Fuel Chemistry ... Engine Response as a Function of Fuel Chemistry and Properties in CI and HCCI Engines ...
TITLE V-CONFIDENTIAL INFORMATION PROTECTION AND STATISTICAL EFFI...
Gasoline and Diesel Fuel Update (EIA)
or maintain the systems for handling or storage of data received under this title; and ... data anomalies, produce statistical samples that are consistently adjusted for the ...
STATISTICAL MECHANICS MODELING OF MESOSCALE DEFORMATION IN METALS...
Office of Scientific and Technical Information (OSTI)
STATISTICAL MECHANICS MODELING OF MESOSCALE DEFORMATION IN METALS Anter El-Azab 36 MATERIALS SCIENCE dislocation dynamics; mesoscale deformation of metals; crystal mechanics...
A statistical perspective of validation and UQ (Conference) ...
Office of Scientific and Technical Information (OSTI)
Title: A statistical perspective of validation and UQ Authors: Higdon, David M 1 + Show Author Affiliations Los Alamos National Laboratory Los Alamos National Laboratory ...
UNECE-Annual Bulletin of Transport Statistics for Europe and...
Data covers Europe, Canada and the United States. This is a trilingual publication in English, French and Russian." "This annual publication presents statistics and brief studies...
WHO Statistical Information System (WHOSIS) | Open Energy Information
Classification of Diseases (ICD-10), International Classification of Impairments, Disabilities and Handicaps (ICIDH) Links to other sources of health-related statistical...
MISR-Derived Statistics of Cumulus Geometry at TWP Site
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
... We found that the satellite-derived basic statistics are similar to those from ... New directions in earth observing: Scientific applications of multi-angle remote sensing. ...
BP Statistical Review of World Energy | Open Energy Information
OpenEI The BP Statistical Review of World Energy is an Excel spreadsheet which contains consumption and production data for Coal, Natural Gas, Nuclear, Oil, and Hydroelectric...
International Monetary Fund-Data and Statistics | Open Energy...
"The IMF publishes a range of time series data on IMF lending, exchange rates and other economic and financial indicators. Manuals, guides, and other material on statistical...
IRF-World Road Statistics | Open Energy Information
AgencyCompany Organization: International Road Statistics Focus Area: Transportation, Economic Development Resource Type: Dataset Website: www.irfnet.orgstatistics.php Cost:...
STUDIES IN ASTRONOMICAL TIME SERIES ANALYSIS. VI. BAYESIAN BLOCK REPRESENTATIONS
Scargle, Jeffrey D.; Norris, Jay P.; Jackson, Brad; Chiang, James
2013-02-20
This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it-an improved and generalized version of Bayesian Blocks-that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piecewise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by Arias-Castro et al. In the spirit of Reproducible Research all of the code and data necessary to reproduce all of the figures in this paper are included as supplementary material.
Development of a statistically based access delay timeline methodology.
Rivera, W. Gary; Robinson, David Gerald; Wyss, Gregory Dane; Hendrickson, Stacey M. Langfitt
2013-02-01
The charter for adversarial delay is to hinder access to critical resources through the use of physical systems increasing an adversary's task time. The traditional method for characterizing access delay has been a simple model focused on accumulating times required to complete each task with little regard to uncertainty, complexity, or decreased efficiency associated with multiple sequential tasks or stress. The delay associated with any given barrier or path is further discounted to worst-case, and often unrealistic, times based on a high-level adversary, resulting in a highly conservative calculation of total delay. This leads to delay systems that require significant funding and personnel resources in order to defend against the assumed threat, which for many sites and applications becomes cost prohibitive. A new methodology has been developed that considers the uncertainties inherent in the problem to develop a realistic timeline distribution for a given adversary path. This new methodology incorporates advanced Bayesian statistical theory and methodologies, taking into account small sample size, expert judgment, human factors and threat uncertainty. The result is an algorithm that can calculate a probability distribution function of delay times directly related to system risk. Through further analysis, the access delay analyst or end user can use the results in making informed decisions while weighing benefits against risks, ultimately resulting in greater system effectiveness with lower cost.
Statistical study of reconnection exhausts in the solar wind
Enl, J.; P?ech, L.; afrnkov, J.; N?me?ek, Z.
2014-11-20
Magnetic reconnection is a fundamental process that changes magnetic field configuration and converts a magnetic energy to flow energy and plasma heating. This paper presents a survey of the plasma and magnetic field parameters inside 418 reconnection exhausts identified in the WIND data from 1995-2012. The statistical analysis is oriented on the re-distribution of the magnetic energy released due to reconnection between a plasma acceleration and its heating. The results show that both the portion of the energy deposited into heat as well as the energy spent on the acceleration of the exhaust plasma rise with the magnetic shear angle in accord with the increase of the magnetic flux available for reconnection. The decrease of the normalized exhaust speed with the increasing magnetic shear suggests a decreasing efficiency of the acceleration and/or the increasing efficiency of heating in high-shear events. However, we have found that the already suggested relation between the exhaust speed and temperature enhancement would be rather considered as an upper limit of the plasma heating during reconnection regardless of the shear angle.
Synchrotron IR microspectroscopy for protein structure analysis: Potential and questions
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Yu, Peiqiang
2006-01-01
Synchrotron radiation-based Fourier transform infrared microspectroscopy (S-FTIR) has been developed as a rapid, direct, non-destructive, bioanalytical technique. This technique takes advantage of synchrotron light brightness and small effective source size and is capable of exploring the molecular chemical make-up within microstructures of a biological tissue without destruction of inherent structures at ultra-spatial resolutions within cellular dimension. To date there has been very little application of this advanced technique to the study of pure protein inherent structure at a cellular level in biological tissues. In this review, a novel approach was introduced to show the potential of the newly developed, advancedmore » synchrotron-based analytical technology, which can be used to localize relatively “pure“ protein in the plant tissues and relatively reveal protein inherent structure and protein molecular chemical make-up within intact tissue at cellular and subcellular levels. Several complex protein IR spectra data analytical techniques (Gaussian and Lorentzian multi-component peak modeling, univariate and multivariate analysis, principal component analysis (PCA), and hierarchical cluster analysis (CLA) are employed to relatively reveal features of protein inherent structure and distinguish protein inherent structure differences between varieties/species and treatments in plant tissues. By using a multi-peak modeling procedure, RELATIVE estimates (but not EXACT determinations) for protein secondary structure analysis can be made for comparison purpose. The issues of pro- and anti-multi-peaking modeling/fitting procedure for relative estimation of protein structure were discussed. By using the PCA and CLA analyses, the plant molecular structure can be qualitatively separate one group from another, statistically, even though the spectral assignments are not known. The synchrotron-based technology provides a new approach for protein structure research in
Statistical techniques for characterizing residual waste in single-shell and double-shell tanks
Jensen, L., Fluor Daniel Hanford
1997-02-13
A primary objective of the Hanford Tank Initiative (HTI) project is to develop methods to estimate the inventory of residual waste in single-shell and double-shell tanks. A second objective is to develop methods to determine the boundaries of waste that may be in the waste plume in the vadose zone. This document presents statistical sampling plans that can be used to estimate the inventory of analytes within the residual waste within a tank. Sampling plans for estimating the inventory of analytes within the waste plume in the vadose zone are also presented. Inventory estimates can be used to classify the residual waste with respect to chemical and radiological hazards. Based on these estimates, it will be possible to make decisions regarding the final disposition of the residual waste. Four sampling plans for the residual waste in a tank are presented. The first plan is based on the assumption that, based on some physical characteristic, the residual waste can be divided into disjoint strata, and waste samples obtained from randomly selected locations within each stratum. The second plan is that waste samples are obtained from randomly selected locations within the waste. The third and fourth plans are similar to the first two, except that composite samples are formed from multiple samples. Common to the four plans is that, in the laboratory, replicate analytical measurements are obtained from homogenized waste samples. The statistical sampling plans for the residual waste are similar to the statistical sampling plans developed for the tank waste characterization program. In that program, the statistical sampling plans required multiple core samples of waste, and replicate analytical measurements from homogenized core segments. A statistical analysis of the analytical data, obtained from use of the statistical sampling plans developed for the characterization program or from the HTI project, provide estimates of mean analyte concentrations and confidence intervals
Robustness analysis of an air heating plant and control law by using polynomial chaos
Colón, Diego; Ferreira, Murillo A. S.; Bueno, Átila M.; Balthazar, José M.; Rosa, Suélia S. R. F. de
2014-12-10
This paper presents a robustness analysis of an air heating plant with a multivariable closed-loop control law by using the polynomial chaos methodology (MPC). The plant consists of a PVC tube with a fan in the air input (that forces the air through the tube) and a mass flux sensor in the output. A heating resistance warms the air as it flows inside the tube, and a thermo-couple sensor measures the air temperature. The plant has thus two inputs (the fan's rotation intensity and heat generated by the resistance, both measured in percent of the maximum value) and two outputs (air temperature and air mass flux, also in percent of the maximal value). The mathematical model is obtained by System Identification techniques. The mass flux sensor, which is nonlinear, is linearized and the delays in the transfer functions are properly approximated by non-minimum phase transfer functions. The resulting model is transformed to a state-space model, which is used for control design purposes. The multivariable robust control design techniques used is the LQG/LTR, and the controllers are validated in simulation software and in the real plant. Finally, the MPC is applied by considering some of the system's parameters as random variables (one at a time, and the system's stochastic differential equations are solved by expanding the solution (a stochastic process) in an orthogonal basis of polynomial functions of the basic random variables. This method transforms the stochastic equations in a set of deterministic differential equations, which can be solved by traditional numerical methods (That is the MPC). Statistical data for the system (like expected values and variances) are then calculated. The effects of randomness in the parameters are evaluated in the open-loop and closed-loop pole's positions.
Infinite statistics condensate as a model of dark matter
Ebadi, Zahra; Mirza, Behrouz; Mohammadzadeh, Hosein E-mail: b.mirza@cc.iut.ac.ir
2013-11-01
In some models, dark matter is considered as a condensate bosonic system. In this paper, we prove that condensation is also possible for particles that obey infinite statistics and derive the critical condensation temperature. We argue that a condensed state of a gas of very weakly interacting particles obeying infinite statistics could be considered as a consistent model of dark matter.
Fundamental Statistical Descriptions of Plasma Turbulence in Magnetic Fields
John A. Krommes
2001-02-16
A pedagogical review of the historical development and current status (as of early 2000) of systematic statistical theories of plasma turbulence is undertaken. Emphasis is on conceptual foundations and methodology, not practical applications. Particular attention is paid to equations and formalism appropriate to strongly magnetized, fully ionized plasmas. Extensive reference to the literature on neutral-fluid turbulence is made, but the unique properties and problems of plasmas are emphasized throughout. Discussions are given of quasilinear theory, weak-turbulence theory, resonance-broadening theory, and the clump algorithm. Those are developed independently, then shown to be special cases of the direct-interaction approximation (DIA), which provides a central focus for the article. Various methods of renormalized perturbation theory are described, then unified with the aid of the generating-functional formalism of Martin, Siggia, and Rose. A general expression for the renormalized dielectric function is deduced and discussed in detail. Modern approaches such as decimation and PDF methods are described. Derivations of DIA-based Markovian closures are discussed. The eddy-damped quasinormal Markovian closure is shown to be nonrealizable in the presence of waves, and a new realizable Markovian closure is presented. The test-field model and a realizable modification thereof are also summarized. Numerical solutions of various closures for some plasma-physics paradigms are reviewed. The variational approach to bounds on transport is developed. Miscellaneous topics include Onsager symmetries for turbulence, the interpretation of entropy balances for both kinetic and fluid descriptions, self-organized criticality, statistical interactions between disparate scales, and the roles of both mean and random shear. Appendices are provided on Fourier transform conventions, dimensional and scaling analysis, the derivations of nonlinear gyrokinetic and gyrofluid equations
Zhang, Yue-Biao; Furukawa, Hiroyasu; Ko, Nakeun; Nie, Weixuan; Park, Hye Jeong; Okajima, Satoshi; Cordova, Kyle E.; Deng, Hexiang; Kim, Jaheon; Yaghi, Omar M.
2015-02-25
Metal–organic framework-177 (MOF-177) is one of the most porous materials whose structure is composed of octahedral Zn_{4}O(-COO)_{6} and triangular 1,3,5-benzenetribenzoate (BTB) units to make a three-dimensional extended network based on the qom topology. This topology violates a long-standing thesis where highly symmetric building units are expected to yield highly symmetric networks. In the case of octahedron and triangle combinations, MOFs based on pyrite (pyr) and rutile (rtl) nets were expected instead of qom. In this study, we have made 24 MOF-177 structures with different functional groups on the triangular BTB linker, having one or more functionalities. We find that the position of the functional groups on the BTB unit allows the selection for a specific net (qom, pyr, and rtl), and that mixing of functionalities (-H, -NH_{2}, and -C_{4}H_{4}) is an important strategy for the incorporation of a specific functionality (-NO_{2}) into MOF-177 where otherwise incorporation of such functionality would be difficult. Such mixing of functionalities to make multivariate MOF-177 structures leads to enhancement of hydrogen uptake by 25%.
Sub-Poissonian statistics in order-to-chaos transition
Kryuchkyan, Gagik Yu. [Yerevan State University, Manookyan 1, Yerevan 375049, (Armenia); Institute for Physical Research, National Academy of Sciences, Ashtarak-2 378410, (Armenia); Manvelyan, Suren B. [Institute for Physical Research, National Academy of Sciences, Ashtarak-2 378410, (Armenia)
2003-07-01
We study the phenomena at the overlap of quantum chaos and nonclassical statistics for the time-dependent model of nonlinear oscillator. It is shown in the framework of Mandel Q parameter and Wigner function that the statistics of oscillatory excitation numbers is drastically changed in the order-to-chaos transition. The essential improvement of sub-Poissonian statistics in comparison with an analogous one for the standard model of driven anharmonic oscillator is observed for the regular operational regime. It is shown that in the chaotic regime, the system exhibits the range of sub-Poissonian and super-Poissonian statistics which alternate one to other depending on time intervals. Unusual dependence of the variance of oscillatory number on the external noise level for the chaotic dynamics is observed. The scaling invariance of the quantum statistics is demonstrated and its relation to dissipation and decoherence is studied.
Building Performance Database Analysis Tools | Department of...
Office of Environmental Management (EM)
Building Performance Database Analysis Tools The BPD statistically analyzes the energy performance and physical and operational characteristics of real commercial and residential ...
Statistical mechanics based on fractional classical and quantum mechanics
Korichi, Z.; Meftah, M. T.
2014-03-15
The purpose of this work is to study some problems in statistical mechanics based on the fractional classical and quantum mechanics. At first stage we have presented the thermodynamical properties of the classical ideal gas and the system of N classical oscillators. In both cases, the Hamiltonian contains fractional exponents of the phase space (position and momentum). At the second stage, in the context of the fractional quantum mechanics, we have calculated the thermodynamical properties for the black body radiation, studied the Bose-Einstein statistics with the related problem of the condensation and the Fermi-Dirac statistics.
Statistical anisotropies in gravitational waves in solid inflation
Akhshik, Mohammad; Emami, Razieh; Firouzjahi, Hassan; Wang, Yi E-mail: emami@ipm.ir E-mail: yw366@cam.ac.uk
2014-09-01
Solid inflation can support a long period of anisotropic inflation. We calculate the statistical anisotropies in the scalar and tensor power spectra and their cross-correlation in anisotropic solid inflation. The tensor-scalar cross-correlation can either be positive or negative, which impacts the statistical anisotropies of the TT and TB spectra in CMB map more significantly compared with the tensor self-correlation. The tensor power spectrum contains potentially comparable contributions from quadrupole and octopole angular patterns, which is different from the power spectra of scalar, the cross-correlation or the scalar bispectrum, where the quadrupole type statistical anisotropy dominates over octopole.
Techniques in teaching statistics : linking research production and research use.
Martinez-Moyano, I .; Smith, A.
2012-01-01
In the spirit of closing the 'research-practice gap,' the authors extend evidence-based principles to statistics instruction in social science graduate education. The authors employ a Delphi method to survey experienced statistics instructors to identify teaching techniques to overcome the challenges inherent in teaching statistics to students enrolled in practitioner-oriented master's degree programs. Among the teaching techniques identi?ed as essential are using real-life examples, requiring data collection exercises, and emphasizing interpretation rather than results. Building on existing research, preliminary interviews, and the ?ndings from the study, the authors develop a model describing antecedents to the strength of the link between research and practice.
Spectroscopic Test of Bose-Einstein Statistics for Photons
English, D.; Yashchuk, V. V.; Budker, D.
2010-06-25
Using Bose-Einstein-statistics-forbidden two-photon excitation in atomic barium, we have limited the rate of statistics-violating transitions, as a fraction {nu} of an equivalent statistics-allowed transition rate, to {nu}<4.0x10{sup -11} at the 90% confidence level. This is an improvement of more than 3 orders of magnitude over the best previous result. Additionally, hyperfine-interaction enabling of the forbidden transition has been observed, to our knowledge, for the first time.
Impact of high-order moments on the statistical modeling of transition arrays
Gilleron, Franck; Pain, Jean-Christophe; Bauche, Jacques; Bauche-Arnoult, Claire
2008-02-15
The impact of high-order moments on the statistical modeling of transition arrays in complex spectra is studied. It is shown that a departure from the Gaussian, which is usually employed in such an approach, may be observed even in the shape of unresolved spectra due to the large value of the kurtosis coefficient. The use of a Gaussian shape may also overestimate the width of the spectra in some cases. Therefore, it is proposed to simulate the statistical shape of the transition arrays by the more flexible generalized Gaussian distribution which introduces an additional parameter--the power of the argument in the exponential--that can be constrained by the kurtosis value. The relevance of the statistical line distribution is checked by comparisons with smoothed spectra obtained from detailed line-by-line calculations. The departure from the Gaussian is also confirmed through the analysis of 2p-3d transitions of recent absorption measurements. A numerical fit is proposed for an easy implementation of the statistical profile in atomic-structure codes.
UN-Energy Statistics Database | Open Energy Information
PV, Wind Resource Type: Dataset Website: data.un.orgExplorer.aspx?dEDATA Cost: Free Language: English UN-Energy Statistics Database Screenshot References: UN Data1 "The United...
The Sloan Digital Sky Survey Quasar Lens Search. IV. Statistical...
Office of Scientific and Technical Information (OSTI)
The Sloan Digital Sky Survey Quasar Lens Search. IV. Statistical Lens Sample from the Fifth Data Release Citation Details In-Document Search Title: The Sloan Digital Sky Survey...
Doppler Lidar Vertical Velocity Statistics Value-Added Product
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
49 Doppler Lidar Vertical Velocity Statistics Value-Added Product RK Newsom C Sivaraman TR Shippert LD Riihimaki July 2015 DISCLAIMER This report was prepared as an account of work...
[pic] EERE Web Site Statistics - Publication and Product Library
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
... Other relevant reports and analyses: Separate SPV Statistics for New and Returning ... |1|3.45%| ||Google|1|3.45%| |12.|earth sheltered money saving tips|1|3.45%| ...
Self-assessed performance improves statistical fusion of image labels
Bryan, Frederick W. Xu, Zhoubing; Asman, Andrew J.; Allen, Wade M.; Reich, Daniel S.; Landman, Bennett A.
2014-03-15
Purpose: Expert manual labeling is the gold standard for image segmentation, but this process is difficult, time-consuming, and prone to inter-individual differences. While fully automated methods have successfully targeted many anatomies, automated methods have not yet been developed for numerous essential structures (e.g., the internal structure of the spinal cord as seen on magnetic resonance imaging). Collaborative labeling is a new paradigm that offers a robust alternative that may realize both the throughput of automation and the guidance of experts. Yet, distributing manual labeling expertise across individuals and sites introduces potential human factors concerns (e.g., training, software usability) and statistical considerations (e.g., fusion of information, assessment of confidence, bias) that must be further explored. During the labeling process, it is simple to ask raters to self-assess the confidence of their labels, but this is rarely done and has not been previously quantitatively studied. Herein, the authors explore the utility of self-assessment in relation to automated assessment of rater performance in the context of statistical fusion. Methods: The authors conducted a study of 66 volumes manually labeled by 75 minimally trained human raters recruited from the university undergraduate population. Raters were given 15 min of training during which they were shown examples of correct segmentation, and the online segmentation tool was demonstrated. The volumes were labeled 2D slice-wise, and the slices were unordered. A self-assessed quality metric was produced by raters for each slice by marking a confidence bar superimposed on the slice. Volumes produced by both voting and statistical fusion algorithms were compared against a set of expert segmentations of the same volumes. Results: Labels for 8825 distinct slices were obtained. Simple majority voting resulted in statistically poorer performance than voting weighted by self-assessed performance
An overview of component qualification using Bayesian statistics and energy
Office of Scientific and Technical Information (OSTI)
methods. (Technical Report) | SciTech Connect An overview of component qualification using Bayesian statistics and energy methods. Citation Details In-Document Search Title: An overview of component qualification using Bayesian statistics and energy methods. The below overview is designed to give the reader a limited understanding of Bayesian and Maximum Likelihood (MLE) estimation; a basic understanding of some of the mathematical tools to evaluate the quality of an estimation; an
Statistical thermodynamics model and empirical correlations for predicting
Office of Scientific and Technical Information (OSTI)
mixed hydrate phase equilibria (Journal Article) | SciTech Connect Statistical thermodynamics model and empirical correlations for predicting mixed hydrate phase equilibria Citation Details In-Document Search Title: Statistical thermodynamics model and empirical correlations for predicting mixed hydrate phase equilibria Authors: Garapati, Nagasree ; Anderson, Brian J Publication Date: 2014-07-01 OSTI Identifier: 1165558 Report Number(s): A-UNIV-PUB-100 Journal ID: ISSN 0378-3812 DOE Contract
Evaluation of cirrus statistics produced by general circulation models
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
using ARM data cirrus statistics produced by general circulation models using ARM data Hartsock, Daniel University of Utah Mace, Gerald University of Utah Benson, Sally University of Utah Category: Modeling Our goal is to evaluate the skill of various general circulation models for producing climatological cloud statistics by comparing them to the cirrus climatology compiled over the Southern Great Plains (SGP) ARM site. This evaluation includes quantifying similar cloud properties and
Statistical and Domain Analytics Applied to PV Module Lifetime and
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Degradation Science | Department of Energy Statistical and Domain Analytics Applied to PV Module Lifetime and Degradation Science Statistical and Domain Analytics Applied to PV Module Lifetime and Degradation Science Presented at the PV Module Reliability Workshop, February 26 - 27 2013, Golden, Colorado pvmrw13_ps2_casewestern_bruckman.pdf (6.77 MB) More Documents & Publications Literature Review of the Effects of UV Exposure on PV Modules Failure Rates from Certification Testing to UL
Final Report on Statistical Debugging for Petascale Environments (Technical
Office of Scientific and Technical Information (OSTI)
Report) | SciTech Connect Final Report on Statistical Debugging for Petascale Environments Citation Details In-Document Search Title: Final Report on Statistical Debugging for Petascale Environments Authors: Liblit, B Publication Date: 2013-01-18 OSTI Identifier: 1062211 Report Number(s): LLNL-SR-612077 DOE Contract Number: W-7405-ENG-48 Resource Type: Technical Report Research Org: Lawrence Livermore National Laboratory (LLNL), Livermore, CA Sponsoring Org: USDOE Country of Publication:
A statistical study of EMIC waves observed by Cluster. 1. Wave properties. EMIC Wave Properties
Allen, R. C.; Zhang, J. -C.; Kistler, L. M.; Spence, H. E.; Lin, R. -L.; Klecker, B.; Dunlop, M. W.; Andr, M.; Jordanova, V. K.
2015-07-23
Electromagnetic ion cyclotron (EMIC) waves are an important mechanism for particle energization and losses inside the magnetosphere. In order to better understand the effects of these waves on particle dynamics, detailed information about the occurrence rate, wave power, ellipticity, normal angle, energy propagation angle distributions, and local plasma parameters are required. Previous statistical studies have used in situ observations to investigate the distribution of these parameters in the magnetic local time versus L-shell (MLT-L) frame within a limited magnetic latitude (MLAT) range. In our study, we present a statistical analysis of EMIC wave properties using 10 years (20012010) of data from Cluster, totaling 25,431 min of wave activity. Due to the polar orbit of Cluster, we are able to investigate EMIC waves at all MLATs and MLTs. This allows us to further investigate the MLAT dependence of various wave properties inside different MLT sectors and further explore the effects of Shabansky orbits on EMIC wave generation and propagation. Thus, the statistical analysis is presented in two papers. OUr paper focuses on the wave occurrence distribution as well as the distribution of wave properties. The companion paper focuses on local plasma parameters during wave observations as well as wave generation proxies.
A statistical study of EMIC waves observed by Cluster. 1. Wave properties. EMIC Wave Properties
Allen, R. C.; Zhang, J. -C.; Kistler, L. M.; Spence, H. E.; Lin, R. -L.; Klecker, B.; Dunlop, M. W.; André, M.; Jordanova, V. K.
2015-07-23
Electromagnetic ion cyclotron (EMIC) waves are an important mechanism for particle energization and losses inside the magnetosphere. In order to better understand the effects of these waves on particle dynamics, detailed information about the occurrence rate, wave power, ellipticity, normal angle, energy propagation angle distributions, and local plasma parameters are required. Previous statistical studies have used in situ observations to investigate the distribution of these parameters in the magnetic local time versus L-shell (MLT-L) frame within a limited magnetic latitude (MLAT) range. In our study, we present a statistical analysis of EMIC wave properties using 10 years (2001–2010) of data from Cluster, totaling 25,431 min of wave activity. Due to the polar orbit of Cluster, we are able to investigate EMIC waves at all MLATs and MLTs. This allows us to further investigate the MLAT dependence of various wave properties inside different MLT sectors and further explore the effects of Shabansky orbits on EMIC wave generation and propagation. Thus, the statistical analysis is presented in two papers. OUr paper focuses on the wave occurrence distribution as well as the distribution of wave properties. The companion paper focuses on local plasma parameters during wave observations as well as wave generation proxies.
A statistical study of EMIC waves observed by Cluster. 1. Wave properties. EMIC Wave Properties
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Allen, R. C.; Zhang, J. -C.; Kistler, L. M.; Spence, H. E.; Lin, R. -L.; Klecker, B.; Dunlop, M. W.; André, M.; Jordanova, V. K.
2015-07-23
Electromagnetic ion cyclotron (EMIC) waves are an important mechanism for particle energization and losses inside the magnetosphere. In order to better understand the effects of these waves on particle dynamics, detailed information about the occurrence rate, wave power, ellipticity, normal angle, energy propagation angle distributions, and local plasma parameters are required. Previous statistical studies have used in situ observations to investigate the distribution of these parameters in the magnetic local time versus L-shell (MLT-L) frame within a limited magnetic latitude (MLAT) range. In our study, we present a statistical analysis of EMIC wave properties using 10 years (2001–2010) of datamore » from Cluster, totaling 25,431 min of wave activity. Due to the polar orbit of Cluster, we are able to investigate EMIC waves at all MLATs and MLTs. This allows us to further investigate the MLAT dependence of various wave properties inside different MLT sectors and further explore the effects of Shabansky orbits on EMIC wave generation and propagation. Thus, the statistical analysis is presented in two papers. OUr paper focuses on the wave occurrence distribution as well as the distribution of wave properties. The companion paper focuses on local plasma parameters during wave observations as well as wave generation proxies.« less
A.G. Crook Company
1993-04-01
This report was prepared by the A.G. Crook Company, under contract to Bonneville Power Administration, and provides statistics of seasonal volumes and streamflow for 28 selected sites in the Columbia River Basin.
Mishra, Srikanta; Schuetter, Jared
2014-11-01
We compare two approaches for building a statistical proxy model (metamodel) for CO₂ geologic sequestration from the results of full-physics compositional simulations. The first approach involves a classical Box-Behnken or Augmented Pairs experimental design with a quadratic polynomial response surface. The second approach used a space-filling maxmin Latin Hypercube sampling or maximum entropy design with the choice of five different meta-modeling techniques: quadratic polynomial, kriging with constant and quadratic trend terms, multivariate adaptive regression spline (MARS) and additivity and variance stabilization (AVAS). Simulations results for CO₂ injection into a reservoir-caprock system with 9 design variables (and 97 samples) were used to generate the data for developing the proxy models. The fitted models were validated with using an independent data set and a cross-validation approach for three different performance metrics: total storage efficiency, CO₂ plume radius and average reservoir pressure. The Box-Behnken–quadratic polynomial metamodel performed the best, followed closely by the maximin LHS–kriging metamodel.
Non-gaussian mode coupling and the statistical cosmological principle
LoVerde, Marilena; Nelson, Elliot; Shandera, Sarah E-mail: eln121@psu.edu
2013-06-01
Local-type primordial non-Gaussianity couples statistics of the curvature perturbation ? on vastly different physical scales. Because of this coupling, statistics (i.e. the polyspectra) of ? in our Hubble volume may not be representative of those in the larger universe that is, they may be biased. The bias depends on the local background value of ?, which includes contributions from all modes with wavelength k?
Quality control and statistical process control for nuclear analytical measurements
Seymour, R.; Sergent, F.; Clark, W.H.C.; Gleason, G.
1993-12-31
The same driving forces that are making businesses examine quality control of manufacturing processes are making laboratories reevaluate their quality control programs. Increased regulation (accountability), global competitiveness (profitability), and potential for litigation (defensibility) are the principal driving forces behind the development and implementation of QA/QC programs in the nuclear analytical laboratory. Both manufacturing and scientific quality control can use identical statistical methods, albeit with some differences in the treatment of the measured data. Today, the approaches to QC programs are quite different for most analytical laboratories as compared with manufacturing sciences. This is unfortunate because the statistical process control methods are directly applicable to measurement processes. It is shown that statistical process control methods can provide many benefits for laboratory QC data treatment.
Statistics of particle transport in a two-dimensional dusty plasma cluster
Ratynskaia, S.; Knapek, C.; Rypdal, K.; Khrapak, S.; Morfill, G.
2005-02-01
Statistical analysis is performed on long time series of dust particle trajectories in a two-dimensional dusty plasma cluster. Particle transport is found to be superdiffusive on all time scales until the range of particle displacements approaches the size of the cluster. Analysis of probability distribution functions and rescaled range analysis of the position increments show that the signal is non-Gaussian self-similar with Hurst exponent H=0.6, indicating that the superdiffusion is caused by long-range dependencies in the system. Investigation of temporal and spatial characteristics of persistent particle slips demonstrates that they are associated with collective events present on all time scales and responsible for the non-Gaussianity and long-memory effects.
System and method for statistically monitoring and analyzing sensed conditions
Pebay, Philippe P.; Brandt, James M.; Gentile, Ann C.; Marzouk, Youssef M.; Hale, Darrian J.; Thompson, David C.
2011-01-04
A system and method of monitoring and analyzing a plurality of attributes for an alarm condition is disclosed. The attributes are processed and/or unprocessed values of sensed conditions of a collection of a statistically significant number of statistically similar components subjected to varying environmental conditions. The attribute values are used to compute the normal behaviors of some of the attributes and also used to infer parameters of a set of models. Relative probabilities of some attribute values are then computed and used along with the set of models to determine whether an alarm condition is met. The alarm conditions are used to prevent or reduce the impact of impending failure.
System and method for statistically monitoring and analyzing sensed conditions
Pebay, Philippe P.; Brandt, James M.; Gentile, Ann C.; Marzouk, Youssef M.; Hale, Darrian J.; Thompson, David C.
2011-01-25
A system and method of monitoring and analyzing a plurality of attributes for an alarm condition is disclosed. The attributes are processed and/or unprocessed values of sensed conditions of a collection of a statistically significant number of statistically similar components subjected to varying environmental conditions. The attribute values are used to compute the normal behaviors of some of the attributes and also used to infer parameters of a set of models. Relative probabilities of some attribute values are then computed and used along with the set of models to determine whether an alarm condition is met. The alarm conditions are used to prevent or reduce the impact of impending failure.
System and method for statistically monitoring and analyzing sensed conditions
Pebay, Philippe P.; Brandt, James M. , Gentile; Ann C. , Marzouk; Youssef M. , Hale; Darrian J. , Thompson; David C.
2010-07-13
A system and method of monitoring and analyzing a plurality of attributes for an alarm condition is disclosed. The attributes are processed and/or unprocessed values of sensed conditions of a collection of a statistically significant number of statistically similar components subjected to varying environmental conditions. The attribute values are used to compute the normal behaviors of some of the attributes and also used to infer parameters of a set of models. Relative probabilities of some attribute values are then computed and used along with the set of models to determine whether an alarm condition is met. The alarm conditions are used to prevent or reduce the impact of impending failure.
Office of Oil, Gas, and Coal Supply Statistics
U.S. Energy Information Administration (EIA) Indexed Site
Office of Oil, Gas, and Coal Supply Statistics www.eia.gov Natural Gas Monthly August 2016 U.S. Department of Energy Washington, DC 20585 August 2016 U.S. Energy Information Administration | Natural Gas Monthly ii This report was prepared by the U.S. Energy Information Administration (EIA), the statistical and analytical agency within the U.S. Department of Energy. By law, EIA's data, analyses, and forecasts are independent of approval by any other officer or employee of the United States
Statistical comparison of ICRF and NBI heating performance in JET-ILW L-mode plasmas
Lerche, E.; Van Eester, D.; Jacquet, Ph.; Mayoral, M.-L.; Graham, M.; Matthews, G.; Monakhov, I.; Rimini, F.; Colas, L.; Czarnecka, A.; Vries, P. de; Collaboration: JET-EFDA Contributors
2014-02-12
After the change over from the C-wall to the ITER-like Be/W wall (ILW) in JET, the radiation losses during ICRF heating have increased and are now substantially larger than those observed with NBI at the same power levels, in spite of the similar global plasma energies reached with the two heating systems. A comparison of the NBI and ICRF performances in the JET-ILW experiments, based on a statistical analysis of ?3000 L-mode discharges, will be presented.
Financial statistics of major publicly owned electric utilities, 1991
Not Available
1993-03-31
The Financial Statistics of Major Publicly Owned Electric Utilities publication presents summary and detailed financial accounting data on the publicly owned electric utilities. The objective of the publication is to provide Federal and State governments, industry, and the general public with data that can be used for policymaking and decisionmaking purposes relating to publicly owned electric utility issues.
Statistical scaling of geometric characteristics in stochastically generated pore microstructures
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Hyman, Jeffrey D.; Guadagnini, Alberto; Winter, C. Larrabee
2015-05-21
In this study, we analyze the statistical scaling of structural attributes of virtual porous microstructures that are stochastically generated by thresholding Gaussian random fields. Characterization of the extent at which randomly generated pore spaces can be considered as representative of a particular rock sample depends on the metrics employed to compare the virtual sample against its physical counterpart. Typically, comparisons against features and/patterns of geometric observables, e.g., porosity and specific surface area, flow-related macroscopic parameters, e.g., permeability, or autocorrelation functions are used to assess the representativeness of a virtual sample, and thereby the quality of the generation method. Here, wemore » rely on manifestations of statistical scaling of geometric observables which were recently observed in real millimeter scale rock samples [13] as additional relevant metrics by which to characterize a virtual sample. We explore the statistical scaling of two geometric observables, namely porosity (Φ) and specific surface area (SSA), of porous microstructures generated using the method of Smolarkiewicz and Winter [42] and Hyman and Winter [22]. Our results suggest that the method can produce virtual pore space samples displaying the symptoms of statistical scaling observed in real rock samples. Order q sample structure functions (statistical moments of absolute increments) of Φ and SSA scale as a power of the separation distance (lag) over a range of lags, and extended self-similarity (linear relationship between log structure functions of successive orders) appears to be an intrinsic property of the generated media. The width of the range of lags where power-law scaling is observed and the Hurst coefficient associated with the variables we consider can be controlled by the generation parameters of the method.« less
Statistical scaling of geometric characteristics in stochastically generated pore microstructures
Hyman, Jeffrey D.; Guadagnini, Alberto; Winter, C. Larrabee
2015-05-21
In this study, we analyze the statistical scaling of structural attributes of virtual porous microstructures that are stochastically generated by thresholding Gaussian random fields. Characterization of the extent at which randomly generated pore spaces can be considered as representative of a particular rock sample depends on the metrics employed to compare the virtual sample against its physical counterpart. Typically, comparisons against features and/patterns of geometric observables, e.g., porosity and specific surface area, flow-related macroscopic parameters, e.g., permeability, or autocorrelation functions are used to assess the representativeness of a virtual sample, and thereby the quality of the generation method. Here, we rely on manifestations of statistical scaling of geometric observables which were recently observed in real millimeter scale rock samples [13] as additional relevant metrics by which to characterize a virtual sample. We explore the statistical scaling of two geometric observables, namely porosity (?) and specific surface area (SSA), of porous microstructures generated using the method of Smolarkiewicz and Winter [42] and Hyman and Winter [22]. Our results suggest that the method can produce virtual pore space samples displaying the symptoms of statistical scaling observed in real rock samples. Order q sample structure functions (statistical moments of absolute increments) of ? and SSA scale as a power of the separation distance (lag) over a range of lags, and extended self-similarity (linear relationship between log structure functions of successive orders) appears to be an intrinsic property of the generated media. The width of the range of lags where power-law scaling is observed and the Hurst coefficient associated with the variables we consider can be controlled by the generation parameters of the method.
Statistical scaling of geometric characteristics in stochastically generated pore microstructures
Hyman, Jeffrey D.; Guadagnini, Alberto; Winter, C. Larrabee
2015-05-21
In this study, we analyze the statistical scaling of structural attributes of virtual porous microstructures that are stochastically generated by thresholding Gaussian random fields. Characterization of the extent at which randomly generated pore spaces can be considered as representative of a particular rock sample depends on the metrics employed to compare the virtual sample against its physical counterpart. Typically, comparisons against features and/patterns of geometric observables, e.g., porosity and specific surface area, flow-related macroscopic parameters, e.g., permeability, or autocorrelation functions are used to assess the representativeness of a virtual sample, and thereby the quality of the generation method. Here, we rely on manifestations of statistical scaling of geometric observables which were recently observed in real millimeter scale rock samples [13] as additional relevant metrics by which to characterize a virtual sample. We explore the statistical scaling of two geometric observables, namely porosity (Φ) and specific surface area (SSA), of porous microstructures generated using the method of Smolarkiewicz and Winter [42] and Hyman and Winter [22]. Our results suggest that the method can produce virtual pore space samples displaying the symptoms of statistical scaling observed in real rock samples. Order q sample structure functions (statistical moments of absolute increments) of Φ and SSA scale as a power of the separation distance (lag) over a range of lags, and extended self-similarity (linear relationship between log structure functions of successive orders) appears to be an intrinsic property of the generated media. The width of the range of lags where power-law scaling is observed and the Hurst coefficient associated with the variables we consider can be controlled by the generation parameters of the method.
Starkov, V. N.; Semenov, A. A.; Gomonay, H. V.
2009-07-15
We demonstrate a practical possibility of loss compensation in measured photocounting statistics in the presence of dark counts and background radiation noise. It is shown that satisfactory results are obtained even in the case of low detection efficiency and large experimental errors.
User Statistics Collection Practices | U.S. DOE Office of Science...
Office of Science (SC) Website
Office of Science User Facilities User Statistics Collection Practices, FY 2015 .pdf file (296KB) Prior year statements of user statistics collection practices In November, 2014, ...
Ensemble Data Analysis ENvironment (EDEN)
Steed, Chad Allen
2012-08-01
The EDEN toolkit facilitates exploratory data analysis and visualization of global climate model simulation datasets. EDEN provides an interactive graphical user interface (GUI) that helps the user visually construct dynamic queries of the characteristically large climate datasets using temporal ranges, variable selections, and geographic areas of interest. EDEN reads the selected data into a multivariate visualization panel which features an extended implementation of parallel coordinates plots as well as interactive scatterplots. The user can query data in the visualization panel using mouse gestures to analyze different ranges of data. The visualization panel provides coordinated multiple views whereby selections made in one plot are propagated to the other plots.
Mexico City air quality research initiative: An overview and some statistical aspects
Waller, R.A.; Streit, G.E. ); Guzman, F. )
1991-01-01
The Mexican Petroleum Institute (Institute Mexicano del Petroleo, IMP) and Los Alamos National Laboratory (LANL) are in the first year of a three-year jointly funded project to examine the air quality in Mexico City and to provide techniques to evaluate the impact of proposed mitigation options. The technical tasks include modeling and simulation; monitoring and characterization; and strategic evaluation. Extensive measurements of the atmosphere, climate, and meteorology are being made as part of the study. This presentation provides an overview of the total project plan, reports on the current status of the technical tasks, describes the data collection methods, presents examples of the data analysis and graphics, and suggest roles for statistical analysis in this and similar environmental studies. 8 figs., 4 tabs.
Statistical Characterization of School Bus Drive Cycles Collected via Onboard Logging Systems
Duran, A.; Walkowicz, K.
2013-10-01
In an effort to characterize the dynamics typical of school bus operation, National Renewable Energy Laboratory (NREL) researchers set out to gather in-use duty cycle data from school bus fleets operating across the country. Employing a combination of Isaac Instruments GPS/CAN data loggers in conjunction with existing onboard telemetric systems resulted in the capture of operating information for more than 200 individual vehicles in three geographically unique domestic locations. In total, over 1,500 individual operational route shifts from Washington, New York, and Colorado were collected. Upon completing the collection of in-use field data using either NREL-installed data acquisition devices or existing onboard telemetry systems, large-scale duty-cycle statistical analyses were performed to examine underlying vehicle dynamics trends within the data and to explore vehicle operation variations between fleet locations. Based on the results of these analyses, high, low, and average vehicle dynamics requirements were determined, resulting in the selection of representative standard chassis dynamometer test cycles for each condition. In this paper, the methodology and accompanying results of the large-scale duty-cycle statistical analysis are presented, including graphical and tabular representations of a number of relationships between key duty-cycle metrics observed within the larger data set. In addition to presenting the results of this analysis, conclusions are drawn and presented regarding potential applications of advanced vehicle technology as it relates specifically to school buses.
Doppler Lidar Vertical Velocity Statistics Value-Added Product
Newsom, R. K.; Sivaraman, C.; Shippert, T. R.; Riihimaki, L. D.
2015-07-01
Accurate height-resolved measurements of higher-order statistical moments of vertical velocity fluctuations are crucial for improved understanding of turbulent mixing and diffusion, convective initiation, and cloud life cycles. The Atmospheric Radiation Measurement (ARM) Climate Research Facility operates coherent Doppler lidar systems at several sites around the globe. These instruments provide measurements of clear-air vertical velocity profiles in the lower troposphere with a nominal temporal resolution of 1 sec and height resolution of 30 m. The purpose of the Doppler lidar vertical velocity statistics (DLWSTATS) value-added product (VAP) is to produce height- and time-resolved estimates of vertical velocity variance, skewness, and kurtosis from these raw measurements. The VAP also produces estimates of cloud properties, including cloud-base height (CBH), cloud frequency, cloud-base vertical velocity, and cloud-base updraft fraction.
Financial statistics of selected investor-owned electric utilities, 1989
Not Available
1991-01-01
The Financial Statistics of Selected Investor-Owned Electric Utilities publication presents summary and detailed financial accounting data on the investor-owned electric utilities. The objective of the publication is to provide the Federal and State governments, industry, and the general public with current and historical data that can be used for policymaking and decisionmaking purposes related to investor-owned electric utility issues.
Spatial statistics for predicting flow through a rock fracture
Coakley, K.J.
1989-03-01
Fluid flow through a single rock fracture depends on the shape of the space between the upper and lower pieces of rock which define the fracture. In this thesis, the normalized flow through a fracture, i.e. the equivalent permeability of a fracture, is predicted in terms of spatial statistics computed from the arrangement of voids, i.e. open spaces, and contact areas within the fracture. Patterns of voids and contact areas, with complexity typical of experimental data, are simulated by clipping a correlated Gaussian process defined on a N by N pixel square region. The voids have constant aperture; the distance between the upper and lower surfaces which define the fracture is either zero or a constant. Local flow is assumed to be proportional to local aperture cubed times local pressure gradient. The flow through a pattern of voids and contact areas is solved using a finite-difference method. After solving for the flow through simulated 10 by 10 by 30 pixel patterns of voids and contact areas, a model to predict equivalent permeability is developed. The first model is for patterns with 80% voids where all voids have the same aperture. The equivalent permeability of a pattern is predicted in terms of spatial statistics computed from the arrangement of voids and contact areas within the pattern. Four spatial statistics are examined. The change point statistic measures how often adjacent pixel alternate from void to contact area (or vice versa ) in the rows of the patterns which are parallel to the overall flow direction. 37 refs., 66 figs., 41 tabs.
U.S. Department of Commerce Economics and Statistics Administration
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Commerce Economics and Statistics Administration 48% 24% 52% 76% 0% 20% 40% 60% 80% 100% All jobs STEM jobs Men Women By David Beede, Tiffany Julian, David Langdon, George McKittrick, Beethika Khan, and Mark Doms, Office of the Chief Economist Women in STEM: A Gender Gap to Innovation August 2011 Executive Summary ESA Issue Brief #04-11 O ur science, technology, engineering and math (STEM) workforce is crucial to America's innovative capacity and global competitiveness. Yet women are vastly
Sensitivity and Uncertainty Analysis Shell
Energy Science and Technology Software Center (OSTI)
1999-04-20
SUNS (Sensitivity and Uncertainty Analysis Shell) is a 32-bit application that runs under Windows 95/98 and Windows NT. It is designed to aid in statistical analyses for a broad range of applications. The class of problems for which SUNS is suitable is generally defined by two requirements: 1. A computer code is developed or acquired that models some processes for which input is uncertain and the user is interested in statistical analysis of the outputmore » of that code. 2. The statistical analysis of interest can be accomplished using the Monte Carlo analysis. The implementation then requires that the user identify which input to the process model is to be manipulated for statistical analysis. With this information, the changes required to loosely couple SUNS with the process model can be completed. SUNS is then used to generate the required statistical sample and the user-supplied process model analyses the sample. The SUNS post processor displays statistical results from any existing file that contains sampled input and output values.« less
Electron transfer statistics and thermal fluctuations in molecular junctions
Goswami, Himangshu Prabal; Harbola, Upendra
2015-02-28
We derive analytical expressions for probability distribution function (PDF) for electron transport in a simple model of quantum junction in presence of thermal fluctuations. Our approach is based on the large deviation theory combined with the generating function method. For large number of electrons transferred, the PDF is found to decay exponentially in the tails with different rates due to applied bias. This asymmetry in the PDF is related to the fluctuation theorem. Statistics of fluctuations are analyzed in terms of the Fano factor. Thermal fluctuations play a quantitative role in determining the statistics of electron transfer; they tend to suppress the average current while enhancing the fluctuations in particle transfer. This gives rise to both bunching and antibunching phenomena as determined by the Fano factor. The thermal fluctuations and shot noise compete with each other and determine the net (effective) statistics of particle transfer. Exact analytical expression is obtained for delay time distribution. The optimal values of the delay time between successive electron transfers can be lowered below the corresponding shot noise values by tuning the thermal effects.
Spectral statistics in noninteracting many-particle systems
Munoz, L.; Relano, A.; Retamosa, J. [Departamento de Fisica Atomica, Molecular y Nuclear, Universidad Complutense de Madrid, E-28040 Madrid (Spain); Faleiro, E. [Departamento de Fisica Aplicada, E.U.I.T. Industrial, Universidad Politecnica de Madrid, E-28012 Madrid (Spain); Molina, R.A. [Max-Planck-Institut fuer Physik Komplexer Systeme, Noethnitzer Strasse 38, D-01187 Dresden (Germany)
2006-03-15
It is widely accepted that the statistical properties of energy level spectra provide an essential characterization of quantum chaos. Indeed, the spectral fluctuations of many different systems like quantum billiards, atoms, or atomic nuclei have been studied. However, noninteracting many-body systems have received little attention, since it is assumed that they must exhibit Poisson-like fluctuations. Apart from a heuristic argument of Bloch, there are neither systematic numerical calculations nor a rigorous derivation of this fact. Here we present a rigorous study of the spectral fluctuations of noninteracting identical particles moving freely in a mean field emphasizing the evolution with the number of particles N as well as with the energy. Our results are conclusive. For N{>=}2 the spectra of these systems exhibit Poisson fluctuations provided that we consider sufficiently high excitation energies. Nevertheless, when the mean field is chaotic there exists a critical energy scale L{sub c}; beyond this scale, the fluctuations deviate from the Poisson statistics as a reminiscence of the statistical properties of the mean field.
Li, Guangjun; Wu, Kui; Peng, Guang; Zhang, Yingjie; Bai, Sen
2014-01-01
Volumetric-modulated arc therapy (VMAT) is now widely used clinically, as it is capable of delivering a highly conformal dose distribution in a short time interval. We retrospectively analyzed patient-specific quality assurance (QA) of VMAT and examined the relationships between the planning parameters and the QA results. A total of 118 clinical VMAT cases underwent pretreatment QA. All plans had 3-dimensional diode array measurements, and 69 also had ion chamber measurements. Dose distribution and isocenter point dose were evaluated by comparing the measurements and the treatment planning system (TPS) calculations. In addition, the relationship between QA results and several planning parameters, such as dose level, control points (CPs), monitor units (MUs), average field width, and average leaf travel, were also analyzed. For delivered dose distribution, a gamma analysis passing rate greater than 90% was obtained for all plans and greater than 95% for 100 of 118 plans with the 3%/3-mm criteria. The difference (mean ± standard deviation) between the point doses measured by the ion chamber and those calculated by TPS was 0.9% ± 2.0% for all plans. For all cancer sites, nasopharyngeal carcinoma and gastric cancer have the lowest and highest average passing rates, respectively. From multivariate linear regression analysis, the dose level (p = 0.001) and the average leaf travel (p < 0.001) showed negative correlations with the passing rate, and the average field width (p = 0.003) showed a positive correlation with the passing rate, all indicating a correlation between the passing rate and the plan complexity. No statistically significant correlation was found between MU or CP and the passing rate. Analysis of the results of dosimetric pretreatment measurements as a function of VMAT plan parameters can provide important information to guide the plan parameter setting and optimization in TPS.
Quality quandaries: A gentle introduction to Bayesian statistics
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Weaver, Brian Phillip; Hamada, Michael Scott
2016-07-11
Here, we provide a gentle introduction to this powerful analysis method that can handle complex data and modeling situations.
Edwards, Lloyd A.; Paresol, Bernard
2014-09-01
This report of the geostatistical analysis results of the fire fuels response variables, custom reaction intensity and total dead fuels is but a part of an SRS 2010 vegetation inventory project. For detailed description of project, theory and background including sample design, methods, and results please refer to USDA Forest Service Savannah River Site internal report “SRS 2010 Vegetation Inventory GeoStatistical Mapping Report”, (Edwards & Parresol 2013).
Farjam, R; Pramanik, P; Srinivasan, A; Chapman, C; Tsien, C; Lawrence, T; Cao, Y
2014-06-15
Purpose: Vascular injury could be a cause of hippocampal dysfunction leading to late neurocognitive decline in patients receiving brain radiotherapy (RT). Hence, our aim was to develop a multivariate interaction model for characterization of hippocampal vascular dose-response and early prediction of radiation-induced late neurocognitive impairments. Methods: 27 patients (17 males and 10 females, age 31–80 years) were enrolled in an IRB-approved prospective longitudinal study. All patients were diagnosed with a low-grade glioma or benign tumor and treated by 3-D conformal or intensity-modulated RT with a median dose of 54 Gy (50.4–59.4 Gy in 1.8− Gy fractions). Six DCE-MRI scans were performed from pre-RT to 18 months post-RT. DCE data were fitted to the modified Toft model to obtain the transfer constant of gadolinium influx from the intravascular space into the extravascular extracellular space, Ktrans, and the fraction of blood plasma volume, Vp. The hippocampus vascular property alterations after starting RT were characterized by changes in the hippocampal mean values of, μh(Ktrans)τ and μh(Vp)τ. The dose-response, Δμh(Ktrans/Vp)pre->τ, was modeled using a multivariate linear regression considering integrations of doses with age, sex, hippocampal laterality and presence of tumor/edema near a hippocampus. Finally, the early vascular dose-response in hippocampus was correlated with neurocognitive decline 6 and 18 months post-RT. Results: The μh(Ktrans) increased significantly from pre-RT to 1 month post-RT (p<0.0004). The multivariate model showed that the dose effect on Δμh(Ktrans)pre->1M post-RT was interacted with sex (p<0.0007) and age (p<0.00004), with the dose-response more pronounced in older females. Also, the vascular dose-response in the left hippocampus of females was significantly correlated with memory function decline at 6 (r = − 0.95, p<0.0006) and 18 (r = −0.88, p<0.02) months post-RT. Conclusion: The hippocampal vascular
Davis, A; Suhr, N H; Spackman, W; Painter, P C; Walker, P L; Given, P H
1981-04-01
The basic objectives of this program are, first, to understand the systematic relationships between the properties of coals, and, second, to determine the nature of the lateral and vertical variability in the properties of a single seam. Multivariate statistical analyses applied to the Coal Data Base confirm a number of known trends for coal properties. In addition, nitrogen and some components of the ash analysis bear interesting relationships to rank. The macroscopic petrography of column samples of the Lower Kittanning seam reveals a significant difference between the sample from a marine-influenced environment and those from toward the margins of the basin where conditions were non-marine. The various methods of determining the amount and mineralogy of the inorganic fraction of coals are reviewed. General trends in seam thickness, ash, sulfur, volatile matter yield, and vitrinite reflectance of the Lower Kittanning seam of western Pennsylvania are presented. Controls of sedimentation are discussed in relation to the areal variability which has been observed. Differential subsidence and paleotopography appear to have played a major role during the deposition of the coal. The same controls may have maintained some influence upon the coalification process after deposition, especially along the eastern margin of the Lower Kittanning basin.
EIA - Advice from Meetings of the ASA Committee on Energy Statistics
U.S. Energy Information Administration (EIA) Indexed Site
Advice from Meetings of the ASA Committee on Energy Statistics Transcripts and Summaries from the American Statistical Association Committee on Energy Statistics The U.S. Energy Information Administration seeks technical advice semi-annually from the American Statistical Association Committee on Energy Statistics. The meetings are held in the spring and fall in Washington, D.C., and are announced in the Federal Register. These meetings are open to the public and are typically held on Thursday
User Statistics Collection Practices Archives | U.S. DOE Office of Science
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
(SC) Policies and Processes » User Statistics Collection Practices » User Statistics Collection Practices Archives User Facilities User Facilities Home User Facilities at a Glance User Resources User Statistics Policies and Processes Definition Designation Process Official List of SC User Facilities User Statistics Collection Practices User Statistics Collection Practices Archives Frequently Asked Questions User Facility Science Highlights User Facility News Contact Information Office of
Accounting for Global Climate Model Projection Uncertainty in Modern Statistical Downscaling
Johannesson, G
2010-03-17
configuring and running a regional climate model (RCM) nested within a given GCM projection (i.e., the GCM provides bounder conditions for the RCM). On the other hand, statistical downscaling aims at establishing a statistical relationship between observed local/regional climate variables of interest and synoptic (GCM-scale) climate predictors. The resulting empirical relationship is then applied to future GCM projections. A comparison of the pros and cons of dynamical versus statistical downscaling is outside the scope of this effort, but has been extensively studied and the reader is referred to Wilby et al. (1998); Murphy (1999); Wood et al. (2004); Benestad et al. (2007); Fowler et al. (2007), and references within those. The scope of this effort is to study methodology, a statistical framework, to propagate and account for GCM uncertainty in regional statistical downscaling assessment. In particular, we will explore how to leverage an ensemble of GCM projections to quantify the impact of the GCM uncertainty in such an assessment. There are three main component to this effort: (1) gather the necessary climate-related data for a regional SDS study, including multiple GCM projections, (2) carry out SDS, and (3) assess the uncertainty. The first step is carried out using tools written in the Python programming language, while analysis tools were developed in the statistical programming language R; see Figure 1.
The Fall Meeting of the Committee on Energy Statistics
U.S. Energy Information Administration (EIA) Indexed Site
* * * * * FRIDAY NOVEMBER 5, 1999 The Fall Meeting of the Committee on Energy Statistics commenced at 8:30 a.m. at the Department of Energy, 1000 Independence Avenue, S.W., Room 8E089, Washington, D.C., Daniel Relles, presiding. PRESENT: DANIEL RELLES, Chairman JAY BREIDT LYNDA CARLSON THOMAS COWING CAROL GOTWAY CRAWFORD JAY HAKES JAMES HAMMITT PHILIP HANSER CALVIN KENT W. DAVID MONTGOMERY LARRY PETTIS SEYMOUR SUDMAN BILL WEINIG ROY WHITMORE C O N T E N T S PAGE Opening 5 Addressing Accuracy in
Advances on statistical/thermodynamical models for unpolarized structure functions
Trevisan, Luis A.; Mirez, Carlos; Tomio, Lauro
2013-03-25
During the eights and nineties many statistical/thermodynamical models were proposed to describe the nucleons' structure functions and distribution of the quarks in the hadrons. Most of these models describe the compound quarks and gluons inside the nucleon as a Fermi / Bose gas respectively, confined in a MIT bag with continuous energy levels. Another models considers discrete spectrum. Some interesting features of the nucleons are obtained by these models, like the sea asymmetries {sup -}d/{sup -}u and {sup -}d-{sup -}u.
Quantum Statistical Testing of a Quantum Random Number Generator
Humble, Travis S
2014-01-01
The unobservable elements in a quantum technology, e.g., the quantum state, complicate system verification against promised behavior. Using model-based system engineering, we present methods for verifying the opera- tion of a prototypical quantum random number generator. We begin with the algorithmic design of the QRNG followed by the synthesis of its physical design requirements. We next discuss how quantum statistical testing can be used to verify device behavior as well as detect device bias. We conclude by highlighting how system design and verification methods must influence effort to certify future quantum technologies.
Statistical anisotropy of the curvature perturbation from vector field perturbations
Dimopoulos, Konstantinos; Karciauskas, Mindaugas; Lyth, David H.; Rodriguez, Yeinzon E-mail: m.karciauskas@lancaster.ac.uk E-mail: yeinzon.rodriguez@uan.edu.co
2009-05-15
The {delta}N formula for the primordial curvature perturbation {zeta} is extended to include vector as well as scalar fields. Formulas for the tree-level contributions to the spectrum and bispectrum of {zeta} are given, exhibiting statistical anisotropy. The one-loop contribution to the spectrum of {zeta} is also worked out. We then consider the generation of vector field perturbations from the vacuum, including the longitudinal component that will be present if there is no gauge invariance. Finally, the {delta}N formula is applied to the vector curvaton and vector inflation models with the tensor perturbation also evaluated in the latter case.
Financial statistics of major US publicly owned electric utilities 1993
Not Available
1995-02-01
The 1993 edition of the Financial Statistics of Major U.S. Publicly Owned Electric Utilities publication presents five years (1989 to 1993) of summary financial data and current year detailed financial data on the major publicly owned electric utilities. The objective of the publication is to provide Federal and State governments, industry, and the general public with current and historical data that can be used for policymaking and decision making purposes related to publicly owned electric utility issues. Generator and nongenerator summaries are presented in this publication. The primary source of publicly owned financial data is the Form EIA-412, the Annual Report of Public Electric Utilities, filed on a fiscal basis.
Anomalous diffusion and Tsallis statistics in an optical lattice
Lutz, Eric
2003-05-01
We point out a connection between anomalous transport in an optical lattice and Tsallis' generalized statistics. Specifically, we show that the momentum equation for the semiclassical Wigner function which describes atomic motion in the optical potential, belongs to a class of transport equations recently studied by Borland [Phys. Lett. A 245, 67 (1998)]. The important property of these ordinary linear Fokker-Planck equations is that their stationary solutions are exactly given by Tsallis distributions. An analytical expression of the Tsallis index q in terms of the microscopic parameters of the quantum-optical problem is given and the spatial coherence of the atomic wave packets is discussed.
Multiple-comparison computer program using the bonferroni t statistic
Johnson, E. E.
1980-11-13
To ascertain the agreement among laboratories, samples from a single batch of material are analyzed by the different laboratories and results are then compared. A graphical format was designed for presenting the results and for showing which laboratories have significantly different results. The appropriate statistic for simultaneously testing the significance of the differences between several means is Bonferroni t. A computer program was written to make the tests between means based on Bonferroni t and also to make multiple comparisons of the standard deviations associated with the means. The program plots the results and indicates means and standard deviations which are significantly different.
Statistical thermodynamics of strain hardening in polycrystalline solids
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Langer, James S.
2015-01-01
This paper starts with a systematic rederivation of the statistical thermodynamic equations of motion for dislocation-mediated plasticity proposed in 2010 by Langer, Bouchbinder, and Lookman. The paper then uses that theory to explain the anomalous rate-hardening behavior reported in 1988 by Follansbee and Kocks and to explore the relation between hardening rate and grain size reported in 1995 by Meyers et al. A central theme is the need for physics-based, nonequilibrium analyses in developing predictive theories of the strength of polycrystalline materials.
Statistics at work in heavy-ion reactions
Moretto, L.G.
1982-07-01
In the first part special aspects of the compound nucleus decay are considered. The evaporation of particles intermediate between nucleons and fission fragments is explored both theoretically and experimentally. The limitations of the fission decay width expression obtained with the transition state method are discussed, and a more general approach is proposed. In the second part the process of angular momentum transfer in deep inelastic reactions is considered. The limit of statistical equilibrium is studied and specifically applied to the estimation of the degree of alignment of the fragment spins. The magnitude and alignment of the transferred angular momentum is experimentally determined from sequentially emitted alpha, gamma, and fission fragments.
Statistical Inference for Big Data Problems in Molecular Biophysics
Ramanathan, Arvind; Savol, Andrej; Burger, Virginia; Quinn, Shannon; Agarwal, Pratul K; Chennubhotla, Chakra
2012-01-01
We highlight the role of statistical inference techniques in providing biological insights from analyzing long time-scale molecular simulation data. Technologi- cal and algorithmic improvements in computation have brought molecular simu- lations to the forefront of techniques applied to investigating the basis of living systems. While these longer simulations, increasingly complex reaching petabyte scales presently, promise a detailed view into microscopic behavior, teasing out the important information has now become a true challenge on its own. Mining this data for important patterns is critical to automating therapeutic intervention discovery, improving protein design, and fundamentally understanding the mech- anistic basis of cellular homeostasis.
Statistical Methods Handbook for Advanced Gas Reactor Fuel Materials
J. J. Einerson
2005-05-01
Fuel materials such as kernels, coated particles, and compacts are being manufactured for experiments simulating service in the next generation of high temperature gas reactors. These must meet predefined acceptance specifications. Many tests are performed for quality assurance, and many of these correspond to criteria that must be met with specified confidence, based on random samples. This report describes the statistical methods to be used. The properties of the tests are discussed, including the risk of false acceptance, the risk of false rejection, and the assumption of normality. Methods for calculating sample sizes are also described.
About the statistical description of gas-liquid flows
Sanz, D.; Guido-Lavalle, G.; Carrica, P.
1995-09-01
Elements of the probabilistic geometry are used to derive the bubble coalescence term of the statistical description of gas liquid flows. It is shown that the Boltzmann`s hypothesis, that leads to the kinetic theory of dilute gases, is not appropriate for this kind of flows. The resulting integro-differential transport equation is numerically integrated to study the flow development in slender bubble columns. The solution remarkably predicts the transition from bubbly to slug flow pattern. Moreover, a bubbly bimodal size distribution is predicted, which has already been observed experimentally.
Poincar recurrence statistics as an indicator of chaos synchronization
Boev, Yaroslav I. Vadivasova, Tatiana E. Anishchenko, Vadim S.
2014-06-15
The dynamics of the autonomous and non-autonomous Rssler system is studied using the Poincar recurrence time statistics. It is shown that the probability distribution density of Poincar recurrences represents a set of equidistant peaks with the distance that is equal to the oscillation period and the envelope obeys an exponential distribution. The dimension of the spatially uniform Rssler attractor is estimated using Poincar recurrence times. The mean Poincar recurrence time in the non-autonomous Rssler system is locked by the external frequency, and this enables us to detect the effect of phase-frequency synchronization.
Statistical methods of spin assignment in compound nuclear reactions
Mach, H.; Johns, M.W.
1985-01-15
Spin assignment to nuclear levels can be obtained from standard in-beam gamma-ray spectroscopy techniques and in the case of compound nuclear reactions can be complemented by statistical methods. These are based on a correlation pattern between level spin and gamma-ray intensities feeding low-lying levels. Three types of intensity and level spin correlations are found suitable for spin assignment: shapes of the excitation functions, ratio of intensity at two beam energies or populated in two different reactions, and feeding distributions. Various empirical attempts are examined and the range of applicability of these methods as well as the limitations associated with them are given.
Office of Oil, Gas, and Coal Supply Statistics
U.S. Energy Information Administration (EIA) Indexed Site
3 U.S. Department of Energy Washington, DC 20585 2013 U.S. Energy Information Administration | Natural Gas Monthly ii This report was prepared by the U.S. Energy Information Administration (EIA), the statistical and analytical agency within the U.S. Department of Energy. By law, EIA's data, analyses, and forecasts are independent of approval by any other officer or employee of the United States Government. The views in this report therefore should not be construed as representing those of the
Office of Oil, Gas, and Coal Supply Statistics
U.S. Energy Information Administration (EIA) Indexed Site
4 U.S. Department of Energy Washington, DC 20585 2014 U.S. Energy Information Administration | Natural Gas Monthly ii This report was prepared by the U.S. Energy Information Administration (EIA), the statistical and analytical agency within the U.S. Department of Energy. By law, EIA's data, analyses, and forecasts are independent of approval by any other officer or employee of the United States Government. The views in this report therefore should not be construed as representing those of the
Statistical Characterization of Medium-Duty Electric Vehicle Drive Cycles: Preprint
Prohaska, R.; Duran, A.; Ragatz, A.; Kelly, K.
2015-05-01
In an effort to help commercialize technologies for electric vehicles (EVs) through deployment and demonstration projects, the U.S. Department of Energy’s (DOE's) American Recovery and Reinvestment Act (ARRA) provided funding to participating U.S. companies to cover part of the cost of purchasing new EVs. Within the medium- and heavy-duty commercial vehicle segment, both Smith Electric Newton and and Navistar eStar vehicles qualified for such funding opportunities. In an effort to evaluate the performance characteristics of the new technologies deployed in these vehicles operating under real world conditions, data from Smith Electric and Navistar medium-duty EVs were collected, compiled, and analyzed by the National Renewable Energy Laboratory's (NREL) Fleet Test and Evaluation team over a period of 3 years. More than 430 Smith Newton EVs have provided data representing more than 150,000 days of operation. Similarly, data have been collected from more than 100 Navistar eStar EVs, resulting in a comparative total of more than 16,000 operating days. Combined, NREL has analyzed more than 6 million kilometers of driving and 4 million hours of charging data collected from commercially operating medium-duty electric vehicles in various configurations. In this paper, extensive duty-cycle statistical analyses are performed to examine and characterize common vehicle dynamics trends and relationships based on in-use field data. The results of these analyses statistically define the vehicle dynamic and kinematic requirements for each vehicle, aiding in the selection of representative chassis dynamometer test cycles and the development of custom drive cycles that emulate daily operation. In this paper, the methodology and accompanying results of the duty-cycle statistical analysis are presented and discussed. Results are presented in both graphical and tabular formats illustrating a number of key relationships between parameters observed within the data set that relate to
Hybrid Statistical Testing for Nuclear Material Accounting Data and/or Process Monitoring Data
Ticknor, Lawrence O.; Hamada, Michael Scott; Sprinkle, James K.; Burr, Thomas Lee
2015-04-14
The two tests employed in the hybrid testing scheme are Page’s cumulative sums for all streams within a Balance Period (maximum of the maximums and average of the maximums) and Crosier’s multivariate cumulative sum applied to incremental cumulative sums across Balance Periods. The role of residuals for both kinds of data is discussed.
View discovery in OLAP databases through statistical combinatorial optimization
Hengartner, Nick W; Burke, John; Critchlow, Terence; Joslyn, Cliff; Hogan, Emilie
2009-01-01
OnLine Analytical Processing (OLAP) is a relational database technology providing users with rapid access to summary, aggregated views of a single large database, and is widely recognized for knowledge representation and discovery in high-dimensional relational databases. OLAP technologies provide intuitive and graphical access to the massively complex set of possible summary views available in large relational (SQL) structured data repositories. The capability of OLAP database software systems to handle data complexity comes at a high price for analysts, presenting them a combinatorially vast space of views of a relational database. We respond to the need to deploy technologies sufficient to allow users to guide themselves to areas of local structure by casting the space of 'views' of an OLAP database as a combinatorial object of all projections and subsets, and 'view discovery' as an search process over that lattice. We equip the view lattice with statistical information theoretical measures sufficient to support a combinatorial optimization process. We outline 'hop-chaining' as a particular view discovery algorithm over this object, wherein users are guided across a permutation of the dimensions by searching for successive two-dimensional views, pushing seen dimensions into an increasingly large background filter in a 'spiraling' search process. We illustrate this work in the context of data cubes recording summary statistics for radiation portal monitors at US ports.
Statistical mentoring at early training and career stages
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Anderson-Cook, Christine M.; Hamada, Michael S.; Moore, Leslie M.; Wendelberger, Joanne R.
2016-06-27
At Los Alamos National Laboratory (LANL), statistical scientists develop solutions for a variety of national security challenges through scientific excellence, typically as members of interdisciplinary teams. At LANL, mentoring is actively encouraged and practiced to develop statistical skills and positive career-building behaviors. Mentoring activities targeted at different career phases from student to junior staff are an important catalyst for both short and long term career development. This article discusses mentoring strategies for undergraduate and graduate students through internships as well as for postdoctoral research associates and junior staff. Topics addressed include project selection, progress, and outcome; intellectual and social activitiesmore » that complement the student internship experience; key skills/knowledge not typically obtained in academic training; and the impact of such internships on students’ careers. Experiences and strategies from a number of successful mentorships are presented. Feedback from former mentees obtained via a questionnaire is incorporated. As a result, these responses address some of the benefits the respondents received from mentoring, helpful contributions and advice from their mentors, key skills learned, and how mentoring impacted their later careers.« less
Statistics of anisotropies in inflation with spectator vector fields
Thorsrud, Mikjel; Mota, David F.; Urban, Federico R. E-mail: furban@ulb.ac.be
2014-04-01
We study the statistics of the primordial power spectrum in models where massless gauge vectors are coupled to the inflaton, paying special attention to observational implications of having fundamental or effective horizons embedded in a bath of infrared fluctuations. As quantum infrared modes cross the horizon, they classicalize and build a background vector field. We find that the vector experiences a statistical precession phenomenon. Implications for primordial correlators and the interpretation thereof are considered. Firstly, we show how in general two, not only one, additional observables, a quadrupole amplitude and an intrinsic shape parameter, are necessary to fully describe the correction to the curvature power spectrum, and develop a unique parametrization for them. Secondly, we show that the observed anisotropic amplitude and the associated preferred direction depend on the volume of the patch being probed. We calculate non-zero priors for the expected deviations between detections based on microwave background data (which probes the entire Hubble patch) and large scale structure (which only probes a fraction of it)
Statistical optimisation techniques in fatigue signal editing problem
Nopiah, Z. M.; Osman, M. H.; Baharin, N.; Abdullah, S.
2015-02-03
Success in fatigue signal editing is determined by the level of length reduction without compromising statistical constraints. A great reduction rate can be achieved by removing small amplitude cycles from the recorded signal. The long recorded signal sometimes renders the cycle-to-cycle editing process daunting. This has encouraged researchers to focus on the segment-based approach. This paper discusses joint application of the Running Damage Extraction (RDE) technique and single constrained Genetic Algorithm (GA) in fatigue signal editing optimisation.. In the first section, the RDE technique is used to restructure and summarise the fatigue strain. This technique combines the overlapping window and fatigue strain-life models. It is designed to identify and isolate the fatigue events that exist in the variable amplitude strain data into different segments whereby the retention of statistical parameters and the vibration energy are considered. In the second section, the fatigue data editing problem is formulated as a constrained single optimisation problem that can be solved using GA method. The GA produces the shortest edited fatigue signal by selecting appropriate segments from a pool of labelling segments. Challenges arise due to constraints on the segment selection by deviation level over three signal properties, namely cumulative fatigue damage, root mean square and kurtosis values. Experimental results over several case studies show that the idea of solving fatigue signal editing within a framework of optimisation is effective and automatic, and that the GA is robust for constrained segment selection.
Image segmentation by hierarchial agglomeration of polygons using ecological statistics
Prasad, Lakshman; Swaminarayan, Sriram
2013-04-23
A method for rapid hierarchical image segmentation based on perceptually driven contour completion and scene statistics is disclosed. The method begins with an initial fine-scale segmentation of an image, such as obtained by perceptual completion of partial contours into polygonal regions using region-contour correspondences established by Delaunay triangulation of edge pixels as implemented in VISTA. The resulting polygons are analyzed with respect to their size and color/intensity distributions and the structural properties of their boundaries. Statistical estimates of granularity of size, similarity of color, texture, and saliency of intervening boundaries are computed and formulated into logical (Boolean) predicates. The combined satisfiability of these Boolean predicates by a pair of adjacent polygons at a given segmentation level qualifies them for merging into a larger polygon representing a coarser, larger-scale feature of the pixel image and collectively obtains the next level of polygonal segments in a hierarchy of fine-to-coarse segmentations. The iterative application of this process precipitates textured regions as polygons with highly convolved boundaries and helps distinguish them from objects which typically have more regular boundaries. The method yields a multiscale decomposition of an image into constituent features that enjoy a hierarchical relationship with features at finer and coarser scales. This provides a traversable graph structure from which feature content and context in terms of other features can be derived, aiding in automated image understanding tasks. The method disclosed is highly efficient and can be used to decompose and analyze large images.
Statistical analysis of fatigue strain-life data for carbon and...
Office of Scientific and Technical Information (OSTI)
The existing fatigue strain vs life (S-N) data, foreign and domestic, for carbon and low-alloy steels used in the construction of nuclear power plant components have been compiled ...
Statistical analysis of radiochemical measurements of TRU radionuclides in REDC waste
Beauchamp, J.; Downing, D.; Chapman, J.; Fedorov, V.; Nguyen, L.; Parks, C.; Schultz, F.; Yong, L.
1996-10-01
This report summarizes results of the study on the isotopic ratios of transuranium elements in waste from the Radiochemical Engineering Development Center actinide-processing streams. The knowledge of the isotopic ratios when combined with results of nondestructive assays, in particular with results of Active-Passive Neutron Examination Assay and Gamma Active Segmented Passive Assay, may lead to significant increase in precision of the determination of TRU elements contained in ORNL generated waste streams.
Statistical Analysis of the Factors Influencing Consumer Use of E85
Bromiley, P.; Gerlach, T.; Marczak, K.; Taylor, M.; Dobrovolny, L.
2008-07-01
Evaluating the sales patterns of E85 retail outlets can provide important information about consumer behavior regarding E85, locating future E85 fueling infrastructure, and developing future alternative fuel policies and programs.
Davis, Adam Christopher
2015-08-25
The Worker Safety and Security Team (WSST) at Los Alamos National Laboratory holds an annual festival, WSST-fest, to engage workers and inform them about safety- and securityrelated matters. As part of the 2015 WSST-fest, workers were given the opportunity to participate in a survey assessing their engagement in their organizations and work environments. A total of 789 workers participated in the 23-question survey where they were also invited, optionally, to identify themselves, their organization, and to give open-ended feedback. The survey consisted of 23 positive statements (i.e. “My organization is a good place to work.”) with which the respondent could express a level of agreement. The text of these statements are provided in Table 1. The level of agreement corresponds to a 5-level Likert scale ranging from “Strongly Disagree” to “Strongly Agree.” In addition to assessing the overall positivity or negativity of the scores, the results were partitioned into several cohorts based on the response meta-data (self-identification, comments, etc.) to explore trends. Survey respondents were presented with the options to identify themselves, their organizations and to provide comments. These options suggested the following questions about the data set.
Askari, Sima [Faculty of Chemical Engineering, Amirkabir University of Technology (Tehran Polytechnic), P.O. Box 15875-4413, Hafez Ave., Tehran (Iran, Islamic Republic of); Halladj, Rouein, E-mail: halladj@aut.ac.ir [Faculty of Chemical Engineering, Amirkabir University of Technology (Tehran Polytechnic), P.O. Box 15875-4413, Hafez Ave., Tehran (Iran, Islamic Republic of); Nazari, Mahdi [Faculty of Chemical Engineering, Amirkabir University of Technology (Tehran Polytechnic), P.O. Box 15875-4413, Hafez Ave., Tehran (Iran, Islamic Republic of)
2013-05-15
Highlights: ? Sonochemical synthesis of SAPO-34 nanocrystals. ? Using Taguchi experimental design (L9) for optimizing the experimental procedure. ? The significant effects of all the ultrasonic parameters on the response. - Abstract: SAPO-34 nanocrystals with high crystallinity were synthesized by means of sonochemical method. An L9 orthogonal array of the Taguchi method was implemented to investigate the effects of sonication conditions on the preparation of SAPO-34 with respect to crystallinity of the final product phase. The experimental data establish the favorable phase crystallinity which is improved by increasing the ultrasonic power and the sonication temperature. In the case of ultrasonic irradiation time, however, an initial increases in crystallinity from 5 min to 15 min is followed by a decrease in crystallinity for longer sonication time.
www.eia.gov U.S. Energy Information Administration Independent Statistics & Analysis
U.S. Energy Information Administration (EIA) Indexed Site
Effects of low oil prices For 2015 EIA Energy Conference June 15, 2015 | Washington, DC By Lynn Westfall U.S. Energy Information Administration Effects of Low Oil Prices April 30, 2015 2 0 20 40 60 80 100 120 140 1970 1975 1980 1985 1990 1995 2000 2005 2010 2015 Imported refiner acquisition cost of crude oil Brent crude oil price Historical and projected oil prices Crude oil price price per barrel (real 2015 dollars) Prices shown are quarterly averages: dashed lines are EIA projections Period #1
EMERGING MODALITIES FOR SOIL CARBON ANALYSIS: SAMPLING STATISTICS AND ECONOMICS WORKSHOP.
WIELOPOLSKI, L.
2006-04-01
The workshop's main objectives are (1) to present the emerging modalities for analyzing carbon in soil, (2) to assess their error propagation, (3) to recommend new protocols and sampling strategies for the new instrumentation, and, (4) to compare the costs of the new methods with traditional chemical ones.
Analysis of sampling plan options for tank 16H from the perspective of statistical uncertainty
Shine, E. P.
2013-02-28
This report develops a concentration variability model for Tank 16H in order to compare candidate sampling plans for assessing the concentrations of analytes in the residual material in the annulus and on the floor of the primary vessel. A concentration variability model is used to compare candidate sampling plans based on the expected upper 95% confidence limit (UCL95) for the mean. The result is expressed as a rank order of candidate sampling plans from lowest to highest expected UCL95, with the lowest being the most desirable from an uncertainty perspective.
Broader source: Energy.gov [DOE]
The Voluntary Protection Program (VPP) was originally developed by Occupational Safety and Health Administration (OSHA) in 1982 to foster greater ownership of safety and health in the workplace. The Department of Energy (DOE) adopted VPP in 1992; currently 23 sites across the DOE complex participate in the program. As its name implies, it is a voluntary program; i.e. not required by laws or regulations.
Office of Oil, Gas, and Coal Supply Statistics
U.S. Energy Information Administration (EIA) Indexed Site
published in the NGM, is developed by the National Weather Service Climate Analysis Center, Camp Springs, Maryland. The data are available weekly with monthly summaries...
Statistical Review of Data from DWPF's Process Samples for Batches 19 Through 30
Edwards, T.B.
1999-04-06
The measurements derived from samples taken during the processing of batches 19 through 30 at the Defense Waste Processing Facility (DWPF) affords an opportunity for review and comparisons. This report has looked at some of the statistics from these data. Only the data reported by the DWPF lab (that is, the data provided by the lab as representative of the samples taken) are available for this analysis. In some cases, the sample results reported may be a subset of the sample results generated by the analytical procedures. A thorough assessment of the DWPF lab's analytical procedures would require the complete set of data. Thus, the statistics reported here, specifically, as they relate to analytical uncertainties, are limited to the reported data for these samples, A fell for the consistency of the incoming slurry is the estimation of the components of variation for the Sludge Receipt and Adjustment Tank (SRAT) receipts. In general, for all of the vessels, the data from batches after 21 show smaller batch-to-batch variation than the data from all the batches. The relative contributions of batch-to-batch versus residual, which includes analytical, are presented in these analyses.
Statistical modeling support for calibration of a multiphysics model of subcooled boiling flows
Bui, A. V.; Dinh, N. T.; Nourgaliev, R. R.; Williams, B. J.
2013-07-01
Nuclear reactor system analyses rely on multiple complex models which describe the physics of reactor neutronics, thermal hydraulics, structural mechanics, coolant physico-chemistry, etc. Such coupled multiphysics models require extensive calibration and validation before they can be used in practical system safety study and/or design/technology optimization. This paper presents an application of statistical modeling and Bayesian inference in calibrating an example multiphysics model of subcooled boiling flows which is widely used in reactor thermal hydraulic analysis. The presence of complex coupling of physics in such a model together with the large number of model inputs, parameters and multidimensional outputs poses significant challenge to the model calibration method. However, the method proposed in this work is shown to be able to overcome these difficulties while allowing data (observation) uncertainty and model inadequacy to be taken into consideration. (authors)
Statistical tools for prognostics and health management of complex systems
Collins, David H; Huzurbazar, Aparna V; Anderson - Cook, Christine M
2010-01-01
Prognostics and Health Management (PHM) is increasingly important for understanding and managing today's complex systems. These systems are typically mission- or safety-critical, expensive to replace, and operate in environments where reliability and cost-effectiveness are a priority. We present background on PHM and a suite of applicable statistical tools and methods. Our primary focus is on predicting future states of the system (e.g., the probability of being operational at a future time, or the expected remaining system life) using heterogeneous data from a variety of sources. We discuss component reliability models incorporating physical understanding, condition measurements from sensors, and environmental covariates; system reliability models that allow prediction of system failure time distributions from component failure models; and the use of Bayesian techniques to incorporate expert judgments into component and system models.
Statistical Stability and Time-Reversal Imgaing in Random Media
Berryman, J; Borcea, L; Papanicolaou, G; Tsogka, C
2002-02-05
Localization of targets imbedded in a heterogeneous background medium is a common problem in seismic, ultrasonic, and electromagnetic imaging problems. The best imaging techniques make direct use of the eigenfunctions and eigenvalues of the array response matrix, as recent work on time-reversal acoustics has shown. Of the various imaging functionals studied, one that is representative of a preferred class is a time-domain generalization of MUSIC (MUltiple Signal Classification), which is a well-known linear subspace method normally applied only in the frequency domain. Since statistical stability is not characteristic of the frequency domain, a transform back to the time domain after first diagonalizing the array data in the frequency domain takes optimum advantage of both the time-domain stability and the frequency-domain orthogonality of the relevant eigenfunctions.
Statistical measures of Planck scale signal correlations in interferometers
Hogan, Craig J.; Kwon, Ohkyung
2015-06-22
A model-independent statistical framework is presented to interpret data from systems where the mean time derivative of positional cross correlation between world lines, a measure of spreading in a quantum geometrical wave function, is measured with a precision smaller than the Planck time. The framework provides a general way to constrain possible departures from perfect independence of classical world lines, associated with Planck scale bounds on positional information. A parametrized candidate set of possible correlation functions is shown to be consistent with the known causal structure of the classical geometry measured by an apparatus, and the holographic scaling of information suggested by gravity. Frequency-domain power spectra are derived that can be compared with interferometer data. As a result, simple projections of sensitivity for specific experimental set-ups suggests that measurements will directly yield constraints on a universal time derivative of the correlation function, and thereby confirm or rule out a class of Planck scale departures from classical geometry.
Development and testing of improved statistical wind power forecasting methods.
Mendes, J.; Bessa, R.J.; Keko, H.; Sumaili, J.; Miranda, V.; Ferreira, C.; Gama, J.; Botterud, A.; Zhou, Z.; Wang, J.
2011-12-06
Wind power forecasting (WPF) provides important inputs to power system operators and electricity market participants. It is therefore not surprising that WPF has attracted increasing interest within the electric power industry. In this report, we document our research on improving statistical WPF algorithms for point, uncertainty, and ramp forecasting. Below, we provide a brief introduction to the research presented in the following chapters. For a detailed overview of the state-of-the-art in wind power forecasting, we refer to [1]. Our related work on the application of WPF in operational decisions is documented in [2]. Point forecasts of wind power are highly dependent on the training criteria used in the statistical algorithms that are used to convert weather forecasts and observational data to a power forecast. In Chapter 2, we explore the application of information theoretic learning (ITL) as opposed to the classical minimum square error (MSE) criterion for point forecasting. In contrast to the MSE criterion, ITL criteria do not assume a Gaussian distribution of the forecasting errors. We investigate to what extent ITL criteria yield better results. In addition, we analyze time-adaptive training algorithms and how they enable WPF algorithms to cope with non-stationary data and, thus, to adapt to new situations without requiring additional offline training of the model. We test the new point forecasting algorithms on two wind farms located in the U.S. Midwest. Although there have been advancements in deterministic WPF, a single-valued forecast cannot provide information on the dispersion of observations around the predicted value. We argue that it is essential to generate, together with (or as an alternative to) point forecasts, a representation of the wind power uncertainty. Wind power uncertainty representation can take the form of probabilistic forecasts (e.g., probability density function, quantiles), risk indices (e.g., prediction risk index) or scenarios
Structure Learning and Statistical Estimation in Distribution Networks - Part II
Deka, Deepjyoti; Backhaus, Scott N.; Chertkov, Michael
2015-02-13
Limited placement of real-time monitoring devices in the distribution grid, recent trends notwithstanding, has prevented the easy implementation of demand-response and other smart grid applications. Part I of this paper discusses the problem of learning the operational structure of the grid from nodal voltage measurements. In this work (Part II), the learning of the operational radial structure is coupled with the problem of estimating nodal consumption statistics and inferring the line parameters in the grid. Based on a Linear-Coupled(LC) approximation of AC power flows equations, polynomial time algorithms are designed to identify the structure and estimate nodal load characteristics and/or line parameters in the grid using the available nodal voltage measurements. Then the structure learning algorithm is extended to cases with missing data, where available observations are limited to a fraction of the grid nodes. The efficacy of the presented algorithms are demonstrated through simulations on several distribution test cases.
Role of quantum statistics in multi-particle decay dynamics
Marchewka, Avi; Granot, Er’el
2015-04-15
The role of quantum statistics in the decay dynamics of a multi-particle state, which is suddenly released from a confining potential, is investigated. For an initially confined double particle state, the exact dynamics is presented for both bosons and fermions. The time-evolution of the probability to measure two-particle is evaluated and some counterintuitive features are discussed. For instance, it is shown that although there is a higher chance of finding the two bosons (as oppose to fermions, and even distinguishable particles) at the initial trap region, there is a higher chance (higher than fermions) of finding them on two opposite sides of the trap as if the repulsion between bosons is higher than the repulsion between fermions. The results are demonstrated by numerical simulations and are calculated analytically in the short-time approximation. Furthermore, experimental validation is suggested.
Statistical properties of Charney-Hasegawa-Mima zonal flows
Anderson, Johan; Botha, G. J. J.
2015-05-15
A theoretical interpretation of numerically generated probability density functions (PDFs) of intermittent plasma transport events in unforced zonal flows is provided within the Charney-Hasegawa-Mima (CHM) model. The governing equation is solved numerically with various prescribed density gradients that are designed to produce different configurations of parallel and anti-parallel streams. Long-lasting vortices form whose flow is governed by the zonal streams. It is found that the numerically generated PDFs can be matched with analytical predictions of PDFs based on the instanton method by removing the autocorrelations from the time series. In many instances, the statistics generated by the CHM dynamics relaxes to Gaussian distributions for both the electrostatic and vorticity perturbations, whereas in areas with strong nonlinear interactions it is found that the PDFs are exponentially distributed.
Predicting weak lensing statistics from halo mass reconstructions - Final Paper
Everett, Spencer
2015-08-20
As dark matter does not absorb or emit light, its distribution in the universe must be inferred through indirect effects such as the gravitational lensing of distant galaxies. While most sources are only weakly lensed, the systematic alignment of background galaxies around a foreground lens can constrain the mass of the lens which is largely in the form of dark matter. In this paper, I have implemented a framework to reconstruct all of the mass along lines of sight using a best-case dark matter halo model in which the halo mass is known. This framework is then used to make predictions of the weak lensing of 3,240 generated source galaxies through a 324 arcmin² field of the Millennium Simulation. The lensed source ellipticities are characterized by the ellipticity-ellipticity and galaxy-mass correlation functions and compared to the same statistic for the intrinsic and ray-traced ellipticities. In the ellipticity-ellipticity correlation function, I and that the framework systematically under predicts the shear power by an average factor of 2.2 and fails to capture correlation from dark matter structure at scales larger than 1 arcminute. The model predicted galaxy-mass correlation function is in agreement with the ray-traced statistic from scales 0.2 to 0.7 arcminutes, but systematically underpredicts shear power at scales larger than 0.7 arcminutes by an average factor of 1.2. Optimization of the framework code has reduced the mean CPU time per lensing prediction by 70% to 24 ± 5 ms. Physical and computational shortcomings of the framework are discussed, as well as potential improvements for upcoming work.
Hegerfeldt, G.C.; Henneberg, R. (Institut fuer Theoretische Physik, University of Goettingen, D-37073 Goettingen (Germany))
1994-05-01
The statistical analysis of energy levels, a powerful tool in the study of quantum systems, is applicable to discrete spectra. Here we propose an approach to carry level statistics over to continuous energy spectra, paradoxical as this may sound at first. The approach proceeds in three steps, first a discretization of the spectrum by cutoffs, then a statistical analysis of the resulting discrete spectra, and finally a determination of the limit distributions as the cutoffs are removed. In this way the notions of Wigner and Poisson distributions for nearest-neighbor spacing (NNS), usually associated with quantum chaos and regularity, can be carried over to systems with a purely continuous energy spectrum. The approach is demonstrated for the hydrogen atom in perpendicular electric and magnetic fields. This system has a purely continuous energy spectrum from [minus][infinity] to [infinity]. Depending on the field parameters, we find for the NNS a Poisson or a Wigner distribution, or a transitional behavior. We also outline how to determine physically relevant resonances in our approach by a stabilization method.
Statistical line-by-line model for atomic spectra in intermediate...
Office of Scientific and Technical Information (OSTI)
Statistical line-by-line model for atomic spectra in intermediate coupling Citation Details In-Document Search Title: Statistical line-by-line model for atomic spectra in ...
2012_0112_BerylliumStatistics_Attachment7.pdf
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
of Energy 2-2014 Offshore Wind Market and Economic Analysis Reports 2012-2014 Offshore Wind Market and Economic Analysis Reports These reports authored by the Navigant Consortium provide a comprehensive annual assessment of the U.S. offshore wind market from 2012 to 2014. The reports provides stakeholders with a reliable and consistent data source addressing entry barriers and U.S. competitiveness in the offshore wind market. The 2012 edition contains significant policy and economic
Statistical Exploration of Electronic Structure of Molecules from Quantum Monte-Carlo Simulations
Prabhat, Mr; Zubarev, Dmitry; Lester, Jr., William A.
2010-12-22
In this report, we present results from analysis of Quantum Monte Carlo (QMC) simulation data with the goal of determining internal structure of a 3N-dimensional phase space of an N-electron molecule. We are interested in mining the simulation data for patterns that might be indicative of the bond rearrangement as molecules change electronic states. We examined simulation output that tracks the positions of two coupled electrons in the singlet and triplet states of an H2 molecule. The electrons trace out a trajectory, which was analyzed with a number of statistical techniques. This project was intended to address the following scientific questions: (1) Do high-dimensional phase spaces characterizing electronic structure of molecules tend to cluster in any natural way? Do we see a change in clustering patterns as we explore different electronic states of the same molecule? (2) Since it is hard to understand the high-dimensional space of trajectories, can we project these trajectories to a lower dimensional subspace to gain a better understanding of patterns? (3) Do trajectories inherently lie in a lower-dimensional manifold? Can we recover that manifold? After extensive statistical analysis, we are now in a better position to respond to these questions. (1) We definitely see clustering patterns, and differences between the H2 and H2tri datasets. These are revealed by the pamk method in a fairly reliable manner and can potentially be used to distinguish bonded and non-bonded systems and get insight into the nature of bonding. (2) Projecting to a lower dimensional subspace ({approx}4-5) using PCA or Kernel PCA reveals interesting patterns in the distribution of scalar values, which can be related to the existing descriptors of electronic structure of molecules. Also, these results can be immediately used to develop robust tools for analysis of noisy data obtained during QMC simulations (3) All dimensionality reduction and estimation techniques that we tried seem to
User Statistics Collection Practices | U.S. DOE Office of Science (SC)
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
User Statistics Collection Practices User Facilities User Facilities Home User Facilities at a Glance User Resources User Statistics Policies and Processes Definition Designation Process Official List of SC User Facilities User Statistics Collection Practices User Statistics Collection Practices Archives Frequently Asked Questions User Facility Science Highlights User Facility News Contact Information Office of Science U.S. Department of Energy 1000 Independence Ave., SW Washington, DC 20585 P:
THE ABSOLUTE MAGNITUDE OF RRc VARIABLES FROM STATISTICAL PARALLAX
Kollmeier, Juna A.; Burns, Christopher R.; Thompson, Ian B.; Preston, George W.; Crane, Jeffrey D.; Madore, Barry F.; Morrell, Nidia; Prieto, José L.; Shectman, Stephen; Simon, Joshua D.; Villanueva, Edward; Szczygieł, Dorota M.; Gould, Andrew; Sneden, Christopher; Dong, Subo
2013-09-20
We present the first definitive measurement of the absolute magnitude of RR Lyrae c-type variable stars (RRc) determined purely from statistical parallax. We use a sample of 242 RRc variables selected from the All Sky Automated Survey for which high-quality light curves, photometry, and proper motions are available. We obtain high-resolution echelle spectra for these objects to determine radial velocities and abundances as part of the Carnegie RR Lyrae Survey. We find that M{sub V,RRc} = 0.59 ± 0.10 at a mean metallicity of [Fe/H] = –1.59. This is to be compared with previous estimates for RRab stars (M{sub V,RRab} = 0.76 ± 0.12) and the only direct measurement of an RRc absolute magnitude (RZ Cephei, M{sub V,RRc} = 0.27 ± 0.17). We find the bulk velocity of the halo relative to the Sun to be (W{sub π}, W{sub θ}, W{sub z} ) = (12.0, –209.9, 3.0) km s{sup –1} in the radial, rotational, and vertical directions with dispersions (σ{sub W{sub π}},σ{sub W{sub θ}},σ{sub W{sub z}}) = (150.4, 106.1, 96.0) km s{sup -1}. For the disk, we find (W{sub π}, W{sub θ}, W{sub z} ) = (13.0, –42.0, –27.3) km s{sup –1} relative to the Sun with dispersions (σ{sub W{sub π}},σ{sub W{sub θ}},σ{sub W{sub z}}) = (67.7,59.2,54.9) km s{sup -1}. Finally, as a byproduct of our statistical framework, we are able to demonstrate that UCAC2 proper-motion errors are significantly overestimated as verified by UCAC4.
ORISE: Workforce Analysis and Program Evaluation
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Workforce Analysis and Program Evaluation Workforce Analysis and Program Evaluation By accurately capturing statistics and conducting data analysis, the Oak Ridge Institute for Science and Education (ORISE) assesses the needs of the nation's science, technology, engineering and mathematics workforce. Our comprehensive workforce trends assessment and recruitment strategies enable us to match highly qualified participants with laboratories and agencies needing research assistance. Beyond research
Kleese van Dam, Kerstin; Lansing, Carina S.; Elsethagen, Todd O.; Hathaway, John E.; Guillen, Zoe C.; Dirks, James A.; Skorski, Daniel C.; Stephan, Eric G.; Gorrissen, Willy J.; Gorton, Ian; Liu, Yan
2014-01-28
Modern workflow systems enable scientists to run ensemble simulations at unprecedented scales and levels of complexity, allowing them to study system sizes previously impossible to achieve, due to the inherent resource requirements needed for the modeling work. However as a result of these new capabilities the science teams suddenly also face unprecedented data volumes that they are unable to analyze with their existing tools and methodologies in a timely fashion. In this paper we will describe the ongoing development work to create an integrated data intensive scientific workflow and analysis environment that offers researchers the ability to easily create and execute complex simulation studies and provides them with different scalable methods to analyze the resulting data volumes. The integration of simulation and analysis environments is hereby not only a question of ease of use, but supports fundamental functions in the correlated analysis of simulation input, execution details and derived results for multi-variant, complex studies. To this end the team extended and integrated the existing capabilities of the Velo data management and analysis infrastructure, the MeDICi data intensive workflow system and RHIPE the R for Hadoop version of the well-known statistics package, as well as developing a new visual analytics interface for the result exploitation by multi-domain users. The capabilities of the new environment are demonstrated on a use case that focusses on the Pacific Northwest National Laboratory (PNNL) building energy team, showing how they were able to take their previously local scale simulations to a nationwide level by utilizing data intensive computing techniques not only for their modeling work, but also for the subsequent analysis of their modeling results. As part of the PNNL research initiative PRIMA (Platform for Regional Integrated Modeling and Analysis) the team performed an initial 3 year study of building energy demands for the US Eastern
Statistical Performance Evaluation Of Soft Seat Pressure Relief Valves
Harris, Stephen P.; Gross, Robert E.
2013-03-26
Risk-based inspection methods enable estimation of the probability of failure on demand for spring-operated pressure relief valves at the United States Department of Energy's Savannah River Site in Aiken, South Carolina. This paper presents a statistical performance evaluation of soft seat spring operated pressure relief valves. These pressure relief valves are typically smaller and of lower cost than hard seat (metal to metal) pressure relief valves and can provide substantial cost savings in fluid service applications (air, gas, liquid, and steam) providing that probability of failure on demand (the probability that the pressure relief valve fails to perform its intended safety function during a potentially dangerous over pressurization) is at least as good as that for hard seat valves. The research in this paper shows that the proportion of soft seat spring operated pressure relief valves failing is the same or less than that of hard seat valves, and that for failed valves, soft seat valves typically have failure ratios of proof test pressure to set pressure less than that of hard seat valves.
Financial statistics major US publicly owned electric utilities 1996
1998-03-01
The 1996 edition of The Financial Statistics of Major US Publicly Owned Electric Utilities publication presents 5 years (1992 through 1996) of summary financial data and current year detailed financial data on the major publicly owned electric utilities. The objective of the publication is to provide Federal and State governments, industry, and the general public with current and historical data that can be used for policymaking and decision making purposes related to publicly owned electric utility issues. Generator and nongenerator summaries are presented in this publication. Five years of summary financial data are provided. Summaries of generators for fiscal years ending June 30 and December 31, nongenerators for fiscal years ending June 30 and December 31, and summaries of all respondents are provided. The composite tables present aggregates of income statement and balance sheet data, as well as financial indicators. Composite tables also display electric operation and maintenance expenses, electric utility plant, number of consumers, sales of electricity, and operating revenue, and electric energy account data. 2 figs., 32 tabs.
Statistically qualified neuro-analytic failure detection method and system
Vilim, Richard B.; Garcia, Humberto E.; Chen, Frederick W.
2002-03-02
An apparatus and method for monitoring a process involve development and application of a statistically qualified neuro-analytic (SQNA) model to accurately and reliably identify process change. The development of the SQNA model is accomplished in two stages: deterministic model adaption and stochastic model modification of the deterministic model adaptation. Deterministic model adaption involves formulating an analytic model of the process representing known process characteristics, augmenting the analytic model with a neural network that captures unknown process characteristics, and training the resulting neuro-analytic model by adjusting the neural network weights according to a unique scaled equation error minimization technique. Stochastic model modification involves qualifying any remaining uncertainty in the trained neuro-analytic model by formulating a likelihood function, given an error propagation equation, for computing the probability that the neuro-analytic model generates measured process output. Preferably, the developed SQNA model is validated using known sequential probability ratio tests and applied to the process as an on-line monitoring system. Illustrative of the method and apparatus, the method is applied to a peristaltic pump system.
A STATISTICAL STUDY OF TRANSVERSE OSCILLATIONS IN A QUIESCENT PROMINENCE
Hillier, A.; Morton, R. J.; Erdlyi, R.
2013-12-20
The launch of the Hinode satellite has allowed for seeing-free observations at high-resolution and high-cadence making it well suited to study the dynamics of quiescent prominences. In recent years it has become clear that quiescent prominences support small-amplitude transverse oscillations, however, sample sizes are usually too small for general conclusions to be drawn. We remedy this by providing a statistical study of transverse oscillations in vertical prominence threads. Over a 4 hr period of observations it was possible to measure the properties of 3436 waves, finding periods from 50 to 6000s with typical velocity amplitudes ranging between 0.2 and 23km s{sup 1}. The large number of observed waves allows the determination of the frequency dependence of the wave properties and derivation of the velocity power spectrum for the transverse waves. For frequencies less than 7mHz, the frequency dependence of the velocity power is consistent with the velocity power spectra generated from observations of the horizontal motions of magnetic elements in the photosphere, suggesting that the prominence transverse waves are driven by photospheric motions. However, at higher frequencies the two distributions significantly diverge, with relatively more power found at higher frequencies in the prominence oscillations. These results highlight that waves over a large frequency range are ubiquitous in prominences, and that a significant amount of the wave energy is found at higher frequency.
International petroleum statistics report, January 1992. [Contains Glossary
1992-01-01
The International Petroleum Statistics Report presents data on international oil production, consumption, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil consumption and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1980, and monthly data for the most two years. Section 2 presents an oil supply/consumption balance for the market economies (i.e., non-communist countries). This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, consumption, and trade in OECD countries. World oil production and OECD consumption data are for the years 1970 through 1990; OECD stocks from 1973 through 1990; and OECD trade from 1982 through 1990.
Maxima of two random walks: Universal statistics of lead changes
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Ben-Naim, E.; Krapivsky, P. L.; Randon-Furling, J.
2016-04-18
In this study, we investigate statistics of lead changes of the maxima of two discrete-time random walks in one dimension. We show that the average number of lead changes grows asmore » $${\\pi }^{-1}\\mathrm{ln}t$$ in the long-time limit. We present theoretical and numerical evidence that this asymptotic behavior is universal. Specifically, this behavior is independent of the jump distribution: the same asymptotic underlies standard Brownian motion and symmetric Lévy flights. We also show that the probability to have at most n lead changes behaves as $${t}^{-1/4}{(\\mathrm{ln}t)}^{n}$$ for Brownian motion and as $${t}^{-\\beta (\\mu )}{(\\mathrm{ln}t)}^{n}$$ for symmetric Lévy flights with index μ. The decay exponent $$\\beta \\equiv \\beta (\\mu )$$ varies continuously with the Lévy index when $$0\\lt \\mu \\lt 2$$, and remains constant $$\\beta =1/4$$ for $$\\mu \\gt 2$$.« less
Glueballs and statistical mechanics of the gluon plasma
Brau, Fabian; Buisseret, Fabien
2009-06-01
We study a pure gluon plasma in the context of quasiparticle models, where the plasma is considered as an ideal gas of massive bosons. In order to reproduce SU(3) gauge field lattice data within such a framework, we review briefly the necessity to use a temperature-dependent gluon mass which accounts for color interactions between the gluons near T{sub c} and agrees with perturbative QCD at large temperatures. Consequently, we discuss the thermodynamics of systems with temperature-dependent Hamiltonians and clarify the situation about the possible solutions proposed in the literature to treat those systems consistently. We then focus our attention on two possible formulations which are thermodynamically consistent, and we extract the gluon mass from the equation of state obtained in SU(3) lattice QCD. We find that the thermal gluon mass is similar in both statistical formalisms. Finally, an interpretation of the gluon plasma as an ideal gas made of glueballs and gluons is also presented. The glueball mass is consistently computed within a relativistic formalism using a potential obtained from lattice QCD. We find that the gluon plasma might be a glueball-rich medium for T < or approx. 1.13T{sub c} and suggest that glueballs could be detected in future experiments dedicated to quark-gluon plasma.
Structure Learning and Statistical Estimation in Distribution Networks - Part I
Deka, Deepjyoti; Backhaus, Scott N.; Chertkov, Michael
2015-02-13
Traditionally power distribution networks are either not observable or only partially observable. This complicates development and implementation of new smart grid technologies, such as those related to demand response, outage detection and management, and improved load-monitoring. In this two part paper, inspired by proliferation of the metering technology, we discuss estimation problems in structurally loopy but operationally radial distribution grids from measurements, e.g. voltage data, which are either already available or can be made available with a relatively minor investment. In Part I, the objective is to learn the operational layout of the grid. Part II of this paper presents algorithms that estimate load statistics or line parameters in addition to learning the grid structure. Further, Part II discusses the problem of structure estimation for systems with incomplete measurement sets. Our newly suggested algorithms apply to a wide range of realistic scenarios. The algorithms are also computationally efficient – polynomial in time– which is proven theoretically and illustrated computationally on a number of test cases. The technique developed can be applied to detect line failures in real time as well as to understand the scope of possible adversarial attacks on the grid.
Statistical mechanics of self-driven Carnot cycles
Smith, E.
1999-10-01
The spontaneous generation and finite-amplitude saturation of sound, in a traveling-wave thermoacoustic engine, are derived as properties of a second-order phase transition. It has previously been argued that this dynamical phase transition, called {open_quotes}onset,{close_quotes} has an equivalent equilibrium representation, but the saturation mechanism and scaling were not computed. In this work, the sound modes implementing the engine cycle are coarse-grained and statistically averaged, in a partition function derived from microscopic dynamics on criteria of scale invariance. Self-amplification performed by the engine cycle is introduced through higher-order modal interactions. Stationary points and fluctuations of the resulting phenomenological Lagrangian are analyzed and related to background dynamical currents. The scaling of the stable sound amplitude near the critical point is derived and shown to arise universally from the interaction of finite-temperature disorder, with the order induced by self-amplification. {copyright} {ital 1999} {ital The American Physical Society}
Universal Quake Statistics: From Compressed Nanocrystals to Earthquakes
Uhl, Jonathan T.; Pathak, Shivesh; Schorlemmer, Danijel; Liu, Xin; Swindeman, Ryan; Brinkman, Braden A. W.; LeBlanc, Michael; Tsekenis, Georgios; Friedman, Nir; Behringer, Robert; Denisov, Dmitry; Schall, Peter; Gu, Xiaojun; Wright, Wendelin J.; Hufnagel, Todd; Jennings, Andrew; Greer, Julia R.; Liaw, P. K.; Becker, Thorsten; Dresen, Georg; Dahmen, Karin A.
2015-11-17
Slowly-compressed single crystals, bulk metallic glasses (BMGs), rocks, granular materials, and the earth all deform via intermittent slips or “quakes”. We find that although these systems span 12 decades in length scale, they all show the same scaling behavior for their slip size distributions and other statistical properties. Remarkably, the size distributions follow the same power law multiplied with the same exponential cutoff. The cutoff grows with applied force for materials spanning length scales from nanometers to kilometers. The tuneability of the cutoff with stress reflects “tuned critical” behavior, rather than self-organized criticality (SOC), which would imply stress-independence. A simple mean field model for avalanches of slipping weak spots explains the agreement across scales. It predicts the observed slip-size distributions and the observed stressdependent cutoff function. In conclusion, the results enable extrapolations from one scale to another, and from one force to another, across different materials and structures, from nanocrystals to earthquakes.
Universal Quake Statistics: From Compressed Nanocrystals to Earthquakes
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Uhl, Jonathan T.; Pathak, Shivesh; Schorlemmer, Danijel; Liu, Xin; Swindeman, Ryan; Brinkman, Braden A. W.; LeBlanc, Michael; Tsekenis, Georgios; Friedman, Nir; Behringer, Robert; et al
2015-11-17
Slowly-compressed single crystals, bulk metallic glasses (BMGs), rocks, granular materials, and the earth all deform via intermittent slips or “quakes”. We find that although these systems span 12 decades in length scale, they all show the same scaling behavior for their slip size distributions and other statistical properties. Remarkably, the size distributions follow the same power law multiplied with the same exponential cutoff. The cutoff grows with applied force for materials spanning length scales from nanometers to kilometers. The tuneability of the cutoff with stress reflects “tuned critical” behavior, rather than self-organized criticality (SOC), which would imply stress-independence. A simplemore » mean field model for avalanches of slipping weak spots explains the agreement across scales. It predicts the observed slip-size distributions and the observed stressdependent cutoff function. In conclusion, the results enable extrapolations from one scale to another, and from one force to another, across different materials and structures, from nanocrystals to earthquakes.« less
A statistical approach to designing mitigation for induced ac voltages
Dabkowski, J. [Electro Sciences, Inc., Crystal Lake, IL (United States)
1996-08-01
Induced voltage levels on buried pipelines collocated with overhead electric power transmission lines are usually mitigated by means of grounding the pipeline. Maximum effectiveness is obtained when grounds are placed at discrete locations along the pipeline where the peak induced voltages occur. The degree of mitigation achieved is dependent upon the local soil resistivity at these locations. On occasion it may be necessary to employ an extensive distributed grounding system, for example, a parallel buried wire connected to the pipe at periodic intervals. In this situation the a priori calculation of mitigated voltage levels is sometimes made assuming an average value for the soil resistivity. Over long distances, however, the soil resistivity generally varies as a log-normally distributed random variable. The effect of this variability upon the predicted mitigated voltage levels is examined. It is found that the predicted levels exhibit a statistical variability which precludes a precise determination of the mitigated voltage levels. Thus, post commissioning testing of the emplaced mitigation system is advisable.
Signatures of initial state modifications on bispectrum statistics
Meerburg, P Daniel; Schaar, Jan Pieter van der; Corasaniti, Pier Stefano E-mail: j.p.vanderschaar@uva.nl
2009-05-15
Modifications of the initial-state of the inflaton field can induce a departure from Gaussianity and leave a testable imprint on the higher order correlations of the CMB and large scale structures in the Universe. We focus on the bispectrum statistics of the primordial curvature perturbation and its projection on the CMB. For a canonical single-field action the three-point correlator enhancement is localized, maximizing in the collinear limit, corresponding to enfolded or squashed triangles in comoving momentum space. We show that the available local and equilateral template are very insensitive to this localized enhancement and do not generate noteworthy constraints on initial-state modifications. On the other hand, when considering the addition of a dimension 8 higher order derivative term, we find a dominant rapidly oscillating contribution, which had previously been overlooked and whose significantly enhanced amplitude is independent of the triangle under consideration. Nevertheless, the oscillatory nature of (the sign of) the correlation function implies the signal is nearly orthogonal to currently available observational templates, strongly reducing the sensitivity to the enhancement. Constraints on departures from the standard Bunch-Davies vacuum state can be derived, but also depend on the next-to-leading terms. We emphasize that the construction and application of especially adapted templates could lead to CMB bispectrum constraints on modified initial states already competing with those derived from the power spectrum.
Statistical properties of super-hot solar flares
Caspi, Amir; Krucker, Sm; Lin, R. P.
2014-01-20
We use Reuven Ramaty High Energy Solar Spectroscopic Imager (RHESSI) high-resolution imaging and spectroscopy observations from ?6 to 100 keV to determine the statistical relationships between measured parameters (temperature, emission measure, etc.) of hot, thermal plasma in 37 intense (GOES M- and X-class) solar flares. The RHESSI data, most sensitive to the hottest flare plasmas, reveal a strong correlation between the maximum achieved temperature and the flare GOES class, such that 'super-hot' temperatures >30 MK are achieved almost exclusively by X-class events; the observed correlation differs significantly from that of GOES-derived temperatures, and from previous studies. A nearly ubiquitous association with high emission measures, electron densities, and instantaneous thermal energies suggests that super-hot plasmas are physically distinct from cooler, ?10-20 MK GOES plasmas, and that they require substantially greater energy input during the flare. High thermal energy densities suggest that super-hot flares require strong coronal magnetic fields, exceeding ?100 G, and that both the plasma ? and volume filling factor f cannot be much less than unity in the super-hot region.
Fleming, P. A.; van Wingerden, J. W.; Wright, A. D.
2011-12-01
This paper presents the structure of an ongoing controller comparison experiment at NREL's National Wind Technology Center; the design process for the two controllers compared in this phase of the experiment, and initial comparison results obtained in field-testing. The intention of the study is to demonstrate the advantage of using modern multivariable methods for designing control systems for wind turbines versus conventional approaches. We will demonstrate the advantages through field-test results from experimental turbines located at the NWTC. At least two controllers are being developed side-by-side to meet an incrementally increasing number of turbine load-reduction objectives. The first, a multiple single-input, single-output (m-SISO) approach, uses separately developed decoupled and classicially tuned controllers, which is, to the best of our knowledge, common practice in the wind industry. The remaining controllers are developed using state-space multiple-input and multiple-output (MIMO) techniques to explicity account for coupling between loops and to optimize given known frequency structures of the turbine and disturbance. In this first publication from the study, we present the structure of the ongoing controller comparison experiment, the design process for the two controllers compared in this phase, and initial comparison results obtained in field-testing.