Broader source: Energy.gov [DOE]
Multivariate Statistical Analysis of Water Chemistry in Evaluating the Origin of Contamination in Many Devils Wash, Shiprock, New Mexico
Beyth, M.; McInteer, C.; Broxton, D.E.; Bolivar, S.L.; Luke, M.E.
1980-06-01
Multivariate statistical analyses were carried out on Hydrogeochemical and Stream Sediment Reconnaissance data from the Craig quadrangle, Colorado, to support the National Uranium Resource Evaluation and to evaluate strategic or other important commercial mineral resources. A few areas for favorable uranium mineralization are suggested for parts of the Wyoming Basin, Park Range, and Gore Range. Six potential source rocks for uranium are postulated based on factor score mapping. Vanadium in stream sediments is suggested as a pathfinder for carnotite-type mineralization. A probable northwest trend of lead-zinc-copper mineralization associated with Tertiary intrusions is suggested. A few locations are mapped where copper is associated with cobalt. Concentrations of placer sands containing rare earth elements, probably of commercial value, are indicated for parts of the Sand Wash Basin.
An Application of Multivariate Statistical Analysis for Query-Driven Visualization
Gosink, Luke J.; Garth, Christoph; Anderson, John C.; Bethel, E. Wes; Joy, Kenneth I.
2010-03-01
Abstract?Driven by the ability to generate ever-larger, increasingly complex data, there is an urgent need in the scientific community for scalable analysis methods that can rapidly identify salient trends in scientific data. Query-Driven Visualization (QDV) strategies are among the small subset of techniques that can address both large and highly complex datasets. This paper extends the utility of QDV strategies with a statistics-based framework that integrates non-parametric distribution estimation techniques with a new segmentation strategy to visually identify statistically significant trends and features within the solution space of a query. In this framework, query distribution estimates help users to interactively explore their query's solution and visually identify the regions where the combined behavior of constrained variables is most important, statistically, to their inquiry. Our new segmentation strategy extends the distribution estimation analysis by visually conveying the individual importance of each variable to these regions of high statistical significance. We demonstrate the analysis benefits these two strategies provide and show how they may be used to facilitate the refinement of constraints over variables expressed in a user's query. We apply our method to datasets from two different scientific domains to demonstrate its broad applicability.
Method of multivariate spectral analysis
Keenan, Michael R.; Kotula, Paul G.
2004-01-06
A method of determining the properties of a sample from measured spectral data collected from the sample by performing a multivariate spectral analysis. The method can include: generating a two-dimensional matrix A containing measured spectral data; providing a weighted spectral data matrix D by performing a weighting operation on matrix A; factoring D into the product of two matrices, C and S.sup.T, by performing a constrained alternating least-squares analysis of D=CS.sup.T, where C is a concentration intensity matrix and S is a spectral shapes matrix; unweighting C and S by applying the inverse of the weighting used previously; and determining the properties of the sample by inspecting C and S. This method can be used to analyze X-ray spectral data generated by operating a Scanning Electron Microscope (SEM) with an attached Energy Dispersive Spectrometer (EDS).
Lloyd, K. G.
2007-07-15
Buried irregular interfaces and particulate present special challenges in terms of chemical analysis and identification, and are critical issues in the manufacture of electronic materials and devices. Cross sectioning at the right location is often difficult, and, while dual-beam scanning electron microscopy/focused ion beam instruments can often provide excellent visualization of buried defects, matching chemical analysis may be absent or problematic. Time-of-flight secondary ion mass spectrometry (ToF-SIMS) depth profiling, with its ability to acquire spatially resolved depth profiles while collecting an entire mass spectrum at every 'voxel,' offers a way to revisit the problem of buried defects. Multivariate analysis of the overwhelming amount of data can reduce the output from essentially a depth profile at every mass to a small set of chemically meaningful factors. Data scaling is an important consideration in the application of these methods, and a comparison of scaling procedures is shown. Examples of ToF-SIMS depth profiles of relatively homogeneous layers, severely inhomogeneous layers, and buried particulate are discussed.
Advanced Multivariate Analysis Tools Applied to Surface Analysis...
Office of Scientific and Technical Information (OSTI)
Resource Relation: Conference: Proposed for presentation at the Microscopy and ... SUPERCONDUCTIVITY AND SUPERFLUIDITY; MICROANALYSIS; MICROSCOPY; MULTIVARIATE ANALYSIS
Classical least squares multivariate spectral analysis
Haaland, David M.
2002-01-01
An improved classical least squares multivariate spectral analysis method that adds spectral shapes describing non-calibrated components and system effects (other than baseline corrections) present in the analyzed mixture to the prediction phase of the method. These improvements decrease or eliminate many of the restrictions to the CLS-type methods and greatly extend their capabilities, accuracy, and precision. One new application of PACLS includes the ability to accurately predict unknown sample concentrations when new unmodeled spectral components are present in the unknown samples. Other applications of PACLS include the incorporation of spectrometer drift into the quantitative multivariate model and the maintenance of a calibration on a drifting spectrometer. Finally, the ability of PACLS to transfer a multivariate model between spectrometers is demonstrated.
Hybrid least squares multivariate spectral analysis methods
Haaland, David M.
2002-01-01
A set of hybrid least squares multivariate spectral analysis methods in which spectral shapes of components or effects not present in the original calibration step are added in a following estimation or calibration step to improve the accuracy of the estimation of the amount of the original components in the sampled mixture. The "hybrid" method herein means a combination of an initial classical least squares analysis calibration step with subsequent analysis by an inverse multivariate analysis method. A "spectral shape" herein means normally the spectral shape of a non-calibrated chemical component in the sample mixture but can also mean the spectral shapes of other sources of spectral variation, including temperature drift, shifts between spectrometers, spectrometer drift, etc. The "shape" can be continuous, discontinuous, or even discrete points illustrative of the particular effect.
Hybrid least squares multivariate spectral analysis methods
Haaland, David M.
2004-03-23
A set of hybrid least squares multivariate spectral analysis methods in which spectral shapes of components or effects not present in the original calibration step are added in a following prediction or calibration step to improve the accuracy of the estimation of the amount of the original components in the sampled mixture. The hybrid method herein means a combination of an initial calibration step with subsequent analysis by an inverse multivariate analysis method. A spectral shape herein means normally the spectral shape of a non-calibrated chemical component in the sample mixture but can also mean the spectral shapes of other sources of spectral variation, including temperature drift, shifts between spectrometers, spectrometer drift, etc. The shape can be continuous, discontinuous, or even discrete points illustrative of the particular effect.
Advanced Multivariate Analysis Tools Applied to Surface Analysis.
Office of Scientific and Technical Information (OSTI)
(Conference) | SciTech Connect Advanced Multivariate Analysis Tools Applied to Surface Analysis. Citation Details In-Document Search Title: Advanced Multivariate Analysis Tools Applied to Surface Analysis. No abstract prepared. Authors: Ohlhausen, James Anthony Publication Date: 2010-08-01 OSTI Identifier: 1022188 Report Number(s): SAND2010-5304C TRN: US201117%%662 DOE Contract Number: AC04-94AL85000 Resource Type: Conference Resource Relation: Conference: Proposed for presentation at the
Independent Statistics & Analysis
U.S. Energy Information Administration (EIA) Indexed Site
October 2014 Independent Statistics & Analysis www.eia.gov U.S. Department of Energy Washington, DC 20585 Quarterly Coal Distribution Report April - June 2014 This report was...
Augmented classical least squares multivariate spectral analysis
Haaland, David M.; Melgaard, David K.
2004-02-03
A method of multivariate spectral analysis, termed augmented classical least squares (ACLS), provides an improved CLS calibration model when unmodeled sources of spectral variation are contained in a calibration sample set. The ACLS methods use information derived from component or spectral residuals during the CLS calibration to provide an improved calibration-augmented CLS model. The ACLS methods are based on CLS so that they retain the qualitative benefits of CLS, yet they have the flexibility of PLS and other hybrid techniques in that they can define a prediction model even with unmodeled sources of spectral variation that are not explicitly included in the calibration model. The unmodeled sources of spectral variation may be unknown constituents, constituents with unknown concentrations, nonlinear responses, non-uniform and correlated errors, or other sources of spectral variation that are present in the calibration sample spectra. Also, since the various ACLS methods are based on CLS, they can incorporate the new prediction-augmented CLS (PACLS) method of updating the prediction model for new sources of spectral variation contained in the prediction sample set without having to return to the calibration process. The ACLS methods can also be applied to alternating least squares models. The ACLS methods can be applied to all types of multivariate data.
Augmented Classical Least Squares Multivariate Spectral Analysis
Haaland, David M.; Melgaard, David K.
2005-07-26
A method of multivariate spectral analysis, termed augmented classical least squares (ACLS), provides an improved CLS calibration model when unmodeled sources of spectral variation are contained in a calibration sample set. The ACLS methods use information derived from component or spectral residuals during the CLS calibration to provide an improved calibration-augmented CLS model. The ACLS methods are based on CLS so that they retain the qualitative benefits of CLS, yet they have the flexibility of PLS and other hybrid techniques in that they can define a prediction model even with unmodeled sources of spectral variation that are not explicitly included in the calibration model. The unmodeled sources of spectral variation may be unknown constituents, constituents with unknown concentrations, nonlinear responses, non-uniform and correlated errors, or other sources of spectral variation that are present in the calibration sample spectra. Also, since the various ACLS methods are based on CLS, they can incorporate the new prediction-augmented CLS (PACLS) method of updating the prediction model for new sources of spectral variation contained in the prediction sample set without having to return to the calibration process. The ACLS methods can also be applied to alternating least squares models. The ACLS methods can be applied to all types of multivariate data.
Augmented Classical Least Squares Multivariate Spectral Analysis
Haaland, David M.; Melgaard, David K.
2005-01-11
A method of multivariate spectral analysis, termed augmented classical least squares (ACLS), provides an improved CLS calibration model when unmodeled sources of spectral variation are contained in a calibration sample set. The ACLS methods use information derived from component or spectral residuals during the CLS calibration to provide an improved calibration-augmented CLS model. The ACLS methods are based on CLS so that they retain the qualitative benefits of CLS, yet they have the flexibility of PLS and other hybrid techniques in that they can define a prediction model even with unmodeled sources of spectral variation that are not explicitly included in the calibration model. The unmodeled sources of spectral variation may be unknown constituents, constituents with unknown concentrations, nonlinear responses, non-uniform and correlated errors, or other sources of spectral variation that are present in the calibration sample spectra. Also, since the various ACLS methods are based on CLS, they can incorporate the new prediction-augmented CLS (PACLS) method of updating the prediction model for new sources of spectral variation contained in the prediction sample set without having to return to the calibration process. The ACLS methods can also be applied to alternating least squares models. The ACLS methods can be applied to all types of multivariate data.
Apparatus and system for multivariate spectral analysis
Keenan, Michael R.; Kotula, Paul G.
2003-06-24
An apparatus and system for determining the properties of a sample from measured spectral data collected from the sample by performing a method of multivariate spectral analysis. The method can include: generating a two-dimensional matrix A containing measured spectral data; providing a weighted spectral data matrix D by performing a weighting operation on matrix A; factoring D into the product of two matrices, C and S.sup.T, by performing a constrained alternating least-squares analysis of D=CS.sup.T, where C is a concentration intensity matrix and S is a spectral shapes matrix; unweighting C and S by applying the inverse of the weighting used previously; and determining the properties of the sample by inspecting C and S. This method can be used by a spectrum analyzer to process X-ray spectral data generated by a spectral analysis system that can include a Scanning Electron Microscope (SEM) with an Energy Dispersive Detector and Pulse Height Analyzer.
Independent Statistics & Analysis
U.S. Energy Information Administration (EIA) Indexed Site
March 2016 Independent Statistics & Analysis www.eia.gov U.S. Department of Energy Washington, DC 20585 Quarterly Coal Distribution Report October - December 2014 This report was prepared by the U.S. Energy Information Administration (EIA), the statistical and analytical agency within the U.S. Department of Energy. By law, EIA's data, analyses, and forecasts are independent of approval by any other officer or employee of the United States Government. The views in this report therefore should
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
The tools used for this evaluation included statistical software to track Internet usage, ... * 123LogAnalyzer and Google Analytics statistical software packages * The LM National ...
Ladd-Lively, Jennifer L
2014-01-01
The objective of this work was to determine the feasibility of using on-line multivariate statistical process control (MSPC) for safeguards applications in natural uranium conversion plants. Multivariate statistical process control is commonly used throughout industry for the detection of faults. For safeguards applications in uranium conversion plants, faults could include the diversion of intermediate products such as uranium dioxide, uranium tetrafluoride, and uranium hexafluoride. This study was limited to a 100 metric ton of uranium (MTU) per year natural uranium conversion plant (NUCP) using the wet solvent extraction method for the purification of uranium ore concentrate. A key component in the multivariate statistical methodology is the Principal Component Analysis (PCA) approach for the analysis of data, development of the base case model, and evaluation of future operations. The PCA approach was implemented through the use of singular value decomposition of the data matrix where the data matrix represents normal operation of the plant. Component mole balances were used to model each of the process units in the NUCP. However, this approach could be applied to any data set. The monitoring framework developed in this research could be used to determine whether or not a diversion of material has occurred at an NUCP as part of an International Atomic Energy Agency (IAEA) safeguards system. This approach can be used to identify the key monitoring locations, as well as locations where monitoring is unimportant. Detection limits at the key monitoring locations can also be established using this technique. Several faulty scenarios were developed to test the monitoring framework after the base case or normal operating conditions of the PCA model were established. In all of the scenarios, the monitoring framework was able to detect the fault. Overall this study was successful at meeting the stated objective.
Nonparametric Multivariate Anomaly Analysis in Support of HPC Resilience
Ostrouchov, George; Naughton, III, Thomas J; Engelmann, Christian; Vallee, Geoffroy R; Scott, Stephen L
2009-01-01
Large-scale computing systems provide great potential for scientific exploration. However, the complexity that accompanies these enormous machines raises challeges for both, users and operators. The effective use of such systems is often hampered by failures encountered when running applications on systems containing tens-of-thousands of nodes and hundreds-of-thousands of compute cores capable of yielding petaflops of performance. In systems of this size failure detection is complicated and root-cause diagnosis difficult. This paper describes our recent work in the identification of anomalies in monitoring data and system logs to provide further insights into machine status, runtime behavior, failure modes and failure root causes. It discusses the details of an initial prototype that gathers the data and uses statistical techniques for analysis.
Systematic wavelength selection for improved multivariate spectral analysis
Thomas, Edward V.; Robinson, Mark R.; Haaland, David M.
1995-01-01
Methods and apparatus for determining in a biological material one or more unknown values of at least one known characteristic (e.g. the concentration of an analyte such as glucose in blood or the concentration of one or more blood gas parameters) with a model based on a set of samples with known values of the known characteristics and a multivariate algorithm using several wavelength subsets. The method includes selecting multiple wavelength subsets, from the electromagnetic spectral region appropriate for determining the known characteristic, for use by an algorithm wherein the selection of wavelength subsets improves the model's fitness of the determination for the unknown values of the known characteristic. The selection process utilizes multivariate search methods that select both predictive and synergistic wavelengths within the range of wavelengths utilized. The fitness of the wavelength subsets is determined by the fitness function F=.function.(cost, performance). The method includes the steps of: (1) using one or more applications of a genetic algorithm to produce one or more count spectra, with multiple count spectra then combined to produce a combined count spectrum; (2) smoothing the count spectrum; (3) selecting a threshold count from a count spectrum to select these wavelength subsets which optimize the fitness function; and (4) eliminating a portion of the selected wavelength subsets. The determination of the unknown values can be made: (1) noninvasively and in vivo; (2) invasively and in vivo; or (3) in vitro.
Clegg, Samuel M; Barefield, James E; Wiens, Roger C; Sklute, Elizabeth; Dyare, Melinda D
2008-01-01
Quantitative analysis with LIBS traditionally employs calibration curves that are complicated by the chemical matrix effects. These chemical matrix effects influence the LIBS plasma and the ratio of elemental composition to elemental emission line intensity. Consequently, LIBS calibration typically requires a priori knowledge of the unknown, in order for a series of calibration standards similar to the unknown to be employed. In this paper, three new Multivariate Analysis (MV A) techniques are employed to analyze the LIBS spectra of 18 disparate igneous and highly-metamorphosed rock samples. Partial Least Squares (PLS) analysis is used to generate a calibration model from which unknown samples can be analyzed. Principal Components Analysis (PCA) and Soft Independent Modeling of Class Analogy (SIMCA) are employed to generate a model and predict the rock type of the samples. These MV A techniques appear to exploit the matrix effects associated with the chemistries of these 18 samples.
Long, C.L.
1991-02-01
Multivariate calibration techniques can reduce the time required for routine testing and can provide new methods of analysis. Multivariate calibration is commonly used with near infrared reflectance analysis (NIRA) and Fourier transform infrared (FTIR) spectroscopy. Two feasibility studies were performed to determine the capability of NIRA, using multivariate calibration techniques, to perform analyses on the types of samples that are routinely analyzed at this laboratory. The first study performed included a variety of samples and indicated that NIRA would be well-suited to perform analyses on selected materials properties such as water content and hydroxyl number on polyol samples, epoxy content on epoxy resins, water content of desiccants, and the amine values of various amine cure agents. A second study was performed to assess the capability of NIRA to perform quantitative analysis of hydroxyl numbers and water contents of hydroxyl-containing materials. Hydroxyl number and water content were selected for determination because these tests are frequently run on polyol materials and the hydroxyl number determination is time consuming. This study pointed out the necessity of obtaining calibration standards identical to the samples being analyzed for each type of polyol or other material being analyzed. Multivariate calibration techniques are frequently used with FTIR data to determine the composition of a large variety of complex mixtures. A literature search indicated many applications of multivariate calibration to FTIR data. Areas identified where quantitation by FTIR would provide a new capability are quantitation of components in epoxy and silicone resins, polychlorinated biphenyls (PCBs) in oils, and additives to polymers. 19 refs., 15 figs., 6 tabs.
Reichardt, Thomas A.; Timlin, Jerilyn Ann; Jones, Howland D. T.; Sickafoose, Shane M.; Schmitt, Randal L.
2010-09-01
Laser-induced fluorescence measurements of cuvette-contained laser dye mixtures are made for evaluation of multivariate analysis techniques to optically thick environments. Nine mixtures of Coumarin 500 and Rhodamine 610 are analyzed, as well as the pure dyes. For each sample, the cuvette is positioned on a two-axis translation stage to allow the interrogation at different spatial locations, allowing the examination of both primary (absorption of the laser light) and secondary (absorption of the fluorescence) inner filter effects. In addition to these expected inner filter effects, we find evidence that a portion of the absorbed fluorescence is re-emitted. A total of 688 spectra are acquired for the evaluation of multivariate analysis approaches to account for nonlinear effects.
Spectral compression algorithms for the analysis of very large multivariate images
Keenan, Michael R. (Albuquerque, NM)
2007-10-16
A method for spectrally compressing data sets enables the efficient analysis of very large multivariate images. The spectral compression algorithm uses a factored representation of the data that can be obtained from Principal Components Analysis or other factorization technique. Furthermore, a block algorithm can be used for performing common operations more efficiently. An image analysis can be performed on the factored representation of the data, using only the most significant factors. The spectral compression algorithm can be combined with a spatial compression algorithm to provide further computational efficiencies.
Enhancing e-waste estimates: Improving data quality by multivariate Input–Output Analysis
Wang, Feng; Huisman, Jaco; Stevels, Ab; Baldé, Cornelis Peter
2013-11-15
Highlights: • A multivariate Input–Output Analysis method for e-waste estimates is proposed. • Applying multivariate analysis to consolidate data can enhance e-waste estimates. • We examine the influence of model selection and data quality on e-waste estimates. • Datasets of all e-waste related variables in a Dutch case study have been provided. • Accurate modeling of time-variant lifespan distributions is critical for estimate. - Abstract: Waste electrical and electronic equipment (or e-waste) is one of the fastest growing waste streams, which encompasses a wide and increasing spectrum of products. Accurate estimation of e-waste generation is difficult, mainly due to lack of high quality data referred to market and socio-economic dynamics. This paper addresses how to enhance e-waste estimates by providing techniques to increase data quality. An advanced, flexible and multivariate Input–Output Analysis (IOA) method is proposed. It links all three pillars in IOA (product sales, stock and lifespan profiles) to construct mathematical relationships between various data points. By applying this method, the data consolidation steps can generate more accurate time-series datasets from available data pool. This can consequently increase the reliability of e-waste estimates compared to the approach without data processing. A case study in the Netherlands is used to apply the advanced IOA model. As a result, for the first time ever, complete datasets of all three variables for estimating all types of e-waste have been obtained. The result of this study also demonstrates significant disparity between various estimation models, arising from the use of data under different conditions. It shows the importance of applying multivariate approach and multiple sources to improve data quality for modelling, specifically using appropriate time-varying lifespan parameters. Following the case study, a roadmap with a procedural guideline is provided to enhance e-waste estimation studies.
Spatial compression algorithm for the analysis of very large multivariate images
Keenan, Michael R.
2008-07-15
A method for spatially compressing data sets enables the efficient analysis of very large multivariate images. The spatial compression algorithms use a wavelet transformation to map an image into a compressed image containing a smaller number of pixels that retain the original image's information content. Image analysis can then be performed on a compressed data matrix consisting of a reduced number of significant wavelet coefficients. Furthermore, a block algorithm can be used for performing common operations more efficiently. The spatial compression algorithms can be combined with spectral compression algorithms to provide further computational efficiencies.
Statistical Hot Channel Analysis for the NBSR
Cuadra A.; Baek J.
2014-05-27
A statistical analysis of thermal limits has been carried out for the research reactor (NBSR) at the National Institute of Standards and Technology (NIST). The objective of this analysis was to update the uncertainties of the hot channel factors with respect to previous analysis for both high-enriched uranium (HEU) and low-enriched uranium (LEU) fuels. Although uncertainties in key parameters which enter into the analysis are not yet known for the LEU core, the current analysis uses reasonable approximations instead of conservative estimates based on HEU values. Cumulative distribution functions (CDFs) were obtained for critical heat flux ratio (CHFR), and onset of flow instability ratio (OFIR). As was done previously, the Sudo-Kaminaga correlation was used for CHF and the Saha-Zuber correlation was used for OFI. Results were obtained for probability levels of 90%, 95%, and 99.9%. As an example of the analysis, the results for both the existing reactor with HEU fuel and the LEU core show that CHFR would have to be above 1.39 to assure with 95% probability that there is no CHF. For the OFIR, the results show that the ratio should be above 1.40 to assure with a 95% probability that OFI is not reached.
The Multi-Isotope Process Monitor: Multivariate Analysis of Gamma Spectra
Orton, Christopher R.; Rutherford, Crystal E.; Fraga, Carlos G.; Schwantes, Jon M.
2011-10-30
The International Atomic Energy Agency (IAEA) has established international safeguards standards for fissionable material at spent fuel reprocessing plants to ensure that significant quantities of nuclear material are not diverted from these facilities. Currently, methods to verify material control and accountancy (MC&A) at these facilities require time-consuming and resource-intensive destructive assay (DA). The time delay between sampling and subsequent DA provides a potential opportunity to divert the material out of the appropriate chemical stream. Leveraging new on-line nondestructive assay (NDA) techniques in conjunction with the traditional and highly precise DA methods may provide a more timely, cost-effective and resource efficient means for MC&A verification at such facilities. Pacific Northwest National Laboratory (PNNL) is developing on-line NDA process monitoring technologies, including the Multi-Isotope Process (MIP) Monitor. The MIP Monitor uses gamma spectroscopy and pattern recognition software to identify off-normal conditions in process streams. Recent efforts have been made to explore the basic limits of using multivariate analysis techniques on gamma-ray spectra. This paper will provide an overview of the methods and report our on-going efforts to develop and demonstrate the technology.
Statistical analysis of random duration times
Engelhardt, M.E.
1996-04-01
This report presents basic statistical methods for analyzing data obtained by observing random time durations. It gives nonparametric estimates of the cumulative distribution function, reliability function and cumulative hazard function. These results can be applied with either complete or censored data. Several models which are commonly used with time data are discussed, and methods for model checking and goodness-of-fit tests are discussed. Maximum likelihood estimates and confidence limits are given for the various models considered. Some results for situations where repeated durations such as repairable systems are also discussed.
Characterization of Used Nuclear Fuel with Multivariate Analysis for Process Monitoring
Dayman, Kenneth J.; Coble, Jamie B.; Orton, Christopher R.; Schwantes, Jon M.
2014-01-01
The Multi-Isotope Process (MIP) Monitor combines gamma spectroscopy and multivariate analysis to detect anomalies in various process streams in a nuclear fuel reprocessing system. Measured spectra are compared to models of nominal behavior at each measurement location to detect unexpected changes in system behavior. In order to improve the accuracy and specificity of process monitoring, fuel characterization may be used to more accurately train subsequent models in a full analysis scheme. This paper presents initial development of a reactor-type classifier that is used to select a reactor-specific partial least squares model to predict fuel burnup. Nuclide activities for prototypic used fuel samples were generated in ORIGEN-ARP and used to investigate techniques to characterize used nuclear fuel in terms of reactor type (pressurized or boiling water reactor) and burnup. A variety of reactor type classification algorithms, including k-nearest neighbors, linear and quadratic discriminant analyses, and support vector machines, were evaluated to differentiate used fuel from pressurized and boiling water reactors. Then, reactor type-specific partial least squares models were developed to predict the burnup of the fuel. Using these reactor type-specific models instead of a model trained for all light water reactors improved the accuracy of burnup predictions. The developed classification and prediction models were combined and applied to a large dataset that included eight fuel assembly designs, two of which were not used in training the models, and spanned the range of the initial 235U enrichment, cooling time, and burnup values expected of future commercial used fuel for reprocessing. Error rates were consistent across the range of considered enrichment, cooling time, and burnup values. Average absolute relative errors in burnup predictions for validation data both within and outside the training space were 0.0574% and 0.0597%, respectively. The errors seen in this work are artificially low, because the models were trained, optimized, and tested on simulated, noise-free data. However, these results indicate that the developed models may generalize well to new data and that the proposed approach constitutes a viable first step in developing a fuel characterization algorithm based on gamma spectra.
Experimental control analysis of a fuel gas saturator. Final report. [Multivariable
Terwilliger, G.E.; Brower, A.S.; Baheti, R.S.; Smith, R.E.; Brown, D.H.
1985-01-01
The multivariable control of the clean fuel gas saturator of a coal gasification process has been demonstrated. First principle process models described the process dynamics from which linear models were generated and used for the actual control designs. The multivariable control was designed, its response to transients simulated and the controls were implemented in a computer controller for a fuel gas saturator. The test results obtained for the gas flow transients showed good correlation with the computer simulations, giving confidence in the ability of the simulation to predict the plant performance for other transients. In this study, both time and frequency domain multivariable design techniques were applied to provide the best possible design and to determine their relative effectiveness. No clear guidelines resulted; it appears that the selection may be made on the basis of personal preference, experience or the availability of computer-aided design tools, rather than inherent technical differences. This EPRI/GE fuel gas saturator control demonstration has shown that multivariable design techniques can be applied to a real process and that practical controls are developed. With suitable process models, presently available computer-aided control design software allows the control design, evaluation and implementation to be completed in a reasonable time period. The application of these techniques to power generation processes is recommended.
A Divergence Statistics Extension to VTK for Performance Analysis.
Pebay, Philippe Pierre; Bennett, Janine Camille
2015-02-01
This report follows the series of previous documents ([PT08, BPRT09b, PT09, BPT09, PT10, PB13], where we presented the parallel descriptive, correlative, multi-correlative, principal component analysis, contingency, k -means, order and auto-correlative statistics engines which we developed within the Visualization Tool Kit ( VTK ) as a scalable, parallel and versatile statistics package. We now report on a new engine which we developed for the calculation of divergence statistics, a concept which we hereafter explain and whose main goal is to quantify the discrepancy, in a stasticial manner akin to measuring a distance, between an observed empirical distribution and a theoretical, "ideal" one. The ease of use of the new diverence statistics engine is illustrated by the means of C++ code snippets. Although this new engine does not yet have a parallel implementation, it has already been applied to HPC performance analysis, of which we provide an example.
Data analysis using the Gnu R system for statistical computation
Simone, James; /Fermilab
2011-07-01
R is a language system for statistical computation. It is widely used in statistics, bioinformatics, machine learning, data mining, quantitative finance, and the analysis of clinical drug trials. Among the advantages of R are: it has become the standard language for developing statistical techniques, it is being actively developed by a large and growing global user community, it is open source software, it is highly portable (Linux, OS-X and Windows), it has a built-in documentation system, it produces high quality graphics and it is easily extensible with over four thousand extension library packages available covering statistics and applications. This report gives a very brief introduction to R with some examples using lattice QCD simulation results. It then discusses the development of R packages designed for chi-square minimization fits for lattice n-pt correlation functions.
Statistical Software for spatial analysis of stratigraphic data sets
Energy Science and Technology Software Center (OSTI)
2003-04-08
Stratistics s a tool for statistical analysis of spatially explicit data sets and model output for description and for model-data comparisons. lt is intended for the analysis of data sets commonly used in geology, such as gamma ray logs and lithologic sequences, as well as 2-D data such as maps. Stratistics incorporates a far wider range of spatial analysis methods drawn from multiple disciplines, than are currently available in other packages. These include incorporation ofmore » techniques from spatial and landscape ecology, fractal analysis, and mathematical geology. Its use should substantially reduce the risk associated with the use of predictive models« less
Feature-Based Statistical Analysis of Combustion Simulation Data
Bennett, J; Krishnamoorthy, V; Liu, S; Grout, R; Hawkes, E; Chen, J; Pascucci, V; Bremer, P T
2011-11-18
We present a new framework for feature-based statistical analysis of large-scale scientific data and demonstrate its effectiveness by analyzing features from Direct Numerical Simulations (DNS) of turbulent combustion. Turbulent flows are ubiquitous and account for transport and mixing processes in combustion, astrophysics, fusion, and climate modeling among other disciplines. They are also characterized by coherent structure or organized motion, i.e. nonlocal entities whose geometrical features can directly impact molecular mixing and reactive processes. While traditional multi-point statistics provide correlative information, they lack nonlocal structural information, and hence, fail to provide mechanistic causality information between organized fluid motion and mixing and reactive processes. Hence, it is of great interest to capture and track flow features and their statistics together with their correlation with relevant scalar quantities, e.g. temperature or species concentrations. In our approach we encode the set of all possible flow features by pre-computing merge trees augmented with attributes, such as statistical moments of various scalar fields, e.g. temperature, as well as length-scales computed via spectral analysis. The computation is performed in an efficient streaming manner in a pre-processing step and results in a collection of meta-data that is orders of magnitude smaller than the original simulation data. This meta-data is sufficient to support a fully flexible and interactive analysis of the features, allowing for arbitrary thresholds, providing per-feature statistics, and creating various global diagnostics such as Cumulative Density Functions (CDFs), histograms, or time-series. We combine the analysis with a rendering of the features in a linked-view browser that enables scientists to interactively explore, visualize, and analyze the equivalent of one terabyte of simulation data. We highlight the utility of this new framework for combustion science; however, it is applicable to many other science domains.
Lifetime statistics of quantum chaos studied by a multiscale analysis
Di Falco, A.; Krauss, T. F. [School of Physics and Astronomy, University of St. Andrews, North Haugh, St. Andrews, KY16 9SS (United Kingdom); Fratalocchi, A. [PRIMALIGHT, Faculty of Electrical Engineering, Applied Mathematics and Computational Science, King Abdullah University of Science and Technology (KAUST), Thuwal 23955-6900 (Saudi Arabia)
2012-04-30
In a series of pump and probe experiments, we study the lifetime statistics of a quantum chaotic resonator when the number of open channels is greater than one. Our design embeds a stadium billiard into a two dimensional photonic crystal realized on a silicon-on-insulator substrate. We calculate resonances through a multiscale procedure that combines energy landscape analysis and wavelet transforms. Experimental data is found to follow the universal predictions arising from random matrix theory with an excellent level of agreement.
Statistical Analysis of Abnormal Electric Power Grid Behavior
Ferryman, Thomas A.; Amidan, Brett G.
2010-10-30
Pacific Northwest National Laboratory is developing a technique to analyze Phasor Measurement Unit data to identify typical patterns, atypical events and precursors to a blackout or other undesirable event. The approach combines a data-driven multivariate analysis with an engineering-model approach. The method identifies atypical events, provides a plane English description of the event, and the capability to use drill-down graphics for detailed investigations. The tool can be applied to the entire grid, individual organizations (e.g. TVA, BPA), or specific substations (e.g., TVA_CUMB). The tool is envisioned for (1) event investigations, (2) overnight processing to generate a Morning Report that characterizes the previous days activity with respect to previous activity over the previous 10-30 days, and (3) potentially near-real-time operation to support the grid operators. This paper presents the current status of the tool and illustrations of its application to real world PMU data collected in three 10-day periods in 2007.
Statistical analysis of cascading failures in power grids
Chertkov, Michael; Pfitzner, Rene; Turitsyn, Konstantin
2010-12-01
We introduce a new microscopic model of cascading failures in transmission power grids. This model accounts for automatic response of the grid to load fluctuations that take place on the scale of minutes, when optimum power flow adjustments and load shedding controls are unavailable. We describe extreme events, caused by load fluctuations, which cause cascading failures of loads, generators and lines. Our model is quasi-static in the causal, discrete time and sequential resolution of individual failures. The model, in its simplest realization based on the Directed Current description of the power flow problem, is tested on three standard IEEE systems consisting of 30, 39 and 118 buses. Our statistical analysis suggests a straightforward classification of cascading and islanding phases in terms of the ratios between average number of removed loads, generators and links. The analysis also demonstrates sensitivity to variations in line capacities. Future research challenges in modeling and control of cascading outages over real-world power networks are discussed.
Multivariate Data EXplorer (MDX)
Energy Science and Technology Software Center (OSTI)
2012-08-01
The MDX toolkit facilitates exploratory data analysis and visualization of multivariate datasets. MDX provides and interactive graphical user interface to load, explore, and modify multivariate datasets stored in tabular forms. MDX uses an extended version of the parallel coordinates plot and scatterplots to represent the data. The user can perform rapid visual queries using mouse gestures in the visualization panels to select rows or columns of interest. The visualization panel provides coordinated multiple views wherebymore » selections made in one plot are propagated to the other plots. Users can also export selected data or reconfigure the visualization panel to explore relationships between columns and rows in the data.« less
ROOT: A C++ framework for petabyte data storage, statistical analysis and visualization
Antcheva, I.; Ballintijn, M.; Bellenot, B.; Biskup, M.; Brun, R.; Buncic, N.; Canal, Ph.; Casadei, D.; Couet, O.; Fine, V.; Franco, L.; ,
2009-01-01
ROOT is an object-oriented C++ framework conceived in the high-energy physics (HEP) community, designed for storing and analyzing petabytes of data in an efficient way. Any instance of a C++ class can be stored into a ROOT file in a machine-independent compressed binary format. In ROOT the TTree object container is optimized for statistical data analysis over very large data sets by using vertical data storage techniques. These containers can span a large number of files on local disks, the web or a number of different shared file systems. In order to analyze this data, the user can chose out of a wide set of mathematical and statistical functions, including linear algebra classes, numerical algorithms such as integration and minimization, and various methods for performing regression analysis (fitting). In particular, the RooFit package allows the user to perform complex data modeling and fitting while the RooStats library provides abstractions and implementations for advanced statistical tools. Multivariate classification methods based on machine learning techniques are available via the TMVA package. A central piece in these analysis tools are the histogram classes which provide binning of one- and multi-dimensional data. Results can be saved in high-quality graphical formats like Postscript and PDF or in bitmap formats like JPG or GIF. The result can also be stored into ROOT macros that allow a full recreation and rework of the graphics. Users typically create their analysis macros step by step, making use of the interactive C++ interpreter CINT, while running over small data samples. Once the development is finished, they can run these macros at full compiled speed over large data sets, using on-the-fly compilation, or by creating a stand-alone batch program. Finally, if processing farms are available, the user can reduce the execution time of intrinsically parallel tasks - e.g. data mining in HEP - by using PROOF, which will take care of optimally distributing the work over the available resources in a transparent way.
Statistical Analysis of Tank 5 Floor Sample Results
Shine, E. P.
2013-01-31
Sampling has been completed for the characterization of the residual material on the floor of Tank 5 in the F-Area Tank Farm at the Savannah River Site (SRS), near Aiken, SC. The sampling was performed by Savannah River Remediation (SRR) LLC using a stratified random sampling plan with volume-proportional compositing. The plan consisted of partitioning the residual material on the floor of Tank 5 into three non-overlapping strata: two strata enclosed accumulations, and a third stratum consisted of a thin layer of material outside the regions of the two accumulations. Each of three composite samples was constructed from five primary sample locations of residual material on the floor of Tank 5. Three of the primary samples were obtained from the stratum containing the thin layer of material, and one primary sample was obtained from each of the two strata containing an accumulation. This report documents the statistical analyses of the analytical results for the composite samples. The objective of the analysis is to determine the mean concentrations and upper 95% confidence (UCL95) bounds for the mean concentrations for a set of analytes in the tank residuals. The statistical procedures employed in the analyses were consistent with the Environmental Protection Agency (EPA) technical guidance by Singh and others [2010]. Savannah River National Laboratory (SRNL) measured the sample bulk density, nonvolatile beta, gross alpha, and the radionuclide1, elemental, and chemical concentrations three times for each of the composite samples. The analyte concentration data were partitioned into three separate groups for further analysis: analytes with every measurement above their minimum detectable concentrations (MDCs), analytes with no measurements above their MDCs, and analytes with a mixture of some measurement results above and below their MDCs. The means, standard deviations, and UCL95s were computed for the analytes in the two groups that had at least some measurements above their MDCs. The identification of distributions and the selection of UCL95 procedures generally followed the protocol in Singh, Armbya, and Singh [2010]. When all of an analyte's measurements lie below their MDCs, only a summary of the MDCs can be provided. The measurement results reported by SRNL are listed, and the results of this analysis are reported. The data were generally found to follow a normal distribution, and to be homogenous across composite samples.
Ensemble Solar Forecasting Statistical Quantification and Sensitivity Analysis: Preprint
Cheung, WanYin; Zhang, Jie; Florita, Anthony; Hodge, Bri-Mathias; Lu, Siyuan; Hamann, Hendrik F.; Sun, Qian; Lehman, Brad
2015-12-08
Uncertainties associated with solar forecasts present challenges to maintain grid reliability, especially at high solar penetrations. This study aims to quantify the errors associated with the day-ahead solar forecast parameters and the theoretical solar power output for a 51-kW solar power plant in a utility area in the state of Vermont, U.S. Forecasts were generated by three numerical weather prediction (NWP) models, including the Rapid Refresh, the High Resolution Rapid Refresh, and the North American Model, and a machine-learning ensemble model. A photovoltaic (PV) performance model was adopted to calculate theoretical solar power generation using the forecast parameters (e.g., irradiance, cell temperature, and wind speed). Errors of the power outputs were quantified using statistical moments and a suite of metrics, such as the normalized root mean squared error (NRMSE). In addition, the PV model's sensitivity to different forecast parameters was quantified and analyzed. Results showed that the ensemble model yielded forecasts in all parameters with the smallest NRMSE. The NRMSE of solar irradiance forecasts of the ensemble NWP model was reduced by 28.10% compared to the best of the three NWP models. Further, the sensitivity analysis indicated that the errors of the forecasted cell temperature attributed only approximately 0.12% to the NRMSE of the power output as opposed to 7.44% from the forecasted solar irradiance.
Plutonium metal exchange program : current status and statistical analysis
Tandon, L.; Eglin, J. L.; Michalak, S. E.; Picard, R. R.; Temer, D. J.
2004-01-01
The Rocky Flats Plutonium (Pu) Metal Sample Exchange program was conducted to insure the quality and intercomparability of measurements such as Pu assay, Pu isotopics, and impurity analyses. The Rocky Flats program was discontinued in 1989 after more than 30 years. In 2001, Los Alamos National Laboratory (LANL) reestablished the Pu Metal Exchange program. In addition to the Atomic Weapons Establishment (AWE) at Aldermaston, six Department of Energy (DOE) facilities Argonne East, Argonne West, Livermore, Los Alamos, New Brunswick Laboratory, and Savannah River are currently participating in the program. Plutonium metal samples are prepared and distributed to the sites for destructive measurements to determine elemental concentration, isotopic abundance, and both metallic and nonmetallic impurity levels. The program provides independent verification of analytical measurement capabilies for each participating facility and allows problems in analytical methods to be identified. The current status of the program will be discussed with emphasis on the unique statistical analysis and modeling of the data developed for the program. The discussion includes the definition of the consensus values for each analyte (in the presence and absence of anomalous values and/or censored values), and interesting features of the data and the results.
U.S. Energy Information Administration Independent Statistics & Analysis
U.S. Energy Information Administration (EIA) Indexed Site
June 15, 2015 | Washington, DC Analysis of the Proposed Clean Power Plan 2 Analysis of the Impacts of the Clean Power Plan June 15, 2015 * Requested by Lamar Smith, Chairman of the House Committee on Science, Space, and Technology - Request identifies specific baselines and alternative CPP cases with increasingly stringent emission targets beyond 2030 as well as scenarios with alternative assumptions regarding specific power technologies and fuel market conditions * Analysis was conducted using
Statistical language analysis for automatic exfiltration event detection.
Robinson, David Gerald
2010-04-01
This paper discusses the recent development a statistical approach for the automatic identification of anomalous network activity that is characteristic of exfiltration events. This approach is based on the language processing method eferred to as latent dirichlet allocation (LDA). Cyber security experts currently depend heavily on a rule-based framework for initial detection of suspect network events. The application of the rule set typically results in an extensive list of uspect network events that are then further explored manually for suspicious activity. The ability to identify anomalous network events is heavily dependent on the experience of the security personnel wading through the network log. Limitations f this approach are clear: rule-based systems only apply to exfiltration behavior that has previously been observed, and experienced cyber security personnel are rare commodities. Since the new methodology is not a discrete rule-based pproach, it is more difficult for an insider to disguise the exfiltration events. A further benefit is that the methodology provides a risk-based approach that can be implemented in a continuous, dynamic or evolutionary fashion. This permits uspect network activity to be identified early with a quantifiable risk associated with decision making when responding to suspicious activity.
NEPA litigation 1988-1995: A detailed statistical analysis
Reinke, D.C.; Robitaille, P.
1997-08-01
The intent of this study was to identify trends and lessons learned from litigated NEPA documents and to compare and contrast these trends among Federal agencies. More than 350 NEPA cases were collected, reviewed, and analyzed. Of the NEPA cases reviewed, more than 170 were appeals or Supreme Court cases, mostly from the late 1980s through 1995. For this time period, the sampled documents represent the majority of the appeals court cases and all the Supreme Court cases. Additionally, over 170 district court cases were also examined as a representative sample of district court decisions on NEPA. Cases on agency actions found to need NEPA documentation (but that had no documentation) and cases on NEPA documents that were found to be inadequate were pooled and examined to determine the factors that were responsible for these findings. The inadequate documents were specifically examined to determine if there were any general trends. The results are shown in detailed statistical terms. Generally, when a Federal agency has some type of NEPA documentation (e.g., CX, EA, or EIS) and at least covers the basic NEPA procedural requirements, the agency typically wins the litigation. NEPA documents that lose generally have serious errors of omission. An awareness and understanding of the errors of omission can help Federal agencies to ensure that they produce winner a greater percentage of the time.
Templates and Examples — Statistics and Search Log Analysis
Broader source: Energy.gov [DOE]
Here you will find custom templates and EERE-specific examples you can use to plan, conduct, and report on your usability and analysis activities. These templates are examples of forms you might use, but you are not required to use them for EERE products.
U.S. Energy Information Administration Independent Statistics & Analysis
U.S. Energy Information Administration (EIA) Indexed Site
EIA's crude-by-rail data For EIA Energy Conference June 16, 2015 | Washington, DC By Mindi Farber-DeAnda, Biofuels and Emerging Technologies Team Office of Petroleum, Natural Gas, and Biofuels Analysis Takeaways * At the end of March, EIA published monthly crude-by-rail (CBR) data along with its monthly petroleum supply balances * EIA monthly data provides credible and publicly-available information on CBR movements, including historical monthly data starting in 2010 * Inter-regional CBR
A Statistical Analysis Of Bottom-Hole Temperature Data In The...
Statistical Analysis Of Bottom-Hole Temperature Data In The Hinton Area Of West-Central Alberta Jump to: navigation, search OpenEI Reference LibraryAdd to library Journal Article:...
University of Illinois at Chicago; Montana State University; Bhardwaj, Chhavi; Cui, Yang; Hofstetter, Theresa; Liu, Suet Yi; Bernstein, Hans C.; Carlson, Ross P.; Ahmed, Musahid; Hanley, Luke
2013-04-01
7.87 to 10.5 eV vacuum ultraviolet (VUV) photon energies were used in laser desorption postionization mass spectrometry (LDPI-MS) to analyze biofilms comprised of binary cultures of interacting microorganisms. The effect of photon energy was examined using both tunable synchrotron and laser sources of VUV radiation. Principal components analysis (PCA) was applied to the MS data to differentiate species in Escherichia coli-Saccharomyces cerevisiae coculture biofilms. PCA of LDPI-MS also differentiated individual E. coli strains in a biofilm comprised of two interacting gene deletion strains, even though these strains differed from the wild type K-12 strain by no more than four gene deletions each out of approximately 2000 genes. PCA treatment of 7.87 eV LDPI-MS data separated the E. coli strains into three distinct groups two ?pure? groups and a mixed region. Furthermore, the ?pure? regions of the E. coli cocultures showed greater variance by PCA when analyzed by 7.87 eV photon energies than by 10.5 eV radiation. Comparison of the 7.87 and 10.5 eV data is consistent with the expectation that the lower photon energy selects a subset of low ionization energy analytes while 10.5 eV is more inclusive, detecting a wider range of analytes. These two VUV photon energies therefore give different spreads via PCA and their respective use in LDPI-MS constitute an additional experimental parameter to differentiate strains and species.
STATISTICAL ANALYSIS OF CURRENT SHEETS IN THREE-DIMENSIONAL MAGNETOHYDRODYNAMIC TURBULENCE
Zhdankin, Vladimir; Boldyrev, Stanislav; Uzdensky, Dmitri A.; Perez, Jean C. E-mail: boldyrev@wisc.edu E-mail: jcperez@wisc.edu
2013-07-10
We develop a framework for studying the statistical properties of current sheets in numerical simulations of magnetohydrodynamic (MHD) turbulence with a strong guide field, as modeled by reduced MHD. We describe an algorithm that identifies current sheets in a simulation snapshot and then determines their geometrical properties (including length, width, and thickness) and intensities (peak current density and total energy dissipation rate). We then apply this procedure to simulations of reduced MHD and perform a statistical analysis on the obtained population of current sheets. We evaluate the role of reconnection by separately studying the populations of current sheets which contain magnetic X-points and those which do not. We find that the statistical properties of the two populations are different in general. We compare the scaling of these properties to phenomenological predictions obtained for the inertial range of MHD turbulence. Finally, we test whether the reconnecting current sheets are consistent with the Sweet-Parker model.
HotPatch Web Gateway: Statistical Analysis of Unusual Patches on Protein Surfaces
DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]
Pettit, Frank K.; Bowie, James U. [DOE-Molecular Biology Institute
HotPatch finds unusual patches on the surface of proteins, and computes just how unusual they are (patch rareness), and how likely each patch is to be of functional importance (functional confidence (FC).) The statistical analysis is done by comparing your protein's surface against the surfaces of a large set of proteins whose functional sites are known. Optionally, HotPatch can also write a script that will display the patches on the structure, when the script is loaded into some common molecular visualization programs. HotPatch generates complete statistics (functional confidence and patch rareness) on the most significant patches on your protein. For each property you choose to analyze, you'll receive an email to which will be attached a PDB-format file in which atomic B-factors (temp. factors) are replaced by patch indices; and the PDB file's Header Remarks will give statistical scores and a PDB-format file in which atomic B-factors are replaced by the raw values of the property used for patch analysis (for example, hydrophobicity instead of hydrophobic patches). [Copied with edits from http://hotpatch.mbi.ucla.edu/
Statistical analysis of multipole components in the magnetic field of the RHIC arc regions
Beebe-Wang,J.; Jain, A.
2009-05-04
The existence of multipolar components in the dipole and quadrupole magnets is one of the factors limiting the beam stability in the RHIC operations. Therefore, the statistical properties of the non-linear fields are crucial for understanding the beam behavior and for achieving the superior performance in RHIC. In an earlier work [1], the field quality analysis of the RHIC interaction regions (IR) was presented. Furthermore, a procedure for developing non-linear IR models constructed from measured multipolar data of RHIC IR magnets was described. However, the field quality in the regions outside of the RHIC IR had not yet been addressed. In this paper, we present the statistical analysis of multipolar components in the magnetic fields of the RHIC arc regions. The emphasis is on the lower order components, especially the sextupole in the arc dipole and the 12-pole in the quadrupole magnets, since they are shown to have the strongest effects on the beam stability. Finally, the inclusion of the measured multipolar components data of RHIC arc regions and their statistical properties into tracking models is discussed.
Kareem, A.; Zhao, J.
1994-12-31
The nonlinearities in the wind and wave loadings of compliant offshore platforms and in their structural characteristics result in response statistics that deviate from a Gaussian distribution. This paper focuses on the analysis of the response of these structures to random nonlinear wind and wave loads. As an improvement over the commonly used linearization approach an equivalent statistical quadratization (ESQ) and cubicization (ESC) approach is presented. The nonlinear loading or structural characteristics can be expressed in terms of an equivalent polynomial that contains terms up to quadratic or cubic depending on the type of nonlinearity. The response statistics and its cumulants are based on Volterra theory. A direct integration scheme is utilized to evaluate the response cumulants. The results provide a good comparison with simulation. It is noted that the ESQ provides an accurate description of the systems with asymmetrical nonlinearities, whereas, for symmetrical nonlinearities the ESC provides a good representation. Based on the information on higher-order cumulants, the response pdf, crossing rates and peak value distributions can be derived.
An Asynchronous Many-Task Implementation of In-Situ Statistical Analysis using Legion.
Pebay, Philippe Pierre; Bennett, Janine Camille
2015-11-01
In this report, we propose a framework for the design and implementation of in-situ analy- ses using an asynchronous many-task (AMT) model, using the Legion programming model together with the MiniAero mini-application as a surrogate for full-scale parallel scientific computing applications. The bulk of this work consists of converting the Learn/Derive/Assess model which we had initially developed for parallel statistical analysis using MPI [PTBM11], from a SPMD to an AMT model. In this goal, we propose an original use of the concept of Legion logical regions as a replacement for the parallel communication schemes used for the only operation of the statistics engines that require explicit communication. We then evaluate this proposed scheme in a shared memory environment, using the Legion port of MiniAero as a proxy for a full-scale scientific application, as a means to provide input data sets of variable size for the in-situ statistical analyses in an AMT context. We demonstrate in particular that the approach has merit, and warrants further investigation, in collaboration with ongoing efforts to improve the overall parallel performance of the Legion system.
In-Situ Statistical Analysis of Autotune Simulation Data using Graphical Processing Units
Ranjan, Niloo; Sanyal, Jibonananda; New, Joshua Ryan
2013-08-01
Developing accurate building energy simulation models to assist energy efficiency at speed and scale is one of the research goals of the Whole-Building and Community Integration group, which is a part of Building Technologies Research and Integration Center (BTRIC) at Oak Ridge National Laboratory (ORNL). The aim of the Autotune project is to speed up the automated calibration of building energy models to match measured utility or sensor data. The workflow of this project takes input parameters and runs EnergyPlus simulations on Oak Ridge Leadership Computing Facility s (OLCF) computing resources such as Titan, the world s second fastest supercomputer. Multiple simulations run in parallel on nodes having 16 processors each and a Graphics Processing Unit (GPU). Each node produces a 5.7 GB output file comprising 256 files from 64 simulations. Four types of output data covering monthly, daily, hourly, and 15-minute time steps for each annual simulation is produced. A total of 270TB+ of data has been produced. In this project, the simulation data is statistically analyzed in-situ using GPUs while annual simulations are being computed on the traditional processors. Titan, with its recent addition of 18,688 Compute Unified Device Architecture (CUDA) capable NVIDIA GPUs, has greatly extended its capability for massively parallel data processing. CUDA is used along with C/MPI to calculate statistical metrics such as sum, mean, variance, and standard deviation leveraging GPU acceleration. The workflow developed in this project produces statistical summaries of the data which reduces by multiple orders of magnitude the time and amount of data that needs to be stored. These statistical capabilities are anticipated to be useful for sensitivity analysis of EnergyPlus simulations.
Statistical analysis of content of Cs-137 in soils in Bansko-Razlog region
Kobilarov, R. G.
2014-11-18
Statistical analysis of the data set consisting of the activity concentrations of {sup 137}Cs in soils in BanskoRazlog region is carried out in order to establish the dependence of the deposition and the migration of {sup 137}Cs on the soil type. The descriptive statistics and the test of normality show that the data set have not normal distribution. Positively skewed distribution and possible outlying values of the activity of {sup 137}Cs in soils were observed. After reduction of the effects of outliers, the data set is divided into two parts, depending on the soil type. Test of normality of the two new data sets shows that they have a normal distribution. Ordinary kriging technique is used to characterize the spatial distribution of the activity of {sup 137}Cs over an area covering 40 km{sup 2} (whole Razlog valley). The result (a map of the spatial distribution of the activity concentration of {sup 137}Cs) can be used as a reference point for future studies on the assessment of radiological risk to the population and the erosion of soils in the study area.
Statistical Projections for Multi-resolution, Multi-dimensional Visual Data Exploration and Analysis
Hoa T. Nguyen; Stone, Daithi; E. Wes Bethel
2016-01-01
An ongoing challenge in visual exploration and analysis of large, multi-dimensional datasets is how to present useful, concise information to a user for some specific visualization tasks. Typical approaches to this problem have proposed either reduced-resolution versions of data, or projections of data, or both. These approaches still have some limitations such as consuming high computation or suffering from errors. In this work, we explore the use of a statistical metric as the basis for both projections and reduced-resolution versions of data, with a particular focus on preserving one key trait in data, namely variation. We use two different case studies to explore this idea, one that uses a synthetic dataset, and another that uses a large ensemble collection produced by an atmospheric modeling code to study long-term changes in global precipitation. The primary findings of our work are that in terms of preserving the variation signal inherent in data, that using a statistical measure more faithfully preserves this key characteristic across both multi-dimensional projections and multi-resolution representations than a methodology based upon averaging.
Eason, E.D.; Merton, A.A.; Wright, J.E.
1996-05-01
The effects of Li, pH and H, on primary water stress corrosion cracking (PWSCC) of Alloy 600 were investigated for temperatures between 320 and 330{degrees}C. Specimens included in the study were reverse U-bends (RUBs) made from several different heats of Alloy 600. The characteristic life, {eta}, which represents the time until 63.2% of the population initiates PWSCC, was computed using a modified Weibull statistical analysis algorithm and was analyzed for effects of the water chemistry variables previously mentioned. It was determined that the water chemistry variables are less sensitive than the metallurgical characteristics defined by the heat, heat treatment and initial stress state of the specimen (diameter and style of RUB); the maximum impact of chemistry effects was 0.13 to 0.59 standard deviations compared to a range of three (3) standard deviations for all variables. A first-order model was generated to estimate the effect of changes in pH, Li and H, concentrations on the characteristic life. The characteristic time to initiate cracks, {eta}, is not sensitive to Li and H{sub 2} concentrations in excess of 3.5 ppm and 25 ml/kg, respectively. Below these values, (1) {eta} decreases by {approximately}20% when [Li] is increased from 0.7 to 3.5 ppm; (2) {eta} decreases by {approximately}9% when [H{sub 2}] is increased from 13.1 to 25.0 ml/kg; and (3) {eta} decreases by {approximately}14% when pH is increased from 7.0 to 7.4, in each case holding the other two variables constant.
Frome, EL
2005-09-20
Environmental exposure measurements are, in general, positive and may be subject to left censoring; i.e,. the measured value is less than a ''detection limit''. In occupational monitoring, strategies for assessing workplace exposures typically focus on the mean exposure level or the probability that any measurement exceeds a limit. Parametric methods used to determine acceptable levels of exposure, are often based on a two parameter lognormal distribution. The mean exposure level, an upper percentile, and the exceedance fraction are used to characterize exposure levels, and confidence limits are used to describe the uncertainty in these estimates. Statistical methods for random samples (without non-detects) from the lognormal distribution are well known for each of these situations. In this report, methods for estimating these quantities based on the maximum likelihood method for randomly left censored lognormal data are described and graphical methods are used to evaluate the lognormal assumption. If the lognormal model is in doubt and an alternative distribution for the exposure profile of a similar exposure group is not available, then nonparametric methods for left censored data are used. The mean exposure level, along with the upper confidence limit, is obtained using the product limit estimate, and the upper confidence limit on an upper percentile (i.e., the upper tolerance limit) is obtained using a nonparametric approach. All of these methods are well known but computational complexity has limited their use in routine data analysis with left censored data. The recent development of the R environment for statistical data analysis and graphics has greatly enhanced the availability of high-quality nonproprietary (open source) software that serves as the basis for implementing the methods in this paper.
Nazemi, A.H.; Smith, E.P.
1985-11-01
Metal loss from in-bed heat exchangers has been a persistent problem in FBC systems. As part of its program in FBC technology development, the US Department of Energy/Morgantown Energy Technology Center (DOE/METC) supports a number of projects directed toward providing both theoretical and experimental results which will guide solutions to the metal loss problem. As a part of this effort, METC and The MITRE Corporation began a project in 1984 to collect and analyze metal loss data from various experimental, pilot-scale, and full-scale coal-fired FBC systems worldwide. The specific objective of this effort is to investigate the effects of unit design parameters and operating conditions on metal loss through the use of regression and analysis of variance techniques. To date, forty-one FBC systems worldwide have been identified and most of the data sets which characterize the metal loss occurrences in those units have been developed. The results of MITRE's effort to date were reported earlier (Interim Report No. DOE/MC/21351-1930, August 1985). This report describes the statistical procedures that MITRE will follow to analyze FBC metal loss data. The data will be analyzed using several regression techniques to find variables related to metal loss. Correlation and single variable regressions will be used to indicate important relationships. The joint relationships between the explanatory variables and metal loss will be examined by building multiple regression models. In order to prevent erroneous conclusions, diagnostics will be performed based on partial residual plots, residual analysis, and multicollinearity statistics. 7 refs.
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Thomas, E. V.; Phillippy, K. H.; Brahamsha, B.; Haaland, D. M.; Timlin, J. A.; Elbourne, L. D. H.; Palenik, B.; Paulsen, I. T.
2009-01-01
Until recently microarray experiments often involved relatively few arrays with only a single representation of each gene on each array. A complete genome microarray with multiple spots per gene (spread out spatially across the array) was developed in order to compare the gene expression of a marine cyanobacterium and a knockout mutant strain in a defined artificial seawater medium. Statistical methods were developed for analysis in the special situation of this case study where there is gene replication within an array and where relatively few arrays are used, which can be the case with current array technology. Due in partmore » to the replication within an array, it was possible to detect very small changes in the levels of expression between the wild type and mutant strains. One interesting biological outcome of this experiment is the indication of the extent to which the phosphorus regulatory system of this cyanobacterium affects the expression of multiple genes beyond those strictly involved in phosphorus acquisition.« less
Mathematical and statistical analysis of the effect of boron on yield parameters of wheat
Rawashdeh, Hamzeh; Sala, Florin; Boldea, Marius
2015-03-10
The main objective of this research is to investigate the effect of foliar applications of boron at different growth stages on yield and yield parameters of wheat. The contribution of boron in achieving yield parameters is described by second degree polynomial equations, with high statistical confidence (p<0.01; F theoretical < F calculated, according to ANOVA test, for Alfa = 0.05). Regression analysis, based on R{sup 2} values obtained, made it possible to evaluate the particular contribution of boron to the realization of yield parameters. This was lower for spike length (R{sup 2} = 0.812), thousand seeds weight (R{sup 2} = 0.850) and higher in the case of the number of spikelets (R{sup 2} = 0.936) and the number of seeds on a spike (R{sup 2} = 0.960). These results confirm that boron plays an important part in achieving the number of seeds on a spike in the case of wheat, as the contribution of this element to the process of flower fertilization is well-known. In regards to productivity elements, the contribution of macroelements to yield quantity is clear, the contribution of B alone being R{sup 2} = 0.868.
Statistical Analysis of Baseline Load Models for Non-Residential Buildings
Coughlin, Katie; Piette, Mary Ann; Goldman, Charles; Kiliccote, Sila
2008-11-10
Policymakers are encouraging the development of standardized and consistent methods to quantify the electric load impacts of demand response programs. For load impacts, an essential part of the analysis is the estimation of the baseline load profile. In this paper, we present a statistical evaluation of the performance of several different models used to calculate baselines for commercial buildings participating in a demand response program in California. In our approach, we use the model to estimate baseline loads for a large set of proxy event days for which the actual load data are also available. Measures of the accuracy and bias of different models, the importance of weather effects, and the effect of applying morning adjustment factors (which use data from the day of the event to adjust the estimated baseline) are presented. Our results suggest that (1) the accuracy of baseline load models can be improved substantially by applying a morning adjustment, (2) the characterization of building loads by variability and weather sensitivity is a useful indicator of which types of baseline models will perform well, and (3) models that incorporate temperature either improve the accuracy of the model fit or do not change it.
Statistical Analysis of Microarray Data with Replicated Spots: A Case Study withSynechococcusWH8102
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Thomas, E. V.; Phillippy, K. H.; Brahamsha, B.; Haaland, D. M.; Timlin, J. A.; Elbourne, L. D. H.; Palenik, B.; Paulsen, I. T.
2009-01-01
Until recently microarray experiments often involved relatively few arrays with only a single representation of each gene on each array. A complete genome microarray with multiple spots per gene (spread out spatially across the array) was developed in order to compare the gene expression of a marine cyanobacterium and a knockout mutant strain in a defined artificial seawater medium. Statistical methods were developed for analysis in the special situation of this case study where there is gene replication within an array and where relatively few arrays are used, which can be the case with current array technology. Due in partmoreto the replication within an array, it was possible to detect very small changes in the levels of expression between the wild type and mutant strains. One interesting biological outcome of this experiment is the indication of the extent to which the phosphorus regulatory system of this cyanobacterium affects the expression of multiple genes beyond those strictly involved in phosphorus acquisition.less
Yu, Victoria; Kishan, Amar U.; Cao, Minsong; Low, Daniel; Lee, Percy; Ruan, Dan
2014-03-15
Purpose: To demonstrate a new method of evaluating dose response of treatment-induced lung radiographic injury post-SBRT (stereotactic body radiotherapy) treatment and the discovery of bimodal dose behavior within clinically identified injury volumes. Methods: Follow-up CT scans at 3, 6, and 12 months were acquired from 24 patients treated with SBRT for stage-1 primary lung cancers or oligometastic lesions. Injury regions in these scans were propagated to the planning CT coordinates by performing deformable registration of the follow-ups to the planning CTs. A bimodal behavior was repeatedly observed from the probability distribution for dose values within the deformed injury regions. Based on a mixture-Gaussian assumption, an Expectation-Maximization (EM) algorithm was used to obtain characteristic parameters for such distribution. Geometric analysis was performed to interpret such parameters and infer the critical dose level that is potentially inductive of post-SBRT lung injury. Results: The Gaussian mixture obtained from the EM algorithm closely approximates the empirical dose histogram within the injury volume with good consistency. The average Kullback-Leibler divergence values between the empirical differential dose volume histogram and the EM-obtained Gaussian mixture distribution were calculated to be 0.069, 0.063, and 0.092 for the 3, 6, and 12 month follow-up groups, respectively. The lower Gaussian component was located at approximately 70% prescription dose (35 Gy) for all three follow-up time points. The higher Gaussian component, contributed by the dose received by planning target volume, was located at around 107% of the prescription dose. Geometrical analysis suggests the mean of the lower Gaussian component, located at 35 Gy, as a possible indicator for a critical dose that induces lung injury after SBRT. Conclusions: An innovative and improved method for analyzing the correspondence between lung radiographic injury and SBRT treatment dose has been demonstrated. Bimodal behavior was observed in the dose distribution of lung injury after SBRT. Novel statistical and geometrical analysis has shown that the systematically quantified low-dose peak at approximately 35 Gy, or 70% prescription dose, is a good indication of a critical dose for injury. The determined critical dose of 35 Gy resembles the critical dose volume limit of 30 Gy for ipsilateral bronchus in RTOG 0618 and results from previous studies. The authors seek to further extend this improved analysis method to a larger cohort to better understand the interpatient variation in radiographic lung injury dose response post-SBRT.
Improved Geothermometry Through Multivariate Reaction Path Modeling...
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Indicators Improved Geothermometry Through Multivariate Reaction Path Modeling and Evaluation of Geomicrobiological Influences on Geochemical Temperature Indicators ...
Widen, Joakim; Waeckelgaard, Ewa; Paatero, Jukka; Lund, Peter
2010-03-15
The trend of increasing application of distributed generation with solar photovoltaics (PV-DG) suggests that a widespread integration in existing low-voltage (LV) grids is possible in the future. With massive integration in LV grids, a major concern is the possible negative impacts of excess power injection from on-site generation. For power-flow simulations of such grid impacts, an important consideration is the time resolution of demand and generation data. This paper investigates the impact of time averaging on high-resolution data series of domestic electricity demand and PV-DG output and on voltages in a simulated LV grid. Effects of 10-minutely and hourly averaging on descriptive statistics and duration curves were determined. Although time averaging has a considerable impact on statistical properties of the demand in individual households, the impact is smaller on aggregate demand, already smoothed from random coincidence, and on PV-DG output. Consequently, the statistical distribution of simulated grid voltages was also robust against time averaging. The overall judgement is that statistical investigation of voltage variations in the presence of PV-DG does not require higher resolution than hourly. (author)
Time varying, multivariate volume data reduction
Ahrens, James P; Fout, Nathaniel; Ma, Kwan - Liu
2010-01-01
Large-scale supercomputing is revolutionizing the way science is conducted. A growing challenge, however, is understanding the massive quantities of data produced by large-scale simulations. The data, typically time-varying, multivariate, and volumetric, can occupy from hundreds of gigabytes to several terabytes of storage space. Transferring and processing volume data of such sizes is prohibitively expensive and resource intensive. Although it may not be possible to entirely alleviate these problems, data compression should be considered as part of a viable solution, especially when the primary means of data analysis is volume rendering. In this paper we present our study of multivariate compression, which exploits correlations among related variables, for volume rendering. Two configurations for multidimensional compression based on vector quantization are examined. We emphasize quality reconstruction and interactive rendering, which leads us to a solution using graphics hardware to perform on-the-fly decompression during rendering. In this paper we present a solution which addresses the need for data reduction in large supercomputing environments where data resulting from simulations occupies tremendous amounts of storage. Our solution employs a lossy encoding scheme to acrueve data reduction with several options in terms of rate-distortion behavior. We focus on encoding of multiple variables together, with optional compression in space and time. The compressed volumes can be rendered directly with commodity graphics cards at interactive frame rates and rendering quality similar to that of static volume renderers. Compression results using a multivariate time-varying data set indicate that encoding multiple variables results in acceptable performance in the case of spatial and temporal encoding as compared to independent compression of variables. The relative performance of spatial vs. temporal compression is data dependent, although temporal compression has the advantage of offering smooth animations, while spatial compression can handle volumes of larger dimensions.
Guo, Genliang; George, S.A.; Lindsey, R.P.
1997-08-01
Thirty-six sets of surface lineaments and fractures mapped from satellite images and/or aerial photos from parts of the Mid-continent and Colorado Plateau regions were collected, digitized, and statistically analyzed in order to obtain the probability distribution functions of natural fractures for characterizing naturally fractured reservoirs. The orientations and lengths of the surface linear features were calculated using the digitized coordinates of the two end points of each individual linear feature. The spacing data of the surface linear features within an individual set were, obtained using a new analytical sampling technique. Statistical analyses were then performed to find the best-fit probability distribution functions for the orientation, length, and spacing of each data set. Twenty-five hypothesized probability distribution functions were used to fit each data set. A chi-square goodness-of-fit test was used to rank the significance of each fit. A distribution which provides the lowest chi-square goodness-of-fit value was considered the best-fit distribution. The orientations of surface linear features were best-fitted by triangular, normal, or logistic distributions; the lengths were best-fitted by PearsonVI, PearsonV, lognormal2, or extreme-value distributions; and the spacing data were best-fitted by lognormal2, PearsonVI, or lognormal distributions. These probability functions can be used to stochastically characterize naturally fractured reservoirs.
SU-E-J-261: Statistical Analysis and Chaotic Dynamics of Respiratory Signal of Patients in BodyFix
Michalski, D; Huq, M; Bednarz, G; Lalonde, R; Yang, Y; Heron, D
2014-06-01
Purpose: To quantify respiratory signal of patients in BodyFix undergoing 4DCT scan with and without immobilization cover. Methods: 20 pairs of respiratory tracks recorded with RPM system during 4DCT scan were analyzed. Descriptive statistic was applied to selected parameters of exhale-inhale decomposition. Standardized signals were used with the delay method to build orbits in embedded space. Nonlinear behavior was tested with surrogate data. Sample entropy SE, Lempel-Ziv complexity LZC and the largest Lyapunov exponents LLE were compared. Results: Statistical tests show difference between scans for inspiration time and its variability, which is bigger for scans without cover. The same is for variability of the end of exhalation and inhalation. Other parameters fail to show the difference. For both scans respiratory signals show determinism and nonlinear stationarity. Statistical test on surrogate data reveals their nonlinearity. LLEs show signals chaotic nature and its correlation with breathing period and its embedding delay time. SE, LZC and LLE measure respiratory signal complexity. Nonlinear characteristics do not differ between scans. Conclusion: Contrary to expectation cover applied to patients in BodyFix appears to have limited effect on signal parameters. Analysis based on trajectories of delay vectors shows respiratory system nonlinear character and its sensitive dependence on initial conditions. Reproducibility of respiratory signal can be evaluated with measures of signal complexity and its predictability window. Longer respiratory period is conducive for signal reproducibility as shown by these gauges. Statistical independence of the exhale and inhale times is also supported by the magnitude of LLE. The nonlinear parameters seem more appropriate to gauge respiratory signal complexity since its deterministic chaotic nature. It contrasts with measures based on harmonic analysis that are blind for nonlinear features. Dynamics of breathing, so crucial for 4D-based clinical technologies, can be better controlled if nonlinear-based methodology, which reflects respiration characteristic, is applied. Funding provided by Varian Medical Systems via Investigator Initiated Research Project.
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Downtime Log Yearly Operation Statistics 2016 Statistics 2015 Statistics 2014 Statistics 2013 Statistics 2012 Statistics 2011 Statistics 2010 Statistics 2009 Statistics 2008...
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Usage Statistics Usage Statistics Genepool Cluster Statistics Period: daily weekly monthly quarter yearly 2year Utilization By Group Jobs Pending Last edited: 2013-09-26 18:21:13...
Rebound 2007: Analysis of U.S. Light-Duty Vehicle Travel Statistics
Greene, David L
2010-01-01
U.S. national time series data on vehicle travel by passenger cars and light trucks covering the period 1966 2007 are used to test for the existence, size and stability of the rebound effect for motor vehicle fuel efficiency on vehicle travel. The data show a statistically significant effect of gasoline price on vehicle travel but do not support the existence of a direct impact of fuel efficiency on vehicle travel. Additional tests indicate that fuel price effects have not been constant over time, although the hypothesis of symmetry with respect to price increases and decreases is not rejected. Small and Van Dender (2007) model of a declining rebound effect with income is tested and similar results are obtained.
International Energy Statistics
U.S. Energy Information Administration (EIA) Indexed Site
Eia.gov BETA - Data - U.S. Energy Information Administration (EIA) U.S. Energy Information Administration - EIA - Independent Statistics and Analysis Sources & Uses Petroleum &...
Wolfrum, E. J.; Sluiter, A. D.
2009-01-01
We have studied rapid calibration models to predict the composition of a variety of biomass feedstocks by correlating near-infrared (NIR) spectroscopic data to compositional data produced using traditional wet chemical analysis techniques. The rapid calibration models are developed using multivariate statistical analysis of the spectroscopic and wet chemical data. This work discusses the latest versions of the NIR calibration models for corn stover feedstock and dilute-acid pretreated corn stover. Measures of the calibration precision and uncertainty are presented. No statistically significant differences (p = 0.05) are seen between NIR calibration models built using different mathematical pretreatments. Finally, two common algorithms for building NIR calibration models are compared; no statistically significant differences (p = 0.05) are seen for the major constituents glucan, xylan, and lignin, but the algorithms did produce different predictions for total extractives. A single calibration model combining the corn stover feedstock and dilute-acid pretreated corn stover samples gave less satisfactory predictions than the separate models.
Analysis of hyper-spectral data derived from an imaging Fourier transform: A statistical perspective
Sengupta, S.K.; Clark, G.A.; Fields, D.J.
1996-01-10
Fourier transform spectrometers (FTS) using optical sensors are increasingly being used in various branches of science. Typically, a FTS generates a three-dimensional data cube with two spatial dimensions and one frequency/wavelength dimension. The number of frequency dimensions in such data cubes is generally very large, often in the hundreds, making data analytical procedures extremely complex. In the present report, the problem is viewed from a statistical perspective. A set of procedures based on the high degree of inter-channel correlation structure often present in such hyper-spectral data, has been identified and applied to an example data set of dimension 100 x 128 x 128 comprising 128 spectral bands. It is shown that in this case, the special eigen-structure of the correlation matrix has allowed the authors to extract just a few linear combinations of the channels (the significant principal vectors) that effectively contain almost all of the spectral information contained in the data set analyzed. This in turn, enables them to segment the objects in the given spatial frame using, in a parsimonious yet highly effective way, most of the information contained in the data set.
High Statistics Analysis using Anisotropic Clover Lattices: (I) Single Hadron Correlation Functions
Beane, S; Detmold, W; Luu, T; Orginos, K; Parreno, A; Savage, M; Torok, A; Walker-Loud, A
2009-03-23
We present the results of high-statistics calculations of correlation functions generated with single-baryon interpolating operators on an ensemble of dynamical anisotropic gauge-field configurations generated by the Hadron Spectrum Collaboration using a tadpole-improved clover fermion action and Symanzik-improved gauge action. A total of 292, 500 sets of measurements are made using 1194 gauge configurations of size 20{sup 3} x 128 with an anisotropy parameter {zeta} = b{sub s}/b{sub t} = 3.5, a spatial lattice spacing of b{sub s} = 0.1227 {+-} 0.0008 fm, and pion mass of M{sub {pi}} {approx} 390 MeV. Ground state baryons masses are extracted with fully quantified uncertainties that are at or below the {approx} 0.2%-level in lattice units. The lowest-lying negative-parity states are also extracted albeit with a somewhat lower level of precision. In the case of the nucleon, this negative-parity state is above the N{pi} threshold and, therefore, the isospin-1/2 {pi}N s-wave scattering phase-shift can be extracted using Luescher's method. The disconnected contributions to this process are included indirectly in the gauge-field configurations and do not require additional calculations. The signal-to-noise ratio in the various correlation functions is explored and is found to degrade exponentially faster than naive expectations on many time-slices. This is due to backward propagating states arising from the anti-periodic boundary conditions imposed on the quark-propagators in the time-direction. We explore how best to distribute computational resources between configuration generation and propagator measurements in order to optimize the extraction of single baryon observables.
Statistical analysis of liquid seepage in partially saturated heterogeneous fracture systems
Liou, T.S.
1999-12-01
Field evidence suggests that water flow in unsaturated fracture systems may occur along fast preferential flow paths. However, conventional macroscale continuum approaches generally predict the downward migration of water as a spatially uniform wetting front subjected to strong inhibition into the partially saturated rock matrix. One possible cause of this discrepancy may be the spatially random geometry of the fracture surfaces, and hence, the irregular fracture aperture. Therefore, a numerical model was developed in this study to investigate the effects of geometric features of natural rock fractures on liquid seepage and solute transport in 2-D planar fractures under isothermal, partially saturated conditions. The fractures were conceptualized as 2-D heterogeneous porous media that are characterized by their spatially correlated permeability fields. A statistical simulator, which uses a simulated annealing (SA) algorithm, was employed to generate synthetic permeability fields. Hypothesized geometric features that are expected to be relevant for seepage behavior, such as spatially correlated asperity contacts, were considered in the SA algorithm. Most importantly, a new perturbation mechanism for SA was developed in order to consider specifically the spatial correlation near conditioning asperity contacts. Numerical simulations of fluid flow and solute transport were then performed in these synthetic fractures by the flow simulator TOUGH2, assuming that the effects of matrix permeability, gas phase pressure, capillary/permeability hysteresis, and molecular diffusion can be neglected. Results of flow simulation showed that liquid seepage in partially saturated fractures is characterized by localized preferential flow, along with bypassing, funneling, and localized ponding. Seepage pattern is dominated by the fraction of asperity contracts, and their shape, size, and spatial correlation. However, the correlation structure of permeability field is less important than the spatial correlation of asperity contacts. A faster breakthrough was observed in fractures subjected to higher normal stress, accompanied with a nonlinearly decreasing trend of the effective permeability. Interestingly, seepage dispersion is generally higher in fractures with intermediate fraction of asperity contacts; but it is lower for small or large fractions of asperity contacts. However, it may become higher if the ponding becomes significant. Transport simulations indicate that tracers bypass dead-end pores and travel along flow paths that have less flow resistance. Accordingly, tracer breakthrough curves generally show more spreading than breakthrough curves for water. Further analyses suggest that the log-normal time model generally fails to fit the breakthrough curves for water, but it is a good approximation for breakthrough curves for the tracer.
Dr. Binh T. Pham; Grant L. Hawkes; Jeffrey J. Einerson
2012-10-01
As part of the Research and Development program for Next Generation High Temperature Reactors (HTR), a series of irradiation tests, designated as Advanced Gas-cooled Reactor (AGR), have been defined to support development and qualification of fuel design, fabrication process, and fuel performance under normal operation and accident conditions. The AGR tests employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule and instrumented with thermocouples (TC) embedded in graphite blocks enabling temperature control. The data representing the crucial test fuel conditions (e.g., temperature, neutron fast fluence, and burnup) while impossible to obtain from direct measurements are calculated by physics and thermal models. The irradiation and post-irradiation examination (PIE) experimental data are used in model calibration effort to reduce the inherent uncertainty of simulation results. This paper is focused on fuel temperature predicted by the ABAQUS code’s finite element-based thermal models. The work follows up on a previous study, in which several statistical analysis methods were adapted, implemented in the NGNP Data Management and Analysis System (NDMAS), and applied for improving qualification of AGR-1 thermocouple data. The present work exercises the idea that the abnormal trends of measured data observed from statistical analysis may be caused by either measuring instrument deterioration or physical mechanisms in capsules that may have shifted the system thermal response. As an example, the uneven reduction of the control gas gap in Capsule 5 revealed by the capsule metrology measurements in PIE helps justify the reduction in TC readings instead of TC drift. This in turn prompts modification of thermal model to better fit with experimental data, thus help increase confidence, and in other word reduce model uncertainties in thermal simulation results of the AGR-1 test.
Binh T. Pham; Grant L. Hawkes; Jeffrey J. Einerson
2014-05-01
As part of the High Temperature Reactors (HTR) R&D program, a series of irradiation tests, designated as Advanced Gas-cooled Reactor (AGR), have been defined to support development and qualification of fuel design, fabrication process, and fuel performance under normal operation and accident conditions. The AGR tests employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule and instrumented with thermocouples (TC) embedded in graphite blocks enabling temperature control. While not possible to obtain by direct measurements in the tests, crucial fuel conditions (e.g., temperature, neutron fast fluence, and burnup) are calculated using core physics and thermal modeling codes. This paper is focused on AGR test fuel temperature predicted by the ABAQUS code's finite element-based thermal models. The work follows up on a previous study, in which several statistical analysis methods were adapted, implemented in the NGNP Data Management and Analysis System (NDMAS), and applied for qualification of AGR-1 thermocouple data. Abnormal trends in measured data revealed by the statistical analysis are traced to either measuring instrument deterioration or physical mechanisms in capsules that may have shifted the system thermal response. The main thrust of this work is to exploit the variety of data obtained in irradiation and post-irradiation examination (PIE) for assessment of modeling assumptions. As an example, the uneven reduction of the control gas gap in Capsule 5 found in the capsule metrology measurements in PIE helps identify mechanisms other than TC drift causing the decrease in TC readings. This suggests a more physics-based modification of the thermal model that leads to a better fit with experimental data, thus reducing model uncertainty and increasing confidence in the calculated fuel temperatures of the AGR-1 test.
Davis, Adam Christopher; Booth, Steven Richard
2015-08-20
Voluntary Protection Program (VPP) surveys were conducted in 2013 and 2014 to assess the degree to which workers at Los Alamos National Laboratory feel that their safety is valued by their management and peers. The goal of this analysis is to determine whether the difference between the VPP survey scores in 2013 and 2014 is significant, and to present the data in a way such that it can help identify either positive changes or potential opportunities for improvement. Data for several questions intended to identify the demographic groups of the respondent are included in both the 2013 and 2014 VPP survey results. These can be used to identify any significant differences among groups of employees as well as to identify any temporal trends in these cohorts.
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Statistics Cluster Statistics Ganglia Ganglia can be used to monitor performance of PDSF nodes... Read More PDSF IO Monitoring This page shows the IO response of the elizas and...
Brandt, Timothy D.; Spiegel, David S.; McElwain, Michael W.; Grady, C. A.; Turner, Edwin L.; Mede, Kyle; Kuzuhara, Masayuki; Schlieder, Joshua E.; Brandner, W.; Feldt, M.; Wisniewski, John P.; Abe, L.; Biller, B.; Carson, J.; Currie, T.; Egner, S.; Golota, T.; Guyon, O.; Goto, M.; Hashimoto, J.; and others
2014-10-20
We conduct a statistical analysis of a combined sample of direct imaging data, totalling nearly 250 stars. The stars cover a wide range of ages and spectral types, and include five detections (? And b, two ?60 M {sub J} brown dwarf companions in the Pleiades, PZ Tel B, and CD35 2722B). For some analyses we add a currently unpublished set of SEEDS observations, including the detections GJ 504b and GJ 758B. We conduct a uniform, Bayesian analysis of all stellar ages using both membership in a kinematic moving group and activity/rotation age indicators. We then present a new statistical method for computing the likelihood of a substellar distribution function. By performing most of the integrals analytically, we achieve an enormous speedup over brute-force Monte Carlo. We use this method to place upper limits on the maximum semimajor axis of the distribution function derived from radial-velocity planets, finding model-dependent values of ?30-100 AU. Finally, we model the entire substellar sample, from massive brown dwarfs to a theoretically motivated cutoff at ?5 M {sub J}, with a single power-law distribution. We find that p(M, a)?M {sup 0.65} {sup } {sup 0.60} a {sup 0.85} {sup } {sup 0.39} (1? errors) provides an adequate fit to our data, with 1.0%-3.1% (68% confidence) of stars hosting 5-70 M {sub J} companions between 10 and 100 AU. This suggests that many of the directly imaged exoplanets known, including most (if not all) of the low-mass companions in our sample, formed by fragmentation in a cloud or disk, and represent the low-mass tail of the brown dwarfs.
Independent Statistics & Analysis
U.S. Energy Information Administration (EIA) Indexed Site
www.eia.gov U.S. Department of Energy Washington, DC 20585 Quarterly Coal Report (Abbreviated) October - December 2015 April 2016 This report was prepared by the U.S. Energy ...
Comnes, G.A.; Belden, T.N.; Kahn, E.P.
1995-02-01
The market for long-term bulk power is becoming increasingly competitive and mature. Given that many privately developed power projects have been or are being developed in the US, it is possible to begin to evaluate the performance of the market by analyzing its revealed prices. Using a consistent method, this paper presents levelized contract prices for a sample of privately developed US generation properties. The sample includes 26 projects with a total capacity of 6,354 MW. Contracts are described in terms of their choice of technology, choice of fuel, treatment of fuel price risk, geographic location, dispatchability, expected dispatch niche, and size. The contract price analysis shows that gas technologies clearly stand out as the most attractive. At an 80% capacity factor, coal projects have an average 20-year levelized price of $0.092/kWh, whereas natural gas combined cycle and/or cogeneration projects have an average price of $0.069/kWh. Within each technology type subsample, however, there is considerable variation. Prices for natural gas combustion turbines and one wind project are also presented. A preliminary statistical analysis is conducted to understand the relationship between price and four categories of explanatory factors including product heterogeneity, geographic heterogeneity, economic and technological change, and other buyer attributes (including avoided costs). Because of residual price variation, we are unable to accept the hypothesis that electricity is a homogeneous product. Instead, the analysis indicates that buyer value still plays an important role in the determination of price for competitively-acquired electricity.
Klenzing, J. H.; Earle, G. D.; Heelis, R. A.; Coley, W. R. [William B. Hanson Center for Space Sciences, University of Texas at Dallas, 800 W. Campbell Rd. WT15, Richardson, Texas 75080 (United States)
2009-05-15
The use of biased grids as energy filters for charged particles is common in satellite-borne instruments such as a planar retarding potential analyzer (RPA). Planar RPAs are currently flown on missions such as the Communications/Navigation Outage Forecast System and the Defense Meteorological Satellites Program to obtain estimates of geophysical parameters including ion velocity and temperature. It has been shown previously that the use of biased grids in such instruments creates a nonuniform potential in the grid plane, which leads to inherent errors in the inferred parameters. A simulation of ion interactions with various configurations of biased grids has been developed using a commercial finite-element analysis software package. Using a statistical approach, the simulation calculates collected flux from Maxwellian ion distributions with three-dimensional drift relative to the instrument. Perturbations in the performance of flight instrumentation relative to expectations from the idealized RPA flux equation are discussed. Both single grid and dual-grid systems are modeled to investigate design considerations. Relative errors in the inferred parameters for each geometry are characterized as functions of ion temperature and drift velocity.
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
f!\Jl~~ If & &0 :3 Statistical Methods for Environmental Pollution Monitoring 3 3679 00058 9400 Statistical Methods for Environmental Pollution Monitoring Richard O. Gilbert Pacific Northwest Laboratory Imi5l VAN NOSTRAND REINHOLD COMPANY ~ - - - - - - - New York Dedicated to my parents, Mary Margaret and Donald I. Gilbert Copyright © 1987 by Van Nostrand Reinhold Company Inc. Library of Congress Catalog Card Number: 86-26758 ISBN 0-442-23050-8 Work supported by the U.S. Department of
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Genepool Memory Heatmaps Usage Statistics UGE Scheduler Cycle Time File storage and I/O Data Management Supported Systems FAQ Performance and Optimization Genepool Completed Jobs Genepool Training and Tutorials Websites, databases and cluster services Testbeds Retired Systems Storage & File Systems Data & Analytics Connecting to NERSC Queues and Scheduling Job Logs & Statistics Application Performance Training & Tutorials Software Policies User Surveys NERSC Users Group User
Extracting bb Higgs Decay Signals using Multivariate Techniques
Smith, W Clarke; /George Washington U. /SLAC
2012-08-28
For low-mass Higgs boson production at ATLAS at {radical}s = 7 TeV, the hard subprocess gg {yields} h{sup 0} {yields} b{bar b} dominates but is in turn drowned out by background. We seek to exploit the intrinsic few-MeV mass width of the Higgs boson to observe it above the background in b{bar b}-dijet mass plots. The mass resolution of existing mass-reconstruction algorithms is insufficient for this purpose due to jet combinatorics, that is, the algorithms cannot identify every jet that results from b{bar b} Higgs decay. We combine these algorithms using the neural net (NN) and boosted regression tree (BDT) multivariate methods in attempt to improve the mass resolution. Events involving gg {yields} h{sup 0} {yields} b{bar b} are generated using Monte Carlo methods with Pythia and then the Toolkit for Multivariate Analysis (TMVA) is used to train and test NNs and BDTs. For a 120 GeV Standard Model Higgs boson, the m{sub h{sup 0}}-reconstruction width is reduced from 8.6 to 6.5 GeV. Most importantly, however, the methods used here allow for more advanced m{sub h{sup 0}}-reconstructions to be created in the future using multivariate methods.
ORISE: Statistical Analyses of Worker Health
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
appropriate methods of statistical analysis to a variety of problems in occupational health and other areas. Our expertise spans a range of capabilities essential for statistical...
Smith, Steven G.; Skalski, John R.; Schelechte, J. Warren
1994-12-01
Program SURPH is the culmination of several years of research to develop a comprehensive computer program to analyze survival studies of fish and wildlife populations. Development of this software was motivated by the advent of the PIT-tag (Passive Integrated Transponder) technology that permits the detection of salmonid smolt as they pass through hydroelectric facilities on the Snake and Columbia Rivers in the Pacific Northwest. Repeated detections of individually tagged smolt and analysis of their capture-histories permits estimates of downriver survival probabilities. Eventual installation of detection facilities at adult fish ladders will also permit estimation of ocean survival and upstream survival of returning salmon using the statistical methods incorporated in SURPH.1. However, the utility of SURPH.1 far exceeds solely the analysis of salmonid tagging studies. Release-recapture and radiotelemetry studies from a wide range of terrestrial and aquatic species have been analyzed using SURPH.1 to estimate discrete time survival probabilities and investigate survival relationships. The interactive computing environment of SURPH.1 was specifically developed to allow researchers to investigate the relationship between survival and capture processes and environmental, experimental and individual-based covariates. Program SURPH.1 represents a significant advancement in the ability of ecologists to investigate the interplay between morphologic, genetic, environmental and anthropogenic factors on the survival of wild species. It is hoped that this better understanding of risk factors affecting survival will lead to greater appreciation of the intricacies of nature and to improvements in the management of wild resources. This technical report is an introduction to SURPH.1 and provides a user guide for both the UNIX and MS-Windows{reg_sign} applications of the SURPH software.
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Storage Trends and Summaries Storage by Scientific Discipline Troubleshooting I/O Resources for Scientific Applications at NERSC Optimizing I/O performance on the Lustre file system I/O Formats Science Databases Sharing Data Transferring Data Unix Groups at NERSC Unix File Permissions Data & Analytics Connecting to NERSC Queues and Scheduling Job Logs & Statistics Application Performance Training & Tutorials Software Policies User Surveys NERSC Users Group User Announcements Help
Chang, Wen-Kuei; Hong, Tianzhen
2013-01-01
Occupancy profile is one of the driving factors behind discrepancies between the measured and simulated energy consumption of buildings. The frequencies of occupants leaving their offices and the corresponding durations of absences have significant impact on energy use and the operational controls of buildings. This study used statistical methods to analyze the occupancy status, based on measured lighting-switch data in five-minute intervals, for a total of 200 open-plan (cubicle) offices. Five typical occupancy patterns were identified based on the average daily 24-hour profiles of the presence of occupants in their cubicles. These statistical patterns were represented by a one-square curve, a one-valley curve, a two-valley curve, a variable curve, and a flat curve. The key parameters that define the occupancy model are the average occupancy profile together with probability distributions of absence duration, and the number of times an occupant is absent from the cubicle. The statistical results also reveal that the number of absence occurrences decreases as total daily presence hours decrease, and the duration of absence from the cubicle decreases as the frequency of absence increases. The developed occupancy model captures the stochastic nature of occupants moving in and out of cubicles, and can be used to generate a more realistic occupancy schedule. This is crucial for improving the evaluation of the energy saving potential of occupancy based technologies and controls using building simulations. Finally, to demonstrate the use of the occupancy model, weekday occupant schedules were generated and discussed.
Using multivariate analyses to compare subsets of electrodes...
Office of Scientific and Technical Information (OSTI)
... Subject: 09 BIOMASS FUELS; ELECTRODES; FRUCTOSE; GALACTOSE; GLUCOSE; SACCHAROSE; CONCENTRATION RATIO; ELECTROCHEMICAL CELLS; PERFORMANCE; MIXTURES; ELECTRIC POTENTIAL; MULTIVARIATE ...
Multivariate Calibration Models for Sorghum Composition using Near-Infrared Spectroscopy
Wolfrum, E.; Payne, C.; Stefaniak, T.; Rooney, W.; Dighe, N.; Bean, B.; Dahlberg, J.
2013-03-01
NREL developed calibration models based on near-infrared (NIR) spectroscopy coupled with multivariate statistics to predict compositional properties relevant to cellulosic biofuels production for a variety of sorghum cultivars. A robust calibration population was developed in an iterative fashion. The quality of models developed using the same sample geometry on two different types of NIR spectrometers and two different sample geometries on the same spectrometer did not vary greatly.
Arrowood, L.F.; Tonn, B.E.
1992-02-01
This report presents recommendations relative to the use of expert systems and machine learning techniques by the Bureau of Labor Statistics (BLS) to substantially automate product substitution decisions associated with the Consumer Price Index (CPI). Thirteen commercially available, PC-based expert system shells have received in-depth evaluations. Various machine learning techniques were also reviewed. Two recommendations are given: (1) BLS should use the expert system shell LEVEL5 OBJECT and establish a software development methodology for expert systems; and (2) BLS should undertake a small study to evaluate the potential of machine learning techniques to create and maintain the approximately 350 ELI-specific knowledge bases to be used in CPI product substitution review.
Intrinsic alignments of galaxies in the MassiveBlack-II simulation: Analysis of two-point statistics
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Tenneti, Ananth; Singh, Sukhdeep; Mandelbaum, Rachel; Matteo, Tiziana Di; Feng, Yu; Khandai, Nishikanta
2015-03-11
The intrinsic alignment of galaxies with the large-scale density field in an important astrophysical contaminant in upcoming weak lensing surveys. We present detailed measurements of the galaxy intrinsic alignments and associated ellipticity-direction (ED) and projected shape (wg₊) correlation functions for galaxies in the cosmological hydrodynamic MassiveBlack-II (MB-II) simulation. We carefully assess the effects on galaxy shapes, misalignment of the stellar component with the dark matter shape and two-point statistics of iterative weighted (by mass and luminosity) definitions of the (reduced and unreduced) inertia tensor. We find that iterative procedures must be adopted for a reliable measurement of the reduced tensormore » but that luminosity versus mass weighting has only negligible effects. Both ED and wg₊ correlations increase in amplitude with subhalo mass (in the range of 10¹⁰ – 6.0 X 10¹⁴h⁻¹ M⊙), with a weak redshift dependence (from z = 1 to z = 0.06) at fixed mass. At z ~ 0.3, we predict a wg₊ that is in reasonable agreement with SDSS LRG measurements and that decreases in amplitude by a factor of ~ 5–18 for galaxies in the LSST survey. We also compared the intrinsic alignment of centrals and satellites, with clear detection of satellite radial alignments within the host halos. Finally, we show that wg₊ (using subhalos as tracers of density and wδ (using dark matter density) predictions from the simulations agree with that of non-linear alignment models (NLA) at scales where the 2-halo term dominates in the correlations (and tabulate associated NLA fitting parameters). The 1-halo term induces a scale dependent bias at small scales which is not modeled in the NLA model.« less
Intrinsic alignments of galaxies in the MassiveBlack-II simulation: Analysis of two-point statistics
Tenneti, Ananth; Singh, Sukhdeep; Mandelbaum, Rachel; Matteo, Tiziana Di; Feng, Yu; Khandai, Nishikanta
2015-03-11
The intrinsic alignment of galaxies with the large-scale density field in an important astrophysical contaminant in upcoming weak lensing surveys. We present detailed measurements of the galaxy intrinsic alignments and associated ellipticity-direction (ED) and projected shape (w_{g}?) correlation functions for galaxies in the cosmological hydrodynamic MassiveBlack-II (MB-II) simulation. We carefully assess the effects on galaxy shapes, misalignment of the stellar component with the dark matter shape and two-point statistics of iterative weighted (by mass and luminosity) definitions of the (reduced and unreduced) inertia tensor. We find that iterative procedures must be adopted for a reliable measurement of the reduced tensor but that luminosity versus mass weighting has only negligible effects. Both ED and w_{g}? correlations increase in amplitude with subhalo mass (in the range of 10? 6.0 X 10?h? M_{?}), with a weak redshift dependence (from z = 1 to z = 0.06) at fixed mass. At z ~ 0.3, we predict a w_{g}? that is in reasonable agreement with SDSS LRG measurements and that decreases in amplitude by a factor of ~ 518 for galaxies in the LSST survey. We also compared the intrinsic alignment of centrals and satellites, with clear detection of satellite radial alignments within the host halos. Finally, we show that w_{g}? (using subhalos as tracers of density and w_{?} (using dark matter density) predictions from the simulations agree with that of non-linear alignment models (NLA) at scales where the 2-halo term dominates in the correlations (and tabulate associated NLA fitting parameters). The 1-halo term induces a scale dependent bias at small scales which is not modeled in the NLA model.
Multivariate volume visualization through dynamic projections
Liu, Shusen; Wang, Bei; Thiagarajan, Jayaraman J.; Bremer, Peer -Timo; Pascucci, Valerio
2014-11-01
We propose a multivariate volume visualization framework that tightly couples dynamic projections with a high-dimensional transfer function design for interactive volume visualization. We assume that the complex, high-dimensional data in the attribute space can be well-represented through a collection of low-dimensional linear subspaces, and embed the data points in a variety of 2D views created as projections onto these subspaces. Through dynamic projections, we present animated transitions between different views to help the user navigate and explore the attribute space for effective transfer function design. Our framework not only provides a more intuitive understanding of the attribute space but also allows the design of the transfer function under multiple dynamic views, which is more flexible than being restricted to a single static view of the data. For large volumetric datasets, we maintain interactivity during the transfer function design via intelligent sampling and scalable clustering. As a result, using examples in combustion and climate simulations, we demonstrate how our framework can be used to visualize interesting structures in the volumetric space.
Computing contingency statistics in parallel.
Bennett, Janine Camille; Thompson, David; Pebay, Philippe Pierre
2010-09-01
Statistical analysis is typically used to reduce the dimensionality of and infer meaning from data. A key challenge of any statistical analysis package aimed at large-scale, distributed data is to address the orthogonal issues of parallel scalability and numerical stability. Many statistical techniques, e.g., descriptive statistics or principal component analysis, are based on moments and co-moments and, using robust online update formulas, can be computed in an embarrassingly parallel manner, amenable to a map-reduce style implementation. In this paper we focus on contingency tables, through which numerous derived statistics such as joint and marginal probability, point-wise mutual information, information entropy, and {chi}{sup 2} independence statistics can be directly obtained. However, contingency tables can become large as data size increases, requiring a correspondingly large amount of communication between processors. This potential increase in communication prevents optimal parallel speedup and is the main difference with moment-based statistics where the amount of inter-processor communication is independent of data size. Here we present the design trade-offs which we made to implement the computation of contingency tables in parallel.We also study the parallel speedup and scalability properties of our open source implementation. In particular, we observe optimal speed-up and scalability when the contingency statistics are used in their appropriate context, namely, when the data input is not quasi-diffuse.
Deep Data Analysis of Conductive Phenomena on Complex Oxide Interfaces: Physics from Data Mining
Strelcov, Evgheni; Belianinov, Alex; Hsieh, Ying-Hui; Jesse, Stephen; Baddorf, Arthur P; Chu, Ying Hao; Kalinin, Sergei V
2014-01-01
Spatial variability of electronic transport in BiFeO3-CoFe2O4 (BFO-CFO) self-assembled heterostructures is explored using spatially resolved first order reversal curve (FORC) current voltage (IV) mapping. Multivariate statistical analysis of FORC-IV data classifies statistically significant behaviors and maps characteristic responses spatially. In particular, regions of grain, matrix, and grain boundary responses are clearly identified. K-means and Bayesian demixing analysis suggests the characteristic response be separated into four components, with hysteretic type behavior localized at the BFO-CFO tubular interfaces. The conditions under which Bayesian components allow direct physical interpretation are explored, and transport mechanisms at the grain boundaries and individual phases are analyzed. This approach conjoins multivariate statistical analysis with physics-based interpretation, actualizing a robust, universal, data driven approach to problem solving, which can be applied to exploration of local transport and other functional phenomena in other spatially inhomogeneous systems.
Demkin, V. P.; Mel'nichuk, S. V.
2014-09-15
In the present work, results of investigations into the dynamics of secondary electrons with helium atoms in the presence of the reverse electric field arising in the flare of a high-voltage pulsed beam-type discharge and leading to degradation of the primary electron beam are presented. The electric field in the discharge of this type at moderate pressures can reach several hundred V/cm and leads to considerable changes in the kinetics of secondary electrons created in the process of propagation of the electron beam generated in the accelerating gap with a grid anode. Moving in the accelerating electric field toward the anode, secondary electrons create the so-called compensating current to the anode. The character of electron motion and the compensating current itself are determined by the ratio of the field strength to the concentration of atoms (E/n). The energy and angular spectra of secondary electrons are calculated by the Monte Carlo method for different ratios E/n of the electric field strength to the helium atom concentration. The motion of secondary electrons with threshold energy is studied for inelastic collisions of helium atoms and differential analysis is carried out of the collisional processes causing energy losses of electrons in helium for different E/n values. The mechanism of creation and accumulation of slow electrons as a result of inelastic collisions of secondary electrons with helium atoms and selective population of metastable states of helium atoms is considered. It is demonstrated that in a wide range of E/n values the motion of secondary electrons in the beam-type discharge flare has the character of drift. At E/n values characteristic for the discharge of the given type, the drift velocity of these electrons is calculated and compared with the available experimental data.
Parallel auto-correlative statistics with VTK.
Pebay, Philippe Pierre; Bennett, Janine Camille
2013-08-01
This report summarizes existing statistical engines in VTK and presents both the serial and parallel auto-correlative statistics engines. It is a sequel to [PT08, BPRT09b, PT09, BPT09, PT10] which studied the parallel descriptive, correlative, multi-correlative, principal component analysis, contingency, k-means, and order statistics engines. The ease of use of the new parallel auto-correlative statistics engine is illustrated by the means of C++ code snippets and algorithm verification is provided. This report justifies the design of the statistics engines with parallel scalability in mind, and provides scalability and speed-up analysis results for the autocorrelative statistics engine.
Compositional Analysis Laboratory (Poster), NREL (National Renewable...
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
with multivariate statistics to produce calibration models for several different biomass types including feedstocks and pretreated materials * Models dramatically decrease the...
Environment/Health/Safety (EHS): Monthly Accident Statistics
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Personal Protective Equipment (PPE) Injury Review & Analysis Worker Safety and Health Program: PUB-3851 Monthly Accident Statistics Latest Accident Statistics Accident...
Automated Multivariate Optimization Tool for Energy Analysis: Preprint
Ellis, P. G.; Griffith, B. T.; Long, N.; Torcellini, P. A.; Crawley, D.
2006-07-01
Building energy simulations are often used for trial-and-error evaluation of ''what-if'' options in building design--a limited search for an optimal solution, or ''optimization''. Computerized searching has the potential to automate the input and output, evaluate many options, and perform enough simulations to account for the complex interactions among combinations of options. This paper describes ongoing efforts to develop such a tool. The optimization tool employs multiple modules, including a graphical user interface, a database, a preprocessor, the EnergyPlus simulation engine, an optimization engine, and a simulation run manager. Each module is described and the overall application architecture is summarized.
Assessment of Critical Events Corridors through Multivariate Cascading Outages Analysis
Makarov, Yuri V.; Samaan, Nader A.; Diao, Ruisheng; Kumbale, Murali; Chen, Yousu; Singh, Ruchi; Green, Irina; Morgan, Mark P.
2011-10-17
Massive blackouts of electrical power systems in North America over the past decade has focused increasing attention upon ways to identify and simulate network events that may potentially lead to widespread network collapse. This paper summarizes a method to simulate power-system vulnerability to cascading failures to a supplied set of initiating events synonymously termed as Extreme Events. The implemented simulation method is currently confined to simulating steady state power-system response to a set of extreme events. The outlined method of simulation is meant to augment and provide a new insight into bulk power transmission network planning that at present remains mainly confined to maintaining power system security for single and double component outages under a number of projected future network operating conditions. Although one of the aims of this paper is to demonstrate the feasibility of simulating network vulnerability to cascading outages, a more important goal has been to determine vulnerable parts of the network that may potentially be strengthened in practice so as to mitigate system susceptibility to cascading failures. This paper proposes to demonstrate a systematic approach to analyze extreme events and identify vulnerable system elements that may be contributing to cascading outages. The hypothesis of critical events corridors is proposed to represent repeating sequential outages that can occur in the system for multiple initiating events. The new concept helps to identify system reinforcements that planners could engineer in order to 'break' the critical events sequences and therefore lessen the likelihood of cascading outages. This hypothesis has been successfully validated with a California power system model.
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
ARMFacility Statistics 2015 Quarterly Reports First Quarter (PDF) Second Quarter (PDF) Third Quarter (PDF) Fourth Quarter (PDF) Historical Statistics Field Campaigns Operational...
Using multivariate analyses to compare subsets of electrodes and potentials
Office of Scientific and Technical Information (OSTI)
within an electrode array for predicting sugar concentrations in mixed solutions. (Journal Article) | SciTech Connect Journal Article: Using multivariate analyses to compare subsets of electrodes and potentials within an electrode array for predicting sugar concentrations in mixed solutions. Citation Details In-Document Search Title: Using multivariate analyses to compare subsets of electrodes and potentials within an electrode array for predicting sugar concentrations in mixed solutions. A
AMERICAN STATISTICAL ASSOCIATION
U.S. Energy Information Administration (EIA) Indexed Site
AMERICAN STATISTICAL ASSOCIATION SPRING 2008 MEETING OF THE AMERICAN STATISTICAL ASSOCIATION COMMITTEE ON ENERGY STATISTICS WITH THE ENERGY INFORMATION ADMINISTRATION Washington, D.C. Wednesday, April 9, 2008 2 1 PARTICIPANTS: 2 COMMITTEE ON ENERGY STATISTICS: 3 NAGARAJ K. NEERCHAL Department of Mathematics and Statistics 4 University of Maryland 5 EDWARD A. BLAIR University of Houston 6 BARBARA FORSYTH 7 University of Maryland 8 DEREK R. BINGHAM University of Michigan 9 CALVIN A. KENT 10
Method for factor analysis of GC/MS data
Van Benthem, Mark H; Kotula, Paul G; Keenan, Michael R
2012-09-11
The method of the present invention provides a fast, robust, and automated multivariate statistical analysis of gas chromatography/mass spectroscopy (GC/MS) data sets. The method can involve systematic elimination of undesired, saturated peak masses to yield data that follow a linear, additive model. The cleaned data can then be subjected to a combination of PCA and orthogonal factor rotation followed by refinement with MCR-ALS to yield highly interpretable results.
Improved Geothermometry Through Multivariate Reaction Path Modeling and
Broader source: Energy.gov (indexed) [DOE]
Evaluation of Geomicrobiological Influences on Geochemical Temperature Indicators | Department of Energy Improved Geothermometry Through Multivariate Reaction Path Modeling and Evaluation of Geomicrobiological Influences on Geochemical Temperature Indicators presentation at the April 2013 peer review meeting held in Denver, Colorado. PDF icon geothermometry_cooper_peer2013.pdf More Documents & Publications track 4: enhanced geothermal systems (EGS) | geothermal 2015 peer review Chemical
Multivariate classification of infrared spectra of cell and tissue samples
Haaland, David M. (Albuquerque, NM); Jones, Howland D. T. (Albuquerque, NM); Thomas, Edward V. (Albuquerque, NM)
1997-01-01
Multivariate classification techniques are applied to spectra from cell and tissue samples irradiated with infrared radiation to determine if the samples are normal or abnormal (cancerous). Mid and near infrared radiation can be used for in vivo and in vitro classifications using at least different wavelengths.
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Statistical Table by Appropriation (dollars in thousands - OMB Scoring) Table of Contents Summary...................................................................................................... 1 Mandatory Funding....................................................................................... 3 Energy Supply.............................................................................................. 4 Non-Defense site acceleration
ARM - Historical Visitor Statistics
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Operational Visitors and Accounts Data Archive and Usage (October 1995 - Present) Historical Visitor Statistics As a national user facility, ARM is required to report...
International Energy Statistics - EIA
Gasoline and Diesel Fuel Update (EIA)
International > International Energy Statistics International Energy Statistics Petroleum Production | Annual Monthly/Quarterly Consumption | Annual Monthly/Quarterly Capacity | Bunker Fuels | Stocks | Annual Monthly/Quarterly Reserves | Imports | Annual Monthly/Quarterly Exports | CO2 Emissions | Heat Content Natural Gas All Flows | Production | Consumption | Reserves | Imports | Exports | Carbon Dioxide Emissions | Heat Content Coal All Flows | Production | Consumption | Reserves | Imports
Various forms of indexing HDMR for modelling multivariate classification problems
Aksu, a?r?; Tunga, M. Alper
2014-12-10
The Indexing HDMR method was recently developed for modelling multivariate interpolation problems. The method uses the Plain HDMR philosophy in partitioning the given multivariate data set into less variate data sets and then constructing an analytical structure through these partitioned data sets to represent the given multidimensional problem. Indexing HDMR makes HDMR be applicable to classification problems having real world data. Mostly, we do not know all possible class values in the domain of the given problem, that is, we have a non-orthogonal data structure. However, Plain HDMR needs an orthogonal data structure in the given problem to be modelled. In this sense, the main idea of this work is to offer various forms of Indexing HDMR to successfully model these real life classification problems. To test these different forms, several well-known multivariate classification problems given in UCI Machine Learning Repository were used and it was observed that the accuracy results lie between 80% and 95% which are very satisfactory.
Statistical design of a uranium corrosion experiment
Wendelberger, Joanne R; Moore, Leslie M
2009-01-01
This work supports an experiment being conducted by Roland Schulze and Mary Ann Hill to study hydride formation, one of the most important forms of corrosion observed in uranium and uranium alloys. The study goals and objectives are described in Schulze and Hill (2008), and the work described here focuses on development of a statistical experiment plan being used for the study. The results of this study will contribute to the development of a uranium hydriding model for use in lifetime prediction models. A parametric study of the effect of hydrogen pressure, gap size and abrasion on hydride initiation and growth is being planned where results can be analyzed statistically to determine individual effects as well as multi-variable interactions. Input to ESC from this experiment will include expected hydride nucleation, size, distribution, and volume on various uranium surface situations (geometry) as a function of age. This study will also address the effect of hydrogen threshold pressure on corrosion nucleation and the effect of oxide abrasion/breach on hydriding processes. Statistical experiment plans provide for efficient collection of data that aids in understanding the impact of specific experiment factors on initiation and growth of corrosion. The experiment planning methods used here also allow for robust data collection accommodating other sources of variation such as the density of inclusions, assumed to vary linearly along the cast rods from which samples are obtained.
AMERICAN STATISTICAL ASSOCIATION
U.S. Energy Information Administration (EIA) Indexed Site
AMERICAN STATISTICAL ASSOCIATION + + + + + COMMITTEE ON ENERGY STATISTICS + + + + + FALL MEETING + + + + + FRIDAY OCTOBER 17, 2003 + + + + + The Committee met in Room 8E089 in the Forrestal Building, 1000 Independence Avenue, S.W., Washington, D.C., at 8:30 a.m., Jay Breidt, Chair, presiding. PRESENT F. JAY BREIDT Chair NICOLAS HENGARTNER Vice Chair JOHNNY BLAIR Committee Member MARK BURTON Committee Member JAE EDMONDS Committee Member MOSHE FEDER Committee Member JAMES K. HAMMITT Committee
AMERICAN STATISTICAL ASSOCIATION (ASA)
U.S. Energy Information Administration (EIA) Indexed Site
AMERICAN STATISTICAL ASSOCIATION (ASA) MEETING OF THE COMMITTEE ON ENERGY STATISTICS WITH THE ENERGY INFORMATION ADMINISTRATION (EIA) Washington, D.C. Friday, April 29, 2005 COMMITTEE MEMBERS: NICOLAS HENGARTNER, Chair Los Alamos National Laboratory MARK BERNSTEIN RAND Corporation CUTLER CLEVELAND Center for Energy and Environmental Studies JAE EDMONDS Pacific Northwest National Laboratory MOSHE FEDER Research Triangle Institute BARBARA FORSYTH Westat WALTER HILL St. Mary's College of Maryland
Statistical methods for environmental pollution monitoring
Gilbert, R.O.
1987-01-01
The application of statistics to environmental pollution monitoring studies requires a knowledge of statistical analysis methods particularly well suited to pollution data. This book fills that need by providing sampling plans, statistical tests, parameter estimation procedure techniques, and references to pertinent publications. Most of the statistical techniques are relatively simple, and examples, exercises, and case studies are provided to illustrate procedures. The book is logically divided into three parts. Chapters 1, 2, and 3 are introductory chapters. Chapters 4 through 10 discuss field sampling designs and Chapters 11 through 18 deal with a broad range of statistical analysis procedures. Some statistical techniques given here are not commonly seen in statistics book. For example, see methods for handling correlated data (Sections 4.5 and 11.12), for detecting hot spots (Chapter 10), and for estimating a confidence interval for the mean of a lognormal distribution (Section 13.2). Also, Appendix B lists a computer code that estimates and tests for trends over time at one or more monitoring stations using nonparametric methods (Chapters 16 and 17). Unfortunately, some important topics could not be included because of their complexity and the need to limit the length of the book. For example, only brief mention could be made of time series analysis using Box-Jenkins methods and of kriging techniques for estimating spatial and spatial-time patterns of pollution, although multiple references on these topics are provided. Also, no discussion of methods for assessing risks from environmental pollution could be included.
Candidate Assembly Statistical Evaluation
Energy Science and Technology Software Center (OSTI)
1998-07-15
The Savannah River Site (SRS) receives aluminum clad spent Material Test Reactor (MTR) fuel from all over the world for storage and eventual reprocessing. There are hundreds of different kinds of MTR fuels and these fuels will continue to be received at SRS for approximately ten more years. SRS''s current criticality evaluation methodology requires the modeling of all MTR fuels utilizing Monte Carlo codes, which is extremely time consuming and resource intensive. Now that amore » significant number of MTR calculations have been conducted it is feasible to consider building statistical models that will provide reasonable estimations of MTR behavior. These statistical models can be incorporated into a standardized model homogenization spreadsheet package to provide analysts with a means of performing routine MTR fuel analyses with a minimal commitment of time and resources. This became the purpose for development of the Candidate Assembly Statistical Evaluation (CASE) program at SRS.« less
Gunst, R. F.
2013-05-01
Phase 3 of the EPAct/V2/E-89 Program investigated the effects of 27 program fuels and 15 program vehicles on exhaust emissions and fuel economy. All vehicles were tested over the California Unified Driving Cycle (LA-92) at 75 degrees F. The program fuels differed on T50, T90, ethanol, Reid vapor pressure, and aromatics. The vehicles tested were new, low-mileage 2008 model year Tier 2 vehicles. A total of 956 test runs were made. Comprehensive statistical modeling and analyses were conducted on methane, carbon dioxide, carbon monoxide, fuel economy, non-methane hydrocarbons, non-methane organic gases, oxides of nitrogen, particulate matter, and total hydrocarbons. In general, model fits determined that emissions and fuel economy were complicated by functions of the five fuel parameters. An extensive evaluation of alternative model fits produced a number of competing model fits. Many of these alternative fits produce similar estimates of mean emissions for the 27 program fuels but should be carefully evaluated for use with emerging fuels with combinations of fuel parameters not included here. The program includes detailed databases on each of the 27 program fuels on each of the 15 vehicles and on each of the vehicles on each of the program fuels.
Derrien, H.; Harvey, J.A.; Larson, N.M.; Leal, L.C.; Wright, R.Q.
2000-05-01
The average {sup 235}U neutron total cross sections were obtained in the energy range 2 keV to 330 keV from high-resolution transmission measurements of a 0.033 atom/b sample.1 The experimental data were corrected for the contribution of isotope impurities and for resonance self-shielding effects in the sample. The results are in very good agreement with the experimental data of Poenitz et al.4 in the energy range 40 keV to 330 keV and are the only available accurate experimental data in the energy range 2 keV to 40 keV. ENDF/B-VI evaluated data are 1.7% larger. The SAMMY/FITACS code 2 was used for a statistical model analysis of the total cross section, selected fission cross sections and data in the energy range 2 keV to 200 keV. SAMMY/FITACS is an extended version of SAMMY which allows consistent analysis of the experimental data in the resolved and unresolved resonance region. The Reich-Moore resonance parameters were obtained 3 from a SAMMY Bayesian fits of high resolution experimental neutron transmission and partial cross section data below 2.25 keV, and the corresponding average parameters and covariance data were used in the present work as input for the statistical model analysis of the high energy range of the experimental data. The result of the analysis shows that the average resonance parameters obtained from the analysis of the unresolved resonance region are consistent with those obtained in the resolved energy region. Another important result is that ENDF/B-VI capture cross section could be too small by more than 10% in the energy range 10 keV to 200 keV.
Statistics, Uncertainty, and Transmitted Variation
Wendelberger, Joanne Roth
2014-11-05
The field of Statistics provides methods for modeling and understanding data and making decisions in the presence of uncertainty. When examining response functions, variation present in the input variables will be transmitted via the response function to the output variables. This phenomenon can potentially have significant impacts on the uncertainty associated with results from subsequent analysis. This presentation will examine the concept of transmitted variation, its impact on designed experiments, and a method for identifying and estimating sources of transmitted variation in certain settings.
Statistically significant relational data mining :
Berry, Jonathan W.; Leung, Vitus Joseph; Phillips, Cynthia Ann; Pinar, Ali; Robinson, David Gerald; Berger-Wolf, Tanya; Bhowmick, Sanjukta; Casleton, Emily; Kaiser, Mark; Nordman, Daniel J.; Wilson, Alyson G.
2014-02-01
This report summarizes the work performed under the project (3z(BStatitically significant relational data mining.(3y (BThe goal of the project was to add more statistical rigor to the fairly ad hoc area of data mining on graphs. Our goal was to develop better algorithms and better ways to evaluate algorithm quality. We concetrated on algorithms for community detection, approximate pattern matching, and graph similarity measures. Approximate pattern matching involves finding an instance of a relatively small pattern, expressed with tolerance, in a large graph of data observed with uncertainty. This report gathers the abstracts and references for the eight refereed publications that have appeared as part of this work. We then archive three pieces of research that have not yet been published. The first is theoretical and experimental evidence that a popular statistical measure for comparison of community assignments favors over-resolved communities over approximations to a ground truth. The second are statistically motivated methods for measuring the quality of an approximate match of a small pattern in a large graph. The third is a new probabilistic random graph model. Statisticians favor these models for graph analysis. The new local structure graph model overcomes some of the issues with popular models such as exponential random graph models and latent variable models.
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Statistical Table by Appropriation (dollars in thousands - OMB Scoring) FY 2004 FY 2005 FY 2006 Comparable Comparable Request to FY 2006 vs. FY 2005 Approp Approp Congress Discretionary Summary By Appropriation Energy And Water Development Appropriation Summary: Energy Programs Energy supply Operation and maintenance................................................. 787,941 909,903 862,499 -47,404 -5.2% Construction......................................................................... 6,956
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Statistical Table by Appropriation (dollars in thousands - OMB Scoring) FY 2005 FY 2006 FY 2007 Current Current Congressional Approp. Approp. Request $ % Discretionary Summary By Appropriation Energy And Water Development, And Related Agencies Appropriation Summary: Energy Programs Energy supply and conservation Operation and maintenance............................................ 1,779,399 1,791,372 1,917,331 +125,959 +7.0%
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Statistical Table by Appropriation (dollars in thousands - OMB Scoring) FY 2006 FY 2007 FY 2008 Current Congressional Congressional Approp. Request Request $ % Discretionary Summary By Appropriation Energy And Water Development, And Related Agencies Appropriation Summary: Energy Programs Energy supply and conservation Operation and maintenance........................................... 1,781,242 1,917,331 2,187,943 +270,612 +14.1%
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Statistical Table by Appropriation (dollars in thousands - OMB Scoring) FY 2011 FY 2012 FY 2013 Current Enacted Congressional Approp. Approp. * Request $ % Discretionary Summary By Appropriation Energy And Water Development, And Related Agencies Appropriation Summary: Energy Programs Energy efficiency and renewable energy........................................ 1,771,721 1,809,638 2,337,000 +527,362 +29.1% Electricity delivery and energy reliability.........................................
Topology for statistical modeling of petascale data.
Pascucci, Valerio; Mascarenhas, Ajith Arthur; Rusek, Korben; Bennett, Janine Camille; Levine, Joshua; Pebay, Philippe Pierre; Gyulassy, Attila; Thompson, David C.; Rojas, Joseph Maurice
2011-07-01
This document presents current technical progress and dissemination of results for the Mathematics for Analysis of Petascale Data (MAPD) project titled 'Topology for Statistical Modeling of Petascale Data', funded by the Office of Science Advanced Scientific Computing Research (ASCR) Applied Math program. Many commonly used algorithms for mathematical analysis do not scale well enough to accommodate the size or complexity of petascale data produced by computational simulations. The primary goal of this project is thus to develop new mathematical tools that address both the petascale size and uncertain nature of current data. At a high level, our approach is based on the complementary techniques of combinatorial topology and statistical modeling. In particular, we use combinatorial topology to filter out spurious data that would otherwise skew statistical modeling techniques, and we employ advanced algorithms from algebraic statistics to efficiently find globally optimal fits to statistical models. This document summarizes the technical advances we have made to date that were made possible in whole or in part by MAPD funding. These technical contributions can be divided loosely into three categories: (1) advances in the field of combinatorial topology, (2) advances in statistical modeling, and (3) new integrated topological and statistical methods.
ORISE: Statistical Analyses of Worker Health
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Statistical Analyses Statistical analyses at the Oak Ridge Institute for Science and Education (ORISE) support ongoing programs involving medical surveillance of workers and other populations, as well as occupational epidemiology and research. ORISE emphasizes insightful and accurate analysis, practical interpretation of results and clear, easily read reports. All analyses are preceded by extensive data scrubbing and verification. ORISE's approach relies on applying appropriate methods of
Key World Energy Statistics-2010 | Open Energy Information
World Energy Statistics-2010 AgencyCompany Organization: International Energy Agency Sector: Energy Topics: Market analysis Resource Type: Dataset, Maps Website: www.iea.org...
Implementing Bayesian Statistics
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Implementing Bayesian Statistics and a Full Systematic Uncertainty Propagation with the Soft X-Ray Tomography Diagnostic on the Madison Symmetric Torus by Jay Johnson A thesis submitted in partial fulfillment of the requirements for the degree of Bachelors of Science (Physics) at the University of Wisconsin - Madison 2013 i Abstract The Madison Symmetric Torus uses multiple diagnostics to measure electron temper- ature (T e ). The soft x-ray (SXR) diagnostic measures T e from x-ray emission in
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Statistical Table by Appropriation (dollars in thousands - OMB Scoring) FY 2007 FY 2008 FY 2009 Current Current Congressional Op. Plan Approp. Request $ % Discretionary Summary By Appropriation Energy And Water Development, And Related Agencies Appropriation Summary: Energy Programs Energy efficiency and renewable energy.......................... -- 1,722,407 1,255,393 -467,014 -27.1% Electricity delivery and energy reliability........................... -- 138,556 134,000 -4,556 -3.3% Nuclear
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Statistical Table by Appropriation (dollars in thousands - OMB Scoring) FY 2008 FY 2009 FY 2009 FY 2010 Current Current Current Congressional Approp. Approp. Recovery Request $ % Discretionary Summary By Appropriation Energy And Water Development, And Related Agencies Appropriation Summary: Energy Programs Energy efficiency and renewable energy....................................... 1,704,112 2,178,540 16,800,000 2,318,602 +140,062 +6.4% Electricity delivery and energy
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
2Statistical Table by Appropriation (dollars in thousands - OMB Scoring) FY 2010 FY 2011 FY 2011 FY 2012 Current Congressional Annualized Congressional Approp. Request CR Request $ % Discretionary Summary By Appropriation Energy And Water Development, And Related Agencies Appropriation Summary: Energy Programs Energy efficiency and renewable energy....................................... 2,216,392 2,355,473 2,242,500 3,200,053 +983,661 +44.4% Electricity delivery and energy
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Primout, M.; Babonneau, D.; Jacquet, L.; Gilleron, F.; Peyrusse, O.; Fournier, K. B.; Marrs, R.; May, M. J.; Heeter, R. F.; Wallace, R. J.
2015-11-10
We studied the titanium K-shell emission spectra from multi-keV x-ray source experiments with hybrid targets on the OMEGA laser facility. Using the collisional-radiative TRANSPEC code, dedicated to K-shell spectroscopy, we reproduced the main features of the detailed spectra measured with the time-resolved MSPEC spectrometer. We developed a general method to infer the Ne, Te and Ti characteristics of the target plasma from the spectral analysis (ratio of integrated Lyman-α to Helium-α in-band emission and the peak amplitude of individual line ratios) of the multi-keV x-ray emission. Finally, these thermodynamic conditions are compared to those calculated independently by the radiation-hydrodynamics transportmore » code FCI2.« less
Primout, M.; Babonneau, D.; Jacquet, L.; Gilleron, F.; Peyrusse, O.; Fournier, K. B.; Marrs, R.; May, M. J.; Heeter, R. F.; Wallace, R. J.
2015-11-10
We studied the titanium K-shell emission spectra from multi-keV x-ray source experiments with hybrid targets on the OMEGA laser facility. Using the collisional-radiative TRANSPEC code, dedicated to K-shell spectroscopy, we reproduced the main features of the detailed spectra measured with the time-resolved MSPEC spectrometer. We developed a general method to infer the N_{e}, T_{e} and T_{i} characteristics of the target plasma from the spectral analysis (ratio of integrated Lyman-α to Helium-α in-band emission and the peak amplitude of individual line ratios) of the multi-keV x-ray emission. Finally, these thermodynamic conditions are compared to those calculated independently by the radiation-hydrodynamics transport code FCI2.
Experimental Mathematics and Computational Statistics
Bailey, David H.; Borwein, Jonathan M.
2009-04-30
The field of statistics has long been noted for techniques to detect patterns and regularities in numerical data. In this article we explore connections between statistics and the emerging field of 'experimental mathematics'. These includes both applications of experimental mathematics in statistics, as well as statistical methods applied to computational mathematics.
International petroleum statistics report
1995-10-01
The International Petroleum Statistics Report is a monthly publication that provides current international oil data. This report presents data on international oil production, demand, imports, exports and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). Section 2 presents an oil supply/demand balance for the world, in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries.
International petroleum statistics report
1997-05-01
The International Petroleum Statistics Report is a monthly publication that provides current international oil data. This report is published for the use of Members of Congress, Federal agencies, State agencies, industry, and the general public. Publication of this report is in keeping with responsibilities given the Energy Information Administration in Public Law 95-91. The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1995; OECD stocks from 1973 through 1995; and OECD trade from 1985 through 1995.
Big-Data RHEED analysis for understanding epitaxial film growth processes
Vasudevan, Rama K; Tselev, Alexander; Baddorf, Arthur P; Kalinin, Sergei V
2014-10-28
Reflection high energy electron diffraction (RHEED) has by now become a standard tool for in-situ monitoring of film growth by pulsed laser deposition and molecular beam epitaxy. Yet despite the widespread adoption and wealth of information in RHEED image, most applications are limited to observing intensity oscillations of the specular spot, and much additional information on growth is discarded. With ease of data acquisition and increased computation speeds, statistical methods to rapidly mine the dataset are now feasible. Here, we develop such an approach to the analysis of the fundamental growth processes through multivariate statistical analysis of RHEED image sequence. This approach is illustrated for growth of LaxCa1-xMnO3 films grown on etched (001) SrTiO3 substrates, but is universal. The multivariate methods including principal component analysis and k-means clustering provide insight into the relevant behaviors, the timing and nature of a disordered to ordered growth change, and highlight statistically significant patterns. Fourier analysis yields the harmonic components of the signal and allows separation of the relevant components and baselines, isolating the assymetric nature of the step density function and the transmission spots from the imperfect layer-by-layer (LBL) growth. These studies show the promise of big data approaches to obtaining more insight into film properties during and after epitaxial film growth. Furthermore, these studies open the pathway to use forward prediction methods to potentially allow significantly more control over growth process and hence final film quality.
Statistical physics ""Beyond equilibrium
Ecke, Robert E
2009-01-01
The scientific challenges of the 21st century will increasingly involve competing interactions, geometric frustration, spatial and temporal intrinsic inhomogeneity, nanoscale structures, and interactions spanning many scales. We will focus on a broad class of emerging problems that will require new tools in non-equilibrium statistical physics and that will find application in new material functionality, in predicting complex spatial dynamics, and in understanding novel states of matter. Our work will encompass materials under extreme conditions involving elastic/plastic deformation, competing interactions, intrinsic inhomogeneity, frustration in condensed matter systems, scaling phenomena in disordered materials from glasses to granular matter, quantum chemistry applied to nano-scale materials, soft-matter materials, and spatio-temporal properties of both ordinary and complex fluids.
statistics Home Rmckeel's picture Submitted by Rmckeel(297) Contributor 8 November, 2012 - 12:58 OpenEI dashboard Google Analytics mediawiki OpenEI statistics wiki OpenEI web...
Independent Statistics & Analysis Drilling Productivity Report
Gasoline and Diesel Fuel Update (EIA)
with +- signs and color-coded arrows to highlight the growth or decline in oil (brown) or natural gas (blue). New-well oilgas production per rig Charts present historical...
International petroleum statistics report
1996-05-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1995; OECD stocks from 1973 through 1995; and OECD trade from 1084 through 1994.
International petroleum statistics report
1996-10-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. Word oil production and OECD demand data are for the years 1970 through 1995; OECD stocks from 1973 through 1995; and OECD trade from 1985 through 1995.
International petroleum statistics report
1995-11-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1994; OECD stocks from 1973 through 1994; and OECD trade from 1984 through 1994.
International petroleum statistics report
1997-07-01
The International Petroleum Statistics Report is a monthly publication that provides current international data. The report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent 12 months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1996; OECD stocks from 1973 through 1996; and OECD trade from 1986 through 1996.
International petroleum statistics report
1995-07-27
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1994; OECD stocks from 1973 through 1994; and OECD trade from 1984 through 1994.
AMERICAN STATISTICAL ASSOCIATION
U.S. Energy Information Administration (EIA) Indexed Site
OPEN SESSION Washington, D.C. Thursday, October 23, 2008 PARTICIPANTS: Committee Members: NAGARAJ K. NEERCHAL, Chair University of Maryland Baltimore County DEREK BINGHAM Simon Fraser University EDWARD A. BLAIR University of Houston STEPHEN BROWN Energy Economics and Microeconomic Policy Analysis CUTLER CLEVELAND Center for Energy and Environmental Studies MOSHE FEDER Research Triangle Institute International WALTER W. HILL St. Mary's College of Maryland VINCE IANNACCHIONE RTI International
AMERICAN STATISTICAL ASSOCIATION
U.S. Energy Information Administration (EIA) Indexed Site
Washington, D.C. Friday, October 24, 2008 PARTICIPANTS: Committee Members: NAGARAJ K. NEERCHAL, Chair University of Maryland Baltimore County DEREK BINGHAM Simon Fraser University EDWARD A. BLAIR University of Houston STEPHEN BROWN Energy Economics and Microeconomic Policy Analysis CUTLER CLEVELAND Center for Energy and Environmental Studies MOSHE FEDER Research Triangle Institute International BARBARA FORSYTH University of Maryland Center for Advanced Study of Language WALTER W. HILL St.
Angular-momentum nonclassicality by breaking classical bounds on statistics
Luis, Alfredo; Rivas, Angel
2011-10-15
We derive simple practical procedures revealing the quantum behavior of angular momentum variables by the violation of classical upper bounds on the statistics. Data analysis is minimum and definite conclusions are obtained without evaluation of moments, or any other more sophisticated procedures. These nonclassical tests are very general and independent of other typical quantum signatures of nonclassical behavior such as sub-Poissonian statistics, squeezing, or oscillatory statistics, being insensitive to the nonclassical behavior displayed by other variables.
ARM - Historical Field Campaign Statistics
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Operational Visitors and Accounts Data Archive and Usage (October 1995 - Present) Historical Field Campaign Statistics ARM Climate Research Facility users regularly conduct...
Design and performance of a scalable, parallel statistics toolkit.
Thompson, David C.; Bennett, Janine Camille; Pebay, Philippe Pierre
2010-11-01
Most statistical software packages implement a broad range of techniques but do so in an ad hoc fashion, leaving users who do not have a broad knowledge of statistics at a disadvantage since they may not understand all the implications of a given analysis or how to test the validity of results. These packages are also largely serial in nature, or target multicore architectures instead of distributed-memory systems, or provide only a small number of statistics in parallel. This paper surveys a collection of parallel implementations of statistics algorithm developed as part of a common framework over the last 3 years. The framework strategically groups modeling techniques with associated verification and validation techniques to make the underlying assumptions of the statistics more clear. Furthermore it employs a design pattern specifically targeted for distributed-memory parallelism, where architectural advances in large-scale high-performance computing have been focused. Moment-based statistics (which include descriptive, correlative, and multicorrelative statistics, principal component analysis (PCA), and k-means statistics) scale nearly linearly with the data set size and number of processes. Entropy-based statistics (which include order and contingency statistics) do not scale well when the data in question is continuous or quasi-diffuse but do scale well when the data is discrete and compact. We confirm and extend our earlier results by now establishing near-optimal scalability with up to 10,000 processes.
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Lupoi, Jason S.; Smith-Moritz, Andreia; Singh, Seema; McQualter, Richard; Scheller, Henrik V.; Simmons, Blake A.; Henry, Robert J.
2015-07-10
Background: Slow-degrading, fossil fuel-derived plastics can have deleterious effects on the environment, especially marine ecosystems. The production of bio-based, biodegradable plastics from or in plants can assist in supplanting those manufactured using fossil fuels. Polyhydroxybutyrate (PHB) is one such biodegradable polyester that has been evaluated as a possible candidate for relinquishing the use of environmentally harmful plastics. Results: PHB, possessing similar properties to polyesters produced from non-renewable sources, has been previously engineered in sugarcane, thereby creating a high-value co-product in addition to the high biomass yield. This manuscript illustrates the coupling of a Fourier-transform infrared microspectrometer, equipped with a focalmore » plane array (FPA) detector, with multivariate imaging to successfully identify and localize PHB aggregates. Principal component analysis imaging facilitated the mining of the abundant quantity of spectral data acquired using the FPA for distinct PHB vibrational modes. PHB was measured in the chloroplasts of mesophyll and bundle sheath cells, acquiescent with previously evaluated plant samples. Conclusion: This study demonstrates the power of IR microspectroscopy to rapidly image plant sections to provide a snapshot of the chemical composition of the cell. While PHB was localized in sugarcane, this method is readily transferable to other value-added co-products in different plants.« less
Lupoi, Jason S.; Smith-Moritz, Andreia; Singh, Seema; McQualter, Richard; Scheller, Henrik V.; Simmons, Blake A.; Henry, Robert J.
2015-07-10
Background: Slow-degrading, fossil fuel-derived plastics can have deleterious effects on the environment, especially marine ecosystems. The production of bio-based, biodegradable plastics from or in plants can assist in supplanting those manufactured using fossil fuels. Polyhydroxybutyrate (PHB) is one such biodegradable polyester that has been evaluated as a possible candidate for relinquishing the use of environmentally harmful plastics. Results: PHB, possessing similar properties to polyesters produced from non-renewable sources, has been previously engineered in sugarcane, thereby creating a high-value co-product in addition to the high biomass yield. This manuscript illustrates the coupling of a Fourier-transform infrared microspectrometer, equipped with a focal plane array (FPA) detector, with multivariate imaging to successfully identify and localize PHB aggregates. Principal component analysis imaging facilitated the mining of the abundant quantity of spectral data acquired using the FPA for distinct PHB vibrational modes. PHB was measured in the chloroplasts of mesophyll and bundle sheath cells, acquiescent with previously evaluated plant samples. Conclusion: This study demonstrates the power of IR microspectroscopy to rapidly image plant sections to provide a snapshot of the chemical composition of the cell. While PHB was localized in sugarcane, this method is readily transferable to other value-added co-products in different plants.
High Performance Multivariate Visual Data Exploration for Extremely Large Data
Rubel, Oliver; Wu, Kesheng; Childs, Hank; Meredith, Jeremy; Geddes, Cameron G.R.; Cormier-Michel, Estelle; Ahern, Sean; Weber, Gunther H.; Messmer, Peter; Hagen, Hans; Hamann, Bernd; Bethel, E. Wes; Prabhat,
2008-08-22
One of the central challenges in modern science is the need to quickly derive knowledge and understanding from large, complex collections of data. We present a new approach that deals with this challenge by combining and extending techniques from high performance visual data analysis and scientific data management. This approach is demonstrated within the context of gaining insight from complex, time-varying datasets produced by a laser wakefield accelerator simulation. Our approach leverages histogram-based parallel coordinates for both visual information display as well as a vehicle for guiding a data mining operation. Data extraction and subsetting are implemented with state-of-the-art index/query technology. This approach, while applied here to accelerator science, is generally applicable to a broad set of science applications, and is implemented in a production-quality visual data analysis infrastructure. We conduct a detailed performance analysis and demonstrate good scalability on a distributed memory Cray XT4 system.
A Visual Analytics Approach for Correlation, Classification, and Regression Analysis
Steed, Chad A; SwanII, J. Edward; Fitzpatrick, Patrick J.; Jankun-Kelly, T.J.
2012-02-01
New approaches that combine the strengths of humans and machines are necessary to equip analysts with the proper tools for exploring today's increasing complex, multivariate data sets. In this paper, a novel visual data mining framework, called the Multidimensional Data eXplorer (MDX), is described that addresses the challenges of today's data by combining automated statistical analytics with a highly interactive parallel coordinates based canvas. In addition to several intuitive interaction capabilities, this framework offers a rich set of graphical statistical indicators, interactive regression analysis, visual correlation mining, automated axis arrangements and filtering, and data classification techniques. The current work provides a detailed description of the system as well as a discussion of key design aspects and critical feedback from domain experts.
Exploratory Data analysis ENvironment eXtreme scale (EDENx)
Energy Science and Technology Software Center (OSTI)
2015-07-01
EDENx is a multivariate data visualization tool that allows interactive user driven analysis of large-scale data sets with high dimensionality. EDENx builds on our earlier system, called EDEN to enable analysis of more dimensions and larger scale data sets. EDENx provides an initial overview of summary statistics for each variable in the data set under investigation. EDENx allows the user to interact with graphical summary plots of the data to investigate subsets and their statisticalmore » associations. These plots include histograms, binned scatterplots, binned parallel coordinate plots, timeline plots, and graphical correlation indicators. From the EDENx interface, a user can select a subsample of interest and launch a more detailed data visualization via the EDEN system. EDENx is best suited for high-level, aggregate analysis tasks while EDEN is more appropriate for detail data investigations.« less
Topological Cacti: Visualizing Contour-based Statistics
Weber, Gunther H.; Bremer, Peer-Timo; Pascucci, Valerio
2011-05-26
Contours, the connected components of level sets, play an important role in understanding the global structure of a scalar field. In particular their nestingbehavior and topology-often represented in form of a contour tree-have been used extensively for visualization and analysis. However, traditional contour trees onlyencode structural properties like number of contours or the nesting of contours, but little quantitative information such as volume or other statistics. Here we use thesegmentation implied by a contour tree to compute a large number of per-contour (interval) based statistics of both the function defining the contour tree as well asother co-located functions. We introduce a new visual metaphor for contour trees, called topological cacti, that extends the traditional toporrery display of acontour tree to display additional quantitative information as width of the cactus trunk and length of its spikes. We apply the new technique to scalar fields ofvarying dimension and different measures to demonstrate the effectiveness of the approach.
Web Analytics and Statistics | Department of Energy
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
User Experience Research & Statistics » Web Analytics and Statistics Web Analytics and Statistics EERE uses Google Analytics to capture statistics on its websites. These statistics help website managers measure and report on users, sessions, most visited pages, and more. The Web Template Coordinator can provide you with EERE's username and password and answer questions about your site statistics. Adding Google Analytics to EERE Websites In order for Google Analytics to capture statistics on
Key China Energy Statistics 2012
Levine, Mark; Fridley, David; Lu, Hongyou; Fino-Chen, Cecilia
2012-05-01
The China Energy Group at Lawrence Berkeley National Laboratory (LBNL) was established in 1988. Over the years the Group has gained recognition as an authoritative source of China energy statistics through the publication of its China Energy Databook (CED). The Group has published seven editions to date of the CED (http://china.lbl.gov/research/chinaenergy-databook). This handbook summarizes key statistics from the CED and is expressly modeled on the International Energy Agency’s “Key World Energy Statistics” series of publications. The handbook contains timely, clearly-presented data on the supply, transformation, and consumption of all major energy sources.
Key China Energy Statistics 2011
Levine, Mark; Fridley, David; Lu, Hongyou; Fino-Chen, Cecilia
2012-01-15
The China Energy Group at Lawrence Berkeley National Laboratory (LBNL) was established in 1988. Over the years the Group has gained recognition as an authoritative source of China energy statistics through the publication of its China Energy Databook (CED). In 2008 the Group published the Seventh Edition of the CED (http://china.lbl.gov/research/chinaenergy-databook). This handbook summarizes key statistics from the CED and is expressly modeled on the International Energy Agency’s “Key World Energy Statistics” series of publications. The handbook contains timely, clearly-presented data on the supply, transformation, and consumption of all major energy sources.
QUANTUM MECHANICS WITHOUT STATISTICAL POSTULATES
G. GEIGER; ET AL
2000-11-01
The Bohmian formulation of quantum mechanics describes the measurement process in an intuitive way without a reduction postulate. Due to the chaotic motion of the hidden classical particle all statistical features of quantum mechanics during a sequence of repeated measurements can be derived in the framework of a deterministic single system theory.
Ideas for Effective Communication of Statistical Results
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Anderson-Cook, Christine M.
2015-03-01
Effective presentation of statistical results to those with less statistical training, including managers and decision-makers requires planning, anticipation and thoughtful delivery. Here are several recommendations for effectively presenting statistical results.
VTPI-Transportation Statistics | Open Energy Information
Area: Transportation Resource Type: Dataset Website: www.vtpi.orgtdmtdm80.htm Cost: Free VTPI-Transportation Statistics Screenshot References: VTPI-Transportation Statistics1...
IEA Energy Statistics | Open Energy Information
Statistics Jump to: navigation, search Tool Summary LAUNCH TOOL Name: IEA Energy Statistics AgencyCompany Organization: International Energy Agency Sector: Energy Topics: GHG...
STATISTICAL PERFORMANCE EVALUATION OF SPRING OPERATED PRESSURE...
Office of Scientific and Technical Information (OSTI)
STATISTICAL PERFORMANCE EVALUATION OF SPRING OPERATED PRESSURE RELIEF VALVE RELIABILITY IMPROVEMENTS 2004 TO 2014 Citation Details In-Document Search Title: STATISTICAL PERFORMANCE ...
On the ability of Order Statistics to distinguish different models for continuum gamma decay
Sandoval, J. J.; Cristancho, F.
2007-10-26
A simulation procedure to calculate some important parameters to the application of Order Statistics in the analysis of continuum gamma decay is presented.
Office of Survey Development and Statistical Integration
U.S. Energy Information Administration (EIA) Indexed Site
Steve Harvey April 27, 2011 | Washington, D.C. Tough Choices in U.S. EIA's Data Programs Agenda * Office of Oil, Gas, and Coal Supply Statistics * Office of Petroleum and Biofuels Statistics * Office of Electricity, Renewables, and Uranium Statistics * Office of Energy Consumption and Efficiency Statistics * Office of Survey Development and Statistical Integration 2 Presenter name, Presentation location, Presentation date Coal Data Collection Program 3 James Kendell Washington, DC, April 27,
Moore honored with American Statistical Association award
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
American Statistical Association Award Moore honored with American Statistical Association award Lisa Moore is the recipient of the 2013 Don Owen Award presented by the American Statistical Association, San Antonio Chapter. May 24, 2013 Leslie "Lisa" Moore Leslie "Lisa" Moore The American Statistical Association (ASA) is the world's largest community of statisticians. It was founded in Massachusetts in 1839. Leslie "Lisa" Moore of the Laboratory's Statistical
Moore named an American Statistical Society Fellow
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Moore named an American Statistical Society Fellow Moore named an American Statistical Society Fellow The ASA inducted Leslie (Lisa) Moore as a Fellow at the 2014 Joint Statistical Meetings. October 8, 2014 Leslie (Lisa) Moore Leslie (Lisa) Moore ASA cited Moore for "seminal and creative research on the design of computer experiments; for statistical collaboration on a wide range of problems of scientific and national importance; and for mentoring statisticians and statistical
Transportation Statistics Annual Report 1997
Fenn, M.
1997-01-01
This document is the fourth Transportation Statistics Annual Report (TSAR) prepared by the Bureau of Transportation Statistics (BTS) for the President and Congress. As in previous years, it reports on the state of U.S. transportation system at two levels. First, in Part I, it provides a statistical and interpretive survey of the system—its physical characteristics, its economic attributes, aspects of its use and performance, and the scale and severity of unintended consequences of transportation, such as fatalities and injuries, oil import dependency, and environment impacts. Part I also explores the state of transportation statistics, and new needs of the rapidly changing world of transportation. Second, Part II of the report, as in prior years, explores in detail the performance of the U.S. transportation system from the perspective of desired social outcomes or strategic goals. This year, the performance aspect of transportation chosen for thematic treatment is “Mobility and Access,” which complements past TSAR theme sections on “The Economic Performance of Transportation” (1995) and “Transportation and the Environment” (1996). Mobility and access are at the heart of the transportation system’s performance from the user’s perspective. In what ways and to what extent does the geographic freedom provided by transportation enhance personal fulfillment of the nation’s residents and contribute to economic advancement of people and businesses? This broad question underlies many of the topics examined in Part II: What is the current level of personal mobility in the United States, and how does it vary by sex, age, income level, urban or rural location, and over time? What factors explain variations? Has transportation helped improve people’s access to work, shopping, recreational facilities, and medical services, and in what ways and in what locations? How have barriers, such as age, disabilities, or lack of an automobile, affected these accessibility patterns? How are commodity flows and transportation services responding to global competition, deregulation, economic restructuring, and new information technologies? How do U.S. patterns of personal mobility and freight movement compare with other advanced industrialized countries, formerly centrally planned economies, and major newly industrializing countries? Finally, how is the rapid adoption of new information technologies influencing the patterns of transportation demand and the supply of new transportation services? Indeed, how are information technologies affecting the nature and organization of transportation services used by individuals and firms?
Lectures on probability and statistics
Yost, G.P.
1984-09-01
These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another.
Tomography and weak lensing statistics
Munshi, Dipak; Coles, Peter; Kilbinger, Martin E-mail: peter.coles@astro.cf.ac.uk
2014-04-01
We provide generic predictions for the lower order cumulants of weak lensing maps, and their correlators for tomographic bins as well as in three dimensions (3D). Using small-angle approximation, we derive the corresponding one- and two-point probability distribution function for the tomographic maps from different bins and for 3D convergence maps. The modelling of weak lensing statistics is obtained by adopting a detailed prescription for the underlying density contrast that involves hierarchal ansatz and lognormal distribution. We study the dependence of our results on cosmological parameters and source distributions corresponding to the realistic surveys such as LSST and DES. We briefly outline how photometric redshift information can be incorporated in our results. We also show how topological properties of convergence maps can be quantified using our results.
Quantrum chaos and statistical nuclear physics
Not Available
1986-01-01
This book contains 33 selections. Some of the titles are: Chaotic motion and statistical nuclear theory; Test of spectrum and strength fluctuations with proton resonances; Nuclear level densities and level spacing distributions; Spectral statistics of scale invariant systems; and Antiunitary symmetries and energy level statistics.
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
I/O Statistics Last 30 Days I/O Statistics Last 30 Days These plots show the daily statistics for the last 30 days for the storage systems at NERSC in terms of the amount of data transferred and the number of files transferred. Daily I/O Volume Daily I/O Count
STORM: A STatistical Object Representation Model
Rafanelli, M. ); Shoshani, A. )
1989-11-01
In this paper we explore the structure and semantic properties of the entities stored in statistical databases. We call such entities statistical objects'' (SOs) and propose a new statistical object representation model,'' based on a graph representation. We identify a number of SO representational problems in current models and propose a methodology for their solution. 11 refs.
Statistics and Discoveries at the LHC (3/4)
None
2011-10-06
The lectures will give an introduction to statistics as applied in particle physics and will provide all the necessary basics for data analysis at the LHC. Special emphasis will be placed on the the problems and questions that arise when searching for new phenomena, including p-values, discovery significance, limit setting procedures, treatment of small signals in the presence of large backgrounds. Specific issues that will be addressed include the advantages and drawbacks of different statistical test procedures (cut-based, likelihood-ratio, etc.), the look-elsewhere effect and treatment of systematic uncertainties.
Statistics and Discoveries at the LHC (4/4)
None
2011-10-06
The lectures will give an introduction to statistics as applied in particle physics and will provide all the necessary basics for data analysis at the LHC. Special emphasis will be placed on the the problems and questions that arise when searching for new phenomena, including p-values, discovery significance, limit setting procedures, treatment of small signals in the presence of large backgrounds. Specific issues that will be addressed include the advantages and drawbacks of different statistical test procedures (cut-based, likelihood-ratio, etc.), the look-elsewhere effect and treatment of systematic uncertainties.
Statistics and Discoveries at the LHC (2/4)
None
2011-10-06
The lectures will give an introduction to statistics as applied in particle physics and will provide all the necessary basics for data analysis at the LHC. Special emphasis will be placed on the the problems and questions that arise when searching for new phenomena, including p-values, discovery significance, limit setting procedures, treatment of small signals in the presence of large backgrounds. Specific issues that will be addressed include the advantages and drawbacks of different statistical test procedures (cut-based, likelihood-ratio, etc.), the look-elsewhere effect and treatment of systematic uncertainties.
Statistics and Discoveries at the LHC (1/4)
None
2011-10-06
The lectures will give an introduction to statistics as applied in particle physics and will provide all the necessary basics for data analysis at the LHC. Special emphasis will be placed on the the problems and questions that arise when searching for new phenomena, including p-values, discovery significance, limit setting procedures, treatment of small signals in the presence of large backgrounds. Specific issues that will be addressed include the advantages and drawbacks of different statistical test procedures (cut-based, likelihood-ratio, etc.), the look-elsewhere effect and treatment of systematic uncertainties.
Beyth, M.; Broxton, D.; McInteer, C.; Averett, W.R.; Stablein, N.K.
1980-06-01
Multivariate statistical analysis to support the National Uranium Resource Evaluation and to evaluate strategic and other commercially important mineral resources was carried out on Hydrogeochemical and Stream Sediment Reconnaissance data from the Montrose quadrangle, Colorado. The analysis suggests that: (1) the southern Colorado Mineral Belt is an area favorable for uranium mineral occurrences; (2) carnotite-type occurrences are likely in the nose of the Gunnison Uplift; (3) uranium mineral occurrences may be present along the western and northern margins of the West Elk crater; (4) a base-metal mineralized area is associated with the Uncompahgre Uplift; and (5) uranium and base metals are associated in some areas, and both are often controlled by faults trending west-northwest and north.
On the Bayesian Treed Multivariate Gaussian Process with Linear Model of Coregionalization
Konomi, Bledar A.; Karagiannis, Georgios; Lin, Guang
2015-02-01
The Bayesian treed Gaussian process (BTGP) has gained popularity in recent years because it provides a straightforward mechanism for modeling non-stationary data and can alleviate computational demands by fitting models to less data. The extension of BTGP to the multivariate setting requires us to model the cross-covariance and to propose efficient algorithms that can deal with trans-dimensional MCMC moves. In this paper we extend the cross-covariance of the Bayesian treed multivariate Gaussian process (BTMGP) to that of linear model of Coregionalization (LMC) cross-covariances. Different strategies have been developed to improve the MCMC mixing and invert smaller matrices in the Bayesian inference. Moreover, we compare the proposed BTMGP with existing multiple BTGP and BTMGP in test cases and multiphase flow computer experiment in a full scale regenerator of a carbon capture unit. The use of the BTMGP with LMC cross-covariance helped to predict the computer experiments relatively better than existing competitors. The proposed model has a wide variety of applications, such as computer experiments and environmental data. In the case of computer experiments we also develop an adaptive sampling strategy for the BTMGP with LMC cross-covariance function.
Konomi, Bledar A.; Karagiannis, Georgios; Sarkar, Avik; Sun, Xin; Lin, Guang
2014-05-16
Computer experiments (numerical simulations) are widely used in scientific research to study and predict the behavior of complex systems, which usually have responses consisting of a set of distinct outputs. The computational cost of the simulations at high resolution are often expensive and become impractical for parametric studies at different input values. To overcome these difficulties we develop a Bayesian treed multivariate Gaussian process (BTMGP) as an extension of the Bayesian treed Gaussian process (BTGP) in order to model and evaluate a multivariate process. A suitable choice of covariance function and the prior distributions facilitates the different Markov chain Monte Carlo (MCMC) movements. We utilize this model to sequentially sample the input space for the most informative values, taking into account model uncertainty and expertise gained. A simulation study demonstrates the use of the proposed method and compares it with alternative approaches. We apply the sequential sampling technique and BTMGP to model the multiphase flow in a full scale regenerator of a carbon capture unit. The application presented in this paper is an important tool for research into carbon dioxide emissions from thermal power plants.
Statistics for characterizing data on the periphery
Theiler, James P; Hush, Donald R
2010-01-01
We introduce a class of statistics for characterizing the periphery of a distribution, and show that these statistics are particularly valuable for problems in target detection. Because so many detection algorithms are rooted in Gaussian statistics, we concentrate on ellipsoidal models of high-dimensional data distributions (that is to say: covariance matrices), but we recommend several alternatives to the sample covariance matrix that more efficiently model the periphery of a distribution, and can more effectively detect anomalous data samples.
Statistical assessment of Monte Carlo distributional tallies
Kiedrowski, Brian C; Solomon, Clell J
2010-12-09
Four tests are developed to assess the statistical reliability of distributional or mesh tallies. To this end, the relative variance density function is developed and its moments are studied using simplified, non-transport models. The statistical tests are performed upon the results of MCNP calculations of three different transport test problems and appear to show that the tests are appropriate indicators of global statistical quality.
STATISTICAL PERFORMANCE EVALUATION OF SPRING OPERATED PRESSURE...
Office of Scientific and Technical Information (OSTI)
VALVE RELIABILITY IMPROVEMENTS 2004 TO 2014 Citation Details In-Document Search Title: STATISTICAL PERFORMANCE EVALUATION OF SPRING OPERATED PRESSURE RELIEF VALVE RELIABILITY ...
Statistics for Industry Groups and Industries, 2003
2009-01-18
Statistics for the U.S. Department of Commerce including types of manufacturing, employees, and products as outlined in the Annual Survey of Manufacturers (ASM).
Structure Learning and Statistical Estimation in Distribution...
Office of Scientific and Technical Information (OSTI)
Citation Details In-Document Search Title: Structure Learning and Statistical Estimation ... Part I of this paper discusses the problem of learning the operational structure of the ...
Statistical methods for nuclear material management
Bowen W.M.; Bennett, C.A.
1988-12-01
This book is intended as a reference manual of statistical methodology for nuclear material management practitioners. It describes statistical methods currently or potentially important in nuclear material management, explains the choice of methods for specific applications, and provides examples of practical applications to nuclear material management problems. Together with the accompanying training manual, which contains fully worked out problems keyed to each chapter, this book can also be used as a textbook for courses in statistical methods for nuclear material management. It should provide increased understanding and guidance to help improve the application of statistical methods to nuclear material management problems.
Moore honored with American Statistical Association award
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Before his death in 1991, Professor Owen was the Distinguished Professor of Statistics at Southern Methodist University in Dallas, Texas. His illustrious career serves as the ...
[pic] EERE Web Site Statistics - Social Media
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
EERE Web Site Statistics - Social Media Custom View: 10110 - 93011 October 1, 2010 ... site compels visitors to return. Updating web site content is one way to draw return ...
ANNUAL FEDERAL EQUAL EMPLOYMENT OPPORTUNITY STATISTICAL REPORT...
National Nuclear Security Administration (NNSA)
STATISTICAL REPORT OF DISCRIMINATION COMPLAINTS (REPORTING PERIOD BEGINS OCTOBER 1ST AND ... COUNSELORINVESTIGATOR F. COMPLAINTS IN LINE E CLOSED DURING REPORT PERIOD a. FULL-TIME b. ...
ANNUAL FEDERAL EQUAL EMPLOYMENT OPPORTUNITY STATISTICAL REPORT...
National Nuclear Security Administration (NNSA)
4 EEOC FORM 462 (REVISED APR 2011) Report Status: Finalized, 11062014 6:14 PM 1 PART I - ... EQUAL EMPLOYMENT OPPORTUNITY STATISTICAL REPORT OF DISCRIMINATION COMPLAINTS (REPORTING ...
Detailed Monthly and Annual LNG Import Statistics (2004-2012...
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Detailed Monthly and Annual LNG Import Statistics (2004-2012) Detailed Monthly and Annual LNG Import Statistics (2004-2012) Detailed Monthly and Annual LNG Import Statistics ...
Multivariable Robust Control of a Simulated Hybrid Solid Oxide Fuel Cell Gas Turbine Plant
Tsai, Alex; Banta, Larry; Tucker, David; Gemmen, Randall
2010-08-01
This work presents a systematic approach to the multivariable robust control of a hybrid fuel cell gas turbine plant. The hybrid configuration under investigation built by the National Energy Technology Laboratory comprises a physical simulation of a 300kW fuel cell coupled to a 120kW auxiliary power unit single spool gas turbine. The public facility provides for the testing and simulation of different fuel cell models that in turn help identify the key difficulties encountered in the transient operation of such systems. An empirical model of the built facility comprising a simulated fuel cell cathode volume and balance of plant components is derived via frequency response data. Through the modulation of various airflow bypass valves within the hybrid configuration, Bode plots are used to derive key input/output interactions in transfer function format. A multivariate system is then built from individual transfer functions, creating a matrix that serves as the nominal plant in an H{sub {infinity}} robust control algorithm. The controller’s main objective is to track and maintain hybrid operational constraints in the fuel cell’s cathode airflow, and the turbo machinery states of temperature and speed, under transient disturbances. This algorithm is then tested on a Simulink/MatLab platform for various perturbations of load and fuel cell heat effluence. As a complementary tool to the aforementioned empirical plant, a nonlinear analytical model faithful to the existing process and instrumentation arrangement is evaluated and designed in the Simulink environment. This parallel task intends to serve as a building block to scalable hybrid configurations that might require a more detailed nonlinear representation for a wide variety of controller schemes and hardware implementations.
Federal offshore statistics: leasing - exploration - production - revenue
Essertier, E.P.
1984-01-01
Federal Offshore Statistics is a numerical record of what has happened since Congress gave authority to the Secretary of the Interior in 1953 to lease the Federal portion of the Continental Shelf for oil and gas. The publication updates and augments the first Federal Offshore Statistics, published in December 1983. It also extends a statistical series published annually from 1969 until 1981 by the US Geological Survey (USGS) under the title Outer Continental Shelf Statistics. The USGS collected royalties and supervised operation and production of minerals on the Outer Continental Shelf (OCS) until the Minerals Management Service (MMS) took over these functions in 1982. Statistics are presented under the following topics: (1) highlights, (2) leasing, (3) exploration and development, (4) production and revenue, (5) federal offshore production by ranking operator, 1983, (6) reserves and undiscovered recoverable resources, and (7) oil pollution in the world's oceans.
MAVTgsa: An R Package for Gene Set (Enrichment) Analysis
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Chien, Chih-Yi; Chang, Ching-Wei; Tsai, Chen-An; Chen, James J.
2014-01-01
Gene semore » t analysis methods aim to determine whether an a priori defined set of genes shows statistically significant difference in expression on either categorical or continuous outcomes. Although many methods for gene set analysis have been proposed, a systematic analysis tool for identification of different types of gene set significance modules has not been developed previously. This work presents an R package, called MAVTgsa, which includes three different methods for integrated gene set enrichment analysis. (1) The one-sided OLS (ordinary least squares) test detects coordinated changes of genes in gene set in one direction, either up- or downregulation. (2) The two-sided MANOVA (multivariate analysis variance) detects changes both up- and downregulation for studying two or more experimental conditions. (3) A random forests-based procedure is to identify gene sets that can accurately predict samples from different experimental conditions or are associated with the continuous phenotypes. MAVTgsa computes the P values and FDR (false discovery rate) q -value for all gene sets in the study. Furthermore, MAVTgsa provides several visualization outputs to support and interpret the enrichment results. This package is available online.« less
Computing contingency statistics in parallel : design trade-offs and limiting cases.
Bennett, Janine Camille; Thompson, David; Pebay, Philippe Pierre
2010-06-01
Statistical analysis is typically used to reduce the dimensionality of and infer meaning from data. A key challenge of any statistical analysis package aimed at large-scale, distributed data is to address the orthogonal issues of parallel scalability and numerical stability. Many statistical techniques, e.g., descriptive statistics or principal component analysis, are based on moments and co-moments and, using robust online update formulas, can be computed in an embarrassingly parallel manner, amenable to a map-reduce style implementation. In this paper we focus on contingency tables, through which numerous derived statistics such as joint and marginal probability, point-wise mutual information, information entropy, and {chi}{sup 2} independence statistics can be directly obtained. However, contingency tables can become large as data size increases, requiring a correspondingly large amount of communication between processors. This potential increase in communication prevents optimal parallel speedup and is the main difference with moment-based statistics (which we discussed in [1]) where the amount of inter-processor communication is independent of data size. Here we present the design trade-offs which we made to implement the computation of contingency tables in parallel. We also study the parallel speedup and scalability properties of our open source implementation. In particular, we observe optimal speed-up and scalability when the contingency statistics are used in their appropriate context, namely, when the data input is not quasi-diffuse.
Computing contingency statistics in parallel : design trade-offs and limiting cases.
Thompson, David C.; Bennett, Janine C.; Pebay, Philippe Pierre
2010-03-01
Statistical analysis is typically used to reduce the dimensionality of and infer meaning from data. A key challenge of any statistical analysis package aimed at large-scale, distributed data is to address the orthogonal issues of parallel scalability and numerical stability. Many statistical techniques, e.g., descriptive statistics or principal component analysis, are based on moments and co-moments and, using robust online update formulas, can be computed in an embarrassingly parallel manner, amenable to a map-reduce style implementation. In this paper we focus on contingency tables, through which numerous derived statistics such as joint and marginal probability, point-wise mutual information, information entropy, and X{sup 2} independence statistics can be directly obtained. However, contingency tables can become large as data size increases, requiring a correspondingly large amount of communication between processors. This potential increase in communication prevents optimal parallel speedup and is the main difference with moment-based statistics (which we discussed in [1]) where the amount of inter-processor communication is independent of data size. Here we present the design trade-offs which we made to implement the computation of contingency tables in parallel.We also study the parallel speedup and scalability properties of our open source implementation. In particular, we observe optimal speed-up and scalability when the contingency statistics are used in their appropriate context, namely, when the data input is not quasi-diffuse.
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Last 30 Days These plots show the daily statistics for the last 30 days for the storage systems at NERSC in terms of the amount of data transferred and the number of files...
EERE Web Site Engagement Statistics: FY09
Broader source: Energy.gov (indexed) [DOE]
WEB SITE ENGAGEMENT STATISTICS TECHNOLOGY ADVANCEMENT AND OUTREACH | 01 TABLE OF CONTENTS ... Views 02 Average Visit Duration 03 Top 20 Web Sites by Visits 03 Top 20 Visited Pages 04 ...
Statistical criteria for characterizing irradiance time series.
Stein, Joshua S.; Ellis, Abraham; Hansen, Clifford W.
2010-10-01
We propose and examine several statistical criteria for characterizing time series of solar irradiance. Time series of irradiance are used in analyses that seek to quantify the performance of photovoltaic (PV) power systems over time. Time series of irradiance are either measured or are simulated using models. Simulations of irradiance are often calibrated to or generated from statistics for observed irradiance and simulations are validated by comparing the simulation output to the observed irradiance. Criteria used in this comparison should derive from the context of the analyses in which the simulated irradiance is to be used. We examine three statistics that characterize time series and their use as criteria for comparing time series. We demonstrate these statistics using observed irradiance data recorded in August 2007 in Las Vegas, Nevada, and in June 2009 in Albuquerque, New Mexico.
Statistical analysis of variations in impurity ion heating at...
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
... heating of majority ions has been studied; 16 and most recently, the charge and mass dependency of impurity ion heating has been investigated. 17 Despite many laboratory ...
Statistical Analysis of Variation in the Human Plasma Proteome
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Corzett, Todd H.; Fodor, Imola K.; Choi, Megan W.; Walsworth, Vicki L.; Turteltaub, Kenneth W.; McCutchen-Maloney, Sandra L.; Chromy, Brett A.
2010-01-01
Quantifying the variation in the human plasma proteome is an essential prerequisite for disease-specific biomarker detection. We report here on the longitudinal and individual variation in human plasma characterized by two-dimensional difference gel electrophoresis (2-D DIGE) using plasma samples from eleven healthy subjects collected three times over a two week period. Fixed-effects modeling was used to remove dye and gel variability. Mixed-effects modeling was then used to quantitate the sources of proteomic variation. The subject-to-subject variation represented the largest variance component, while the time-within-subject variation was comparable to the experimental variation found in a previous technical variability study where onemore » human plasma sample was processed eight times in parallel and each was then analyzed by 2-D DIGE in triplicate. Here, 21 protein spots had larger than 50% CV, suggesting that these proteins may not be appropriate as biomarkers and should be carefully scrutinized in future studies. Seventy-eight protein spots showing differential protein levels between different individuals or individual collections were identified by mass spectrometry and further characterized using hierarchical clustering. The results present a first step toward understanding the complexity of longitudinal and individual variation in the human plasma proteome, and provide a baseline for improved biomarker discovery.« less
U.S. Energy Information Administration Independent Statistics & Analysis
U.S. Energy Information Administration (EIA) Indexed Site
Tight Oil Production Trends in a Low Price Environment For EIA Conference June 15, 2015 | Washington, DC by Grant Nülle, Upstream Oil & Gas Economist Office of Petroleum, Natural Gas and Biofuels Outline * Effects low prices are having on U.S. oil production * EIA's short-term outlook for production * Reasons why U.S. output may be more resilient than otherwise thought Nülle | U.S. Tight Oil Trends, June 15, 2015 2 U.S. oil supply: reasons for reduced output in the short term * Oil price
Statistical Analysis of Transient Cycle Test Results in a 40...
Broader source: Energy.gov (indexed) [DOE]
Effects of ""new"" engine testing procedures (40 CFR Part 1065) with respect to repeatability of transient engine dynamometer tests were examined as well as the effects of ...
Canadian National Energy Use Database: Statistics and Analysis...
Occidental, Inuktitut, Inupiaq, Iranian languages, Irish, Iroquoian languages, Italian, Japanese, Javanese, Judeo-Arabic, Judeo-Persian, Kabardian, Kabyle, Kachin; Jingpho,...
Combined statistical and dynamical assessment of simulated
Office of Scientific and Technical Information (OSTI)
vegetation-rainfall in North Africa during the mid-Holocene* (Journal Article) | SciTech Connect Combined statistical and dynamical assessment of simulated vegetation-rainfall in North Africa during the mid-Holocene* Citation Details In-Document Search Title: Combined statistical and dynamical assessment of simulated vegetation-rainfall in North Africa during the mid-Holocene* A negative feedback of vegetation cover on subsequent annual precipitation is simulated for the mid-Holocene over
FY 2015 Statistical Table by Appropriation
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Statistical Table by Appropriation (dollars in thousands - OMB Scoring) Statistical Table by Appropriation Page 1 FY 2015 Congressional Request FY 2013 FY 2014 FY 2014 FY 2014 FY 2015 Current Enacted Adjustment Current Congressional Approp. Approp. Approp. Request Discretionary Summary By Appropriation Energy And Water Development And Related Agencies Appropriation Summary: Energy Programs Energy efficiency and renewable energy............................... 1,691,757 1,900,641 ---- 1,900,641
FY 2015 Statistical Table by Organization
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
of Energy FY 2015 Statistical Table by Organization (dollars in thousands - OMB Scoring) Statistical Table by Organization Page 1 FY 2015 Congressional Request FY 2013 FY 2014 FY 2014 FY 2014 FY 2015 Current Enacted Adjustments Current Congressional Approp. Approp. Approp. Request Discretionary Summary By Organization National Nuclear Security Administration Weapons Activities........................................................................... 6,966,855 7,781,000 ---- 7,781,000 8,314,902
Statistical Fault Detection & Diagnosis Expert System
Energy Science and Technology Software Center (OSTI)
1996-12-18
STATMON is an expert system that performs real-time fault detection and diagnosis of redundant sensors in any industrial process requiring high reliability. After a training period performed during normal operation, the expert system monitors the statistical properties of the incoming signals using a pattern recognition test. If the test determines that statistical properties of the signals have changed, the expert system performs a sequence of logical steps to determine which sensor or machine component hasmoredegraded.less
Complex statistics and diffusion in nonlinear disordered particle chains
Antonopoulos, Ch. G.; Bountis, T.; Skokos, Ch.; Drossos, L.
2014-06-15
We investigate dynamically and statistically diffusive motion in a Klein-Gordon particle chain in the presence of disorder. In particular, we examine a low energy (subdiffusive) and a higher energy (self-trapping) case and verify that subdiffusive spreading is always observed. We then carry out a statistical analysis of the motion, in both cases, in the sense of the Central Limit Theorem and present evidence of different chaos behaviors, for various groups of particles. Integrating the equations of motion for times as long as 10{sup 9}, our probability distribution functions always tend to Gaussians and show that the dynamics does not relax onto a quasi-periodic Kolmogorov-Arnold-Moser torus and that diffusion continues to spread chaotically for arbitrarily long times.
Statistics of particle time-temperature histories.
Hewson, John C.; Lignell, David O.; Sun, Guangyuan
2014-10-01
Particles in non - isothermal turbulent flow are subject to a stochastic environment tha t produces a distribution of particle time - temperature histories. This distribution is a function of the dispersion of the non - isothermal (continuous) gas phase and the distribution of particles relative to that gas phase. In this work we extend the one - dimensional turbulence (ODT) model to predict the joint dispersion of a dispersed particle phase and a continuous phase. The ODT model predicts the turbulent evolution of continuous scalar fields with a model for the cascade of fluctuations to smaller sc ales (the 'triplet map') at a rate that is a function of the fully resolved one - dimens ional velocity field . Stochastic triplet maps also drive Lagrangian particle dispersion with finite Stokes number s including inertial and eddy trajectory - crossing effect s included. Two distinct approaches to this coupling between triplet maps and particle dispersion are developed and implemented along with a hybrid approach. An 'instantaneous' particle displacement model matches the tracer particle limit and provide s an accurate description of particle dispersion. A 'continuous' particle displacement m odel translates triplet maps into a continuous velocity field to which particles respond. Particles can alter the turbulence, and modifications to the stochastic rate expr ession are developed for two - way coupling between particles and the continuous phase. Each aspect of model development is evaluated in canonical flows (homogeneous turbulence, free - shear flows and wall - bounded flows) for which quality measurements are ava ilable. ODT simulations of non - isothermal flows provide statistics for particle heating. These simulations show the significance of accurately predicting the joint statistics of particle and fluid dispersion . Inhomogeneous turbulence coupled with the in fluence of the mean flow fields on particles of varying properties alter s particle dispersion. The joint particle - temperature dispersion leads to a distribution of temperature histories predicted by the ODT . Predictions are shown for the lower moments an d the full distributions of the particle positions, particle - observed gas temperatures and particle temperatures. An analysis of the time scales affecting particle - temperature interactions covers Lagrangian integral time scales based on temperature autoco rrelations, rates of temperature change associated with particle motion relative to the temperature field and rates of diffusional change of temperatures. These latter two time scales have not been investigated previously; they are shown to be strongly in termittent having peaked distributions with long tails. The logarithm of the absolute value of these time scales exhibits a distribution closer to normal. A cknowledgements This work is supported by the Defense Threat Reduction Agency (DTRA) under their Counter - Weapons of Mass Destruction Basic Research Program in the area of Chemical and Biological Agent Defeat under award number HDTRA1 - 11 - 4503I to Sandia National Laboratories. The authors would like to express their appreciation for the guidance provi ded by Dr. Suhithi Peiris to this project and to the Science to Defeat Weapons of Mass Destruction program.
NREL: Energy Analysis - Michael Gleason
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Gleason Michael Gleason is a member of the Data Analysis and Visualization Group in the Strategic Energy Analysis Center. Scientist III - GIS On staff since April 2014 Phone number: 303-275-4109 E-mail: Michael.Gleason@nrel.gov Areas of expertise Geospatial analysis and modeling Scientific programming Multivariate data visualization Technical writing and editing Primary research interests Modeling technical resource potential for renewable technologies Modeling market diffusion of distributed
n-dimensional Statistical Inverse Graphical Hydraulic Test Simulator
Energy Science and Technology Software Center (OSTI)
2012-09-12
nSIGHTS (n-dimensional Statistical Inverse Graphical Hydraulic Test Simulator) is a comprehensive well test analysis software package. It provides a user-interface, a well test analysis model and many tools to analyze both field and simulated data. The well test analysis model simulates a single-phase, one-dimensional, radial/non-radial flow regime, with a borehole at the center of the modeled flow system. nSIGHTS solves the radially symmetric n-dimensional forward flow problem using a solver based on a graph-theoretic approach.more » The results of the forward simulation are pressure, and flow rate, given all the input parameters. The parameter estimation portion of nSIGHTS uses a perturbation-based approach to interpret the best-fit well and reservoir parameters, given an observed dataset of pressure and flow rate.« less
Statistical Fault Detection & Diagnosis Expert System
Energy Science and Technology Software Center (OSTI)
1996-12-18
STATMON is an expert system that performs real-time fault detection and diagnosis of redundant sensors in any industrial process requiring high reliability. After a training period performed during normal operation, the expert system monitors the statistical properties of the incoming signals using a pattern recognition test. If the test determines that statistical properties of the signals have changed, the expert system performs a sequence of logical steps to determine which sensor or machine component hasmore » degraded.« less
Federal offshore statistics: leasing, exploration, production, revenue
Essertier, E.P.
1983-01-01
The statistics in this update of the Outer Continental Shelf Statistics publication document what has happened since federal leasing began on the Outer Continental Shelf (OCS) in 1954. Highlights note that of the 29.8 million acres actually leased from 175.6 million acres offered for leasing, 20.1% were in frontier areas. Total revenues for the 1954-1982 period were $58.9 billion with about 13% received in 1982. The book is divided into six parts covering highlights, leasing, exploration and development, production and revenue, reserves and undiscovered recoverable resources, and pollution problems from well and tanker accidents. 5 figures, 59 tables.
DOE - NNSA/NFO -- FOIA Statistics
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Statistics NNSA/NFO Language Options U.S. DOE/NNSA - Nevada Field Office FOIA Statistics The FOIA has become a useful tool for researchers, news media, and the general public. In October 1996, Congress enacted the Electronic Freedom of Information Act Amendments of 1996, one of which (5 U.S.C. 552(a)(6)(A)(i)) extended the agency response period from 10 days to 20 days, and another (5 U.S.C. 552(e)) which requires agencies to make available to the public fiscal year FOIA Annual Reports. This
FRAMES Software System: Linking to the Statistical Package R
Castleton, Karl J.; Whelan, Gene; Hoopes, Bonnie L.
2006-12-11
This document provides requirements, design, data-file specifications, test plan, and Quality Assurance/Quality Control protocol for the linkage between the statistical package R and the Framework for Risk Analysis in Multimedia Environmental Systems (FRAMES) Versions 1.x and 2.0. The requirements identify the attributes of the system. The design describes how the system will be structured to meet those requirements. The specification presents the specific modifications to FRAMES to meet the requirements and design. The test plan confirms that the basic functionality listed in the requirements (black box testing) actually functions as designed, and QA/QC confirms that the software meets the client’s needs.
Summary Statistics for Homemade ?Play Dough? -- Data Acquired at LLNL
Kallman, J S; Morales, K E; Whipple, R E; Huber, R D; Martz, A; Brown, W D; Smith, J A; Schneberk, D J; Martz, Jr., H E; White, III, W T
2010-03-11
Using x-ray computerized tomography (CT), we have characterized the x-ray linear attenuation coefficients (LAC) of a homemade Play Dough{trademark}-like material, designated as PDA. Table 1 gives the first-order statistics for each of four CT measurements, estimated with a Gaussian kernel density estimator (KDE) analysis. The mean values of the LAC range from a high of about 2700 LMHU{sub D} 100kVp to a low of about 1200 LMHUD at 300kVp. The standard deviation of each measurement is around 10% to 15% of the mean. The entropy covers the range from 6.0 to 7.4. Ordinarily, we would model the LAC of the material and compare the modeled values to the measured values. In this case, however, we did not have the detailed chemical composition of the material and therefore did not model the LAC. Using a method recently proposed by Lawrence Livermore National Laboratory (LLNL), we estimate the value of the effective atomic number, Z{sub eff}, to be near 10. LLNL prepared about 50mL of the homemade 'Play Dough' in a polypropylene vial and firmly compressed it immediately prior to the x-ray measurements. We used the computer program IMGREC to reconstruct the CT images. The values of the key parameters used in the data capture and image reconstruction are given in this report. Additional details may be found in the experimental SOP and a separate document. To characterize the statistical distribution of LAC values in each CT image, we first isolated an 80% central-core segment of volume elements ('voxels') lying completely within the specimen, away from the walls of the polypropylene vial. All of the voxels within this central core, including those comprised of voids and inclusions, are included in the statistics. We then calculated the mean value, standard deviation and entropy for (a) the four image segments and for (b) their digital gradient images. (A digital gradient image of a given image was obtained by taking the absolute value of the difference between the initial image and that same image offset by one voxel horizontally, parallel to the rows of the x-ray detector array.) The statistics of the initial image of LAC values are called 'first order statistics;' those of the gradient image, 'second order statistics.'
Summary Statistics for Fun Dough Data Acquired at LLNL
Kallman, J S; Morales, K E; Whipple, R E; Huber, R D; Brown, W D; Smith, J A; Schneberk, D J; Martz, Jr., H E; White, III, W T
2010-03-11
Using x-ray computerized tomography (CT), we have characterized the x-ray linear attenuation coefficients (LAC) of a Play Dough{trademark}-like product, Fun Dough{trademark}, designated as PD. Table 1 gives the first-order statistics for each of four CT measurements, estimated with a Gaussian kernel density estimator (KDE) analysis. The mean values of the LAC range from a high of about 2100 LMHU{sub D} at 100kVp to a low of about 1100 LMHU{sub D} at 300kVp. The standard deviation of each measurement is around 1% of the mean. The entropy covers the range from 3.9 to 4.6. Ordinarily, we would model the LAC of the material and compare the modeled values to the measured values. In this case, however, we did not have the composition of the material and therefore did not model the LAC. Using a method recently proposed by Lawrence Livermore National Laboratory (LLNL), we estimate the value of the effective atomic number, Z{sub eff}, to be near 8.5. LLNL prepared about 50mL of the Fun Dough{trademark} in a polypropylene vial and firmly compressed it immediately prior to the x-ray measurements. Still, layers can plainly be seen in the reconstructed images, indicating that the bulk density of the material in the container is affected by voids and bubbles. We used the computer program IMGREC to reconstruct the CT images. The values of the key parameters used in the data capture and image reconstruction are given in this report. Additional details may be found in the experimental SOP and a separate document. To characterize the statistical distribution of LAC values in each CT image, we first isolated an 80% central-core segment of volume elements ('voxels') lying completely within the specimen, away from the walls of the polypropylene vial. All of the voxels within this central core, including those comprised of voids and inclusions, are included in the statistics. We then calculated the mean value, standard deviation and entropy for (a) the four image segments and for (b) their digital gradient images. (A digital gradient image of a given image was obtained by taking the absolute value of the difference between the initial image and that same image offset by one voxel horizontally, parallel to the rows of the x-ray detector array.) The statistics of the initial image of LAC values are called 'first order statistics;' those of the gradient image, 'second order statistics.'
Statistics of dislocation pinning at localized obstacles
Dutta, A.; Bhattacharya, M. Barat, P.
2014-10-14
Pinning of dislocations at nanosized obstacles like precipitates, voids, and bubbles is a crucial mechanism in the context of phenomena like hardening and creep. The interaction between such an obstacle and a dislocation is often studied at fundamental level by means of analytical tools, atomistic simulations, and finite element methods. Nevertheless, the information extracted from such studies cannot be utilized to its maximum extent on account of insufficient information about the underlying statistics of this process comprising a large number of dislocations and obstacles in a system. Here, we propose a new statistical approach, where the statistics of pinning of dislocations by idealized spherical obstacles is explored by taking into account the generalized size-distribution of the obstacles along with the dislocation density within a three-dimensional framework. Starting with a minimal set of material parameters, the framework employs the method of geometrical statistics with a few simple assumptions compatible with the real physical scenario. The application of this approach, in combination with the knowledge of fundamental dislocation-obstacle interactions, has successfully been demonstrated for dislocation pinning at nanovoids in neutron irradiated type 316-stainless steel in regard to the non-conservative motion of dislocations. An interesting phenomenon of transition from rare pinning to multiple pinning regimes with increasing irradiation temperature is revealed.
Baseballs and Barrels: World Statistics Day
Broader source: Energy.gov [DOE]
Statistics don’t just help us answer trivia questions – they also help us make intelligent decisions. For example, if I heat my home with natural gas, I’m probably interested in what natural gas prices are likely to be this winter.
Multifragmentation: New dynamics or old statistics?
Moretto, L.G.; Delis, D.N.; Wozniak, G.J.
1993-10-01
The understanding of the fission process as it has developed over the last fifty years has been applied to multifragmentation. Two salient aspects have been discovered: 1) a strong decoupling of the entrance and exit channels with the formation of well-characterized sources: 2) a statistical competition between two-, three-, four-, five-, ... n-body decays.
Picard, R.R.
1987-01-01
Many aspects of the MUF-D statistic, used for verification of accountability data, have been examined in the safeguards literature. In this paper, basic MUF-D results are extended to more general environments than are usually considered. These environments include arbitrary measurement error structures, various sampling regimes that could be imposed by the inspectorate, and the attributes/variables framework.
Statistical surrogate models for prediction of high-consequence...
Office of Scientific and Technical Information (OSTI)
Statistical surrogate models for prediction of high-consequence climate change. Citation Details In-Document Search Title: Statistical surrogate models for prediction of ...
An overview of component qualification using Bayesian statistics...
Office of Scientific and Technical Information (OSTI)
using Bayesian statistics and energy methods. Citation Details In-Document Search Title: An overview of component qualification using Bayesian statistics and energy methods. ...
Cosmology constraints from shear peak statistics in Dark Energy...
Office of Scientific and Technical Information (OSTI)
shear peak statistics in Dark Energy Survey Science Verification data Citation Details In-Document Search Title: Cosmology constraints from shear peak statistics in Dark Energy ...
Masked Areas in Shear Peak Statistics: A Forward Modeling Approach...
Office of Scientific and Technical Information (OSTI)
Masked Areas in Shear Peak Statistics: A Forward Modeling Approach Citation Details In-Document Search Title: Masked Areas in Shear Peak Statistics: A Forward Modeling Approach ...
Statistical Behavior of Formation Process of Magnetic Vortex...
Office of Scientific and Technical Information (OSTI)
Statistical Behavior of Formation Process of Magnetic Vortex State in Ni80Fe20 Nanodisks Citation Details In-Document Search Title: Statistical Behavior of Formation Process of ...
Masked Areas in Shear Peak Statistics: A Forward Modeling Approach...
Office of Scientific and Technical Information (OSTI)
Journal Article: Masked Areas in Shear Peak Statistics: A Forward Modeling Approach Citation Details In-Document Search Title: Masked Areas in Shear Peak Statistics: A Forward ...
Fact #602: December 21, 2009 Freight Statistics by Mode, 2007...
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
2: December 21, 2009 Freight Statistics by Mode, 2007 Commodity Flow Survey Fact 602: December 21, 2009 Freight Statistics by Mode, 2007 Commodity Flow Survey Results from the ...
Doppler Lidar Vertical Velocity Statistics Value-Added Product...
Office of Scientific and Technical Information (OSTI)
Vertical Velocity Statistics Value-Added Product Citation Details In-Document Search Title: Doppler Lidar Vertical Velocity Statistics Value-Added Product You are accessing a ...
User Statistics Collection Practices Archives | U.S. DOE Office...
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Policies and Processes User Statistics Collection Practices User Statistics Collection Practices Archives User Facilities User Facilities Home User Facilities at a Glance...
RITA-Bureau of Transportation Statistics | Open Energy Information
RITA-Bureau of Transportation Statistics Jump to: navigation, search Tool Summary LAUNCH TOOL Name: RITA-Bureau of Transportation Statistics AgencyCompany Organization: United...
Random-matrix approach to the statistical compound nuclear reaction...
Office of Scientific and Technical Information (OSTI)
statistical compound nuclear reaction at low energies using the Monte-Carlo technique Citation Details In-Document Search Title: Random-matrix approach to the statistical compound ...
FY 2014 Budget Request Statistical Table | Department of Energy
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Statistical Table FY 2014 Budget Request Statistical Table PDF icon Stats Table FY2014.pdf More Documents & Publications FY 2009 Environmental Management Budget Request to Congress ...
Statistical and Domain Analytics Applied to PV Module Lifetime...
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Statistical and Domain Analytics Applied to PV Module Lifetime and Degradation Science Statistical and Domain Analytics Applied to PV Module Lifetime and Degradation Science ...
ARM Climate Modeling Best Estimate Lamont, OK Statistical Summary...
Office of Scientific and Technical Information (OSTI)
Climate Modeling Best Estimate Lamont, OK Statistical Summary (ARMBE-CLDRAD SGPC1) Title: ARM Climate Modeling Best Estimate Lamont, OK Statistical Summary (ARMBE-CLDRAD SGPC1) ...
A statistical perspective of validation and UQ (Conference) ...
Office of Scientific and Technical Information (OSTI)
A statistical perspective of validation and UQ Citation Details In-Document Search Title: A statistical perspective of validation and UQ You are accessing a document from the ...
Evolution in Cloud Population Statistics of the MJO. From AMIE...
Office of Scientific and Technical Information (OSTI)
Technical Report: Evolution in Cloud Population Statistics of the MJO. From AMIE Field ... Citation Details In-Document Search Title: Evolution in Cloud Population Statistics of the ...
Evolution in Cloud Population Statistics of the MJO. From AMIE...
Office of Scientific and Technical Information (OSTI)
Evolution in Cloud Population Statistics of the MJO. From AMIE Field Observations to ... Citation Details In-Document Search Title: Evolution in Cloud Population Statistics of the ...
Distributed Design and Analysis of Computer Experiments
Energy Science and Technology Software Center (OSTI)
2002-11-11
DDACE is a C++ object-oriented software library for the design and analysis of computer experiments. DDACE can be used to generate samples from a variety of sampling techniques. These samples may be used as input to a application code. DDACE also contains statistical tools such as response surface models and correlation coefficients to analyze input/output relationships between variables in an application code. DDACE can generate input values for uncertain variables within a user's application. Formore » example, a user might like to vary a temperature variable as well as some material variables in a series of simulations. Through the series of simulations the user might be looking for optimal settings of parameters based on some user criteria. Or the user may be interested in the sensitivity to input variability shown by an output variable. In either case, the user may provide information about the suspected ranges and distributions of a set of input variables, along with a sampling scheme, and DDACE will generate input points based on these specifications. The input values generated by DDACE and the one or more outputs computed through the user's application code can be analyzed with a variety of statistical methods. This can lead to a wealth of information about the relationships between the variables in the problem. While statistical and mathematical packages may be employeed to carry out the analysis on the input/output relationships, DDACE also contains some tools for analyzing the simulation data. DDACE incorporates a software package called MARS (Multivariate Adaptive Regression Splines), developed by Jerome Friedman. MARS is used for generating a spline surface fit of the data. With MARS, a model simplification may be calculated using the input and corresponding output, values for the user's application problem. The MARS grid data may be used for generating 3-dimensional response surface plots of the simulation data. DDACE also contains an implementation of an algorithm by Michael McKay to compute variable correlations. DDACE can also be used to carry out a main-effects analysis to calculate the sensitivity of an output variable to each of the varied inputs taken individually. 1 Continued« less
Statistical approach to nuclear level density
Sen'kov, R. A.; Horoi, M.; Zelevinsky, V. G.
2014-10-15
We discuss the level density in a finite many-body system with strong interaction between the constituents. Our primary object of applications is the atomic nucleus but the same techniques can be applied to other mesoscopic systems. We calculate and compare nuclear level densities for given quantum numbers obtained by different methods, such as nuclear shell model (the most successful microscopic approach), our main instrument - moments method (statistical approach), and Fermi-gas model; the calculation with the moments method can use any shell-model Hamiltonian excluding the spurious states of the center-of-mass motion. Our goal is to investigate statistical properties of nuclear level density, define its phenomenological parameters, and offer an affordable and reliable way of calculation.
Robust statistical reconstruction for charged particle tomography
2013-10-08
Systems and methods for charged particle detection including statistical reconstruction of object volume scattering density profiles from charged particle tomographic data to determine the probability distribution of charged particle scattering using a statistical multiple scattering model and determine a substantially maximum likelihood estimate of object volume scattering density using expectation maximization (ML/EM) algorithm to reconstruct the object volume scattering density. The presence of and/or type of object occupying the volume of interest can be identified from the reconstructed volume scattering density profile. The charged particle tomographic data can be cosmic ray muon tomographic data from a muon tracker for scanning packages, containers, vehicles or cargo. The method can be implemented using a computer program which is executable on a computer.
Mathematical and Statistical Opportunities in Cyber Security
Office of Scientific and Technical Information (OSTI)
Mathematical and Statistical Opportunities in Cyber Security ∗ Juan Meza † Scott Campbell ‡ David Bailey § Abstract The role of mathematics in a complex system such as the Internet has yet to be deeply explored. In this paper, we summarize some of the important and pressing problems in cyber security from the viewpoint of open science environments. We start by posing the question "What fundamental problems exist within cyber security research that can be helped by advanced
FY 2011 Statistical Table by Appropriation
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Statistical Table by Appropriation (dollars in thousands - OMB Scoring) FY 2009 FY 2009 FY 2010 FY 2011 Current Current Current Congressional Approp. Recovery Approp. Request $ % Discretionary Summary By Appropriation Energy And Water Development, And Related Agencies Appropriation Summary: Energy Programs Energy efficiency and renewable energy........................................... 2,156,865 16,771,907 2,242,500 2,355,473 +112,973 +5.0% Electricity delivery and energy
FY 2017 Statistical Table by Organization
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Organization (dollars in thousands - OMB Scoring) Statistical Table by Organization Page 1 FY 2017 Congressional Budget Justification FY 2015 FY 2015 FY 2016 FY 2017 Enacted Current Enacted Congressional Approp. Approp. Approp. Request $ % Discretionary Summary By Organization National Nuclear Security Administration Weapons Activities................................................................................. 8,180,359 8,180,609 8,846,948 9,243,147 +396,199 +4.5% Defense Nuclear
Federal offshore statistics: leasing, exploration, production, revenue
Essertier, E.P.
1984-09-01
This publication is a numerical record of what has happened since Congress gave authority to the Secretary of the Interior in 1953 to lease the federal portion of the Continental Shelf for oil and gas. The publication updates and augments the first Federal Offshore Statistics, published in December 1983. It also extends a statistical series published annually from 1969 until 1981 by the US Geological Survey (USGS) under the title Outer Continental Shelf Statistics. The USGS collected royalties and supervised operation and production of minerals on the Outer Continental Shelf (OCS) until the Minerals Management Service (MMS) took over these functions in 1982. Some of the highlights are: of the 329.5 million acres offered for leasing, 37.1 million acres were actually leased; total revenues for the 1954 to 1983 period were $68,173,112,563 and for 1983 $9,161,435,540; a total of 22,095 wells were drilled in federal waters and 10,145 wells were drilled in state waters; from 1954 through 1983, federal offshore areas produced 6.4 billion barrels of oil and condensate, and 62.1 trillion cubic feet of natural gas; in 1983 alone production was 340.7 million barrels of oil and condensate, and 3.9 trillion cubic feet of gas; and for the second straight year, no oil was lost in 1983 as a result of blowouts in federal waters. 8 figures, 66 tables.
Wang, Hongmei [Department of Radiation Oncology, Nanfang Hospital, Southern Medical University, Guangzhou, Guangdong Province, P.R. of China (China); Liao, Zhongxing, E-mail: zliao@mdanderson.org [Department of Radiation Oncology, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Zhuang, Yan; Xu, Ting; Nguyen, Quynh-Nhu; Levy, Lawrence B.; O'Reilly, Michael [Department of Radiation Oncology, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Gold, Kathryn A. [Department of Thoracic Medical Oncology, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Gomez, Daniel R. [Department of Radiation Oncology, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States)
2013-12-01
Purpose: Preclinical studies have suggested that angiotensin-converting enzyme inhibitors (ACEIs) can mitigate radiation-induced lung injury. We sought here to investigate possible associations between ACEI use and the risk of symptomatic radiation pneumonitis (RP) among patients undergoing radiation therapy (RT) for nonsmall cell lung cancer (NSCLC). Methods and Materials: We retrospectively identified patients who received definitive radiation therapy for stages I to III NSCLC between 2004 and 2010 at a single tertiary cancer center. Patients must have received a radiation dose of at least 60 Gy for a single primary lung tumor and have had imaging and dosimetric data available for analysis. RP was quantified according to Common Terminology Criteria for Adverse Events, version 3.0. A Cox proportional hazard model was used to assess potential associations between ACEI use and risk of symptomatic RP. Results: Of 413 patients analyzed, 65 were using ACEIs during RT. In univariate analysis, the rate of RP grade ?2 seemed lower in ACEI users than in nonusers (34% vs 46%), but this apparent difference was not statistically significant (P=.06). In multivariate analysis of all patients, ACEI use was not associated with the risk of symptomatic RP (hazard ratio [HR] = 0.66; P=.07) after adjustment for sex, smoking status, mean lung dose (MLD), and concurrent carboplatin and paclitaxel chemotherapy. Subgroup analysis showed that ACEI use did have a protective effect from RP grade ?2 among patients who received a low (?20-Gy) MLD (P<.01) or were male (P=.04). Conclusions: A trend toward reduction in symptomatic RP among patients taking ACEIs during RT for NSCLC was not statistically significant on univariate or multivariate analyses, although certain subgroups may benefit from use (ie, male patients and those receiving low MLD). The evidence at this point is insufficient to establish whether the use of ACEIs does or does not reduce the risk of RP.
Weatherization Assistance Program - Background Data and Statistics
Eisenberg, Joel Fred
2010-03-01
This technical memorandum is intended to provide readers with information that may be useful in understanding the purposes, performance, and outcomes of the Department of Energy's (DOE's) Weatherization Assistance Program (Weatherization). Weatherization has been in operation for over thirty years and is the nation's largest single residential energy efficiency program. Its primary purpose, established by law, is 'to increase the energy efficiency of dwellings owned or occupied by low-income persons, reduce their total residential energy expenditures, and improve their health and safety, especially low-income persons who are particularly vulnerable such as the elderly, the handicapped, and children.' The American Reinvestment and Recovery Act PL111-5 (ARRA), passed and signed into law in February 2009, committed $5 Billion over two years to an expanded Weatherization Assistance Program. This has created substantial interest in the program, the population it serves, the energy and cost savings it produces, and its cost-effectiveness. This memorandum is intended to address the need for this kind of information. Statistically valid answers to many of the questions surrounding Weatherization and its performance require comprehensive evaluation of the program. DOE is undertaking precisely this kind of independent evaluation in order to ascertain program effectiveness and to improve its performance. Results of this evaluation effort will begin to emerge in late 2010 and 2011, but they require substantial time and effort. In the meantime, the data and statistics in this memorandum can provide reasonable and transparent estimates of key program characteristics. The memorandum is laid out in three sections. The first deals with some key characteristics describing low-income energy consumption and expenditures. The second section provides estimates of energy savings and energy bill reductions that the program can reasonably be presumed to be producing. The third section deals with estimates of program cost-effectiveness and societal impacts such as carbon reduction and reduced national energy consumption. Each of the sections is brief, containing statistics, explanatory graphics and tables as appropriate, and short explanations of the statistics in order to place them in context for the reader. The companion appendices at the back of the memorandum explain the methods and sources used in developing the statistics.
Lightweight and Statistical Techniques for Petascale PetaScale Debugging
Miller, Barton
2014-06-30
This project investigated novel techniques for debugging scientific applications on petascale architectures. In particular, we developed lightweight tools that narrow the problem space when bugs are encountered. We also developed techniques that either limit the number of tasks and the code regions to which a developer must apply a traditional debugger or that apply statistical techniques to provide direct suggestions of the location and type of error. We extend previous work on the Stack Trace Analysis Tool (STAT), that has already demonstrated scalability to over one hundred thousand MPI tasks. We also extended statistical techniques developed to isolate programming errors in widely used sequential or threaded applications in the Cooperative Bug Isolation (CBI) project to large scale parallel applications. Overall, our research substantially improved productivity on petascale platforms through a tool set for debugging that complements existing commercial tools. Previously, Office Of Science application developers relied either on primitive manual debugging techniques based on printf or they use tools, such as TotalView, that do not scale beyond a few thousand processors. However, bugs often arise at scale and substantial effort and computation cycles are wasted in either reproducing the problem in a smaller run that can be analyzed with the traditional tools or in repeated runs at scale that use the primitive techniques. New techniques that work at scale and automate the process of identifying the root cause of errors were needed. These techniques significantly reduced the time spent debugging petascale applications, thus leading to a greater overall amount of time for application scientists to pursue the scientific objectives for which the systems are purchased. We developed a new paradigm for debugging at scale: techniques that reduced the debugging scenario to a scale suitable for traditional debuggers, e.g., by narrowing the search for the root-cause analysis to a small set of nodes or by identifying equivalence classes of nodes and sampling our debug targets from them. We implemented these techniques as lightweight tools that efficiently work on the full scale of the target machine. We explored four lightweight debugging refinements: generic classification parameters, such as stack traces, application-specific classification parameters, such as global variables, statistical data acquisition techniques and machine learning based approaches to perform root cause analysis. Work done under this project can be divided into two categories, new algorithms and techniques for scalable debugging, and foundation infrastructure work on our MRNet multicast-reduction framework for scalability, and Dyninst binary analysis and instrumentation toolkits.
Cosmology constraints from shear peak statistics in Dark Energy Survey
Office of Scientific and Technical Information (OSTI)
Science Verification data (Journal Article) | SciTech Connect Cosmology constraints from shear peak statistics in Dark Energy Survey Science Verification data Citation Details In-Document Search Title: Cosmology constraints from shear peak statistics in Dark Energy Survey Science Verification data Shear peak statistics has gained a lot of attention recently as a practical alternative to the two point statistics for constraining cosmological parameters. We perform a shear peak statistics
Chapter 11. Community analysis-based methods
Cao, Y.; Wu, C.H.; Andersen, G.L.; Holden, P.A.
2010-05-01
Microbial communities are each a composite of populations whose presence and relative abundance in water or other environmental samples are a direct manifestation of environmental conditions, including the introduction of microbe-rich fecal material and factors promoting persistence of the microbes therein. As shown by culture-independent methods, different animal-host fecal microbial communities appear distinctive, suggesting that their community profiles can be used to differentiate fecal samples and to potentially reveal the presence of host fecal material in environmental waters. Cross-comparisons of microbial communities from different hosts also reveal relative abundances of genetic groups that can be used to distinguish sources. In increasing order of their information richness, several community analysis methods hold promise for MST applications: phospholipid fatty acid (PLFA) analysis, denaturing gradient gel electrophoresis (DGGE), terminal restriction fragment length polymorphism (TRFLP), cloning/sequencing, and PhyloChip. Specific case studies involving TRFLP and PhyloChip approaches demonstrate the ability of community-based analyses of contaminated waters to confirm a diagnosis of water quality based on host-specific marker(s). The success of community-based MST for comprehensively confirming fecal sources relies extensively upon using appropriate multivariate statistical approaches. While community-based MST is still under evaluation and development as a primary diagnostic tool, results presented herein demonstrate its promise. Coupled with its inherently comprehensive ability to capture an unprecedented amount of microbiological data that is relevant to water quality, the tools for microbial community analysis are increasingly accessible, and community-based approaches have unparalleled potential for translation into rapid, perhaps real-time, monitoring platforms.
Statistical Characterization of Medium-Duty Electric Vehicle Drive Cycles
Prohaska, Robert; Duran, Adam; Ragatz, Adam; Kelly, Kenneth
2015-05-03
In an effort to help commercialize technologies for electric vehicles (EVs) through deployment and demonstration projects, the U.S. Department of Energy's (DOE's) American Recovery and Reinvestment Act (ARRA) provided funding to participating U.S. companies to cover part of the cost of purchasing new EVs. Within the medium- and heavy-duty commercial vehicle segment, both Smith Electric Newton and and Navistar eStar vehicles qualified for such funding opportunities. In an effort to evaluate the performance characteristics of the new technologies deployed in these vehicles operating under real world conditions, data from Smith Electric and Navistar medium-duty EVs were collected, compiled, and analyzed by the National Renewable Energy Laboratory's (NREL) Fleet Test and Evaluation team over a period of 3 years. More than 430 Smith Newton EVs have provided data representing more than 150,000 days of operation. Similarly, data have been collected from more than 100 Navistar eStar EVs, resulting in a comparative total of more than 16,000 operating days. Combined, NREL has analyzed more than 6 million kilometers of driving and 4 million hours of charging data collected from commercially operating medium-duty electric vehicles in various configurations. In this paper, extensive duty-cycle statistical analyses are performed to examine and characterize common vehicle dynamics trends and relationships based on in-use field data. The results of these analyses statistically define the vehicle dynamic and kinematic requirements for each vehicle, aiding in the selection of representative chassis dynamometer test cycles and the development of custom drive cycles that emulate daily operation. In this paper, the methodology and accompanying results of the duty-cycle statistical analysis are presented and discussed. Results are presented in both graphical and tabular formats illustrating a number of key relationships between parameters observed within the data set that relate to medium duty EVs.
Statistical simulation ?of the magnetorotational dynamo
Squire, J.; Bhattacharjee, A.
2014-08-01
We analyze turbulence and dynamo induced by the magnetorotational instability (MRI) using quasi-linear statistical simulation methods. We find that homogenous turbulence is unstable to a large scale dynamo instability, which saturates to an inhomogenous equilibrium with a very strong dependence on the magnetic Prandtl number (Pm). Despite its enormously reduced nonlinearity, the quasi-linear model exhibits the same qualitative scaling of angular momentum transport with Pm as fully nonlinear turbulence. This demonstrates the relationship of recent convergence problems to the large scale dynamo and suggests possible methods for studying astrophysically relevant regimes at very low or high Pm.
[pic] EERE Web Site Statistics - Information Center
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
EERE Web Site Statistics - Information Center Custom View: 10/1/10 - 9/30/11 October 1, 2010 12:00:00 AM - September 30, 2011 11:59:59 PM Table of Contents Overview Dashboard 3 By Number of Visits 4 Domain Names 28 Top-Level Domain Types 31 Countries 34 Visits Trend 37 Visits by Number of Pages Viewed 39 Visit Duration by Visits 41 Visit Duration by Page Views 44 Pages 47 Page Views Trend 52 File Downloads 54 Entry Pages 56 Exit Pages 60 Single-Page Visits 66 Paths, Forward 71 Referring Site 105
Correlating sampling and intensity statistics in nanoparticle diffraction experiments
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Öztürk, Hande; Yan, Hanfei; Hill, John P.; Noyan, I. Cevdet
2015-07-28
It is shown in a previous article [Öztürk, Yan, Hill & Noyan (2014).J. Appl. Cryst.47, 1016–1025] that the sampling statistics of diffracting particle populations within a polycrystalline ensemble depended on the size of the constituent crystallites: broad X-ray peak breadths enabled some nano-sized particles to contribute more than one diffraction spot to Debye–Scherrer rings. Here it is shown that the equations proposed by Alexander, Klug & Kummer [J. Appl. Phys.(1948),19, 742–753] (AKK) to link diffracting particle and diffracted intensity statistics are not applicable if the constituent crystallites of the powder are below 10 nm. In this size range, (i) themore » one-to-one correspondence between diffracting particles and Laue spots assumed in the AKK analysis is not satisfied, and (ii) the crystallographic correlation between Laue spots originating from the same grain invalidates the assumption that all diffracting plane normals are randomly oriented and uncorrelated. Such correlation produces unexpected results in the selection of diffracting grains. For example, three or more Laue spots from a given grain for a particular reflection can only be observed at certain wavelengths. In addition, correcting the diffracted intensity values by the traditional Lorentz term, 1/cos θ, to compensate for the variation of particles sampled within a reflection band does not maintain fidelity to the number of poles contributing to the diffracted signal. A new term, cos θB/cos θ, corrects this problem.« less
International energy indicators. [Statistical tables and graphs
Bauer, E.K.
1980-05-01
International statistical tables and graphs are given for the following: (1) Iran - Crude Oil Capacity, Production and Shut-in, June 1974-April 1980; (2) Saudi Arabia - Crude Oil Capacity, Production, and Shut-in, March 1974-Apr 1980; (3) OPEC (Ex-Iran and Saudi Arabia) - Capacity, Production and Shut-in, June 1974-March 1980; (4) Non-OPEC Free World and US Production of Crude Oil, January 1973-February 1980; (5) Oil Stocks - Free World, US, Japan, and Europe (Landed, 1973-1st Quarter, 1980); (6) Petroleum Consumption by Industrial Countries, January 1973-December 1979; (7) USSR Crude Oil Production and Exports, January 1974-April 1980; and (8) Free World and US Nuclear Generation Capacity, January 1973-March 1980. Similar statistical tables and graphs included for the United States include: (1) Imports of Crude Oil and Products, January 1973-April 1980; (2) Landed Cost of Saudi Oil in Current and 1974 Dollars, April 1974-January 1980; (3) US Trade in Coal, January 1973-March 1980; (4) Summary of US Merchandise Trade, 1976-March 1980; and (5) US Energy/GNP Ratio, 1947 to 1979.
International petroleum statistics report, July 1999
1999-07-01
The International Petroleum Statistics Report is a monthly publication that provides current international oil data. This report is published for the use of Members of Congress, Federal agencies, State agencies, industry, and the general public. The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1990, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years and annually for the three years prior to that. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1998; OECD stocks from 1973 through 1998; and OECD trade from 1988 through 1998. 4 figs., 44 tabs.
International petroleum statistics report, March 1998
1998-03-01
The International Petroleum Statistics Report is a monthly publication that provides current international oil data. This report is published for the use of Members of Congress, Federal agencies, State agencies, industry, and the general public. The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1996; OECD stocks from 1973 through 1996; and OECD trade from 1986 through 1996.
Lectures on probability and statistics. Revision
Yost, G.P.
1985-06-01
These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. They begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probabilty of any specified outcome. They finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another. Hopefully, the reader will come away from these notes with a feel for some of the problems and uncertainties involved. Although there are standard approaches, most of the time there is no cut and dried ''best'' solution - ''best'' according to every criterion.
Statistics and geometry of cosmic voids
Gaite, Jos
2009-11-01
We introduce new statistical methods for the study of cosmic voids, focusing on the statistics of largest size voids. We distinguish three different types of distributions of voids, namely, Poisson-like, lognormal-like and Pareto-like distributions. The last two distributions are connected with two types of fractal geometry of the matter distribution. Scaling voids with Pareto distribution appear in fractal distributions with box-counting dimension smaller than three (its maximum value), whereas the lognormal void distribution corresponds to multifractals with box-counting dimension equal to three. Moreover, voids of the former type persist in the continuum limit, namely, as the number density of observable objects grows, giving rise to lacunar fractals, whereas voids of the latter type disappear in the continuum limit, giving rise to non-lacunar (multi)fractals. We propose both lacunar and non-lacunar multifractal models of the cosmic web structure of the Universe. A non-lacunar multifractal model is supported by current galaxy surveys as well as cosmological N-body simulations. This model suggests, in particular, that small dark matter halos and, arguably, faint galaxies are present in cosmic voids.
Statistical Methods and Tools for Hanford Staged Feed Tank Sampling
Fountain, Matthew S.; Brigantic, Robert T.; Peterson, Reid A.
2013-10-01
This report summarizes work conducted by Pacific Northwest National Laboratory to technically evaluate the current approach to staged feed sampling of high-level waste (HLW) sludge to meet waste acceptance criteria (WAC) for transfer from tank farms to the Hanford Waste Treatment and Immobilization Plant (WTP). The current sampling and analysis approach is detailed in the document titled Initial Data Quality Objectives for WTP Feed Acceptance Criteria, 24590-WTP-RPT-MGT-11-014, Revision 0 (Arakali et al. 2011). The goal of this current work is to evaluate and provide recommendations to support a defensible, technical and statistical basis for the staged feed sampling approach that meets WAC data quality objectives (DQOs).
International Petroleum Statistics Report, January 1994
Not Available
1994-01-31
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade for the years 1970 through 1992; OECD stocks from 1973 through 1992; and OECD trade from 1982 through 1992.
International petroleum statistics report, November 1993
Not Available
1993-11-26
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries World oil production and OECD demand data are for the years 1970 through 1992; OECD stocks from 1973 through 1992: and OECD trade from 1982 through 1992.
Statistical correlations in the Moshinsky atom
Laguna, H. G.; Sagar, R. P.
2011-07-15
We study the influence of the interparticle and confining potentials on statistical correlation via the correlation coefficient and mutual information in ground and some excited states of the Moshinsky atom in position and momentum space. The magnitude of the correlation between positions and between momenta is equal in the ground state. In excited states, the correlation between the momenta of the particles is greater than between their positions when they interact through an attractive potential whereas for repulsive interparticle potentials the opposite is true. Shannon entropies, and their sums (entropic formulations of the uncertainty principle), are also analyzed, showing that the one-particle entropy sum is dependent on the interparticle potential and thus able to detect the correlation between particles.
Statistical fingerprinting for malware detection and classification
Prowell, Stacy J.; Rathgeb, Christopher T.
2015-09-15
A system detects malware in a computing architecture with an unknown pedigree. The system includes a first computing device having a known pedigree and operating free of malware. The first computing device executes a series of instrumented functions that, when executed, provide a statistical baseline that is representative of the time it takes the software application to run on a computing device having a known pedigree. A second computing device executes a second series of instrumented functions that, when executed, provides an actual time that is representative of the time the known software application runs on the second computing device. The system detects malware when there is a difference in execution times between the first and the second computing devices.
International petroleum statistics report, October 1997
1997-10-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1996; OECD stocks from 1973 through 1996; and OECD trade from 1986 through 1996. 4 figs., 48 tabs.
International Petroleum Statistics Report, July 1994
Not Available
1994-07-26
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1993; OECD stocks from 1973 through 1993; and OECD trade from 1983 through 1993. Data for the United States are developed by the Energy Information Administration`s (EIA) Office of Oil and Gas. Data for other countries are derived largely from published sources, including International Energy Agency publications, the EIA International Energy Annual, and the trade press. (See sources after each section.) All data are reviewed by the International Statistics Branch of EIA. All data have been converted to units of measurement familiar to the American public. Definitions of oil production and consumption are consistent with other EIA publications.
Microsoft PowerPoint - Belianinov_2015_StaffScienceHighlight...
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
channel capacity limit. The use of multivariate statistical methods such as Principal Component Analysis, allows one to guide a data sampling strategy and attain insight into...
Natural Contamination from the Mancos Shale | Department of Energy
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Application of Environmental Isotopes to the Evaluation of the Origin of Contamination in a Desert Arroyo: Many Devils Wash, Shiprock, New Mexico Multivariate Statistical Analysis ...
Geology and Groundwater Investigation Many Devils Wash, Shiprock...
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
in a Desert Arroyo: Many Devils Wash, Shiprock, New Mexico Multivariate Statistical Analysis of Water Chemistry in Evaluating the Origin of Contamination in Many Devils ...
EU Pocketbook - European Vehicle Market Statistics | Open Energy...
- European Vehicle Market Statistics AgencyCompany Organization: International Council on Clean Transportation Website: eupocketbook.theicct.org Transport Toolkit...
Metoyer, Candace N.; Walsh, Stephen J.; Tardiff, Mark F.; Chilton, Lawrence
2008-10-30
The detection and identification of weak gaseous plumes using thermal imaging data is complicated by many factors. These include variability due to atmosphere, ground and plume temperature, and background clutter. This paper presents an analysis of one formulation of the physics-based model that describes the at-sensor observed radiance. The motivating question for the analyses performed in this paper is as follows. Given a set of backgrounds, is there a way to predict the background over which the probability of detecting a given chemical will be the highest? Two statistics were developed to address this question. These statistics incorporate data from the long-wave infrared band to predict the background over which chemical detectability will be the highest. These statistics can be computed prior to data collection. As a preliminary exploration into the predictive ability of these statistics, analyses were performed on synthetic hyperspectral images. Each image contained one chemical (either carbon tetrachloride or ammonia) spread across six distinct background types. The statistics were used to generate predictions for the background ranks. Then, the predicted ranks were compared to the empirical ranks obtained from the analyses of the synthetic images. For the simplified images under consideration, the predicted and empirical ranks showed a promising amount of agreement. One statistic accurately predicted the best and worst background for detection in all of the images. Future work may include explorations of more complicated plume ingredients, background types, and noise structures.
Bonne, François; Bonnay, Patrick; Bradu, Benjamin
2014-01-29
In this paper, a multivariable model-based non-linear controller for Warm Compression Stations (WCS) is proposed. The strategy is to replace all the PID loops controlling the WCS with an optimally designed model-based multivariable loop. This new strategy leads to high stability and fast disturbance rejection such as those induced by a turbine or a compressor stop, a key-aspect in the case of large scale cryogenic refrigeration. The proposed control scheme can be used to have precise control of every pressure in normal operation or to stabilize and control the cryoplant under high variation of thermal loads (such as a pulsed heat load expected to take place in future fusion reactors such as those expected in the cryogenic cooling systems of the International Thermonuclear Experimental Reactor ITER or the Japan Torus-60 Super Advanced fusion experiment JT-60SA). The paper details how to set the WCS model up to synthesize the Linear Quadratic Optimal feedback gain and how to use it. After preliminary tuning at CEA-Grenoble on the 400W@1.8K helium test facility, the controller has been implemented on a Schneider PLC and fully tested first on the CERN's real-time simulator. Then, it was experimentally validated on a real CERN cryoplant. The efficiency of the solution is experimentally assessed using a reasonable operating scenario of start and stop of compressors and cryogenic turbines. This work is partially supported through the European Fusion Development Agreement (EFDA) Goal Oriented Training Program, task agreement WP10-GOT-GIRO.
A structural analysis of natural gas consumption by income class from 1987 to 1993
Poyer, D.A.
1996-12-01
This study had two major objectives: (1) assess and compare changes in natural gas consumption between 1987 and 1993 by income group and (2) assess the potential influence of energy policy on observed changes in natural gas consumption over time and across income groups. This analysis used U.S. Department of Energy (DOE) data files and involved both the generation of simple descriptive statistics and the use of multivariate regression analysis. The consumption of natural gas by the groups was studied over a six-year period. The results showed that: (1) natural gas use was substantially higher for the highest income group than for the two lower income groups and (2) natural gas consumption declined for the lowest and middle income quintiles and increased for the highest income quintile between 1987 and 1990; between 1990 and 1993, consumption increased for the lowest and middle income quintile, but remained relatively constant for the highest income quintile. The relative importance of the structural and variable factors in explaining consumption changes between survey periods varies by income group. The analysis provides two major energy policy implications: (1) natural gas intensity has been the highest for the lowest income group, indicating that this group is more vulnerable to sudden changes in demand-indicator variables, in particular weather-related variables, than increase natural gas consumption, and (2) the fall in natural gas intensity between 1987 and 1993 may indicate that energy policy has had some impact on reducing natural gas consumption. 11 refs., 4 figs., 16 tabs.
Random paths and current fluctuations in nonequilibrium statistical mechanics
Gaspard, Pierre
2014-07-15
An overview is given of recent advances in nonequilibrium statistical mechanics about the statistics of random paths and current fluctuations. Although statistics is carried out in space for equilibrium statistical mechanics, statistics is considered in time or spacetime for nonequilibrium systems. In this approach, relationships have been established between nonequilibrium properties such as the transport coefficients, the thermodynamic entropy production, or the affinities, and quantities characterizing the microscopic Hamiltonian dynamics and the chaos or fluctuations it may generate. This overview presents results for classical systems in the escape-rate formalism, stochastic processes, and open quantum systems.
International petroleum statistics report, June 1999
1999-06-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1990, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years and annually for the three years prior to that. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1998; OECD stocks from 1973 through 1998; and OECD trade from 1988 through 1998. 4 figs., 46 tabs.
International petroleum statistics report, February 1996
1996-02-28
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1994; OECD stocks from 1973 through 1994; and OECD trade from 1984 through 1994.
International petroleum statistics report, July 1998
1998-07-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1997; OECD stocks from 1973 through 1997; and OECD trade from 1987 through 1997. 4 figs., 46 tabs.
International petroleum statistics report, December 1998
1998-12-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1997; OECD stocks from 1973 through 1997; and OECD trade from 1987 through 1997. 4 figs., 46 tabs.
International petroleum statistics report, March 1999
1999-03-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1990, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years and annually for the three years prior to that. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarter data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1997; OECD stocks from 1973 through 1997; and OECD trade from 1987 through 1997. 4 figs., 48 tabs.
International petroleum statistics report, March 1994
1994-03-28
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1992; OECD stocks from 1973 through 1992; and OECD trade from 1982 through 1992.
International petroleum statistics report, February 1998
1998-02-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1996; OECD stocks from 1973 through 1996; and OECD trade from 1986 through 1996. 4 figs., 48 tabs.
International petroleum statistics report, April 1999
1999-05-04
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance fore the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1997; OECD stocks from 1973 through 1997; and OECD trade from 1987 through 1997. 4 figs., 48 tabs.
International petroleum statistics report, February 1997
1997-02-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1995; OECD stocks from 1973 through 1995; and OECD trade from 1985 through 1995. 4 figs., 47 tabs.
International petroleum statistics report, January 1999
1999-01-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1997; OECD stocks from 1973 through 1997; and OECD trade from 1987 through 1997. 4 figs., 46 tabs.
International petroleum statistics report, August 1995
1995-08-25
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1994; OECD stocks from 1973 through 1994; and OECD trade from 1984 through 1994.
International petroleum statistics report, August 1998
1998-08-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1997; OECD stocks from 1973 through 1997; and OECD trade from 1987 through 1997. 4 figs., 48 tabs.
International petroleum statistics report, June 1998
1998-06-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1997; OECD stocks from 1973 through 1997; and OECD trade from 1987 through 1997. 4 figs., 48 tabs.
International petroleum statistics report, August 1994
Not Available
1994-08-26
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1993; OECD stocks from 1973 through 1993; and OECD trade from 1983 through 1993.
International petroleum statistics report, September 1998
1998-09-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1997; OECD stocks from 1973 through 1997; and OECD trade from 1987 through 1997. 4 figs., 46 tabs.
International petroleum statistics report, March 1995
1995-03-30
The International Petroleum Statistics Report presents data for March 1995 on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1993; OECD stocks from 1973 through 1993; and OECD trade from 1983 through 1993.
International petroleum statistics report, November 1994
Not Available
1994-11-25
Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The International production, and on oil and stocks. The report has four sections. Section 1 contains time series data on world oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1993; OECD stocks from 1973 through 1993; and OECD trade from 1983 through 1993.
International petroleum statistics report, April 1998
1998-04-01
The International Petroleum Statistics Report presents data on International oil production, demand, imports and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1997; OECD stocks from 1973 through 1997; and OECD trade from 1986 through 1996. 4 figs., 46 tabs.
International petroleum statistics report, December 1997
1997-12-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. The balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1996; OECD stocks from 1973 through 1996; and OECD trade from 1986 through 1996. 4 figs., 46 tabs.
International petroleum statistics report, November 1998
1998-11-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1997; OECD stocks from 1973 through 1997; and OECD trade from 1987 through 1997. 4 figs., 46 tabs.
International petroleum statistics report, October 1998
1998-10-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1997; OECD stocks from 1973 through 1997; and OECD trade from 1987 through 1997. 4 figs., 46 tabs.
International petroleum statistics report, June 1997
1997-06-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1996; OECD stocks from 1973 through 1996; and OECD trade from 1986 through 1996. 46 tabs.
International petroleum statistics report, May 1998
1998-05-01
The International Petroleum Statistics report is a monthly publication that provides current international oil data. It presents data on international production, demand, imports and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two year. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1997; OECD stocks from 1973 through 1997, and OECD trade from 1987 through 1997. 4 fig., 48 tabs.
International petroleum statistics report, September 1995
1995-09-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1994; OECD stocks from 1973 through 1994; and OECD trade from 1984 through 1994. 4 figs., 45 tabs.
International petroleum statistics report, September 1994
Not Available
1994-09-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (ECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1993; OECD stocks from 1973 through 1993; and OECD trade from 1983 through 1993.
International petroleum statistics report, May 1995
1995-05-30
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1994; OECD stocks from 1973 through 1994; and OECD trade from 1983 through 1993.
International petroleum statistics report, October 1993
Not Available
1993-10-27
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1980, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1992; OECD stocks from 1982 through 1992.
International petroleum statistics report, December 1993
Not Available
1993-12-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1992; OECD stocks from 1973 through 1992; and OECD trade from 1982 through 1992. 41 tabs.
International petroleum statistics report, April 1994
Not Available
1994-04-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1993; OECD stocks from 1973 through 1993; and OECD trade from 1982 through 1992. 41 tables.
International petroleum statistics report, February 1994
Not Available
1994-02-28
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1992; OECD stocks from 1973 through 1992; and OECD trade from 1982 through 1992.
International petroleum statistics report, September 1996
1996-09-27
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1995; OECD stocks from 1973 through 1995; and OECD trade from 1985 through 1995.
International petroleum statistics report, April 1997
1997-04-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1995; OECD stocks from 1973 through 1995; and OECD trade from 1985 through 1995. 4 figs., 47 tabs.
International petroleum statistics report, May 1999
1999-05-01
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1990, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1998; OECD stocks from 1973 through 1998; and OECD trade from 1988 through 1998. 4 figs., 48 tabs.
International petroleum statistics report, February 1999
1999-02-01
The International petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970--1997; OECD stocks from 1973--1997; and OECD trade from 1987--1997.
STUDIES IN ASTRONOMICAL TIME SERIES ANALYSIS. VI. BAYESIAN BLOCK REPRESENTATIONS
Scargle, Jeffrey D.; Norris, Jay P.; Jackson, Brad; Chiang, James
2013-02-20
This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it-an improved and generalized version of Bayesian Blocks-that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piecewise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by Arias-Castro et al. In the spirit of Reproducible Research all of the code and data necessary to reproduce all of the figures in this paper are included as supplementary material.
Spatial Statistical Procedures to Validate Input Data in Energy Models
Johannesson, G.; Stewart, J.; Barr, C.; Brady Sabeff, L.; George, R.; Heimiller, D.; Milbrandt, A.
2006-01-01
Energy modeling and analysis often relies on data collected for other purposes such as census counts, atmospheric and air quality observations, economic trends, and other primarily non-energy related uses. Systematic collection of empirical data solely for regional, national, and global energy modeling has not been established as in the abovementioned fields. Empirical and modeled data relevant to energy modeling is reported and available at various spatial and temporal scales that might or might not be those needed and used by the energy modeling community. The incorrect representation of spatial and temporal components of these data sets can result in energy models producing misleading conclusions, especially in cases of newly evolving technologies with spatial and temporal operating characteristics different from the dominant fossil and nuclear technologies that powered the energy economy over the last two hundred years. Increased private and government research and development and public interest in alternative technologies that have a benign effect on the climate and the environment have spurred interest in wind, solar, hydrogen, and other alternative energy sources and energy carriers. Many of these technologies require much finer spatial and temporal detail to determine optimal engineering designs, resource availability, and market potential. This paper presents exploratory and modeling techniques in spatial statistics that can improve the usefulness of empirical and modeled data sets that do not initially meet the spatial and/or temporal requirements of energy models. In particular, we focus on (1) aggregation and disaggregation of spatial data, (2) predicting missing data, and (3) merging spatial data sets. In addition, we introduce relevant statistical software models commonly used in the field for various sizes and types of data sets.
Synchrotron IR microspectroscopy for protein structure analysis: Potential and questions
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Yu, Peiqiang
2006-01-01
Synchrotron radiation-based Fourier transform infrared microspectroscopy (S-FTIR) has been developed as a rapid, direct, non-destructive, bioanalytical technique. This technique takes advantage of synchrotron light brightness and small effective source size and is capable of exploring the molecular chemical make-up within microstructures of a biological tissue without destruction of inherent structures at ultra-spatial resolutions within cellular dimension. To date there has been very little application of this advanced technique to the study of pure protein inherent structure at a cellular level in biological tissues. In this review, a novel approach was introduced to show the potential of the newly developed, advancedmore » synchrotron-based analytical technology, which can be used to localize relatively “pure“ protein in the plant tissues and relatively reveal protein inherent structure and protein molecular chemical make-up within intact tissue at cellular and subcellular levels. Several complex protein IR spectra data analytical techniques (Gaussian and Lorentzian multi-component peak modeling, univariate and multivariate analysis, principal component analysis (PCA), and hierarchical cluster analysis (CLA) are employed to relatively reveal features of protein inherent structure and distinguish protein inherent structure differences between varieties/species and treatments in plant tissues. By using a multi-peak modeling procedure, RELATIVE estimates (but not EXACT determinations) for protein secondary structure analysis can be made for comparison purpose. The issues of pro- and anti-multi-peaking modeling/fitting procedure for relative estimation of protein structure were discussed. By using the PCA and CLA analyses, the plant molecular structure can be qualitatively separate one group from another, statistically, even though the spectral assignments are not known. The synchrotron-based technology provides a new approach for protein structure research in biological tissues at ultraspatial resolutions.« less
Statistical theory of turbulent incompressible multimaterial flow
Kashiwa, B.
1987-10-01
Interpenetrating motion of incompressible materials is considered. ''Turbulence'' is defined as any deviation from the mean motion. Accordingly a nominally stationary fluid will exhibit turbulent fluctuations due to a single, slowly moving sphere. Mean conservation equations for interpenetrating materials in arbitrary proportions are derived using an ensemble averaging procedure, beginning with the exact equations of motion. The result is a set of conservation equations for the mean mass, momentum and fluctuational kinetic energy of each material. The equation system is at first unclosed due to integral terms involving unknown one-point and two-point probability distribution functions. In the mean momentum equation, the unclosed terms are clearly identified as representing two physical processes. One is transport of momentum by multimaterial Reynolds stresses, and the other is momentum exchange due to pressure fluctuations and viscous stress at material interfaces. Closure is approached by combining careful examination of multipoint statistical correlations with the traditional physical technique of kappa-epsilon modeling for single-material turbulence. This involves representing the multimaterial Reynolds stress for each material as a turbulent viscosity times the rate of strain based on the mean velocity of that material. The multimaterial turbulent viscosity is related to the fluctuational kinetic energy kappa, and the rate of fluctuational energy dissipation epsilon, for each material. Hence a set of kappa and epsilon equations must be solved, together with mean mass and momentum conservation equations, for each material. Both kappa and the turbulent viscosities enter into the momentum exchange force. The theory is applied to (a) calculation of the drag force on a sphere fixed in a uniform flow, (b) calculation of the settling rate in a suspension and (c) calculation of velocity profiles in the pneumatic transport of solid particles in a pipe.
A Statistical Perspective on Highly Accelerated Testing.
Thomas, Edward V.
2015-02-01
Highly accelerated life testing has been heavily promoted at Sandia (and elsewhere) as a means to rapidly identify product weaknesses caused by flaws in the product's design or manufacturing process. During product development, a small number of units are forced to fail at high stress. The failed units are then examined to determine the root causes of failure. The identification of the root causes of product failures exposed by highly accelerated life testing can instigate changes to the product's design and/or manufacturing process that result in a product with increased reliability. It is widely viewed that this qualitative use of highly accelerated life testing (often associated with the acronym HALT) can be useful. However, highly accelerated life testing has also been proposed as a quantitative means for "demonstrating" the reliability of a product where unreliability is associated with loss of margin via an identified and dominating failure mechanism. It is assumed that the dominant failure mechanism can be accelerated by changing the level of a stress factor that is assumed to be related to the dominant failure mode. In extreme cases, a minimal number of units (often from a pre-production lot) are subjected to a single highly accelerated stress relative to normal use. If no (or, sufficiently few) units fail at this high stress level, some might claim that a certain level of reliability has been demonstrated (relative to normal use conditions). Underlying this claim are assumptions regarding the level of knowledge associated with the relationship between the stress level and the probability of failure. The primary purpose of this document is to discuss (from a statistical perspective) the efficacy of using accelerated life testing protocols (and, in particular, "highly accelerated" protocols) to make quantitative inferences concerning the performance of a product (e.g., reliability) when in fact there is lack-of-knowledge and uncertainty concerning the assumed relationship between the stress level and performance. In addition, this document contains recommendations for conducting more informative accelerated tests.
Robustness analysis of an air heating plant and control law by using polynomial chaos
Coln, Diego; Ferreira, Murillo A. S.; Bueno, tila M.; Balthazar, Jos M.; Rosa, Sulia S. R. F. de
2014-12-10
This paper presents a robustness analysis of an air heating plant with a multivariable closed-loop control law by using the polynomial chaos methodology (MPC). The plant consists of a PVC tube with a fan in the air input (that forces the air through the tube) and a mass flux sensor in the output. A heating resistance warms the air as it flows inside the tube, and a thermo-couple sensor measures the air temperature. The plant has thus two inputs (the fan's rotation intensity and heat generated by the resistance, both measured in percent of the maximum value) and two outputs (air temperature and air mass flux, also in percent of the maximal value). The mathematical model is obtained by System Identification techniques. The mass flux sensor, which is nonlinear, is linearized and the delays in the transfer functions are properly approximated by non-minimum phase transfer functions. The resulting model is transformed to a state-space model, which is used for control design purposes. The multivariable robust control design techniques used is the LQG/LTR, and the controllers are validated in simulation software and in the real plant. Finally, the MPC is applied by considering some of the system's parameters as random variables (one at a time, and the system's stochastic differential equations are solved by expanding the solution (a stochastic process) in an orthogonal basis of polynomial functions of the basic random variables. This method transforms the stochastic equations in a set of deterministic differential equations, which can be solved by traditional numerical methods (That is the MPC). Statistical data for the system (like expected values and variances) are then calculated. The effects of randomness in the parameters are evaluated in the open-loop and closed-loop pole's positions.
Energy-Efficient and Comfortable Buildings through Multivariate Integrated Control (ECoMIC)
Birru, Dagnachew; Wen, Yao-Jung; Rubinstein, Francis M.; Clear, Robert D.
2013-10-28
This project aims to develop an integrated control solution for enhanced energy efficiency and user comfort in commercial buildings. The developed technology is a zone-based control framework that minimizes energy usage while maintaining occupants’ visual and thermal comfort through control of electric lights, motorized venetian blinds and thermostats. The control framework is designed following a modular, scalable and flexible architecture to facilitate easy integration with exiting building management systems. The control framework contains two key algorithms: 1) the lighting load balancing algorithm and 2) the thermostat control algorithm. The lighting load balancing algorithm adopts a model-based closed-loop control approach to determine the optimal electric light and venetian blind settings. It is formulated into an optimization problem with minimizing lighting-related energy consumptions as the objective and delivering adequate task light and preventing daylight glare as the constraints. The thermostat control algorithm is based on a well-established thermal comfort model and formulated as a root-finding problem to dynamically determine the optimal thermostat setpoint for both energy savings and improved thermal comfort. To address building-wide scalability, a system architecture was developed for the zone-based control technology. Three levels of services are defined in the architecture: external services, facility level services and zone level services. The zone-level service includes the control algorithms described above as well as the corresponding interfaces, profiles, sensors and actuators to realize the zone controller. The facility level services connect to the zones through a backbone network, handle supervisory level information and controls, and thus facilitate building-wide scalability. The external services provide communication capability to entities outside of the building for grid interaction and remote access. Various aspects of the developed control technology were evaluated and verified through both simulations and testbed implementations. Simulations coupling a DOE medium office reference building in EnergyPlus building simulation software and a prototype controller in Matlab were performed. During summer time in a mixed-humid climate zone, the simulations revealed reductions of 27% and 42% in electric lighting load and cooling load, respectively, when compared to an advanced base case with daylight dimming and blinds automatically tilted to block direct sun. Two single-room testbeds were established. The testbed at Philips Lighting business building (Rosemont, IL) was designed for quantifying energy performance of integrated controls. This particular implementation achieved 40% and 79% savings on lighting and HVAC energy, respectively, compared to a relatively simple base case operated on predefined schedules. While the resulting energy savings was very encouraging, it should be noted that there may be several caveats associated with it. 1) The test was run during late spring and early summer, and the savings numbers might not be directly used to extrapolate the annual energy savings. 2) Due to the needs for separate control and metering of the small-scale demonstrator within a large building, the HVAC system, hence the corresponding savings, did not represent a typical energy code-compliant design. 3) The light level in the control case was regulated at a particular setpoint, which was lower than then the full-on light level in the base case, and the savings resulted from tuning down the light level to the setpoint was not attributable to the contribution of the developed technology. The testbed at the Lawrence Berkeley National Laboratory (Berkeley, CA) specifically focused on glare control integration, and has demonstrated the feasibility and capability of the glare detection and prevention technique. While the short one-month test in this testbed provided a functional indication of the developed technology, and it would require at least a full solstice-to-solstice cycle to ruinously quantify the performance, which was not possible within the project timeframe. There are certain limitations inherited from the operational assumptions, which could potentially affect the effectiveness and applicability of the developed control technologies. The system takes a typical ceiling-mounting approach for the photosensor locations, and therefore, the control performance relies on proper commissioning or the built-in intelligence of the photosensor for pertinent task light level estimations. For spaces where daylight penetration diminishes significantly deeper into the zone, certain modification to the control algorithms is required to accommodate multiple lighting control subzones and the corresponding sensors for providing a more uniform light level across the entire zone. Integrated control of visual and thermal comfort requires the lighting control zone and thermal control zone to coincide with each other. In other words, the area illuminated by a lighting circuit needs to be the same area served by the thermostat. Thus, the original zoning will potentially constrain the applicability of this technology in retrofitting projects. This project demonstrated the technical feasibility of a zone-based integrated control technology. From the simulation results and testbed implementations, up to 60% lighting energy savings in daylit areas relative to a “no-controls” case can easily be achieved. A 20% reduction of whole building energy consumption is also attainable. In the aspect of occupant comfort, the testbed demonstrated the ability to maintain specified light level on the workplane while promptly mitigate daylight glare 90% of the time. The control system also managed to maintain the thermal environment at a comfortable level 90% of the time. The aspect of system scalability was guaranteed by the system architecture design, based on which the testbeds were instantiated. Analysis on the aspect of economic benefit has yielded an about 6-year payback time for a medium-sized building, including the installation of all hardware and software, such as motorized blinds and LED luminaires. The payback time can be significantly reduced if part of the hardware is already in place for retrofitting projects. It needs to be noted that since the payback analysis was partly based on the testbed performance results, it is constrained by the caveats associated with the testbed implementations. The main uncertainty lies in the contribution from the space conditioning energy savings as it was non-trivial to realistically configure a room-size HVAC system for directly extrapolating whole-building HVAC energy savings. It is recommended to further evaluate the developed technology at a larger scale, where the lighting and HVAC energy consumption can be realistically measured at the building level, to more rigorously quantify the performance potentials.
A Statistical Framework for Microbial Source Attribution
Velsko, S P; Allen, J E; Cunningham, C T
2009-04-28
This report presents a general approach to inferring transmission and source relationships among microbial isolates from their genetic sequences. The outbreak transmission graph (also called the transmission tree or transmission network) is the fundamental structure which determines the statistical distributions relevant to source attribution. The nodes of this graph are infected individuals or aggregated sub-populations of individuals in which transmitted bacteria or viruses undergo clonal expansion, leading to a genetically heterogeneous population. Each edge of the graph represents a transmission event in which one or a small number of bacteria or virions infects another node thus increasing the size of the transmission network. Recombination and re-assortment events originate in nodes which are common to two distinct networks. In order to calculate the probability that one node was infected by another, given the observed genetic sequences of microbial isolates sampled from them, we require two fundamental probability distributions. The first is the probability of obtaining the observed mutational differences between two isolates given that they are separated by M steps in a transmission network. The second is the probability that two nodes sampled randomly from an outbreak transmission network are separated by M transmission events. We show how these distributions can be obtained from the genetic sequences of isolates obtained by sampling from past outbreaks combined with data from contact tracing studies. Realistic examples are drawn from the SARS outbreak of 2003, the FMDV outbreak in Great Britain in 2001, and HIV transmission cases. The likelihood estimators derived in this report, and the underlying probability distribution functions required to calculate them possess certain compelling general properties in the context of microbial forensics. These include the ability to quantify the significance of a sequence 'match' or 'mismatch' between two isolates; the ability to capture non-intuitive effects of network structure on inferential power, including the 'small world' effect; the insensitivity of inferences to uncertainties in the underlying distributions; and the concept of rescaling, i.e. ability to collapse sub-networks into single nodes and examine transmission inferences on the rescaled network.
Experimental uncertainty estimation and statistics for data having interval uncertainty.
Kreinovich, Vladik; Oberkampf, William Louis; Ginzburg, Lev; Ferson, Scott; Hajagos, Janos
2007-05-01
This report addresses the characterization of measurements that include epistemic uncertainties in the form of intervals. It reviews the application of basic descriptive statistics to data sets which contain intervals rather than exclusively point estimates. It describes algorithms to compute various means, the median and other percentiles, variance, interquartile range, moments, confidence limits, and other important statistics and summarizes the computability of these statistics as a function of sample size and characteristics of the intervals in the data (degree of overlap, size and regularity of widths, etc.). It also reviews the prospects for analyzing such data sets with the methods of inferential statistics such as outlier detection and regressions. The report explores the tradeoff between measurement precision and sample size in statistical results that are sensitive to both. It also argues that an approach based on interval statistics could be a reasonable alternative to current standard methods for evaluating, expressing and propagating measurement uncertainties.
Nuclear Forensic Inferences Using Iterative Multidimensional Statistics
Robel, M; Kristo, M J; Heller, M A
2009-06-09
Nuclear forensics involves the analysis of interdicted nuclear material for specific material characteristics (referred to as 'signatures') that imply specific geographical locations, production processes, culprit intentions, etc. Predictive signatures rely on expert knowledge of physics, chemistry, and engineering to develop inferences from these material characteristics. Comparative signatures, on the other hand, rely on comparison of the material characteristics of the interdicted sample (the 'questioned sample' in FBI parlance) with those of a set of known samples. In the ideal case, the set of known samples would be a comprehensive nuclear forensics database, a database which does not currently exist. In fact, our ability to analyze interdicted samples and produce an extensive list of precise materials characteristics far exceeds our ability to interpret the results. Therefore, as we seek to develop the extensive databases necessary for nuclear forensics, we must also develop the methods necessary to produce the necessary inferences from comparison of our analytical results with these large, multidimensional sets of data. In the work reported here, we used a large, multidimensional dataset of results from quality control analyses of uranium ore concentrate (UOC, sometimes called 'yellowcake'). We have found that traditional multidimensional techniques, such as principal components analysis (PCA), are especially useful for understanding such datasets and drawing relevant conclusions. In particular, we have developed an iterative partial least squares-discriminant analysis (PLS-DA) procedure that has proven especially adept at identifying the production location of unknown UOC samples. By removing classes which fell far outside the initial decision boundary, and then rebuilding the PLS-DA model, we have consistently produced better and more definitive attributions than with a single pass classification approach. Performance of the iterative PLS-DA method compared favorably to that of classification and regression tree (CART) and k nearest neighbor (KNN) algorithms, with the best combination of accuracy and robustness, as tested by classifying samples measured independently in our laboratories against the vendor QC based reference set.
Doppler Lidar Vertical Velocity Statistics Value-Added Product (Technical
Office of Scientific and Technical Information (OSTI)
Report) | SciTech Connect Vertical Velocity Statistics Value-Added Product Citation Details In-Document Search Title: Doppler Lidar Vertical Velocity Statistics Value-Added Product Accurate height-resolved measurements of higher-order statistical moments of vertical velocity fluctuations are crucial for improved understanding of turbulent mixing and diffusion, convective initiation, and cloud life cycles. The Atmospheric Radiation Measurement (ARM) Climate Research Facility operates coherent
Nonlinearity sensing via photon-statistics excitation spectroscopy
Assmann, Marc; Bayer, Manfred
2011-11-15
We propose photon-statistics excitation spectroscopy as an adequate tool to describe the optical response of a nonlinear system. To this end we suggest to use optical excitation with varying photon statistics as another spectroscopic degree of freedom to gather information about the system in question. The responses of several simple model systems to excitation beams with different photon statistics are discussed. Possible spectroscopic applications in terms of identifying lasing operation are pointed out.
User Statistics | U.S. DOE Office of Science (SC)
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Statistics User Facilities User Facilities Home User Facilities at a Glance User Resources User Statistics Policies and Processes Frequently Asked Questions User Facility Science Highlights User Facility News Contact Information Office of Science U.S. Department of Energy 1000 Independence Ave., SW Washington, DC 20585 P: (202) 586-5430 User Statistics Print Text Size: A A A FeedbackShare Page The map and source data below contain information regarding FY 2014 user projects at the Office of
STATISTICAL MECHANICS MODELING OF MESOSCALE DEFORMATION IN METALS
Office of Scientific and Technical Information (OSTI)
(Technical Report) | SciTech Connect Technical Report: STATISTICAL MECHANICS MODELING OF MESOSCALE DEFORMATION IN METALS Citation Details In-Document Search Title: STATISTICAL MECHANICS MODELING OF MESOSCALE DEFORMATION IN METALS The research under this project focused on a theoretical and computational modeling of dislocation dynamics of mesoscale deformation of metal single crystals. Specifically, the work aimed to implement a continuum statistical theory of dislocations to understand
Statistical surrogate models for prediction of high-consequence...
Office of Scientific and Technical Information (OSTI)
surrogate models for prediction of high-consequence climate change. Citation Details In-Document Search Title: Statistical surrogate models for prediction of high-consequence ...
Physics-based statistical learning approach to mesoscopic model...
Office of Scientific and Technical Information (OSTI)
Title: Physics-based statistical learning approach to ... Type: Publisher's Accepted Manuscript Journal Name: Physical ... Country of Publication: United States Language: English Word ...
BP Statistical Review of World Energy | Open Energy Information
OpenEI The BP Statistical Review of World Energy is an Excel spreadsheet which contains consumption and production data for Coal, Natural Gas, Nuclear, Oil, and Hydroelectric...
International Monetary Fund-Data and Statistics | Open Energy...
"The IMF publishes a range of time series data on IMF lending, exchange rates and other economic and financial indicators. Manuals, guides, and other material on statistical...
IRF-World Road Statistics | Open Energy Information
AgencyCompany Organization: International Road Statistics Focus Area: Transportation, Economic Development Resource Type: Dataset Website: www.irfnet.orgstatistics.php Cost:...
An overview of component qualification using Bayesian statistics...
Office of Scientific and Technical Information (OSTI)
Example problems with solutions have been supplied as a learning aid. Bold letters are ... COMPUTING, AND INFORMATION SCIENCE; LEARNING; STATISTICS; MATHEMATICS Word Cloud More ...
STATISTICAL MECHANICS MODELING OF MESOSCALE DEFORMATION IN METALS...
Office of Scientific and Technical Information (OSTI)
STATISTICAL MECHANICS MODELING OF MESOSCALE DEFORMATION IN METALS Anter El-Azab 36 MATERIALS SCIENCE dislocation dynamics; mesoscale deformation of metals; crystal mechanics...
TITLE V-CONFIDENTIAL INFORMATION PROTECTION AND STATISTICAL EFFI...
Gasoline and Diesel Fuel Update (EIA)
or maintain the systems for handling or storage of data received under this title; and ... data anomalies, produce statistical samples that are consistently adjusted for the ...
UNECE-Annual Bulletin of Transport Statistics for Europe and...
Data covers Europe, Canada and the United States. This is a trilingual publication in English, French and Russian." "This annual publication presents statistics and brief studies...
WHO Statistical Information System (WHOSIS) | Open Energy Information
Classification of Diseases (ICD-10), International Classification of Impairments, Disabilities and Handicaps (ICIDH) Links to other sources of health-related statistical...
Experimental and Statistical Comparison of Engine Response as...
Office of Scientific and Technical Information (OSTI)
Experimental and Statistical Comparison of Engine Response as a Function of Fuel Chemistry ... Engine Response as a Function of Fuel Chemistry and Properties in CI and HCCI Engines ...
Doppler Lidar Vertical Velocity Statistics Value-Added Product...
Office of Scientific and Technical Information (OSTI)
Citation Details In-Document Search Title: Doppler Lidar Vertical Velocity Statistics ... Facility operates coherent Doppler lidar systems at several sites around the globe. ...
Random-matrix approach to the statistical compound nuclear reaction...
Office of Scientific and Technical Information (OSTI)
nuclear reaction at low energies using the Monte-Carlo technique Citation Details In-Document Search Title: Random-matrix approach to the statistical compound nuclear ...
UN-Glossary for Transportation Statistics | Open Energy Information
Publications Website: www.internationaltransportforum.orgPubpdfGloStat3e.pdf Cost: Free UN-Glossary for Transportation Statistics Screenshot References: UN-Glossary for...
Autocorrelation Function Statistics and Implication to Decay Ratio Estimation
March-Leuba, Jose A.
2016-01-01
This document summarizes the results of a series of computer simulations to attempt to identify the statistics of the autocorrelation function, and implications for decay ratio estimation.
MISR-Derived Statistics of Cumulus Geometry at TWP Site
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
derived from satellite (Figure 4a) and surface (Figure 5a) observations. Summary To test the potential for deriving the basic statistics (mean, standard deviation, and...
Statistical study of reconnection exhausts in the solar wind
Enl, J.; P?ech, L.; afrnkov, J.; N?me?ek, Z.
2014-11-20
Magnetic reconnection is a fundamental process that changes magnetic field configuration and converts a magnetic energy to flow energy and plasma heating. This paper presents a survey of the plasma and magnetic field parameters inside 418 reconnection exhausts identified in the WIND data from 1995-2012. The statistical analysis is oriented on the re-distribution of the magnetic energy released due to reconnection between a plasma acceleration and its heating. The results show that both the portion of the energy deposited into heat as well as the energy spent on the acceleration of the exhaust plasma rise with the magnetic shear angle in accord with the increase of the magnetic flux available for reconnection. The decrease of the normalized exhaust speed with the increasing magnetic shear suggests a decreasing efficiency of the acceleration and/or the increasing efficiency of heating in high-shear events. However, we have found that the already suggested relation between the exhaust speed and temperature enhancement would be rather considered as an upper limit of the plasma heating during reconnection regardless of the shear angle.
Development of a statistically based access delay timeline methodology.
Rivera, W. Gary; Robinson, David Gerald; Wyss, Gregory Dane; Hendrickson, Stacey M. Langfitt
2013-02-01
The charter for adversarial delay is to hinder access to critical resources through the use of physical systems increasing an adversary's task time. The traditional method for characterizing access delay has been a simple model focused on accumulating times required to complete each task with little regard to uncertainty, complexity, or decreased efficiency associated with multiple sequential tasks or stress. The delay associated with any given barrier or path is further discounted to worst-case, and often unrealistic, times based on a high-level adversary, resulting in a highly conservative calculation of total delay. This leads to delay systems that require significant funding and personnel resources in order to defend against the assumed threat, which for many sites and applications becomes cost prohibitive. A new methodology has been developed that considers the uncertainties inherent in the problem to develop a realistic timeline distribution for a given adversary path. This new methodology incorporates advanced Bayesian statistical theory and methodologies, taking into account small sample size, expert judgment, human factors and threat uncertainty. The result is an algorithm that can calculate a probability distribution function of delay times directly related to system risk. Through further analysis, the access delay analyst or end user can use the results in making informed decisions while weighing benefits against risks, ultimately resulting in greater system effectiveness with lower cost.
Statistical techniques for characterizing residual waste in single-shell and double-shell tanks
Jensen, L., Fluor Daniel Hanford
1997-02-13
A primary objective of the Hanford Tank Initiative (HTI) project is to develop methods to estimate the inventory of residual waste in single-shell and double-shell tanks. A second objective is to develop methods to determine the boundaries of waste that may be in the waste plume in the vadose zone. This document presents statistical sampling plans that can be used to estimate the inventory of analytes within the residual waste within a tank. Sampling plans for estimating the inventory of analytes within the waste plume in the vadose zone are also presented. Inventory estimates can be used to classify the residual waste with respect to chemical and radiological hazards. Based on these estimates, it will be possible to make decisions regarding the final disposition of the residual waste. Four sampling plans for the residual waste in a tank are presented. The first plan is based on the assumption that, based on some physical characteristic, the residual waste can be divided into disjoint strata, and waste samples obtained from randomly selected locations within each stratum. The second plan is that waste samples are obtained from randomly selected locations within the waste. The third and fourth plans are similar to the first two, except that composite samples are formed from multiple samples. Common to the four plans is that, in the laboratory, replicate analytical measurements are obtained from homogenized waste samples. The statistical sampling plans for the residual waste are similar to the statistical sampling plans developed for the tank waste characterization program. In that program, the statistical sampling plans required multiple core samples of waste, and replicate analytical measurements from homogenized core segments. A statistical analysis of the analytical data, obtained from use of the statistical sampling plans developed for the characterization program or from the HTI project, provide estimates of mean analyte concentrations and confidence intervals on the mean. In addition, the statistical analysis provides estimates of spatial and measurement variabilities. The magnitude of these sources of variability are used to determine how well the inventory of the analytes in the waste have been estimated. This document provides statistical sampling plans that can be used to estimate the inventory of the analytes in the residual waste in single-shell and double-shell tanks and in the waste plume in the vadose zone.
Infinite statistics condensate as a model of dark matter
Ebadi, Zahra; Mirza, Behrouz; Mohammadzadeh, Hosein E-mail: b.mirza@cc.iut.ac.ir
2013-11-01
In some models, dark matter is considered as a condensate bosonic system. In this paper, we prove that condensation is also possible for particles that obey infinite statistics and derive the critical condensation temperature. We argue that a condensed state of a gas of very weakly interacting particles obeying infinite statistics could be considered as a consistent model of dark matter.
Fundamental Statistical Descriptions of Plasma Turbulence in Magnetic Fields
John A. Krommes
2001-02-16
A pedagogical review of the historical development and current status (as of early 2000) of systematic statistical theories of plasma turbulence is undertaken. Emphasis is on conceptual foundations and methodology, not practical applications. Particular attention is paid to equations and formalism appropriate to strongly magnetized, fully ionized plasmas. Extensive reference to the literature on neutral-fluid turbulence is made, but the unique properties and problems of plasmas are emphasized throughout. Discussions are given of quasilinear theory, weak-turbulence theory, resonance-broadening theory, and the clump algorithm. Those are developed independently, then shown to be special cases of the direct-interaction approximation (DIA), which provides a central focus for the article. Various methods of renormalized perturbation theory are described, then unified with the aid of the generating-functional formalism of Martin, Siggia, and Rose. A general expression for the renormalized dielectric function is deduced and discussed in detail. Modern approaches such as decimation and PDF methods are described. Derivations of DIA-based Markovian closures are discussed. The eddy-damped quasinormal Markovian closure is shown to be nonrealizable in the presence of waves, and a new realizable Markovian closure is presented. The test-field model and a realizable modification thereof are also summarized. Numerical solutions of various closures for some plasma-physics paradigms are reviewed. The variational approach to bounds on transport is developed. Miscellaneous topics include Onsager symmetries for turbulence, the interpretation of entropy balances for both kinetic and fluid descriptions, self-organized criticality, statistical interactions between disparate scales, and the roles of both mean and random shear. Appendices are provided on Fourier transform conventions, dimensional and scaling analysis, the derivations of nonlinear gyrokinetic and gyrofluid equations, stochasticity criteria for quasilinear theory, formal aspects of resonance-broadening theory, Novikov's theorem, the treatment of weak inhomogeneity, the derivation of the Vlasov weak-turbulence wave kinetic equation from a fully renormalized description, some features of a code for solving the direct-interaction approximation and related Markovian closures, the details of the solution of the EDQNM closure for a solvable three-wave model, and the notation used in the article.
Sub-Poissonian statistics in order-to-chaos transition
Kryuchkyan, Gagik Yu. [Yerevan State University, Manookyan 1, Yerevan 375049, (Armenia); Institute for Physical Research, National Academy of Sciences, Ashtarak-2 378410, (Armenia); Manvelyan, Suren B. [Institute for Physical Research, National Academy of Sciences, Ashtarak-2 378410, (Armenia)
2003-07-01
We study the phenomena at the overlap of quantum chaos and nonclassical statistics for the time-dependent model of nonlinear oscillator. It is shown in the framework of Mandel Q parameter and Wigner function that the statistics of oscillatory excitation numbers is drastically changed in the order-to-chaos transition. The essential improvement of sub-Poissonian statistics in comparison with an analogous one for the standard model of driven anharmonic oscillator is observed for the regular operational regime. It is shown that in the chaotic regime, the system exhibits the range of sub-Poissonian and super-Poissonian statistics which alternate one to other depending on time intervals. Unusual dependence of the variance of oscillatory number on the external noise level for the chaotic dynamics is observed. The scaling invariance of the quantum statistics is demonstrated and its relation to dissipation and decoherence is studied.
Impact of high-order moments on the statistical modeling of transition arrays
Gilleron, Franck; Pain, Jean-Christophe; Bauche, Jacques; Bauche-Arnoult, Claire
2008-02-15
The impact of high-order moments on the statistical modeling of transition arrays in complex spectra is studied. It is shown that a departure from the Gaussian, which is usually employed in such an approach, may be observed even in the shape of unresolved spectra due to the large value of the kurtosis coefficient. The use of a Gaussian shape may also overestimate the width of the spectra in some cases. Therefore, it is proposed to simulate the statistical shape of the transition arrays by the more flexible generalized Gaussian distribution which introduces an additional parameter--the power of the argument in the exponential--that can be constrained by the kurtosis value. The relevance of the statistical line distribution is checked by comparisons with smoothed spectra obtained from detailed line-by-line calculations. The departure from the Gaussian is also confirmed through the analysis of 2p-3d transitions of recent absorption measurements. A numerical fit is proposed for an easy implementation of the statistical profile in atomic-structure codes.
Statistical mechanics based on fractional classical and quantum mechanics
Korichi, Z.; Meftah, M. T.
2014-03-15
The purpose of this work is to study some problems in statistical mechanics based on the fractional classical and quantum mechanics. At first stage we have presented the thermodynamical properties of the classical ideal gas and the system of N classical oscillators. In both cases, the Hamiltonian contains fractional exponents of the phase space (position and momentum). At the second stage, in the context of the fractional quantum mechanics, we have calculated the thermodynamical properties for the black body radiation, studied the Bose-Einstein statistics with the related problem of the condensation and the Fermi-Dirac statistics.
Techniques in teaching statistics : linking research production and research use.
Martinez-Moyano, I .; Smith, A.
2012-01-01
In the spirit of closing the 'research-practice gap,' the authors extend evidence-based principles to statistics instruction in social science graduate education. The authors employ a Delphi method to survey experienced statistics instructors to identify teaching techniques to overcome the challenges inherent in teaching statistics to students enrolled in practitioner-oriented master's degree programs. Among the teaching techniques identi?ed as essential are using real-life examples, requiring data collection exercises, and emphasizing interpretation rather than results. Building on existing research, preliminary interviews, and the ?ndings from the study, the authors develop a model describing antecedents to the strength of the link between research and practice.
Statistical anisotropies in gravitational waves in solid inflation
Akhshik, Mohammad; Emami, Razieh; Firouzjahi, Hassan; Wang, Yi E-mail: emami@ipm.ir E-mail: yw366@cam.ac.uk
2014-09-01
Solid inflation can support a long period of anisotropic inflation. We calculate the statistical anisotropies in the scalar and tensor power spectra and their cross-correlation in anisotropic solid inflation. The tensor-scalar cross-correlation can either be positive or negative, which impacts the statistical anisotropies of the TT and TB spectra in CMB map more significantly compared with the tensor self-correlation. The tensor power spectrum contains potentially comparable contributions from quadrupole and octopole angular patterns, which is different from the power spectra of scalar, the cross-correlation or the scalar bispectrum, where the quadrupole type statistical anisotropy dominates over octopole.
The Sloan Digital Sky Survey Quasar Lens Search. IV. Statistical...
Office of Scientific and Technical Information (OSTI)
The Sloan Digital Sky Survey Quasar Lens Search. IV. Statistical Lens Sample from the Fifth Data Release Citation Details In-Document Search Title: The Sloan Digital Sky Survey...
UN-Energy Statistics Database | Open Energy Information
PV, Wind Resource Type: Dataset Website: data.un.orgExplorer.aspx?dEDATA Cost: Free Language: English UN-Energy Statistics Database Screenshot References: UN Data1 "The United...
Doppler Lidar Vertical Velocity Statistics Value-Added Product
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
49 Doppler Lidar Vertical Velocity Statistics Value-Added Product RK Newsom C Sivaraman TR Shippert LD Riihimaki July 2015 DISCLAIMER This report was prepared as an account of work...
Physics-based statistical learning approach to mesoscopic model selection
Office of Scientific and Technical Information (OSTI)
(Journal Article) | SciTech Connect Physics-based statistical learning approach to mesoscopic model selection Citation Details In-Document Search This content will become publicly available on November 8, 2016 Title: Physics-based statistical learning approach to mesoscopic model selection Authors: Taverniers, Søren ; Haut, Terry S. ; Barros, Kipton ; Alexander, Francis J. ; Lookman, Turab Publication Date: 2015-11-09 OSTI Identifier: 1225546 Grant/Contract Number: AC52-06NA25396;
STATISTICAL PERFORMANCE EVALUATION OF SPRING OPERATED PRESSURE RELIEF VALVE
Office of Scientific and Technical Information (OSTI)
RELIABILITY IMPROVEMENTS 2004 TO 2014 (Conference) | SciTech Connect STATISTICAL PERFORMANCE EVALUATION OF SPRING OPERATED PRESSURE RELIEF VALVE RELIABILITY IMPROVEMENTS 2004 TO 2014 Citation Details In-Document Search Title: STATISTICAL PERFORMANCE EVALUATION OF SPRING OPERATED PRESSURE RELIEF VALVE RELIABILITY IMPROVEMENTS 2004 TO 2014 Authors: Harris, S. ; Gross, R. ; Watson, H. Publication Date: 2015-02-04 OSTI Identifier: 1209039 Report Number(s): SRNL-STI-2015-00047
An overview of component qualification using Bayesian statistics and energy
Office of Scientific and Technical Information (OSTI)
methods. (Technical Report) | SciTech Connect An overview of component qualification using Bayesian statistics and energy methods. Citation Details In-Document Search Title: An overview of component qualification using Bayesian statistics and energy methods. The below overview is designed to give the reader a limited understanding of Bayesian and Maximum Likelihood (MLE) estimation; a basic understanding of some of the mathematical tools to evaluate the quality of an estimation; an
Evaluation of cirrus statistics produced by general circulation models
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
using ARM data cirrus statistics produced by general circulation models using ARM data Hartsock, Daniel University of Utah Mace, Gerald University of Utah Benson, Sally University of Utah Category: Modeling Our goal is to evaluate the skill of various general circulation models for producing climatological cloud statistics by comparing them to the cirrus climatology compiled over the Southern Great Plains (SGP) ARM site. This evaluation includes quantifying similar cloud properties and
Final Report on Statistical Debugging for Petascale Environments (Technical
Office of Scientific and Technical Information (OSTI)
Report) | SciTech Connect Final Report on Statistical Debugging for Petascale Environments Citation Details In-Document Search Title: Final Report on Statistical Debugging for Petascale Environments Authors: Liblit, B Publication Date: 2013-01-18 OSTI Identifier: 1062211 Report Number(s): LLNL-SR-612077 DOE Contract Number: W-7405-ENG-48 Resource Type: Technical Report Research Org: Lawrence Livermore National Laboratory (LLNL), Livermore, CA Sponsoring Org: USDOE Country of Publication:
Lightweight and Statistical Techniques for Petascale Debugging: Correctness
Office of Scientific and Technical Information (OSTI)
on Petascale Systems (CoPS) Preliminry Report (Technical Report) | SciTech Connect Lightweight and Statistical Techniques for Petascale Debugging: Correctness on Petascale Systems (CoPS) Preliminry Report Citation Details In-Document Search Title: Lightweight and Statistical Techniques for Petascale Debugging: Correctness on Petascale Systems (CoPS) Preliminry Report Petascale platforms with O(10{sup 5}) and O(10{sup 6}) processing cores are driving advancements in a wide range of scientific
Mathematical and Statistical Opportunities in Cyber Security (Technical
Office of Scientific and Technical Information (OSTI)
Report) | SciTech Connect Mathematical and Statistical Opportunities in Cyber Security Citation Details In-Document Search Title: Mathematical and Statistical Opportunities in Cyber Security The role of mathematics in a complex system such as the Internet has yet to be deeply explored. In this paper, we summarize some of the important and pressing problems in cyber security from the viewpoint of open science environments. We start by posing the question 'What fundamental problems exist
Statistical Surrogate Models for Estimating Probability of High-Consequence
Office of Scientific and Technical Information (OSTI)
Climate Change. (Conference) | SciTech Connect Statistical Surrogate Models for Estimating Probability of High-Consequence Climate Change. Citation Details In-Document Search Title: Statistical Surrogate Models for Estimating Probability of High-Consequence Climate Change. Abstract not provided. Authors: Field, Richard V., ; Boslough, Mark B. E. ; Constantine, Paul Publication Date: 2011-10-01 OSTI Identifier: 1106521 Report Number(s): SAND2011-8231C 465067 DOE Contract Number:
Statistical characteristics of cloud variability. Part 2: Implication for
Office of Scientific and Technical Information (OSTI)
parameterizations of microphysical and radiative transfer processes in climate models (Journal Article) | SciTech Connect Statistical characteristics of cloud variability. Part 2: Implication for parameterizations of microphysical and radiative transfer processes in climate models Citation Details In-Document Search Title: Statistical characteristics of cloud variability. Part 2: Implication for parameterizations of microphysical and radiative transfer processes in climate models The effects
Statistical surrogate models for prediction of high-consequence climate
Office of Scientific and Technical Information (OSTI)
change. (Technical Report) | SciTech Connect Technical Report: Statistical surrogate models for prediction of high-consequence climate change. Citation Details In-Document Search Title: Statistical surrogate models for prediction of high-consequence climate change. In safety engineering, performance metrics are defined using probabilistic risk assessments focused on the low-probability, high-consequence tail of the distribution of possible events, as opposed to best estimates based on
An overview of component qualification using Bayesian statistics and energy
Office of Scientific and Technical Information (OSTI)
methods. (Technical Report) | SciTech Connect Technical Report: An overview of component qualification using Bayesian statistics and energy methods. Citation Details In-Document Search Title: An overview of component qualification using Bayesian statistics and energy methods. The below overview is designed to give the reader a limited understanding of Bayesian and Maximum Likelihood (MLE) estimation; a basic understanding of some of the mathematical tools to evaluate the quality of an
A statistical study of EMIC waves observed by Cluster. 1. Wave properties. EMIC Wave Properties
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Allen, R. C.; Zhang, J. -C.; Kistler, L. M.; Spence, H. E.; Lin, R. -L.; Klecker, B.; Dunlop, M. W.; André, M.; Jordanova, V. K.
2015-07-23
Electromagnetic ion cyclotron (EMIC) waves are an important mechanism for particle energization and losses inside the magnetosphere. In order to better understand the effects of these waves on particle dynamics, detailed information about the occurrence rate, wave power, ellipticity, normal angle, energy propagation angle distributions, and local plasma parameters are required. Previous statistical studies have used in situ observations to investigate the distribution of these parameters in the magnetic local time versus L-shell (MLT-L) frame within a limited magnetic latitude (MLAT) range. In our study, we present a statistical analysis of EMIC wave properties using 10 years (2001–2010) of datamore » from Cluster, totaling 25,431 min of wave activity. Due to the polar orbit of Cluster, we are able to investigate EMIC waves at all MLATs and MLTs. This allows us to further investigate the MLAT dependence of various wave properties inside different MLT sectors and further explore the effects of Shabansky orbits on EMIC wave generation and propagation. Thus, the statistical analysis is presented in two papers. OUr paper focuses on the wave occurrence distribution as well as the distribution of wave properties. The companion paper focuses on local plasma parameters during wave observations as well as wave generation proxies.« less
A statistical study of EMIC waves observed by Cluster. 1. Wave properties. EMIC Wave Properties
Allen, R. C.; Zhang, J. -C.; Kistler, L. M.; Spence, H. E.; Lin, R. -L.; Klecker, B.; Dunlop, M. W.; André, M.; Jordanova, V. K.
2015-07-23
Electromagnetic ion cyclotron (EMIC) waves are an important mechanism for particle energization and losses inside the magnetosphere. In order to better understand the effects of these waves on particle dynamics, detailed information about the occurrence rate, wave power, ellipticity, normal angle, energy propagation angle distributions, and local plasma parameters are required. Previous statistical studies have used in situ observations to investigate the distribution of these parameters in the magnetic local time versus L-shell (MLT-L) frame within a limited magnetic latitude (MLAT) range. In our study, we present a statistical analysis of EMIC wave properties using 10 years (2001–2010) of data from Cluster, totaling 25,431 min of wave activity. Due to the polar orbit of Cluster, we are able to investigate EMIC waves at all MLATs and MLTs. This allows us to further investigate the MLAT dependence of various wave properties inside different MLT sectors and further explore the effects of Shabansky orbits on EMIC wave generation and propagation. Thus, the statistical analysis is presented in two papers. OUr paper focuses on the wave occurrence distribution as well as the distribution of wave properties. The companion paper focuses on local plasma parameters during wave observations as well as wave generation proxies.
A.G. Crook Company
1993-04-01
This report was prepared by the A.G. Crook Company, under contract to Bonneville Power Administration, and provides statistics of seasonal volumes and streamflow for 28 selected sites in the Columbia River Basin.
Mishra, Srikanta; Schuetter, Jared
2014-11-01
We compare two approaches for building a statistical proxy model (metamodel) for CO₂ geologic sequestration from the results of full-physics compositional simulations. The first approach involves a classical Box-Behnken or Augmented Pairs experimental design with a quadratic polynomial response surface. The second approach used a space-filling maxmin Latin Hypercube sampling or maximum entropy design with the choice of five different meta-modeling techniques: quadratic polynomial, kriging with constant and quadratic trend terms, multivariate adaptive regression spline (MARS) and additivity and variance stabilization (AVAS). Simulations results for CO₂ injection into a reservoir-caprock system with 9 design variables (and 97 samples) were used to generate the data for developing the proxy models. The fitted models were validated with using an independent data set and a cross-validation approach for three different performance metrics: total storage efficiency, CO₂ plume radius and average reservoir pressure. The Box-Behnken–quadratic polynomial metamodel performed the best, followed closely by the maximin LHS–kriging metamodel.
Non-gaussian mode coupling and the statistical cosmological principle
LoVerde, Marilena; Nelson, Elliot; Shandera, Sarah E-mail: eln121@psu.edu
2013-06-01
Local-type primordial non-Gaussianity couples statistics of the curvature perturbation ? on vastly different physical scales. Because of this coupling, statistics (i.e. the polyspectra) of ? in our Hubble volume may not be representative of those in the larger universe that is, they may be biased. The bias depends on the local background value of ?, which includes contributions from all modes with wavelength k?
Ensemble Data Analysis ENvironment (EDEN)
Steed, Chad Allen
2012-08-01
The EDEN toolkit facilitates exploratory data analysis and visualization of global climate model simulation datasets. EDEN provides an interactive graphical user interface (GUI) that helps the user visually construct dynamic queries of the characteristically large climate datasets using temporal ranges, variable selections, and geographic areas of interest. EDEN reads the selected data into a multivariate visualization panel which features an extended implementation of parallel coordinates plots as well as interactive scatterplots. The user can query data in the visualization panel using mouse gestures to analyze different ranges of data. The visualization panel provides coordinated multiple views whereby selections made in one plot are propagated to the other plots.
Quality control and statistical process control for nuclear analytical measurements
Seymour, R.; Sergent, F.; Clark, W.H.C.; Gleason, G.
1993-12-31
The same driving forces that are making businesses examine quality control of manufacturing processes are making laboratories reevaluate their quality control programs. Increased regulation (accountability), global competitiveness (profitability), and potential for litigation (defensibility) are the principal driving forces behind the development and implementation of QA/QC programs in the nuclear analytical laboratory. Both manufacturing and scientific quality control can use identical statistical methods, albeit with some differences in the treatment of the measured data. Today, the approaches to QC programs are quite different for most analytical laboratories as compared with manufacturing sciences. This is unfortunate because the statistical process control methods are directly applicable to measurement processes. It is shown that statistical process control methods can provide many benefits for laboratory QC data treatment.
Office of Oil, Gas, and Coal Supply Statistics
U.S. Energy Information Administration (EIA) Indexed Site
Office of Oil, Gas, and Coal Supply Statistics www.eia.gov Natural Gas Monthly April 2016 U.S. Department of Energy Washington, DC 20585 April 2016 U.S. Energy Information Administration | Natural Gas Monthly ii This report was prepared by the U.S. Energy Information Administration (EIA), the statistical and analytical agency within the U.S. Department of Energy. By law, EIA's data, analyses, and forecasts are independent of approval by any other officer or employee of the United States
System and method for statistically monitoring and analyzing sensed conditions
Pebay, Philippe P.; Brandt, James M.; Gentile, Ann C.; Marzouk, Youssef M.; Hale, Darrian J.; Thompson, David C.
2011-01-04
A system and method of monitoring and analyzing a plurality of attributes for an alarm condition is disclosed. The attributes are processed and/or unprocessed values of sensed conditions of a collection of a statistically significant number of statistically similar components subjected to varying environmental conditions. The attribute values are used to compute the normal behaviors of some of the attributes and also used to infer parameters of a set of models. Relative probabilities of some attribute values are then computed and used along with the set of models to determine whether an alarm condition is met. The alarm conditions are used to prevent or reduce the impact of impending failure.
System and method for statistically monitoring and analyzing sensed conditions
Pebay, Philippe P.; Brandt, James M.; Gentile, Ann C.; Marzouk, Youssef M.; Hale, Darrian J.; Thompson, David C.
2011-01-25
A system and method of monitoring and analyzing a plurality of attributes for an alarm condition is disclosed. The attributes are processed and/or unprocessed values of sensed conditions of a collection of a statistically significant number of statistically similar components subjected to varying environmental conditions. The attribute values are used to compute the normal behaviors of some of the attributes and also used to infer parameters of a set of models. Relative probabilities of some attribute values are then computed and used along with the set of models to determine whether an alarm condition is met. The alarm conditions are used to prevent or reduce the impact of impending failure.
System and method for statistically monitoring and analyzing sensed conditions
Pebay, Philippe P.; Brandt, James M. , Gentile; Ann C. , Marzouk; Youssef M. , Hale; Darrian J. , Thompson; David C.
2010-07-13
A system and method of monitoring and analyzing a plurality of attributes for an alarm condition is disclosed. The attributes are processed and/or unprocessed values of sensed conditions of a collection of a statistically significant number of statistically similar components subjected to varying environmental conditions. The attribute values are used to compute the normal behaviors of some of the attributes and also used to infer parameters of a set of models. Relative probabilities of some attribute values are then computed and used along with the set of models to determine whether an alarm condition is met. The alarm conditions are used to prevent or reduce the impact of impending failure.
Rhapsody: I. Structural Properties and Formation History from a Statistical
Office of Scientific and Technical Information (OSTI)
Sample of Re-simulated Cluster-size Halos (Journal Article) | SciTech Connect Rhapsody: I. Structural Properties and Formation History from a Statistical Sample of Re-simulated Cluster-size Halos Citation Details In-Document Search Title: Rhapsody: I. Structural Properties and Formation History from a Statistical Sample of Re-simulated Cluster-size Halos Authors: Wu, Hao-Yi ; /KIPAC, Menlo Park /SLAC /Michigan U. ; Hahn, Oliver ; Wechsler, Risa H. ; Mao, Yao-Yuan ; Behroozi, Peter S. ;
Sensitivity and Uncertainty Analysis Shell
Energy Science and Technology Software Center (OSTI)
1999-04-20
SUNS (Sensitivity and Uncertainty Analysis Shell) is a 32-bit application that runs under Windows 95/98 and Windows NT. It is designed to aid in statistical analyses for a broad range of applications. The class of problems for which SUNS is suitable is generally defined by two requirements: 1. A computer code is developed or acquired that models some processes for which input is uncertain and the user is interested in statistical analysis of the outputmore » of that code. 2. The statistical analysis of interest can be accomplished using the Monte Carlo analysis. The implementation then requires that the user identify which input to the process model is to be manipulated for statistical analysis. With this information, the changes required to loosely couple SUNS with the process model can be completed. SUNS is then used to generate the required statistical sample and the user-supplied process model analyses the sample. The SUNS post processor displays statistical results from any existing file that contains sampled input and output values.« less
Statistical comparison of ICRF and NBI heating performance in JET-ILW L-mode plasmas
Lerche, E.; Van Eester, D.; Jacquet, Ph.; Mayoral, M.-L.; Graham, M.; Matthews, G.; Monakhov, I.; Rimini, F.; Colas, L.; Czarnecka, A.; Vries, P. de; Collaboration: JET-EFDA Contributors
2014-02-12
After the change over from the C-wall to the ITER-like Be/W wall (ILW) in JET, the radiation losses during ICRF heating have increased and are now substantially larger than those observed with NBI at the same power levels, in spite of the similar global plasma energies reached with the two heating systems. A comparison of the NBI and ICRF performances in the JET-ILW experiments, based on a statistical analysis of ?3000 L-mode discharges, will be presented.
Financial statistics of major publicly owned electric utilities, 1991
Not Available
1993-03-31
The Financial Statistics of Major Publicly Owned Electric Utilities publication presents summary and detailed financial accounting data on the publicly owned electric utilities. The objective of the publication is to provide Federal and State governments, industry, and the general public with data that can be used for policymaking and decisionmaking purposes relating to publicly owned electric utility issues.
Li, Guangjun; Wu, Kui; Peng, Guang; Zhang, Yingjie; Bai, Sen
2014-01-01
Volumetric-modulated arc therapy (VMAT) is now widely used clinically, as it is capable of delivering a highly conformal dose distribution in a short time interval. We retrospectively analyzed patient-specific quality assurance (QA) of VMAT and examined the relationships between the planning parameters and the QA results. A total of 118 clinical VMAT cases underwent pretreatment QA. All plans had 3-dimensional diode array measurements, and 69 also had ion chamber measurements. Dose distribution and isocenter point dose were evaluated by comparing the measurements and the treatment planning system (TPS) calculations. In addition, the relationship between QA results and several planning parameters, such as dose level, control points (CPs), monitor units (MUs), average field width, and average leaf travel, were also analyzed. For delivered dose distribution, a gamma analysis passing rate greater than 90% was obtained for all plans and greater than 95% for 100 of 118 plans with the 3%/3-mm criteria. The difference (mean ± standard deviation) between the point doses measured by the ion chamber and those calculated by TPS was 0.9% ± 2.0% for all plans. For all cancer sites, nasopharyngeal carcinoma and gastric cancer have the lowest and highest average passing rates, respectively. From multivariate linear regression analysis, the dose level (p = 0.001) and the average leaf travel (p < 0.001) showed negative correlations with the passing rate, and the average field width (p = 0.003) showed a positive correlation with the passing rate, all indicating a correlation between the passing rate and the plan complexity. No statistically significant correlation was found between MU or CP and the passing rate. Analysis of the results of dosimetric pretreatment measurements as a function of VMAT plan parameters can provide important information to guide the plan parameter setting and optimization in TPS.
Statistical scaling of geometric characteristics in stochastically generated pore microstructures
Hyman, Jeffrey D.; Guadagnini, Alberto; Winter, C. Larrabee
2015-05-21
In this study, we analyze the statistical scaling of structural attributes of virtual porous microstructures that are stochastically generated by thresholding Gaussian random fields. Characterization of the extent at which randomly generated pore spaces can be considered as representative of a particular rock sample depends on the metrics employed to compare the virtual sample against its physical counterpart. Typically, comparisons against features and/patterns of geometric observables, e.g., porosity and specific surface area, flow-related macroscopic parameters, e.g., permeability, or autocorrelation functions are used to assess the representativeness of a virtual sample, and thereby the quality of the generation method. Here, we rely on manifestations of statistical scaling of geometric observables which were recently observed in real millimeter scale rock samples [13] as additional relevant metrics by which to characterize a virtual sample. We explore the statistical scaling of two geometric observables, namely porosity (?) and specific surface area (SSA), of porous microstructures generated using the method of Smolarkiewicz and Winter [42] and Hyman and Winter [22]. Our results suggest that the method can produce virtual pore space samples displaying the symptoms of statistical scaling observed in real rock samples. Order q sample structure functions (statistical moments of absolute increments) of ? and SSA scale as a power of the separation distance (lag) over a range of lags, and extended self-similarity (linear relationship between log structure functions of successive orders) appears to be an intrinsic property of the generated media. The width of the range of lags where power-law scaling is observed and the Hurst coefficient associated with the variables we consider can be controlled by the generation parameters of the method.
Statistical scaling of geometric characteristics in stochastically generated pore microstructures
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Hyman, Jeffrey D.; Guadagnini, Alberto; Winter, C. Larrabee
2015-05-21
In this study, we analyze the statistical scaling of structural attributes of virtual porous microstructures that are stochastically generated by thresholding Gaussian random fields. Characterization of the extent at which randomly generated pore spaces can be considered as representative of a particular rock sample depends on the metrics employed to compare the virtual sample against its physical counterpart. Typically, comparisons against features and/patterns of geometric observables, e.g., porosity and specific surface area, flow-related macroscopic parameters, e.g., permeability, or autocorrelation functions are used to assess the representativeness of a virtual sample, and thereby the quality of the generation method. Here, wemore » rely on manifestations of statistical scaling of geometric observables which were recently observed in real millimeter scale rock samples [13] as additional relevant metrics by which to characterize a virtual sample. We explore the statistical scaling of two geometric observables, namely porosity (Φ) and specific surface area (SSA), of porous microstructures generated using the method of Smolarkiewicz and Winter [42] and Hyman and Winter [22]. Our results suggest that the method can produce virtual pore space samples displaying the symptoms of statistical scaling observed in real rock samples. Order q sample structure functions (statistical moments of absolute increments) of Φ and SSA scale as a power of the separation distance (lag) over a range of lags, and extended self-similarity (linear relationship between log structure functions of successive orders) appears to be an intrinsic property of the generated media. The width of the range of lags where power-law scaling is observed and the Hurst coefficient associated with the variables we consider can be controlled by the generation parameters of the method.« less
High Statistics Study of Nearby Type 1a Supernovae. QUEST Camera...
Office of Scientific and Technical Information (OSTI)
Statistics Study of Nearby Type 1a Supernovae. QUEST Camera Short Term Maintenance: Final Technical Report Citation Details In-Document Search Title: High Statistics Study of ...
High Statistics Study of Nearby Type 1a Supernovae. QUEST Camera...
Office of Scientific and Technical Information (OSTI)
High Statistics Study of Nearby Type 1a Supernovae. QUEST Camera Short Term Maintenance: Final Technical Report Citation Details In-Document Search Title: High Statistics Study of ...
Starkov, V. N.; Semenov, A. A.; Gomonay, H. V.
2009-07-15
We demonstrate a practical possibility of loss compensation in measured photocounting statistics in the presence of dark counts and background radiation noise. It is shown that satisfactory results are obtained even in the case of low detection efficiency and large experimental errors.
Davis, A; Suhr, N H; Spackman, W; Painter, P C; Walker, P L; Given, P H
1981-04-01
The basic objectives of this program are, first, to understand the systematic relationships between the properties of coals, and, second, to determine the nature of the lateral and vertical variability in the properties of a single seam. Multivariate statistical analyses applied to the Coal Data Base confirm a number of known trends for coal properties. In addition, nitrogen and some components of the ash analysis bear interesting relationships to rank. The macroscopic petrography of column samples of the Lower Kittanning seam reveals a significant difference between the sample from a marine-influenced environment and those from toward the margins of the basin where conditions were non-marine. The various methods of determining the amount and mineralogy of the inorganic fraction of coals are reviewed. General trends in seam thickness, ash, sulfur, volatile matter yield, and vitrinite reflectance of the Lower Kittanning seam of western Pennsylvania are presented. Controls of sedimentation are discussed in relation to the areal variability which has been observed. Differential subsidence and paleotopography appear to have played a major role during the deposition of the coal. The same controls may have maintained some influence upon the coalification process after deposition, especially along the eastern margin of the Lower Kittanning basin.
Mexico City air quality research initiative: An overview and some statistical aspects
Waller, R.A.; Streit, G.E. ); Guzman, F. )
1991-01-01
The Mexican Petroleum Institute (Institute Mexicano del Petroleo, IMP) and Los Alamos National Laboratory (LANL) are in the first year of a three-year jointly funded project to examine the air quality in Mexico City and to provide techniques to evaluate the impact of proposed mitigation options. The technical tasks include modeling and simulation; monitoring and characterization; and strategic evaluation. Extensive measurements of the atmosphere, climate, and meteorology are being made as part of the study. This presentation provides an overview of the total project plan, reports on the current status of the technical tasks, describes the data collection methods, presents examples of the data analysis and graphics, and suggest roles for statistical analysis in this and similar environmental studies. 8 figs., 4 tabs.
Statistical Characterization of School Bus Drive Cycles Collected via Onboard Logging Systems
Duran, A.; Walkowicz, K.
2013-10-01
In an effort to characterize the dynamics typical of school bus operation, National Renewable Energy Laboratory (NREL) researchers set out to gather in-use duty cycle data from school bus fleets operating across the country. Employing a combination of Isaac Instruments GPS/CAN data loggers in conjunction with existing onboard telemetric systems resulted in the capture of operating information for more than 200 individual vehicles in three geographically unique domestic locations. In total, over 1,500 individual operational route shifts from Washington, New York, and Colorado were collected. Upon completing the collection of in-use field data using either NREL-installed data acquisition devices or existing onboard telemetry systems, large-scale duty-cycle statistical analyses were performed to examine underlying vehicle dynamics trends within the data and to explore vehicle operation variations between fleet locations. Based on the results of these analyses, high, low, and average vehicle dynamics requirements were determined, resulting in the selection of representative standard chassis dynamometer test cycles for each condition. In this paper, the methodology and accompanying results of the large-scale duty-cycle statistical analysis are presented, including graphical and tabular representations of a number of relationships between key duty-cycle metrics observed within the larger data set. In addition to presenting the results of this analysis, conclusions are drawn and presented regarding potential applications of advanced vehicle technology as it relates specifically to school buses.
U.S. Department of Commerce Economics and Statistics Administration
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Commerce Economics and Statistics Administration 48% 24% 52% 76% 0% 20% 40% 60% 80% 100% All jobs STEM jobs Men Women By David Beede, Tiffany Julian, David Langdon, George McKittrick, Beethika Khan, and Mark Doms, Office of the Chief Economist Women in STEM: A Gender Gap to Innovation August 2011 Executive Summary ESA Issue Brief #04-11 O ur science, technology, engineering and math (STEM) workforce is crucial to America's innovative capacity and global competitiveness. Yet women are vastly
Financial statistics of selected investor-owned electric utilities, 1989
Not Available
1991-01-01
The Financial Statistics of Selected Investor-Owned Electric Utilities publication presents summary and detailed financial accounting data on the investor-owned electric utilities. The objective of the publication is to provide the Federal and State governments, industry, and the general public with current and historical data that can be used for policymaking and decisionmaking purposes related to investor-owned electric utility issues.
Spatial statistics for predicting flow through a rock fracture
Coakley, K.J.
1989-03-01
Fluid flow through a single rock fracture depends on the shape of the space between the upper and lower pieces of rock which define the fracture. In this thesis, the normalized flow through a fracture, i.e. the equivalent permeability of a fracture, is predicted in terms of spatial statistics computed from the arrangement of voids, i.e. open spaces, and contact areas within the fracture. Patterns of voids and contact areas, with complexity typical of experimental data, are simulated by clipping a correlated Gaussian process defined on a N by N pixel square region. The voids have constant aperture; the distance between the upper and lower surfaces which define the fracture is either zero or a constant. Local flow is assumed to be proportional to local aperture cubed times local pressure gradient. The flow through a pattern of voids and contact areas is solved using a finite-difference method. After solving for the flow through simulated 10 by 10 by 30 pixel patterns of voids and contact areas, a model to predict equivalent permeability is developed. The first model is for patterns with 80% voids where all voids have the same aperture. The equivalent permeability of a pattern is predicted in terms of spatial statistics computed from the arrangement of voids and contact areas within the pattern. Four spatial statistics are examined. The change point statistic measures how often adjacent pixel alternate from void to contact area (or vice versa ) in the rows of the patterns which are parallel to the overall flow direction. 37 refs., 66 figs., 41 tabs.
ARSCL Cloud Statistics - A Value-Added Product
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
ARSCL Cloud Statistics - A Value-Added Product Y. Shi Pacific Northwest National Laboratory Richland, Washington M. A. Miller Brookhaven National Laboratory Upton, New York Introduction The active remote sensing of cloud layers (ARSCLs) value-added product (VAP) combines data from active remote sensors to produce an objective determination of cloud location, radar reflectivity, vertical velocity, and Doppler spectral width. Information about the liquid water path (LWP) in these clouds and the
Spectral statistics in noninteracting many-particle systems
Munoz, L.; Relano, A.; Retamosa, J. [Departamento de Fisica Atomica, Molecular y Nuclear, Universidad Complutense de Madrid, E-28040 Madrid (Spain); Faleiro, E. [Departamento de Fisica Aplicada, E.U.I.T. Industrial, Universidad Politecnica de Madrid, E-28012 Madrid (Spain); Molina, R.A. [Max-Planck-Institut fuer Physik Komplexer Systeme, Noethnitzer Strasse 38, D-01187 Dresden (Germany)
2006-03-15
It is widely accepted that the statistical properties of energy level spectra provide an essential characterization of quantum chaos. Indeed, the spectral fluctuations of many different systems like quantum billiards, atoms, or atomic nuclei have been studied. However, noninteracting many-body systems have received little attention, since it is assumed that they must exhibit Poisson-like fluctuations. Apart from a heuristic argument of Bloch, there are neither systematic numerical calculations nor a rigorous derivation of this fact. Here we present a rigorous study of the spectral fluctuations of noninteracting identical particles moving freely in a mean field emphasizing the evolution with the number of particles N as well as with the energy. Our results are conclusive. For N{>=}2 the spectra of these systems exhibit Poisson fluctuations provided that we consider sufficiently high excitation energies. Nevertheless, when the mean field is chaotic there exists a critical energy scale L{sub c}; beyond this scale, the fluctuations deviate from the Poisson statistics as a reminiscence of the statistical properties of the mean field.
Electron transfer statistics and thermal fluctuations in molecular junctions
Goswami, Himangshu Prabal; Harbola, Upendra
2015-02-28
We derive analytical expressions for probability distribution function (PDF) for electron transport in a simple model of quantum junction in presence of thermal fluctuations. Our approach is based on the large deviation theory combined with the generating function method. For large number of electrons transferred, the PDF is found to decay exponentially in the tails with different rates due to applied bias. This asymmetry in the PDF is related to the fluctuation theorem. Statistics of fluctuations are analyzed in terms of the Fano factor. Thermal fluctuations play a quantitative role in determining the statistics of electron transfer; they tend to suppress the average current while enhancing the fluctuations in particle transfer. This gives rise to both bunching and antibunching phenomena as determined by the Fano factor. The thermal fluctuations and shot noise compete with each other and determine the net (effective) statistics of particle transfer. Exact analytical expression is obtained for delay time distribution. The optimal values of the delay time between successive electron transfers can be lowered below the corresponding shot noise values by tuning the thermal effects.
Farjam, R; Pramanik, P; Srinivasan, A; Chapman, C; Tsien, C; Lawrence, T; Cao, Y
2014-06-15
Purpose: Vascular injury could be a cause of hippocampal dysfunction leading to late neurocognitive decline in patients receiving brain radiotherapy (RT). Hence, our aim was to develop a multivariate interaction model for characterization of hippocampal vascular dose-response and early prediction of radiation-induced late neurocognitive impairments. Methods: 27 patients (17 males and 10 females, age 31–80 years) were enrolled in an IRB-approved prospective longitudinal study. All patients were diagnosed with a low-grade glioma or benign tumor and treated by 3-D conformal or intensity-modulated RT with a median dose of 54 Gy (50.4–59.4 Gy in 1.8− Gy fractions). Six DCE-MRI scans were performed from pre-RT to 18 months post-RT. DCE data were fitted to the modified Toft model to obtain the transfer constant of gadolinium influx from the intravascular space into the extravascular extracellular space, Ktrans, and the fraction of blood plasma volume, Vp. The hippocampus vascular property alterations after starting RT were characterized by changes in the hippocampal mean values of, μh(Ktrans)τ and μh(Vp)τ. The dose-response, Δμh(Ktrans/Vp)pre->τ, was modeled using a multivariate linear regression considering integrations of doses with age, sex, hippocampal laterality and presence of tumor/edema near a hippocampus. Finally, the early vascular dose-response in hippocampus was correlated with neurocognitive decline 6 and 18 months post-RT. Results: The μh(Ktrans) increased significantly from pre-RT to 1 month post-RT (p<0.0004). The multivariate model showed that the dose effect on Δμh(Ktrans)pre->1M post-RT was interacted with sex (p<0.0007) and age (p<0.00004), with the dose-response more pronounced in older females. Also, the vascular dose-response in the left hippocampus of females was significantly correlated with memory function decline at 6 (r = − 0.95, p<0.0006) and 18 (r = −0.88, p<0.02) months post-RT. Conclusion: The hippocampal vascular response to radiation could be sex and age dependent. The early hippocampal vascular dose-response could predict late neurocognitive dysfunction. (Support: NIH-RO1NS064973)
Edwards, Lloyd A.; Paresol, Bernard
2014-09-01
This report of the geostatistical analysis results of the fire fuels response variables, custom reaction intensity and total dead fuels is but a part of an SRS 2010 vegetation inventory project. For detailed description of project, theory and background including sample design, methods, and results please refer to USDA Forest Service Savannah River Site internal report “SRS 2010 Vegetation Inventory GeoStatistical Mapping Report”, (Edwards & Parresol 2013).
EIA - Advice from Meetings of the ASA Committee on Energy Statistics
U.S. Energy Information Administration (EIA) Indexed Site
Advice from Meetings of the ASA Committee on Energy Statistics Transcripts and Summaries from the American Statistical Association Committee on Energy Statistics The U.S. Energy Information Administration seeks technical advice semi-annually from the American Statistical Association Committee on Energy Statistics. The meetings are held in the spring and fall in Washington, D.C., and are announced in the Federal Register. These meetings are open to the public and are typically held on Thursday
User Statistics Collection Practices Archives | U.S. DOE Office of Science
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
(SC) Policies and Processes » User Statistics Collection Practices » User Statistics Collection Practices Archives User Facilities User Facilities Home User Facilities at a Glance User Resources User Statistics Policies and Processes Definition Designation Process Official List of SC User Facilities User Statistics Collection Practices User Statistics Collection Practices Archives Frequently Asked Questions User Facility Science Highlights User Facility News Contact Information Office of
Accounting for Global Climate Model Projection Uncertainty in Modern Statistical Downscaling
Johannesson, G
2010-03-17
Future climate change has emerged as a national and a global security threat. To carry out the needed adaptation and mitigation steps, a quantification of the expected level of climate change is needed, both at the global and the regional scale; in the end, the impact of climate change is felt at the local/regional level. An important part of such climate change assessment is uncertainty quantification. Decision and policy makers are not only interested in 'best guesses' of expected climate change, but rather probabilistic quantification (e.g., Rougier, 2007). For example, consider the following question: What is the probability that the average summer temperature will increase by at least 4 C in region R if global CO{sub 2} emission increases by P% from current levels by time T? It is a simple question, but one that remains very difficult to answer. It is answering these kind of questions that is the focus of this effort. The uncertainty associated with future climate change can be attributed to three major factors: (1) Uncertainty about future emission of green house gasses (GHG). (2) Given a future GHG emission scenario, what is its impact on the global climate? (3) Given a particular evolution of the global climate, what does it mean for a particular location/region? In what follows, we assume a particular GHG emission scenario has been selected. Given the GHG emission scenario, the current batch of the state-of-the-art global climate models (GCMs) is used to simulate future climate under this scenario, yielding an ensemble of future climate projections (which reflect, to some degree our uncertainty of being able to simulate future climate give a particular GHG scenario). Due to the coarse-resolution nature of the GCM projections, they need to be spatially downscaled for regional impact assessments. To downscale a given GCM projection, two methods have emerged: dynamical downscaling and statistical (empirical) downscaling (SDS). Dynamic downscaling involves configuring and running a regional climate model (RCM) nested within a given GCM projection (i.e., the GCM provides bounder conditions for the RCM). On the other hand, statistical downscaling aims at establishing a statistical relationship between observed local/regional climate variables of interest and synoptic (GCM-scale) climate predictors. The resulting empirical relationship is then applied to future GCM projections. A comparison of the pros and cons of dynamical versus statistical downscaling is outside the scope of this effort, but has been extensively studied and the reader is referred to Wilby et al. (1998); Murphy (1999); Wood et al. (2004); Benestad et al. (2007); Fowler et al. (2007), and references within those. The scope of this effort is to study methodology, a statistical framework, to propagate and account for GCM uncertainty in regional statistical downscaling assessment. In particular, we will explore how to leverage an ensemble of GCM projections to quantify the impact of the GCM uncertainty in such an assessment. There are three main component to this effort: (1) gather the necessary climate-related data for a regional SDS study, including multiple GCM projections, (2) carry out SDS, and (3) assess the uncertainty. The first step is carried out using tools written in the Python programming language, while analysis tools were developed in the statistical programming language R; see Figure 1.
Askari, Sima [Faculty of Chemical Engineering, Amirkabir University of Technology (Tehran Polytechnic), P.O. Box 15875-4413, Hafez Ave., Tehran (Iran, Islamic Republic of); Halladj, Rouein, E-mail: halladj@aut.ac.ir [Faculty of Chemical Engineering, Amirkabir University of Technology (Tehran Polytechnic), P.O. Box 15875-4413, Hafez Ave., Tehran (Iran, Islamic Republic of); Nazari, Mahdi [Faculty of Chemical Engineering, Amirkabir University of Technology (Tehran Polytechnic), P.O. Box 15875-4413, Hafez Ave., Tehran (Iran, Islamic Republic of)
2013-05-15
Highlights: ? Sonochemical synthesis of SAPO-34 nanocrystals. ? Using Taguchi experimental design (L9) for optimizing the experimental procedure. ? The significant effects of all the ultrasonic parameters on the response. - Abstract: SAPO-34 nanocrystals with high crystallinity were synthesized by means of sonochemical method. An L9 orthogonal array of the Taguchi method was implemented to investigate the effects of sonication conditions on the preparation of SAPO-34 with respect to crystallinity of the final product phase. The experimental data establish the favorable phase crystallinity which is improved by increasing the ultrasonic power and the sonication temperature. In the case of ultrasonic irradiation time, however, an initial increases in crystallinity from 5 min to 15 min is followed by a decrease in crystallinity for longer sonication time.
Broader source: Energy.gov [DOE]
The Voluntary Protection Program (VPP) was originally developed by Occupational Safety and Health Administration (OSHA) in 1982 to foster greater ownership of safety and health in the workplace. The Department of Energy (DOE) adopted VPP in 1992; currently 23 sites across the DOE complex participate in the program. As its name implies, it is a voluntary program; i.e. not required by laws or regulations.
Davis, Adam Christopher
2015-08-25
The Worker Safety and Security Team (WSST) at Los Alamos National Laboratory holds an annual festival, WSST-fest, to engage workers and inform them about safety- and securityrelated matters. As part of the 2015 WSST-fest, workers were given the opportunity to participate in a survey assessing their engagement in their organizations and work environments. A total of 789 workers participated in the 23-question survey where they were also invited, optionally, to identify themselves, their organization, and to give open-ended feedback. The survey consisted of 23 positive statements (i.e. “My organization is a good place to work.”) with which the respondent could express a level of agreement. The text of these statements are provided in Table 1. The level of agreement corresponds to a 5-level Likert scale ranging from “Strongly Disagree” to “Strongly Agree.” In addition to assessing the overall positivity or negativity of the scores, the results were partitioned into several cohorts based on the response meta-data (self-identification, comments, etc.) to explore trends. Survey respondents were presented with the options to identify themselves, their organizations and to provide comments. These options suggested the following questions about the data set.
EMERGING MODALITIES FOR SOIL CARBON ANALYSIS: SAMPLING STATISTICS AND ECONOMICS WORKSHOP.
WIELOPOLSKI, L.
2006-04-01
The workshop's main objectives are (1) to present the emerging modalities for analyzing carbon in soil, (2) to assess their error propagation, (3) to recommend new protocols and sampling strategies for the new instrumentation, and, (4) to compare the costs of the new methods with traditional chemical ones.
Statistical analysis of radiochemical measurements of TRU radionuclides in REDC waste
Beauchamp, J.; Downing, D.; Chapman, J.; Fedorov, V.; Nguyen, L.; Parks, C.; Schultz, F.; Yong, L.
1996-10-01
This report summarizes results of the study on the isotopic ratios of transuranium elements in waste from the Radiochemical Engineering Development Center actinide-processing streams. The knowledge of the isotopic ratios when combined with results of nondestructive assays, in particular with results of Active-Passive Neutron Examination Assay and Gamma Active Segmented Passive Assay, may lead to significant increase in precision of the determination of TRU elements contained in ORNL generated waste streams.
Statistical Analysis of the Factors Influencing Consumer Use of E85
Bromiley, P.; Gerlach, T.; Marczak, K.; Taylor, M.; Dobrovolny, L.
2008-07-01
Evaluating the sales patterns of E85 retail outlets can provide important information about consumer behavior regarding E85, locating future E85 fueling infrastructure, and developing future alternative fuel policies and programs.
Analysis of sampling plan options for tank 16H from the perspective of statistical uncertainty
Shine, E. P.
2013-02-28
This report develops a concentration variability model for Tank 16H in order to compare candidate sampling plans for assessing the concentrations of analytes in the residual material in the annulus and on the floor of the primary vessel. A concentration variability model is used to compare candidate sampling plans based on the expected upper 95% confidence limit (UCL95) for the mean. The result is expressed as a rank order of candidate sampling plans from lowest to highest expected UCL95, with the lowest being the most desirable from an uncertainty perspective.
Statistical Characterization of Medium-Duty Electric Vehicle Drive Cycles: Preprint
Prohaska, R.; Duran, A.; Ragatz, A.; Kelly, K.
2015-05-01
In an effort to help commercialize technologies for electric vehicles (EVs) through deployment and demonstration projects, the U.S. Department of Energy’s (DOE's) American Recovery and Reinvestment Act (ARRA) provided funding to participating U.S. companies to cover part of the cost of purchasing new EVs. Within the medium- and heavy-duty commercial vehicle segment, both Smith Electric Newton and and Navistar eStar vehicles qualified for such funding opportunities. In an effort to evaluate the performance characteristics of the new technologies deployed in these vehicles operating under real world conditions, data from Smith Electric and Navistar medium-duty EVs were collected, compiled, and analyzed by the National Renewable Energy Laboratory's (NREL) Fleet Test and Evaluation team over a period of 3 years. More than 430 Smith Newton EVs have provided data representing more than 150,000 days of operation. Similarly, data have been collected from more than 100 Navistar eStar EVs, resulting in a comparative total of more than 16,000 operating days. Combined, NREL has analyzed more than 6 million kilometers of driving and 4 million hours of charging data collected from commercially operating medium-duty electric vehicles in various configurations. In this paper, extensive duty-cycle statistical analyses are performed to examine and characterize common vehicle dynamics trends and relationships based on in-use field data. The results of these analyses statistically define the vehicle dynamic and kinematic requirements for each vehicle, aiding in the selection of representative chassis dynamometer test cycles and the development of custom drive cycles that emulate daily operation. In this paper, the methodology and accompanying results of the duty-cycle statistical analysis are presented and discussed. Results are presented in both graphical and tabular formats illustrating a number of key relationships between parameters observed within the data set that relate to medium duty EVs.
Quantum Statistical Testing of a Quantum Random Number Generator
Humble, Travis S
2014-01-01
The unobservable elements in a quantum technology, e.g., the quantum state, complicate system verification against promised behavior. Using model-based system engineering, we present methods for verifying the opera- tion of a prototypical quantum random number generator. We begin with the algorithmic design of the QRNG followed by the synthesis of its physical design requirements. We next discuss how quantum statistical testing can be used to verify device behavior as well as detect device bias. We conclude by highlighting how system design and verification methods must influence effort to certify future quantum technologies.
Poincar recurrence statistics as an indicator of chaos synchronization
Boev, Yaroslav I. Vadivasova, Tatiana E. Anishchenko, Vadim S.
2014-06-15
The dynamics of the autonomous and non-autonomous Rssler system is studied using the Poincar recurrence time statistics. It is shown that the probability distribution density of Poincar recurrences represents a set of equidistant peaks with the distance that is equal to the oscillation period and the envelope obeys an exponential distribution. The dimension of the spatially uniform Rssler attractor is estimated using Poincar recurrence times. The mean Poincar recurrence time in the non-autonomous Rssler system is locked by the external frequency, and this enables us to detect the effect of phase-frequency synchronization.
Statistical thermodynamics of strain hardening in polycrystalline solids
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Langer, James S.
2015-01-01
This paper starts with a systematic rederivation of the statistical thermodynamic equations of motion for dislocation-mediated plasticity proposed in 2010 by Langer, Bouchbinder, and Lookman. The paper then uses that theory to explain the anomalous rate-hardening behavior reported in 1988 by Follansbee and Kocks and to explore the relation between hardening rate and grain size reported in 1995 by Meyers et al. A central theme is the need for physics-based, nonequilibrium analyses in developing predictive theories of the strength of polycrystalline materials.
Financial statistics of major US publicly owned electric utilities 1993
Not Available
1995-02-01
The 1993 edition of the Financial Statistics of Major U.S. Publicly Owned Electric Utilities publication presents five years (1989 to 1993) of summary financial data and current year detailed financial data on the major publicly owned electric utilities. The objective of the publication is to provide Federal and State governments, industry, and the general public with current and historical data that can be used for policymaking and decision making purposes related to publicly owned electric utility issues. Generator and nongenerator summaries are presented in this publication. The primary source of publicly owned financial data is the Form EIA-412, the Annual Report of Public Electric Utilities, filed on a fiscal basis.
The Fall Meeting of the Committee on Energy Statistics
U.S. Energy Information Administration (EIA) Indexed Site
* * * * * FRIDAY NOVEMBER 5, 1999 The Fall Meeting of the Committee on Energy Statistics commenced at 8:30 a.m. at the Department of Energy, 1000 Independence Avenue, S.W., Room 8E089, Washington, D.C., Daniel Relles, presiding. PRESENT: DANIEL RELLES, Chairman JAY BREIDT LYNDA CARLSON THOMAS COWING CAROL GOTWAY CRAWFORD JAY HAKES JAMES HAMMITT PHILIP HANSER CALVIN KENT W. DAVID MONTGOMERY LARRY PETTIS SEYMOUR SUDMAN BILL WEINIG ROY WHITMORE C O N T E N T S PAGE Opening 5 Addressing Accuracy in
Statistical anisotropy of the curvature perturbation from vector field perturbations
Dimopoulos, Konstantinos; Karciauskas, Mindaugas; Lyth, David H.; Rodriguez, Yeinzon E-mail: m.karciauskas@lancaster.ac.uk E-mail: yeinzon.rodriguez@uan.edu.co
2009-05-15
The {delta}N formula for the primordial curvature perturbation {zeta} is extended to include vector as well as scalar fields. Formulas for the tree-level contributions to the spectrum and bispectrum of {zeta} are given, exhibiting statistical anisotropy. The one-loop contribution to the spectrum of {zeta} is also worked out. We then consider the generation of vector field perturbations from the vacuum, including the longitudinal component that will be present if there is no gauge invariance. Finally, the {delta}N formula is applied to the vector curvaton and vector inflation models with the tensor perturbation also evaluated in the latter case.
Office of Oil, Gas, and Coal Supply Statistics
U.S. Energy Information Administration (EIA) Indexed Site
3 U.S. Department of Energy Washington, DC 20585 2013 U.S. Energy Information Administration | Natural Gas Monthly ii This report was prepared by the U.S. Energy Information Administration (EIA), the statistical and analytical agency within the U.S. Department of Energy. By law, EIA's data, analyses, and forecasts are independent of approval by any other officer or employee of the United States Government. The views in this report therefore should not be construed as representing those of the
Office of Oil, Gas, and Coal Supply Statistics
U.S. Energy Information Administration (EIA) Indexed Site
4 U.S. Department of Energy Washington, DC 20585 2014 U.S. Energy Information Administration | Natural Gas Monthly ii This report was prepared by the U.S. Energy Information Administration (EIA), the statistical and analytical agency within the U.S. Department of Energy. By law, EIA's data, analyses, and forecasts are independent of approval by any other officer or employee of the United States Government. The views in this report therefore should not be construed as representing those of the
Advances on statistical/thermodynamical models for unpolarized structure functions
Trevisan, Luis A.; Mirez, Carlos; Tomio, Lauro
2013-03-25
During the eights and nineties many statistical/thermodynamical models were proposed to describe the nucleons' structure functions and distribution of the quarks in the hadrons. Most of these models describe the compound quarks and gluons inside the nucleon as a Fermi / Bose gas respectively, confined in a MIT bag with continuous energy levels. Another models considers discrete spectrum. Some interesting features of the nucleons are obtained by these models, like the sea asymmetries {sup -}d/{sup -}u and {sup -}d-{sup -}u.
Statistics at work in heavy-ion reactions
Moretto, L.G.
1982-07-01
In the first part special aspects of the compound nucleus decay are considered. The evaporation of particles intermediate between nucleons and fission fragments is explored both theoretically and experimentally. The limitations of the fission decay width expression obtained with the transition state method are discussed, and a more general approach is proposed. In the second part the process of angular momentum transfer in deep inelastic reactions is considered. The limit of statistical equilibrium is studied and specifically applied to the estimation of the degree of alignment of the fragment spins. The magnitude and alignment of the transferred angular momentum is experimentally determined from sequentially emitted alpha, gamma, and fission fragments.
Statistical Inference for Big Data Problems in Molecular Biophysics
Ramanathan, Arvind; Savol, Andrej; Burger, Virginia; Quinn, Shannon; Agarwal, Pratul K; Chennubhotla, Chakra
2012-01-01
We highlight the role of statistical inference techniques in providing biological insights from analyzing long time-scale molecular simulation data. Technologi- cal and algorithmic improvements in computation have brought molecular simu- lations to the forefront of techniques applied to investigating the basis of living systems. While these longer simulations, increasingly complex reaching petabyte scales presently, promise a detailed view into microscopic behavior, teasing out the important information has now become a true challenge on its own. Mining this data for important patterns is critical to automating therapeutic intervention discovery, improving protein design, and fundamentally understanding the mech- anistic basis of cellular homeostasis.
Younger Dryas Boundary (YDB) impact : physical and statistical
Office of Scientific and Technical Information (OSTI)
impossibility. (Conference) | SciTech Connect The YDB impact hypothesis of Firestone et al. (2007) is so extremely improbable it can be considered statistically impossible in addition to being physically impossible. Comets make up only about 1% of the population of Earth-crossing objects. Broken comets are a vanishingly small fraction, and only exist as Earth-sized clusters for a very short period of time. Only a small fraction of impacts occur at angles as shallow as proposed by the YDB
Statistical Methods Handbook for Advanced Gas Reactor Fuel Materials
J. J. Einerson
2005-05-01
Fuel materials such as kernels, coated particles, and compacts are being manufactured for experiments simulating service in the next generation of high temperature gas reactors. These must meet predefined acceptance specifications. Many tests are performed for quality assurance, and many of these correspond to criteria that must be met with specified confidence, based on random samples. This report describes the statistical methods to be used. The properties of the tests are discussed, including the risk of false acceptance, the risk of false rejection, and the assumption of normality. Methods for calculating sample sizes are also described.
Hybrid Statistical Testing for Nuclear Material Accounting Data and/or Process Monitoring Data
Ticknor, Lawrence O.; Hamada, Michael Scott; Sprinkle, James K.; Burr, Thomas Lee
2015-04-14
The two tests employed in the hybrid testing scheme are Page’s cumulative sums for all streams within a Balance Period (maximum of the maximums and average of the maximums) and Crosier’s multivariate cumulative sum applied to incremental cumulative sums across Balance Periods. The role of residuals for both kinds of data is discussed.
Statistical Review of Data from DWPF's Process Samples for Batches 19 Through 30
Edwards, T.B.
1999-04-06
The measurements derived from samples taken during the processing of batches 19 through 30 at the Defense Waste Processing Facility (DWPF) affords an opportunity for review and comparisons. This report has looked at some of the statistics from these data. Only the data reported by the DWPF lab (that is, the data provided by the lab as representative of the samples taken) are available for this analysis. In some cases, the sample results reported may be a subset of the sample results generated by the analytical procedures. A thorough assessment of the DWPF lab's analytical procedures would require the complete set of data. Thus, the statistics reported here, specifically, as they relate to analytical uncertainties, are limited to the reported data for these samples, A fell for the consistency of the incoming slurry is the estimation of the components of variation for the Sludge Receipt and Adjustment Tank (SRAT) receipts. In general, for all of the vessels, the data from batches after 21 show smaller batch-to-batch variation than the data from all the batches. The relative contributions of batch-to-batch versus residual, which includes analytical, are presented in these analyses.
NREL: Energy Analysis - Ella Zhou
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Ella Zhou Photo of Ella Zhou Ella Zhou is a member of the Market and Policy Impact Group in the Strategic Energy Analysis Center. International Energy Analyst On staff since December 2014 Phone number: 303-275-3293 E-mail: Ella.Zhou@nrel.gov Areas of expertise Power system modeling Integrating water and cooling system, energy storage, and demand response into grid simulation and optimization models International energy policy and market analysis Data analysis and statistical modeling Primary
NREL: Energy Analysis - Galen Maclaurin
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Galen Maclaurin Photo of Galen Maclaurin Galen Maclaurin is a member of the Data Analysis and Visualization Group in the Strategic Energy Analysis Center. Scientist III On staff since June 2015 Phone number: 303-275-3846 E-mail: Galen.Maclaurin@nrel.gov Areas of expertise Spatial and temporal modeling Remote sensing image analysis Spatial statistics & machine learning Scientific programming Primary research interests Modeling energy use at multiple spatial scales Information extraction from
Image segmentation by hierarchial agglomeration of polygons using ecological statistics
Prasad, Lakshman; Swaminarayan, Sriram
2013-04-23
A method for rapid hierarchical image segmentation based on perceptually driven contour completion and scene statistics is disclosed. The method begins with an initial fine-scale segmentation of an image, such as obtained by perceptual completion of partial contours into polygonal regions using region-contour correspondences established by Delaunay triangulation of edge pixels as implemented in VISTA. The resulting polygons are analyzed with respect to their size and color/intensity distributions and the structural properties of their boundaries. Statistical estimates of granularity of size, similarity of color, texture, and saliency of intervening boundaries are computed and formulated into logical (Boolean) predicates. The combined satisfiability of these Boolean predicates by a pair of adjacent polygons at a given segmentation level qualifies them for merging into a larger polygon representing a coarser, larger-scale feature of the pixel image and collectively obtains the next level of polygonal segments in a hierarchy of fine-to-coarse segmentations. The iterative application of this process precipitates textured regions as polygons with highly convolved boundaries and helps distinguish them from objects which typically have more regular boundaries. The method yields a multiscale decomposition of an image into constituent features that enjoy a hierarchical relationship with features at finer and coarser scales. This provides a traversable graph structure from which feature content and context in terms of other features can be derived, aiding in automated image understanding tasks. The method disclosed is highly efficient and can be used to decompose and analyze large images.
Statistics of anisotropies in inflation with spectator vector fields
Thorsrud, Mikjel; Mota, David F.; Urban, Federico R. E-mail: furban@ulb.ac.be
2014-04-01
We study the statistics of the primordial power spectrum in models where massless gauge vectors are coupled to the inflaton, paying special attention to observational implications of having fundamental or effective horizons embedded in a bath of infrared fluctuations. As quantum infrared modes cross the horizon, they classicalize and build a background vector field. We find that the vector experiences a statistical precession phenomenon. Implications for primordial correlators and the interpretation thereof are considered. Firstly, we show how in general two, not only one, additional observables, a quadrupole amplitude and an intrinsic shape parameter, are necessary to fully describe the correction to the curvature power spectrum, and develop a unique parametrization for them. Secondly, we show that the observed anisotropic amplitude and the associated preferred direction depend on the volume of the patch being probed. We calculate non-zero priors for the expected deviations between detections based on microwave background data (which probes the entire Hubble patch) and large scale structure (which only probes a fraction of it)
View discovery in OLAP databases through statistical combinatorial optimization
Hengartner, Nick W; Burke, John; Critchlow, Terence; Joslyn, Cliff; Hogan, Emilie
2009-01-01
OnLine Analytical Processing (OLAP) is a relational database technology providing users with rapid access to summary, aggregated views of a single large database, and is widely recognized for knowledge representation and discovery in high-dimensional relational databases. OLAP technologies provide intuitive and graphical access to the massively complex set of possible summary views available in large relational (SQL) structured data repositories. The capability of OLAP database software systems to handle data complexity comes at a high price for analysts, presenting them a combinatorially vast space of views of a relational database. We respond to the need to deploy technologies sufficient to allow users to guide themselves to areas of local structure by casting the space of 'views' of an OLAP database as a combinatorial object of all projections and subsets, and 'view discovery' as an search process over that lattice. We equip the view lattice with statistical information theoretical measures sufficient to support a combinatorial optimization process. We outline 'hop-chaining' as a particular view discovery algorithm over this object, wherein users are guided across a permutation of the dimensions by searching for successive two-dimensional views, pushing seen dimensions into an increasingly large background filter in a 'spiraling' search process. We illustrate this work in the context of data cubes recording summary statistics for radiation portal monitors at US ports.
Office of Oil, Gas, and Coal Supply Statistics
U.S. Energy Information Administration (EIA) Indexed Site
published in the NGM, is developed by the National Weather Service Climate Analysis Center, Camp Springs, Maryland. The data are available weekly with monthly summaries...
Analysis of Hybrid Hydrogen Systems: Final Report
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Heat Rate Improvement Potential at Coal- Fired Power Plants May 2015 Independent Statistics & Analysis www.eia.gov U.S. Department of Energy Washington, DC 20585 U.S. Energy Information Administration | Analysis of Heat Rate Improvement Potential at Coal-Fired Power Plants i This report was prepared by the U.S. Energy Information Administration (EIA), the statistical and analytical agency within the U.S. Department of Energy. By law, EIA's data, analyses, and forecasts are independent of
ORISE: Workforce Analysis and Program Evaluation
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Workforce Analysis and Program Evaluation Workforce Analysis and Program Evaluation By accurately capturing statistics and conducting data analysis, the Oak Ridge Institute for Science and Education (ORISE) assesses the needs of the nation's science, technology, engineering and mathematics workforce. Our comprehensive workforce trends assessment and recruitment strategies enable us to match highly qualified participants with laboratories and agencies needing research assistance. Beyond research
Statistical modeling support for calibration of a multiphysics model of subcooled boiling flows
Bui, A. V.; Dinh, N. T.; Nourgaliev, R. R.; Williams, B. J.
2013-07-01
Nuclear reactor system analyses rely on multiple complex models which describe the physics of reactor neutronics, thermal hydraulics, structural mechanics, coolant physico-chemistry, etc. Such coupled multiphysics models require extensive calibration and validation before they can be used in practical system safety study and/or design/technology optimization. This paper presents an application of statistical modeling and Bayesian inference in calibrating an example multiphysics model of subcooled boiling flows which is widely used in reactor thermal hydraulic analysis. The presence of complex coupling of physics in such a model together with the large number of model inputs, parameters and multidimensional outputs poses significant challenge to the model calibration method. However, the method proposed in this work is shown to be able to overcome these difficulties while allowing data (observation) uncertainty and model inadequacy to be taken into consideration. (authors)
Hegerfeldt, G.C.; Henneberg, R. (Institut fuer Theoretische Physik, University of Goettingen, D-37073 Goettingen (Germany))
1994-05-01
The statistical analysis of energy levels, a powerful tool in the study of quantum systems, is applicable to discrete spectra. Here we propose an approach to carry level statistics over to continuous energy spectra, paradoxical as this may sound at first. The approach proceeds in three steps, first a discretization of the spectrum by cutoffs, then a statistical analysis of the resulting discrete spectra, and finally a determination of the limit distributions as the cutoffs are removed. In this way the notions of Wigner and Poisson distributions for nearest-neighbor spacing (NNS), usually associated with quantum chaos and regularity, can be carried over to systems with a purely continuous energy spectrum. The approach is demonstrated for the hydrogen atom in perpendicular electric and magnetic fields. This system has a purely continuous energy spectrum from [minus][infinity] to [infinity]. Depending on the field parameters, we find for the NNS a Poisson or a Wigner distribution, or a transitional behavior. We also outline how to determine physically relevant resonances in our approach by a stabilization method.
Development and testing of improved statistical wind power forecasting methods.
Mendes, J.; Bessa, R.J.; Keko, H.; Sumaili, J.; Miranda, V.; Ferreira, C.; Gama, J.; Botterud, A.; Zhou, Z.; Wang, J.
2011-12-06
Wind power forecasting (WPF) provides important inputs to power system operators and electricity market participants. It is therefore not surprising that WPF has attracted increasing interest within the electric power industry. In this report, we document our research on improving statistical WPF algorithms for point, uncertainty, and ramp forecasting. Below, we provide a brief introduction to the research presented in the following chapters. For a detailed overview of the state-of-the-art in wind power forecasting, we refer to [1]. Our related work on the application of WPF in operational decisions is documented in [2]. Point forecasts of wind power are highly dependent on the training criteria used in the statistical algorithms that are used to convert weather forecasts and observational data to a power forecast. In Chapter 2, we explore the application of information theoretic learning (ITL) as opposed to the classical minimum square error (MSE) criterion for point forecasting. In contrast to the MSE criterion, ITL criteria do not assume a Gaussian distribution of the forecasting errors. We investigate to what extent ITL criteria yield better results. In addition, we analyze time-adaptive training algorithms and how they enable WPF algorithms to cope with non-stationary data and, thus, to adapt to new situations without requiring additional offline training of the model. We test the new point forecasting algorithms on two wind farms located in the U.S. Midwest. Although there have been advancements in deterministic WPF, a single-valued forecast cannot provide information on the dispersion of observations around the predicted value. We argue that it is essential to generate, together with (or as an alternative to) point forecasts, a representation of the wind power uncertainty. Wind power uncertainty representation can take the form of probabilistic forecasts (e.g., probability density function, quantiles), risk indices (e.g., prediction risk index) or scenarios (with spatial and/or temporal dependence). Statistical approaches to uncertainty forecasting basically consist of estimating the uncertainty based on observed forecasting errors. Quantile regression (QR) is currently a commonly used approach in uncertainty forecasting. In Chapter 3, we propose new statistical approaches to the uncertainty estimation problem by employing kernel density forecast (KDF) methods. We use two estimators in both offline and time-adaptive modes, namely, the Nadaraya-Watson (NW) and Quantilecopula (QC) estimators. We conduct detailed tests of the new approaches using QR as a benchmark. One of the major issues in wind power generation are sudden and large changes of wind power output over a short period of time, namely ramping events. In Chapter 4, we perform a comparative study of existing definitions and methodologies for ramp forecasting. We also introduce a new probabilistic method for ramp event detection. The method starts with a stochastic algorithm that generates wind power scenarios, which are passed through a high-pass filter for ramp detection and estimation of the likelihood of ramp events to happen. The report is organized as follows: Chapter 2 presents the results of the application of ITL training criteria to deterministic WPF; Chapter 3 reports the study on probabilistic WPF, including new contributions to wind power uncertainty forecasting; Chapter 4 presents a new method to predict and visualize ramp events, comparing it with state-of-the-art methodologies; Chapter 5 briefly summarizes the main findings and contributions of this report.
Structure Learning and Statistical Estimation in Distribution Networks - Part II
Deka, Deepjyoti; Backhaus, Scott N.; Chertkov, Michael
2015-02-13
Limited placement of real-time monitoring devices in the distribution grid, recent trends notwithstanding, has prevented the easy implementation of demand-response and other smart grid applications. Part I of this paper discusses the problem of learning the operational structure of the grid from nodal voltage measurements. In this work (Part II), the learning of the operational radial structure is coupled with the problem of estimating nodal consumption statistics and inferring the line parameters in the grid. Based on a Linear-Coupled(LC) approximation of AC power flows equations, polynomial time algorithms are designed to identify the structure and estimate nodal load characteristics and/or line parameters in the grid using the available nodal voltage measurements. Then the structure learning algorithm is extended to cases with missing data, where available observations are limited to a fraction of the grid nodes. The efficacy of the presented algorithms are demonstrated through simulations on several distribution test cases.
Statistical properties of Charney-Hasegawa-Mima zonal flows
Anderson, Johan; Botha, G. J. J.
2015-05-15
A theoretical interpretation of numerically generated probability density functions (PDFs) of intermittent plasma transport events in unforced zonal flows is provided within the Charney-Hasegawa-Mima (CHM) model. The governing equation is solved numerically with various prescribed density gradients that are designed to produce different configurations of parallel and anti-parallel streams. Long-lasting vortices form whose flow is governed by the zonal streams. It is found that the numerically generated PDFs can be matched with analytical predictions of PDFs based on the instanton method by removing the autocorrelations from the time series. In many instances, the statistics generated by the CHM dynamics relaxes to Gaussian distributions for both the electrostatic and vorticity perturbations, whereas in areas with strong nonlinear interactions it is found that the PDFs are exponentially distributed.
Statistical Stability and Time-Reversal Imgaing in Random Media
Berryman, J; Borcea, L; Papanicolaou, G; Tsogka, C
2002-02-05
Localization of targets imbedded in a heterogeneous background medium is a common problem in seismic, ultrasonic, and electromagnetic imaging problems. The best imaging techniques make direct use of the eigenfunctions and eigenvalues of the array response matrix, as recent work on time-reversal acoustics has shown. Of the various imaging functionals studied, one that is representative of a preferred class is a time-domain generalization of MUSIC (MUltiple Signal Classification), which is a well-known linear subspace method normally applied only in the frequency domain. Since statistical stability is not characteristic of the frequency domain, a transform back to the time domain after first diagonalizing the array data in the frequency domain takes optimum advantage of both the time-domain stability and the frequency-domain orthogonality of the relevant eigenfunctions.
Statistical tools for prognostics and health management of complex systems
Collins, David H; Huzurbazar, Aparna V; Anderson - Cook, Christine M
2010-01-01
Prognostics and Health Management (PHM) is increasingly important for understanding and managing today's complex systems. These systems are typically mission- or safety-critical, expensive to replace, and operate in environments where reliability and cost-effectiveness are a priority. We present background on PHM and a suite of applicable statistical tools and methods. Our primary focus is on predicting future states of the system (e.g., the probability of being operational at a future time, or the expected remaining system life) using heterogeneous data from a variety of sources. We discuss component reliability models incorporating physical understanding, condition measurements from sensors, and environmental covariates; system reliability models that allow prediction of system failure time distributions from component failure models; and the use of Bayesian techniques to incorporate expert judgments into component and system models.
Statistical measures of Planck scale signal correlations in interferometers
Hogan, Craig J.; Kwon, Ohkyung
2015-06-22
A model-independent statistical framework is presented to interpret data from systems where the mean time derivative of positional cross correlation between world lines, a measure of spreading in a quantum geometrical wave function, is measured with a precision smaller than the Planck time. The framework provides a general way to constrain possible departures from perfect independence of classical world lines, associated with Planck scale bounds on positional information. A parametrized candidate set of possible correlation functions is shown to be consistent with the known causal structure of the classical geometry measured by an apparatus, and the holographic scaling of information suggested by gravity. Frequency-domain power spectra are derived that can be compared with interferometer data. As a result, simple projections of sensitivity for specific experimental set-ups suggests that measurements will directly yield constraints on a universal time derivative of the correlation function, and thereby confirm or rule out a class of Planck scale departures from classical geometry.
Predicting weak lensing statistics from halo mass reconstructions - Final Paper
Everett, Spencer
2015-08-20
As dark matter does not absorb or emit light, its distribution in the universe must be inferred through indirect effects such as the gravitational lensing of distant galaxies. While most sources are only weakly lensed, the systematic alignment of background galaxies around a foreground lens can constrain the mass of the lens which is largely in the form of dark matter. In this paper, I have implemented a framework to reconstruct all of the mass along lines of sight using a best-case dark matter halo model in which the halo mass is known. This framework is then used to make predictions of the weak lensing of 3,240 generated source galaxies through a 324 arcmin field of the Millennium Simulation. The lensed source ellipticities are characterized by the ellipticity-ellipticity and galaxy-mass correlation functions and compared to the same statistic for the intrinsic and ray-traced ellipticities. In the ellipticity-ellipticity correlation function, I and that the framework systematically under predicts the shear power by an average factor of 2.2 and fails to capture correlation from dark matter structure at scales larger than 1 arcminute. The model predicted galaxy-mass correlation function is in agreement with the ray-traced statistic from scales 0.2 to 0.7 arcminutes, but systematically underpredicts shear power at scales larger than 0.7 arcminutes by an average factor of 1.2. Optimization of the framework code has reduced the mean CPU time per lensing prediction by 70% to 24 5 ms. Physical and computational shortcomings of the framework are discussed, as well as potential improvements for upcoming work.
Table B1. Summary statistics for natural gas in the United States...
U.S. Energy Information Administration (EIA) Indexed Site
8 Table B1. Summary statistics for natural gas in the United States, metric equivalents, ... Gas Annual 199 Table B1. Summary statistics for natural gas in the United states, ...
Super-Poissonian Statistics of Photon Emission from Single CdSe...
Office of Scientific and Technical Information (OSTI)
Statistics of Photon Emission from Single CdSe-CdS Core-Shell Nanocrystals Coupled to Metal Nanostructures Citation Details In-Document Search Title: Super-Poissonian Statistics of ...
Statistical Exploration of Electronic Structure of Molecules from Quantum Monte-Carlo Simulations
Prabhat, Mr; Zubarev, Dmitry; Lester, Jr., William A.
2010-12-22
In this report, we present results from analysis of Quantum Monte Carlo (QMC) simulation data with the goal of determining internal structure of a 3N-dimensional phase space of an N-electron molecule. We are interested in mining the simulation data for patterns that might be indicative of the bond rearrangement as molecules change electronic states. We examined simulation output that tracks the positions of two coupled electrons in the singlet and triplet states of an H2 molecule. The electrons trace out a trajectory, which was analyzed with a number of statistical techniques. This project was intended to address the following scientific questions: (1) Do high-dimensional phase spaces characterizing electronic structure of molecules tend to cluster in any natural way? Do we see a change in clustering patterns as we explore different electronic states of the same molecule? (2) Since it is hard to understand the high-dimensional space of trajectories, can we project these trajectories to a lower dimensional subspace to gain a better understanding of patterns? (3) Do trajectories inherently lie in a lower-dimensional manifold? Can we recover that manifold? After extensive statistical analysis, we are now in a better position to respond to these questions. (1) We definitely see clustering patterns, and differences between the H2 and H2tri datasets. These are revealed by the pamk method in a fairly reliable manner and can potentially be used to distinguish bonded and non-bonded systems and get insight into the nature of bonding. (2) Projecting to a lower dimensional subspace ({approx}4-5) using PCA or Kernel PCA reveals interesting patterns in the distribution of scalar values, which can be related to the existing descriptors of electronic structure of molecules. Also, these results can be immediately used to develop robust tools for analysis of noisy data obtained during QMC simulations (3) All dimensionality reduction and estimation techniques that we tried seem to indicate that one needs 4 or 5 components to account for most of the variance in the data, hence this 5D dataset does not necessarily lie on a well-defined, low dimensional manifold. In terms of specific clustering techniques, K-means was generally useful in exploring the dataset. The partition around medoids (pam) technique produced the most definitive results for our data showing distinctive patterns for both a sample of the complete data and time-series. The gap statistic with tibshirani criteria did not provide any distinction across the 2 dataset. The gap statistic w/DandF criteria, Model based clustering and hierarchical modeling simply failed to run on our datasets. Thankfully, the vanilla PCA technique was successful in handling our entire dataset. PCA revealed some interesting patterns for the scalar value distribution. Kernel PCA techniques (vanilladot, RBF, Polynomial) and MDS failed to run on the entire dataset, or even a significant fraction of the dataset, and we resorted to creating an explicit feature map followed by conventional PCA. Clustering using K-means and PAM in the new basis set seems to produce promising results. Understanding the new basis set in the scientific context of the problem is challenging, and we are currently working to further examine and interpret the results.
Automated eXpert Spectral Image Analysis
Energy Science and Technology Software Center (OSTI)
2003-11-25
AXSIA performs automated factor analysis of hyperspectral images. In such images, a complete spectrum is collected an each point in a 1-, 2- or 3- dimensional spatial array. One of the remaining obstacles to adopting these techniques for routine use is the difficulty of reducing the vast quantities of raw spectral data to meaningful information. Multivariate factor analysis techniques have proven effective for extracting the essential information from high dimensional data sets into a limtedmore » number of factors that describe the spectral characteristics and spatial distributions of the pure components comprising the sample. AXSIA provides tools to estimate different types of factor models including Singular Value Decomposition (SVD), Principal Component Analysis (PCA), PCA with factor rotation, and Alternating Least Squares-based Multivariate Curve Resolution (MCR-ALS). As part of the analysis process, AXSIA can automatically estimate the number of pure components that comprise the data and can scale the data to account for Poisson noise. The data analysis methods are fundamentally based on eigenanalysis of the data crossproduct matrix coupled with orthogonal eigenvector rotation and constrained alternating least squares refinement. A novel method for automatically determining the number of significant components, which is based on the eigenvalues of the crossproduct matrix, has also been devised and implemented. The data can be compressed spectrally via PCA and spatially through wavelet transforms, and algorithms have been developed that perform factor analysis in the transform domain while retaining full spatial and spectral resolution in the final result. These latter innovations enable the analysis of larger-than core-memory spectrum-images. AXSIA was designed to perform automated chemical phase analysis of spectrum-images acquired by a variety of chemical imaging techniques. Successful applications include Energy Dispersive X-ray Spectroscopy, X-ray Fluorescence Spectroscopy, Laser-Induced Fluorescence Spectroscopy and Time-of-Flight Secondary Ion Mass Spectroscopy.« less
User Statistics Collection Practices | U.S. DOE Office of Science (SC)
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
User Statistics Collection Practices User Facilities User Facilities Home User Facilities at a Glance User Resources User Statistics Policies and Processes Definition Designation Process Official List of SC User Facilities User Statistics Collection Practices User Statistics Collection Practices Archives Frequently Asked Questions User Facility Science Highlights User Facility News Contact Information Office of Science U.S. Department of Energy 1000 Independence Ave., SW Washington, DC 20585 P:
2012_0112_BerylliumStatistics_Attachment7.pdf
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
of Energy 2-2014 Offshore Wind Market and Economic Analysis Reports 2012-2014 Offshore Wind Market and Economic Analysis Reports These reports authored by the Navigant Consortium provide a comprehensive annual assessment of the U.S. offshore wind market from 2012 to 2014. The reports provides stakeholders with a reliable and consistent data source addressing entry barriers and U.S. competitiveness in the offshore wind market. The 2012 edition contains significant policy and economic
Fleming, P. A.; van Wingerden, J. W.; Wright, A. D.
2011-12-01
This paper presents the structure of an ongoing controller comparison experiment at NREL's National Wind Technology Center; the design process for the two controllers compared in this phase of the experiment, and initial comparison results obtained in field-testing. The intention of the study is to demonstrate the advantage of using modern multivariable methods for designing control systems for wind turbines versus conventional approaches. We will demonstrate the advantages through field-test results from experimental turbines located at the NWTC. At least two controllers are being developed side-by-side to meet an incrementally increasing number of turbine load-reduction objectives. The first, a multiple single-input, single-output (m-SISO) approach, uses separately developed decoupled and classicially tuned controllers, which is, to the best of our knowledge, common practice in the wind industry. The remaining controllers are developed using state-space multiple-input and multiple-output (MIMO) techniques to explicity account for coupling between loops and to optimize given known frequency structures of the turbine and disturbance. In this first publication from the study, we present the structure of the ongoing controller comparison experiment, the design process for the two controllers compared in this phase, and initial comparison results obtained in field-testing.
Ensemble Data Analysis ENvironment (EDEN)
Energy Science and Technology Software Center (OSTI)
2012-08-01
The EDEN toolkit facilitates exploratory data analysis and visualization of global climate model simulation datasets. EDEN provides an interactive graphical user interface (GUI) that helps the user visually construct dynamic queries of the characteristically large climate datasets using temporal ranges, variable selections, and geographic areas of interest. EDEN reads the selected data into a multivariate visualization panel which features an extended implementation of parallel coordinates plots as well as interactive scatterplots. The user can querymore » data in the visualization panel using mouse gestures to analyze different ranges of data. The visualization panel provides coordinated multiple views whereby selections made in one plot are propagated to the other plots.« less
Signatures of initial state modifications on bispectrum statistics
Meerburg, P Daniel; Schaar, Jan Pieter van der; Corasaniti, Pier Stefano E-mail: j.p.vanderschaar@uva.nl
2009-05-15
Modifications of the initial-state of the inflaton field can induce a departure from Gaussianity and leave a testable imprint on the higher order correlations of the CMB and large scale structures in the Universe. We focus on the bispectrum statistics of the primordial curvature perturbation and its projection on the CMB. For a canonical single-field action the three-point correlator enhancement is localized, maximizing in the collinear limit, corresponding to enfolded or squashed triangles in comoving momentum space. We show that the available local and equilateral template are very insensitive to this localized enhancement and do not generate noteworthy constraints on initial-state modifications. On the other hand, when considering the addition of a dimension 8 higher order derivative term, we find a dominant rapidly oscillating contribution, which had previously been overlooked and whose significantly enhanced amplitude is independent of the triangle under consideration. Nevertheless, the oscillatory nature of (the sign of) the correlation function implies the signal is nearly orthogonal to currently available observational templates, strongly reducing the sensitivity to the enhancement. Constraints on departures from the standard Bunch-Davies vacuum state can be derived, but also depend on the next-to-leading terms. We emphasize that the construction and application of especially adapted templates could lead to CMB bispectrum constraints on modified initial states already competing with those derived from the power spectrum.
A STATISTICAL STUDY OF TRANSVERSE OSCILLATIONS IN A QUIESCENT PROMINENCE
Hillier, A.; Morton, R. J.; Erdlyi, R.
2013-12-20
The launch of the Hinode satellite has allowed for seeing-free observations at high-resolution and high-cadence making it well suited to study the dynamics of quiescent prominences. In recent years it has become clear that quiescent prominences support small-amplitude transverse oscillations, however, sample sizes are usually too small for general conclusions to be drawn. We remedy this by providing a statistical study of transverse oscillations in vertical prominence threads. Over a 4 hr period of observations it was possible to measure the properties of 3436 waves, finding periods from 50 to 6000s with typical velocity amplitudes ranging between 0.2 and 23km s{sup 1}. The large number of observed waves allows the determination of the frequency dependence of the wave properties and derivation of the velocity power spectrum for the transverse waves. For frequencies less than 7mHz, the frequency dependence of the velocity power is consistent with the velocity power spectra generated from observations of the horizontal motions of magnetic elements in the photosphere, suggesting that the prominence transverse waves are driven by photospheric motions. However, at higher frequencies the two distributions significantly diverge, with relatively more power found at higher frequencies in the prominence oscillations. These results highlight that waves over a large frequency range are ubiquitous in prominences, and that a significant amount of the wave energy is found at higher frequency.
International petroleum statistics report, January 1992. [Contains Glossary
1992-01-01
The International Petroleum Statistics Report presents data on international oil production, consumption, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil consumption and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1980, and monthly data for the most two years. Section 2 presents an oil supply/consumption balance for the market economies (i.e., non-communist countries). This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, consumption, and trade in OECD countries. World oil production and OECD consumption data are for the years 1970 through 1990; OECD stocks from 1973 through 1990; and OECD trade from 1982 through 1990.
A statistical approach to designing mitigation for induced ac voltages
Dabkowski, J. [Electro Sciences, Inc., Crystal Lake, IL (United States)
1996-08-01
Induced voltage levels on buried pipelines collocated with overhead electric power transmission lines are usually mitigated by means of grounding the pipeline. Maximum effectiveness is obtained when grounds are placed at discrete locations along the pipeline where the peak induced voltages occur. The degree of mitigation achieved is dependent upon the local soil resistivity at these locations. On occasion it may be necessary to employ an extensive distributed grounding system, for example, a parallel buried wire connected to the pipe at periodic intervals. In this situation the a priori calculation of mitigated voltage levels is sometimes made assuming an average value for the soil resistivity. Over long distances, however, the soil resistivity generally varies as a log-normally distributed random variable. The effect of this variability upon the predicted mitigated voltage levels is examined. It is found that the predicted levels exhibit a statistical variability which precludes a precise determination of the mitigated voltage levels. Thus, post commissioning testing of the emplaced mitigation system is advisable.
Statistical mechanics of self-driven Carnot cycles
Smith, E.
1999-10-01
The spontaneous generation and finite-amplitude saturation of sound, in a traveling-wave thermoacoustic engine, are derived as properties of a second-order phase transition. It has previously been argued that this dynamical phase transition, called {open_quotes}onset,{close_quotes} has an equivalent equilibrium representation, but the saturation mechanism and scaling were not computed. In this work, the sound modes implementing the engine cycle are coarse-grained and statistically averaged, in a partition function derived from microscopic dynamics on criteria of scale invariance. Self-amplification performed by the engine cycle is introduced through higher-order modal interactions. Stationary points and fluctuations of the resulting phenomenological Lagrangian are analyzed and related to background dynamical currents. The scaling of the stable sound amplitude near the critical point is derived and shown to arise universally from the interaction of finite-temperature disorder, with the order induced by self-amplification. {copyright} {ital 1999} {ital The American Physical Society}
Structure Learning and Statistical Estimation in Distribution Networks - Part I
Deka, Deepjyoti; Backhaus, Scott N.; Chertkov, Michael
2015-02-13
Traditionally power distribution networks are either not observable or only partially observable. This complicates development and implementation of new smart grid technologies, such as those related to demand response, outage detection and management, and improved load-monitoring. In this two part paper, inspired by proliferation of the metering technology, we discuss estimation problems in structurally loopy but operationally radial distribution grids from measurements, e.g. voltage data, which are either already available or can be made available with a relatively minor investment. In Part I, the objective is to learn the operational layout of the grid. Part II of this paper presents algorithms that estimate load statistics or line parameters in addition to learning the grid structure. Further, Part II discusses the problem of structure estimation for systems with incomplete measurement sets. Our newly suggested algorithms apply to a wide range of realistic scenarios. The algorithms are also computationally efficient – polynomial in time– which is proven theoretically and illustrated computationally on a number of test cases. The technique developed can be applied to detect line failures in real time as well as to understand the scope of possible adversarial attacks on the grid.
Glueballs and statistical mechanics of the gluon plasma
Brau, Fabian; Buisseret, Fabien
2009-06-01
We study a pure gluon plasma in the context of quasiparticle models, where the plasma is considered as an ideal gas of massive bosons. In order to reproduce SU(3) gauge field lattice data within such a framework, we review briefly the necessity to use a temperature-dependent gluon mass which accounts for color interactions between the gluons near T{sub c} and agrees with perturbative QCD at large temperatures. Consequently, we discuss the thermodynamics of systems with temperature-dependent Hamiltonians and clarify the situation about the possible solutions proposed in the literature to treat those systems consistently. We then focus our attention on two possible formulations which are thermodynamically consistent, and we extract the gluon mass from the equation of state obtained in SU(3) lattice QCD. We find that the thermal gluon mass is similar in both statistical formalisms. Finally, an interpretation of the gluon plasma as an ideal gas made of glueballs and gluons is also presented. The glueball mass is consistently computed within a relativistic formalism using a potential obtained from lattice QCD. We find that the gluon plasma might be a glueball-rich medium for T < or approx. 1.13T{sub c} and suggest that glueballs could be detected in future experiments dedicated to quark-gluon plasma.
Universal Quake Statistics: From Compressed Nanocrystals to Earthquakes
Uhl, Jonathan T.; Pathak, Shivesh; Schorlemmer, Danijel; Liu, Xin; Swindeman, Ryan; Brinkman, Braden A. W.; LeBlanc, Michael; Tsekenis, Georgios; Friedman, Nir; Behringer, Robert; Denisov, Dmitry; Schall, Peter; Gu, Xiaojun; Wright, Wendelin J.; Hufnagel, Todd; Jennings, Andrew; Greer, Julia R.; Liaw, P. K.; Becker, Thorsten; Dresen, Georg; Dahmen, Karin A.
2015-11-17
Slowly-compressed single crystals, bulk metallic glasses (BMGs), rocks, granular materials, and the earth all deform via intermittent slips or “quakes”. We find that although these systems span 12 decades in length scale, they all show the same scaling behavior for their slip size distributions and other statistical properties. Remarkably, the size distributions follow the same power law multiplied with the same exponential cutoff. The cutoff grows with applied force for materials spanning length scales from nanometers to kilometers. The tuneability of the cutoff with stress reflects “tuned critical” behavior, rather than self-organized criticality (SOC), which would imply stress-independence. A simple mean field model for avalanches of slipping weak spots explains the agreement across scales. It predicts the observed slip-size distributions and the observed stressdependent cutoff function. In conclusion, the results enable extrapolations from one scale to another, and from one force to another, across different materials and structures, from nanocrystals to earthquakes.
Universal Quake Statistics: From Compressed Nanocrystals to Earthquakes
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Uhl, Jonathan T.; Pathak, Shivesh; Schorlemmer, Danijel; Liu, Xin; Swindeman, Ryan; Brinkman, Braden A. W.; LeBlanc, Michael; Tsekenis, Georgios; Friedman, Nir; Behringer, Robert; et al
2015-11-17
Slowly-compressed single crystals, bulk metallic glasses (BMGs), rocks, granular materials, and the earth all deform via intermittent slips or “quakes”. We find that although these systems span 12 decades in length scale, they all show the same scaling behavior for their slip size distributions and other statistical properties. Remarkably, the size distributions follow the same power law multiplied with the same exponential cutoff. The cutoff grows with applied force for materials spanning length scales from nanometers to kilometers. The tuneability of the cutoff with stress reflects “tuned critical” behavior, rather than self-organized criticality (SOC), which would imply stress-independence. A simplemore » mean field model for avalanches of slipping weak spots explains the agreement across scales. It predicts the observed slip-size distributions and the observed stressdependent cutoff function. In conclusion, the results enable extrapolations from one scale to another, and from one force to another, across different materials and structures, from nanocrystals to earthquakes.« less
Financial statistics major US publicly owned electric utilities 1996
1998-03-01
The 1996 edition of The Financial Statistics of Major US Publicly Owned Electric Utilities publication presents 5 years (1992 through 1996) of summary financial data and current year detailed financial data on the major publicly owned electric utilities. The objective of the publication is to provide Federal and State governments, industry, and the general public with current and historical data that can be used for policymaking and decision making purposes related to publicly owned electric utility issues. Generator and nongenerator summaries are presented in this publication. Five years of summary financial data are provided. Summaries of generators for fiscal years ending June 30 and December 31, nongenerators for fiscal years ending June 30 and December 31, and summaries of all respondents are provided. The composite tables present aggregates of income statement and balance sheet data, as well as financial indicators. Composite tables also display electric operation and maintenance expenses, electric utility plant, number of consumers, sales of electricity, and operating revenue, and electric energy account data. 2 figs., 32 tabs.
Statistical Performance Evaluation Of Soft Seat Pressure Relief Valves
Harris, Stephen P.; Gross, Robert E.
2013-03-26
Risk-based inspection methods enable estimation of the probability of failure on demand for spring-operated pressure relief valves at the United States Department of Energy's Savannah River Site in Aiken, South Carolina. This paper presents a statistical performance evaluation of soft seat spring operated pressure relief valves. These pressure relief valves are typically smaller and of lower cost than hard seat (metal to metal) pressure relief valves and can provide substantial cost savings in fluid service applications (air, gas, liquid, and steam) providing that probability of failure on demand (the probability that the pressure relief valve fails to perform its intended safety function during a potentially dangerous over pressurization) is at least as good as that for hard seat valves. The research in this paper shows that the proportion of soft seat spring operated pressure relief valves failing is the same or less than that of hard seat valves, and that for failed valves, soft seat valves typically have failure ratios of proof test pressure to set pressure less than that of hard seat valves.
Statistically qualified neuro-analytic failure detection method and system
Vilim, Richard B.; Garcia, Humberto E.; Chen, Frederick W.
2002-03-02
An apparatus and method for monitoring a process involve development and application of a statistically qualified neuro-analytic (SQNA) model to accurately and reliably identify process change. The development of the SQNA model is accomplished in two stages: deterministic model adaption and stochastic model modification of the deterministic model adaptation. Deterministic model adaption involves formulating an analytic model of the process representing known process characteristics, augmenting the analytic model with a neural network that captures unknown process characteristics, and training the resulting neuro-analytic model by adjusting the neural network weights according to a unique scaled equation error minimization technique. Stochastic model modification involves qualifying any remaining uncertainty in the trained neuro-analytic model by formulating a likelihood function, given an error propagation equation, for computing the probability that the neuro-analytic model generates measured process output. Preferably, the developed SQNA model is validated using known sequential probability ratio tests and applied to the process as an on-line monitoring system. Illustrative of the method and apparatus, the method is applied to a peristaltic pump system.
Statistical properties of super-hot solar flares
Caspi, Amir; Krucker, Sm; Lin, R. P.
2014-01-20
We use Reuven Ramaty High Energy Solar Spectroscopic Imager (RHESSI) high-resolution imaging and spectroscopy observations from ?6 to 100 keV to determine the statistical relationships between measured parameters (temperature, emission measure, etc.) of hot, thermal plasma in 37 intense (GOES M- and X-class) solar flares. The RHESSI data, most sensitive to the hottest flare plasmas, reveal a strong correlation between the maximum achieved temperature and the flare GOES class, such that 'super-hot' temperatures >30 MK are achieved almost exclusively by X-class events; the observed correlation differs significantly from that of GOES-derived temperatures, and from previous studies. A nearly ubiquitous association with high emission measures, electron densities, and instantaneous thermal energies suggests that super-hot plasmas are physically distinct from cooler, ?10-20 MK GOES plasmas, and that they require substantially greater energy input during the flare. High thermal energy densities suggest that super-hot flares require strong coronal magnetic fields, exceeding ?100 G, and that both the plasma ? and volume filling factor f cannot be much less than unity in the super-hot region.
Data analysis and radionuclide scaling factor for the B-Cell waste stream
HILL, R.L.
2000-04-25
This report documents a statistical data analysis of radiological data obtained to characterize the 324 Facility B-Cell decontamination and decommissioning waste stream.
A statistical study of the macroepidemiology of air pollution and total mortality
Lipfert, F.W.; Malone, R.G.; Daum, M.L.; Mendell, N.R.; Yang, Chin-Chun
1988-04-01
A statistical analysis of spatial patterns of 1980 US urban total mortality (all causes) was performed, evaluating demographic, socioeconomic and air pollution factors as predictors. Specific mortality predictors included cigarette smoking, drinking water hardness, heating fuel use, and 1978-1982 annual concentrations of the following air pollutants: ozone, carbon monoxide, sulfate aerosol, particulate concentrations of lead, iron, cadmium, manganese, vanadium, as well as total and fine particle mass concentrations from the inhalable particulate network (dichotomous samplers). In addition, estimates of sulfur dioxide, oxides of nitrogen, and sulfate aerosol were made for each city using the ASTRAP long-range transport diffusion model, and entered into the analysis as independent variables. Because the number of cities with valid air quality and water hardness data varied considerably by pollutant, it was necessary to consider several different data sets, ranging from 48 to 952 cities. The relatively strong associations (ca. 5--10%) shown for 1980 pollution with 1980 total mortality are generally not confirmed by independent studies, for example, in Europe. In addition, the US studies did not find those pollutants with known adverse health effects at the concentrations in question (such as ozone or CO) to be associated with mortality. The question of causality vs. circumstantial association must therefore be regarded as still unresolved. 59 refs., 20 figs., 40 tabs.
Baumgartner, S.; Bieli, R.; Bergmann, U. C.
2012-07-01
An overview is given of existing CPR design criteria and the methods used in BWR reload analysis to evaluate the impact of channel bow on CPR margins. Potential weaknesses in today's methodologies are discussed. Westinghouse in collaboration with KKL and Axpo - operator and owner of the Leibstadt NPP - has developed an optimized CPR methodology based on a new criterion to protect against dryout during normal operation and with a more rigorous treatment of channel bow. The new steady-state criterion is expressed in terms of an upper limit of 0.01 for the dryout failure probability per year. This is considered a meaningful and appropriate criterion that can be directly related to the probabilistic criteria set-up for the analyses of Anticipated Operation Occurrences (AOOs) and accidents. In the Monte Carlo approach a statistical modeling of channel bow and an accurate evaluation of CPR response functions allow the associated CPR penalties to be included directly in the plant SLMCPR and OLMCPR in a best-estimate manner. In this way, the treatment of channel bow is equivalent to all other uncertainties affecting CPR. Emphasis is put on quantifying the statistical distribution of channel bow throughout the core using measurement data. The optimized CPR methodology has been implemented in the Westinghouse Monte Carlo code, McSLAP. The methodology improves the quality of dryout safety assessments by supplying more valuable information and better control of conservatisms in establishing operational limits for CPR. The methodology is demonstrated with application examples from the introduction at KKL. (authors)
Statistical Assessment of Proton Treatment Plans Under Setup and Range Uncertainties
Park, Peter C.; Cheung, Joey P.; Zhu, X. Ronald; Lee, Andrew K.; Sahoo, Narayan; Tucker, Susan L.; Liu, Wei; Li, Heng; Mohan, Radhe; Court, Laurence E.; Dong, Lei
2013-08-01
Purpose: To evaluate a method for quantifying the effect of setup errors and range uncertainties on dose distribution and dosevolume histogram using statistical parameters; and to assess existing planning practice in selected treatment sites under setup and range uncertainties. Methods and Materials: Twenty passively scattered proton lung cancer plans, 10 prostate, and 1 brain cancer scanning-beam proton plan(s) were analyzed. To account for the dose under uncertainties, we performed a comprehensive simulation in which the dose was recalculated 600 times per given plan under the influence of random and systematic setup errors and proton range errors. On the basis of simulation results, we determined the probability of dose variations and calculated the expected values and standard deviations of dosevolume histograms. The uncertainties in dose were spatially visualized on the planning CT as a probability map of failure to target coverage or overdose of critical structures. Results: The expected value of target coverage under the uncertainties was consistently lower than that of the nominal value determined from the clinical target volume coverage without setup error or range uncertainty, with a mean difference of ?1.1% (?0.9% for breath-hold), ?0.3%, and ?2.2% for lung, prostate, and a brain cases, respectively. The organs with most sensitive dose under uncertainties were esophagus and spinal cord for lung, rectum for prostate, and brain stem for brain cancer. Conclusions: A clinically feasible robustness plan analysis tool based on direct dose calculation and statistical simulation has been developed. Both the expectation value and standard deviation are useful to evaluate the impact of uncertainties. The existing proton beam planning method used in this institution seems to be adequate in terms of target coverage. However, structures that are small in volume or located near the target area showed greater sensitivity to uncertainties.
Brady, Samuel L.; Shulkin, Barry L.
2015-02-15
Purpose: To develop ultralow dose computed tomography (CT) attenuation correction (CTAC) acquisition protocols for pediatric positron emission tomography CT (PET CT). Methods: A GE Discovery 690 PET CT hybrid scanner was used to investigate the change to quantitative PET and CT measurements when operated at ultralow doses (1035 mA s). CT quantitation: noise, low-contrast resolution, and CT numbers for 11 tissue substitutes were analyzed in-phantom. CT quantitation was analyzed to a reduction of 90% volume computed tomography dose index (0.39/3.64; mGy) from baseline. To minimize noise infiltration, 100% adaptive statistical iterative reconstruction (ASiR) was used for CT reconstruction. PET images were reconstructed with the lower-dose CTAC iterations and analyzed for: maximum body weight standardized uptake value (SUV{sub bw}) of various diameter targets (range 837 mm), background uniformity, and spatial resolution. Radiation dose and CTAC noise magnitude were compared for 140 patient examinations (76 post-ASiR implementation) to determine relative dose reduction and noise control. Results: CT numbers were constant to within 10% from the nondose reduced CTAC image for 90% dose reduction. No change in SUV{sub bw}, background percent uniformity, or spatial resolution for PET images reconstructed with CTAC protocols was found down to 90% dose reduction. Patient population effective dose analysis demonstrated relative CTAC dose reductions between 62% and 86% (3.2/8.30.9/6.2). Noise magnitude in dose-reduced patient images increased but was not statistically different from predose-reduced patient images. Conclusions: Using ASiR allowed for aggressive reduction in CT dose with no change in PET reconstructed images while maintaining sufficient image quality for colocalization of hybrid CT anatomy and PET radioisotope uptake.
Analysis of Energy Efficiency Program Impacts Based on Program Spending
U.S. Energy Information Administration (EIA) Indexed Site
Analysis of Energy Efficiency Program Impacts Based on Program Spending May 2015 Independent Statistics & Analysis www.eia.gov U.S. Department of Energy Washington, DC 20585 U.S. Energy Information Administration | Analysis of Energy Efficiency Program Impacts Based on Program Spending i This report was prepared by the U.S. Energy Information Administration (EIA), the statistical and analytical agency within the U.S. Department of Energy. By law, EIA's data, analyses, and forecasts are
Behavioral Economics Applied to Energy Demand Analysis: A Foundation
U.S. Energy Information Administration (EIA) Indexed Site
Behavioral Economics Applied to Energy Demand Analysis: A Foundation October 2014 Independent Statistics & Analysis www.eia.gov U.S. Department of Energy Washington, DC 20585 U.S. Energy Information Administration | Behavioral Economics Applied to Energy Demand Analysis i This report was prepared by the U.S. Energy Information Administration (EIA), the statistical and analytical agency within the U.S. Department of Energy. By law, EIA's data, analyses, and forecasts are independent of
Financial statistics of major US publicly owned electric utilities 1992
Not Available
1994-01-01
The 1992 edition of the Financial Statistics of Major US Publicly Owned Electric Utilities publication presents 4 years (1989 through 1992) of summary financial data and current year detailed financial data on the major publicly owned electric utilities. The objective of the publication is to provide Federal and State governments, industry, and the general public with current and historical data that can be used for policymaking and decisionmaking purposes related to publicly owned electric utility issues. Generator and nongenerator summaries are presented in this publication. Four years of summary financial data are provided. Summaries of generators for fiscal years ending June 30 and December 31, nongenerators for fiscal years ending June 30 and December 31, and summaries of all respondents are provided. The composite tables present aggregates of income statement and balance sheet data, as well as financial indicators. Composite tables also display electric operation and maintenance expenses, electric utility plant, number of consumers, sales of electricity, and operating revenue, and electric energy account data. The primary source of publicly owned financial data is the Form EIA-412, {open_quotes}Annual Report of Public Electric Utilities.{close_quotes} Public electric utilities file this survey on a fiscal year, rather than a calendar year basis, in conformance with their recordkeeping practices. In previous editions of this publication, data were aggregated by the two most commonly reported fiscal years, June 30 and December 31. This omitted approximately 20 percent of the respondents who operate on fiscal years ending in other months. Accordingly, the EIA undertook a review of the Form EIA-412 submissions to determine if alternative classifications of publicly owned electric utilities would permit the inclusion of all respondents.
Rosa, B.; Parishani, H.; Ayala, O.; Wang, L.-P.
2015-01-15
In this paper, we study systematically the effects of forcing time scale in the large-scale stochastic forcing scheme of Eswaran and Pope [An examination of forcing in direct numerical simulations of turbulence, Comput. Fluids 16, 257 (1988)] on the simulated flow structures and statistics of forced turbulence. Using direct numerical simulations, we find that the forcing time scale affects the flow dissipation rate and flow Reynolds number. Other flow statistics can be predicted using the altered flow dissipation rate and flow Reynolds number, except when the forcing time scale is made unrealistically large to yield a Taylor microscale flow Reynolds number of 30 and less. We then study the effects of forcing time scale on the kinematic collision statistics of inertial particles. We show that the radial distribution function and the radial relative velocity may depend on the forcing time scale when it becomes comparable to the eddy turnover time. This dependence, however, can be largely explained in terms of altered flow Reynolds number and the changing range of flow length scales present in the turbulent flow. We argue that removing this dependence is important when studying the Reynolds number dependence of the turbulent collision statistics. The results are also compared to those based on a deterministic forcing scheme to better understand the role of large-scale forcing, relative to that of the small-scale turbulence, on turbulent collision of inertial particles. To further elucidate the correlation between the altered flow structures and dynamics of inertial particles, a conditional analysis has been performed, showing that the regions of higher collision rate of inertial particles are well correlated with the regions of lower vorticity. Regions of higher concentration of pairs at contact are found to be highly correlated with the region of high energy dissipation rate.
DOE real property: A yearly statistical handbook, Fiscal year 1992
Not Available
1992-12-31
To assist in the tracking, reporting, and management of the real property, DOE has an automated inventory information system. This handbook was prepared as a resource for DOE officials and others who have a need to reference real property data as part of their day-to-day functions. It is intended as an internal working document, and the information therein is a compilation or analysis of data in the information system. It is divided into: land, buildings, and other structures and facilities.
Statistical behavior in deterministic quantum systems with few degrees of freedom
Jensen, R.V.; Shankar, R.
1985-04-29
Numerical studies of the dynamics of finite quantum spin chains are presented which show that quantum systems with few degrees of freedom (N = 7) can be described by equilibrium statistical mechanics. The success of the statistical description is seen to depend on the interplay between the initial state, the observable, and the Hamiltonian. This work clarifies the impact of integrability and conservation laws on statistical behavior. The relation to quantum chaos is also discussed.
Photon-number statistics of twin beams: Self-consistent measurement, reconstruction, and properties
Pe?ina, Jan Jr.; Haderka, Ond?ej; Michlek, Vclav
2014-12-04
A method for the determination of photon-number statistics of twin beams using the joint signal-idler photocount statistics obtained by an iCCD camera is described. It also provides absolute quantum detection efficiency of the camera. Using the measured photocount statistics, quasi-distributions of integrated intensities are obtained. They attain negative values occurring in characteristic strips an a consequence of pairing of photons in twin beams.
Statistical Mechanics of Prion Diseases (Journal Article) | SciTech Connect
Office of Scientific and Technical Information (OSTI)
Statistical Mechanics of Prion Diseases Citation Details In-Document Search Title: Statistical Mechanics of Prion Diseases We present a two-dimensional, lattice based, protein-level statistical mechanical model for prion diseases (e.g., mad cow disease) with concomitant prion protein misfolding and aggregation. Our studies lead us to the hypothesis that the observed broad incubation time distribution in epidemiological data reflect fluctuation dominated growth seeded by a few nanometer scale
User Statistics Collection Practices | U.S. DOE Office of Science...
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Facility News Contact Information Office of Science U.S. Department of Energy 1000 ... The Office of Science upholds a set of core principles regarding user statistics ...
High Statistics Study of Nearby Type 1a Supernovae. QUEST Camera...
Office of Scientific and Technical Information (OSTI)
Statistics Study of Nearby Type 1a Supernovae. QUEST Camera Short Term Maintenance: Final Technical Report Baltay, Charles 79 ASTRONOMY AND ASTROPHYSICS Study of Type 1a Supernovae...
Statistical Design of Experiment for Li-ion Cell Formation Parameters...
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Design of Experiment for Li-ion Cell Formation Parameters using Gen3 Electrode Materials: Final Summary Statistical Design of Experiment for Li-ion Cell Formation Parameters ...
Investigation of statistical iterative reconstruction for dedicated breast CT
Makeev, Andrey; Glick, Stephen J.
2013-08-15
Purpose: Dedicated breast CT has great potential for improving the detection and diagnosis of breast cancer. Statistical iterative reconstruction (SIR) in dedicated breast CT is a promising alternative to traditional filtered backprojection (FBP). One of the difficulties in using SIR is the presence of free parameters in the algorithm that control the appearance of the resulting image. These parameters require tuning in order to achieve high quality reconstructions. In this study, the authors investigated the penalized maximum likelihood (PML) method with two commonly used types of roughness penalty functions: hyperbolic potential and anisotropic total variation (TV) norm. Reconstructed images were compared with images obtained using standard FBP. Optimal parameters for PML with the hyperbolic prior are reported for the task of detecting microcalcifications embedded in breast tissue.Methods: Computer simulations were used to acquire projections in a half-cone beam geometry. The modeled setup describes a realistic breast CT benchtop system, with an x-ray spectra produced by a point source and an a-Si, CsI:Tl flat-panel detector. A voxelized anthropomorphic breast phantom with 280 ?m microcalcification spheres embedded in it was used to model attenuation properties of the uncompressed woman's breast in a pendant position. The reconstruction of 3D images was performed using the separable paraboloidal surrogates algorithm with ordered subsets. Task performance was assessed with the ideal observer detectability index to determine optimal PML parameters.Results: The authors' findings suggest that there is a preferred range of values of the roughness penalty weight and the edge preservation threshold in the penalized objective function with the hyperbolic potential, which resulted in low noise images with high contrast microcalcifications preserved. In terms of numerical observer detectability index, the PML method with optimal parameters yielded substantially improved performance (by a factor of greater than 10) compared to FBP. The hyperbolic prior was also observed to be superior to the TV norm. A few of the best-performing parameter pairs for the PML method also demonstrated superior performance for various radiation doses. In fact, using PML with certain parameter values results in better images, acquired using 2 mGy dose, than FBP-reconstructed images acquired using 6 mGy dose.Conclusions: A range of optimal free parameters for the PML algorithm with hyperbolic and TV norm-based potentials is presented for the microcalcification detection task, in dedicated breast CT. The reported values can be used as starting values of the free parameters, when SIR techniques are used for image reconstruction. Significant improvement in image quality can be achieved by using PML with optimal combination of parameters, as compared to FBP. Importantly, these results suggest improved detection of microcalcifications can be obtained by using PML with lower radiation dose to the patient, than using FBP with higher dose.
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Anderson-Cook, Christine M.; Morzinski, Jerome; Blecker, Kenneth D.
2015-08-19
Understanding the impact of production, environmental exposure and age characteristics on the reliability of a population is frequently based on underlying science and empirical assessment. When there is incomplete science to prescribe which inputs should be included in a model of reliability to predict future trends, statistical model/variable selection techniques can be leveraged on a stockpile or population of units to improve reliability predictions as well as suggest new mechanisms affecting reliability to explore. We describe a five-step process for exploring relationships between available summaries of age, usage and environmental exposure and reliability. The process involves first identifying potential candidatemore » inputs, then second organizing data for the analysis. Third, a variety of models with different combinations of the inputs are estimated, and fourth, flexible metrics are used to compare them. As a result, plots of the predicted relationships are examined to distill leading model contenders into a prioritized list for subject matter experts to understand and compare. The complexity of the model, quality of prediction and cost of future data collection are all factors to be considered by the subject matter experts when selecting a final model.« less
Anderson-Cook, Christine M.; Morzinski, Jerome; Blecker, Kenneth D.
2015-08-19
Understanding the impact of production, environmental exposure and age characteristics on the reliability of a population is frequently based on underlying science and empirical assessment. When there is incomplete science to prescribe which inputs should be included in a model of reliability to predict future trends, statistical model/variable selection techniques can be leveraged on a stockpile or population of units to improve reliability predictions as well as suggest new mechanisms affecting reliability to explore. We describe a five-step process for exploring relationships between available summaries of age, usage and environmental exposure and reliability. The process involves first identifying potential candidate inputs, then second organizing data for the analysis. Third, a variety of models with different combinations of the inputs are estimated, and fourth, flexible metrics are used to compare them. As a result, plots of the predicted relationships are examined to distill leading model contenders into a prioritized list for subject matter experts to understand and compare. The complexity of the model, quality of prediction and cost of future data collection are all factors to be considered by the subject matter experts when selecting a final model.
Mortality in Appalachian coal mining regions: the value of statistical life lost
Hendryx, M.; Ahern, M.M.
2009-07-15
We examined elevated mortality rates in Appalachian coal mining areas for 1979-2005, and estimated the corresponding value of statistical life (VSL) lost relative to the economic benefits of the coal mining industry. We compared age-adjusted mortality rates and socioeconomic conditions across four county groups: Appalachia with high levels of coal mining, Appalachia with lower mining levels, Appalachia without coal mining, and other counties in the nation. We converted mortality estimates to VSL estimates and compared the results with the economic contribution of coal mining. We also conducted a discount analysis to estimate current benefits relative to future mortality costs. The heaviest coal mining areas of Appalachia had the poorest socioeconomic conditions. Before adjusting for covariates, the number of excess annual age-adjusted deaths in coal mining areas ranged from 3,975 to 10,923, depending on years studied and comparison group. Corresponding VSL estimates ranged from $18.563 billion to $84.544 billion, with a point estimate of $50.010 billion, greater than the $8.088 billion economic contribution of coal mining. After adjusting for covariates, the number of excess annual deaths in mining areas ranged from 1,736 to 2,889, and VSL costs continued to exceed the benefits of mining. Discounting VSL costs into the future resulted in excess costs relative to benefits in seven of eight conditions, with a point estimate of $41.846 billion.
Statistically designed study of the variables and parameters of carbon dioxide equations of state
Donohue, M.D.; Naiman, D.Q.; Jin, Gang; Loehe, J.R.
1991-05-01
Carbon dioxide is used widely in enhanced oil recovery (EOR) processes to maximize the production of crude oil from aging and nearly depleted oil wells. Carbon dioxide also is encountered in many processes related to oil recovery. Accurate representations of the properties of carbon dioxide, and its mixtures with hydrocarbons, play a critical role in a number of enhanced oil recovery operations. One of the first tasks of this project was to select an equation of state to calculate the properties of carbon dioxide and its mixtures. The equations simplicity, accuracy, and reliability in representing phase behavior and thermodynamic properties of mixtures containing carbon dioxide with hydrocarbons at conditions relevant to enhanced oil recovery were taken into account. We also have determined the thermodynamic properties that are important to enhanced oil recovery and the ranges of temperature, pressure and composition that are important. We chose twelve equations of state for preliminary studies to be evaluated against these criteria. All of these equations were tested for pure carbon dioxide and eleven were tested for pure alkanes and their mixtures with carbon dioxide. Two equations, the ALS equation and the ESD equation, were selected for detailed statistical analysis. 54 refs., 41 figs., 36 tabs.
Effects of the vacuum state on the statistics of nonclassical states
Alioui, N.; Amroun-Frahi, A.; Bendjaballah, C.
2007-10-15
Based on the calculation of the Wigner function, some statistical properties of the superposition of two coherent states with a vacuum state are demonstrated. The distance variation difference function is calculated for these states. Application of homodyne statistics shows that the addition (subtraction) of the vacuum state can improve the classical channel capacity of a noiseless binary symmetric system.
Confirmation of standard error analysis techniques applied to...
Office of Scientific and Technical Information (OSTI)
us more about our data, and even about the power and limitations of the EXAFS technique. ... Stern's rule is balanced by the degrees of freedom obtained from a 2 statistical analysis. ...
Violations of the ceiling principle: Exact conditions and statistical evidence
Slimowitz, J.R. ); Cohen, J.E. )
1993-08-01
The National Research Council recommended the use of the ceiling principle in forensic applications of DNA testing on the grounds that the ceiling principle was believed to be [open quotes]conservative,[close quotes] giving estimates greater than or equal to the actual genotype frequencies in the appropriate reference population. The authors show here that the ceiling principle can fail to be conservative in a population with two subpopulations and two loci, each with two alleles at Hardy-Weinberg equilibrium, if there is some linkage disequilibrium between loci. They also show that the ceiling principle can fail in a population with two subpopulations and a single locus with two alleles if Hardy-Weinberg equilibrium does not hold. They given explicit analytical formulas to describe when the ceiling principle fails. By showing that the ceiling principle is not always mathematically reliable, this analysis gives users of the ceiling principle the responsibility of demonstrating that it is conservative for the particular data with which it is used. Reanalysis of VNTR data bases of the FBI provides compelling evidence of two-locus associations within three major ethnic groups (Caucasian, black, and Hispanic) in the United States, even though the loci tested are located on different chromosomes. Before the ceiling principle is implemented, more research should be done to determine whether it may be violated in practice. 19 refs., 5 tabs.
Physics-based statistical model and simulation method of RF propagation in urban environments
Pao, Hsueh-Yuan; Dvorak, Steven L.
2010-09-14
A physics-based statistical model and simulation/modeling method and system of electromagnetic wave propagation (wireless communication) in urban environments. In particular, the model is a computationally efficient close-formed parametric model of RF propagation in an urban environment which is extracted from a physics-based statistical wireless channel simulation method and system. The simulation divides the complex urban environment into a network of interconnected urban canyon waveguides which can be analyzed individually; calculates spectral coefficients of modal fields in the waveguides excited by the propagation using a database of statistical impedance boundary conditions which incorporates the complexity of building walls in the propagation model; determines statistical parameters of the calculated modal fields; and determines a parametric propagation model based on the statistical parameters of the calculated modal fields from which predictions of communications capability may be made.
Tiling Microarray Analysis Tools
2005-05-04
TiMAT is a package of 23 command line Java applications for use in the analysis of Affymetrix tiled genomic microarray data. TiMAT enables: 1) Rebuilding the genome annotation for entire tiled arrays (repeat filtering, chromosomal coordinate assignment). 2) Post processing of oligo intensity values (quantile normalization, median scaling, PMMM transformation), 3) Significance testing (Wilcoxon rank sum and signed rank tests, intensity difference and ratio tests) and Interval refinement (filtering based on multiple statistics, overlap comparisons), 4) Data visualization (detailed thumbnail/zoomed view with Interval Plots and data export to Affymetrix's Integrated Genome Browser) and Data reports (spreadsheet summaries and detailed profiles)
NREL: Energy Analysis - Brian W Bush
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Brian W Bush Photo of Brian Bush Brian W Bush is a member of the Energy Forecasting and Modeling Group in the Strategic Energy Analysis Center. Principal Strategic Analyst On staff from January 2008 Phone number: (303) 384-7472 E-mail: brian.bush@nrel.gov Areas of expertise Energy and infrastructure modeling, simulation, and analysis High performance computing Software architecture, design, implementation, and testing Discrete-event and continuous simulation Statistical analysis Geographic
Assessing Regional Scale Variability in Extreme Value Statistics Under Altered Climate Scenarios
Brunsell, Nathaniel; Mechem, David; Ma, Chunsheng
2015-02-20
Recent studies have suggested that low-frequency modes of climate variability can significantly influence regional climate. The climatology associated with extreme events has been shown to be particularly sensitive. This has profound implications for droughts, heat waves, and food production. We propose to examine regional climate simulations conducted over the continental United States by applying a recently developed technique which combines wavelet multi–resolution analysis with information theory metrics. This research is motivated by two fundamental questions concerning the spatial and temporal structure of extreme events. These questions are 1) what temporal scales of the extreme value distributions are most sensitive to alteration by low-frequency climate forcings and 2) what is the nature of the spatial structure of variation in these timescales? The primary objective is to assess to what extent information theory metrics can be useful in characterizing the nature of extreme weather phenomena. Specifically, we hypothesize that (1) changes in the nature of extreme events will impact the temporal probability density functions and that information theory metrics will be sensitive these changes and (2) via a wavelet multi–resolution analysis, we will be able to characterize the relative contribution of different timescales on the stochastic nature of extreme events. In order to address these hypotheses, we propose a unique combination of an established regional climate modeling approach and advanced statistical techniques to assess the effects of low-frequency modes on climate extremes over North America. The behavior of climate extremes in RCM simulations for the 20th century will be compared with statistics calculated from the United States Historical Climatology Network (USHCN) and simulations from the North American Regional Climate Change Assessment Program (NARCCAP). This effort will serve to establish the baseline behavior of climate extremes, the validity of an innovative multi–resolution information theory approach, and the ability of the RCM modeling framework to represent the low-frequency modulation of extreme climate events. Once the skill of the modeling and analysis methodology has been established, we will apply the same approach for the AR5 (IPCC Fifth Assessment Report) climate change scenarios in order to assess how climate extremes and the the influence of lowfrequency variability on climate extremes might vary under changing climate. The research specifically addresses the DOE focus area 2. Simulation of climate extremes under a changing climate. Specific results will include (1) a better understanding of the spatial and temporal structure of extreme events, (2) a thorough quantification of how extreme values are impacted by low-frequency climate teleconnections, (3) increased knowledge of current regional climate models ability to ascertain these influences, and (4) a detailed examination of the how the distribution of extreme events are likely to change under different climate change scenarios. In addition, this research will assess the ability of the innovative wavelet information theory approach to characterize extreme events. Any and all of these results will greatly enhance society’s ability to understand and mitigate the regional ramifications of future global climate change.
Analysis of Human Genetic Linkage
Boehnke, M.
1991-01-01
Linkage analysis continues in its golden age. The convergence of several factors - advances in molecular biology, advances in statistical models and algorithms, and advances in computing technology - have made possible remarkable successes in the mapping of human genetic diseases and in the construction of human genetic maps. The goals of mapping all the most important simple Mendelian disorders and constructing fine-structure genetic maps for each of the human chromosomes soon will be reached, and linkage methods promise to help us understand the etiologies of many common and complex familial diseases. With the continuing rapid advance of the field, the appearance of the revised edition of Dr. Ott's book is particularly welcome. As with the first edition, the goal of the revised edition is to provide a concise, easy-to-read introduction to human linkage analysis. The revised edition includes chapters on basic genetics and cytogenetics, genes and genetic polymorphisms, aspects of statistical inference, methods of linkage analysis, the informativeness of family data, multipoint linkage analysis, penetrance, numerical and computerized methods, the variability of the recombination fraction, inconsistencies, and linkage analysis with disease loci. The results is not an encyclopedia providing everything one could ever want to know about linkage analysis but, rather, a guide to the important methods, topics, and problems of linkage analysis today. Overall, the book achieves an excellent compromise between presenting important conclusions and working out the details.
Plasma analogy and non-Abelian statistics for Ising-type quantum Hall
Office of Scientific and Technical Information (OSTI)
states (Journal Article) | SciTech Connect Plasma analogy and non-Abelian statistics for Ising-type quantum Hall states Citation Details In-Document Search Title: Plasma analogy and non-Abelian statistics for Ising-type quantum Hall states We study the non-Abelian statistics of quasiparticles in the Ising-type quantum Hall states which are likely candidates to explain the observed Hall conductivity plateaus in the second Landau level, most notably the one at filling fraction {nu}=5/2. We
Statistics of resonance fluorescence of a pair of atoms in a feedback loop
Tomilin, V. A. Il'ichev, L. V.
2013-02-15
The statistics of photoemission events of a pair of closely spaced two-level atoms is calculated in a classical light field whose phase is changed by {pi} after the detection of each spontaneous photon. This statistics is compared with the statistics in the case when the feedback is missing. In both cases, one can observe noticeable antibunching of photons in the range of parameters where no antibunching is observed in a single-atom system. The feedback substantially increases the antibunching. This effect manifests itself more strongly in relatively weak fields and for considerable frequency detunings.
RHIC POWER SUPPLIES-FAILURE STATISTICS FOR RUNS 4, 5, AND 6
BRUNO,D.; GANETIS, G.; SANDBERG, J.; LOUIE, W.; HEPPNER, G.; SCHULTHEISS, C.
2007-06-25
The two rings in the Relativistic Heavy Ion Collider (RFIIC) require a total of 933 power supplies to supply current to highly inductive superconducting magnets. Failure statistics for the RHIC power supplies will be failure associated with the CEPS group's responsibilities. presented for the last three RHIC runs. The failures of the power supplies will be analyzed. The statistics associated with the power supply failures will be presented. Comparisons of the failure statistics for the last three RHIC runs will be shown. Improvements that have increased power supply availability will be discussed.
Webb-Robertson, Bobbie-Jo M.; Bunn, Amoret L.; Bailey, Vanessa L.
2011-01-01
Phospholipid fatty acids (PLFA) have been widely used to characterize environmental microbial communities, generating community profiles that can distinguish phylogenetic or functional groups within the community. The poor specificity of organism groups with fatty acid biomarkers in the classic PLFA-microorganism associations is a confounding factor in many of the statistical classification/clustering approaches traditionally used to interpret PLFA profiles. In this paper we demonstrate that non-linear statistical learning methods, such as a support vector machine (SVM), can more accurately find patterns related to uranyl nitrate exposure in a freshwater periphyton community than linear methods, such as partial least squares discriminant analysis. In addition, probabilistic models of exposure can be derived from the identified lipid biomarkers to demonstrate the potential model-based approach that could be used in remediation. The SVM probability model separates dose groups at accuracies of ~87.0%, ~71.4%, ~87.5%, and 100% for the four groups; Control (non-amended system), low-dose (amended at 10 g U L-1), medium dose (amended at 100 g U L-1), and high dose (500 g U L-1). The SVM model achieved an overall cross-validated classification accuracy of ~87% in contrast to ~59% for the best linear classifier.