National Library of Energy BETA

Sample records for multivariate statistical analysis

  1. Multivariate Statistical Analysis Strategies for Hyperspectral...

    Office of Scientific and Technical Information (OSTI)

    held May 26, 2011 in Seoul, South Korea.; Related Information: Proposed for ... Union of Microbeam Analysis Scoieties held May 26, 2011 in Seoul, South Korea

  2. Classification of Malaysia aromatic rice using multivariate statistical analysis

    SciTech Connect (OSTI)

    Abdullah, A. H.; Adom, A. H.; Shakaff, A. Y. Md; Masnan, M. J.; Zakaria, A.; Rahim, N. A.; Omar, O.

    2015-05-15

    Aromatic rice (Oryza sativa L.) is considered as the best quality premium rice. The varieties are preferred by consumers because of its preference criteria such as shape, colour, distinctive aroma and flavour. The price of aromatic rice is higher than ordinary rice due to its special needed growth condition for instance specific climate and soil. Presently, the aromatic rice quality is identified by using its key elements and isotopic variables. The rice can also be classified via Gas Chromatography Mass Spectrometry (GC-MS) or human sensory panels. However, the uses of human sensory panels have significant drawbacks such as lengthy training time, and prone to fatigue as the number of sample increased and inconsistent. The GC–MS analysis techniques on the other hand, require detailed procedures, lengthy analysis and quite costly. This paper presents the application of in-house developed Electronic Nose (e-nose) to classify new aromatic rice varieties. The e-nose is used to classify the variety of aromatic rice based on the samples odour. The samples were taken from the variety of rice. The instrument utilizes multivariate statistical data analysis, including Principal Component Analysis (PCA), Linear Discriminant Analysis (LDA) and K-Nearest Neighbours (KNN) to classify the unknown rice samples. The Leave-One-Out (LOO) validation approach is applied to evaluate the ability of KNN to perform recognition and classification of the unspecified samples. The visual observation of the PCA and LDA plots of the rice proves that the instrument was able to separate the samples into different clusters accordingly. The results of LDA and KNN with low misclassification error support the above findings and we may conclude that the e-nose is successfully applied to the classification of the aromatic rice varieties.

  3. Multivariate Statistical Analysis of Water Chemistry in Evaluating the Origin of Contamination in Many Devils Wash, Shiprock, New Mexico

    Broader source: Energy.gov [DOE]

    Multivariate Statistical Analysis of Water Chemistry in Evaluating the Origin of Contamination in Many Devils Wash, Shiprock, New Mexico

  4. Characterization of Nuclear Fuel using Multivariate Statistical Analysis

    SciTech Connect (OSTI)

    Robel, M; Robel, M; Robel, M; Kristo, M J; Kristo, M J

    2007-11-27

    Various combinations of reactor type and fuel composition have been characterized using principle components analysis (PCA) of the concentrations of 9 U and Pu isotopes in the 10 fuel as a function of burnup. The use of PCA allows the reduction of the 9-dimensional data (isotopic concentrations) into a 3-dimensional approximation, giving a visual representation of the changes in nuclear fuel composition with burnup. Real-world variation in the concentrations of {sup 234}U and {sup 236}U in the fresh (unirradiated) fuel was accounted for. The effects of reprocessing were also simulated. The results suggest that, 15 even after reprocessing, Pu isotopes can be used to determine both the type of reactor and the initial fuel composition with good discrimination. Finally, partial least squares discriminant analysis (PSLDA) was investigated as a substitute for PCA. Our results suggest that PLSDA is a better tool for this application where separation between known classes is most important.

  5. Multivariate statistical analysis of stream sediments for mineral resources from the Craig NTMS Quadrangle, Colorado

    SciTech Connect (OSTI)

    Beyth, M.; McInteer, C.; Broxton, D.E.; Bolivar, S.L.; Luke, M.E.

    1980-06-01

    Multivariate statistical analyses were carried out on Hydrogeochemical and Stream Sediment Reconnaissance data from the Craig quadrangle, Colorado, to support the National Uranium Resource Evaluation and to evaluate strategic or other important commercial mineral resources. A few areas for favorable uranium mineralization are suggested for parts of the Wyoming Basin, Park Range, and Gore Range. Six potential source rocks for uranium are postulated based on factor score mapping. Vanadium in stream sediments is suggested as a pathfinder for carnotite-type mineralization. A probable northwest trend of lead-zinc-copper mineralization associated with Tertiary intrusions is suggested. A few locations are mapped where copper is associated with cobalt. Concentrations of placer sands containing rare earth elements, probably of commercial value, are indicated for parts of the Sand Wash Basin.

  6. An Application of Multivariate Statistical Analysis for Query-Driven Visualization

    SciTech Connect (OSTI)

    Gosink, Luke J.; Garth, Christoph; Anderson, John C.; Bethel, E. Wes; Joy, Kenneth I.

    2010-03-01

    Abstract?Driven by the ability to generate ever-larger, increasingly complex data, there is an urgent need in the scientific community for scalable analysis methods that can rapidly identify salient trends in scientific data. Query-Driven Visualization (QDV) strategies are among the small subset of techniques that can address both large and highly complex datasets. This paper extends the utility of QDV strategies with a statistics-based framework that integrates non-parametric distribution estimation techniques with a new segmentation strategy to visually identify statistically significant trends and features within the solution space of a query. In this framework, query distribution estimates help users to interactively explore their query's solution and visually identify the regions where the combined behavior of constrained variables is most important, statistically, to their inquiry. Our new segmentation strategy extends the distribution estimation analysis by visually conveying the individual importance of each variable to these regions of high statistical significance. We demonstrate the analysis benefits these two strategies provide and show how they may be used to facilitate the refinement of constraints over variables expressed in a user's query. We apply our method to datasets from two different scientific domains to demonstrate its broad applicability.

  7. Multivariate Statistical Analysis of Water Chemistry in Evaluating the Origin of Contamination in Many Devils Wash, Shiprock, New Mexico

    SciTech Connect (OSTI)

    None, None

    2012-12-31

    This report evaluates the chemistry of seep water occurring in three desert drainages near Shiprock, New Mexico: Many Devils Wash, Salt Creek Wash, and Eagle Nest Arroyo. Through the use of geochemical plotting tools and multivariate statistical analysis techniques, analytical results of samples collected from the three drainages are compared with the groundwater chemistry at a former uranium mill in the Shiprock area (the Shiprock site), managed by the U.S. Department of Energy Office of Legacy Management. The objective of this study was to determine, based on the water chemistry of the samples, if statistically significant patterns or groupings are apparent between the sample populations and, if so, whether there are any reasonable explanations for those groupings.

  8. Method of multivariate spectral analysis

    DOE Patents [OSTI]

    Keenan, Michael R.; Kotula, Paul G.

    2004-01-06

    A method of determining the properties of a sample from measured spectral data collected from the sample by performing a multivariate spectral analysis. The method can include: generating a two-dimensional matrix A containing measured spectral data; providing a weighted spectral data matrix D by performing a weighting operation on matrix A; factoring D into the product of two matrices, C and S.sup.T, by performing a constrained alternating least-squares analysis of D=CS.sup.T, where C is a concentration intensity matrix and S is a spectral shapes matrix; unweighting C and S by applying the inverse of the weighting used previously; and determining the properties of the sample by inspecting C and S. This method can be used to analyze X-ray spectral data generated by operating a Scanning Electron Microscope (SEM) with an attached Energy Dispersive Spectrometer (EDS).

  9. Application of multivariate statistical analysis methods for improved time-of-flight secondary ion mass spectrometry depth profiling of buried interfaces and particulate

    SciTech Connect (OSTI)

    Lloyd, K. G.

    2007-07-15

    Buried irregular interfaces and particulate present special challenges in terms of chemical analysis and identification, and are critical issues in the manufacture of electronic materials and devices. Cross sectioning at the right location is often difficult, and, while dual-beam scanning electron microscopy/focused ion beam instruments can often provide excellent visualization of buried defects, matching chemical analysis may be absent or problematic. Time-of-flight secondary ion mass spectrometry (ToF-SIMS) depth profiling, with its ability to acquire spatially resolved depth profiles while collecting an entire mass spectrum at every 'voxel,' offers a way to revisit the problem of buried defects. Multivariate analysis of the overwhelming amount of data can reduce the output from essentially a depth profile at every mass to a small set of chemically meaningful factors. Data scaling is an important consideration in the application of these methods, and a comparison of scaling procedures is shown. Examples of ToF-SIMS depth profiles of relatively homogeneous layers, severely inhomogeneous layers, and buried particulate are discussed.

  10. Advanced Multivariate Analysis Tools Applied to Surface Analysis...

    Office of Scientific and Technical Information (OSTI)

    Resource Relation: Conference: Proposed for presentation at the Microscopy and ... SUPERCONDUCTIVITY AND SUPERFLUIDITY; MICROANALYSIS; MICROSCOPY; MULTIVARIATE ANALYSIS

  11. Data base for the analysis of compositional characteristics of coal seams and macerals. Final report - Part 10. Variability in the inorganic content of United States' coals: a multivariate statistical study

    SciTech Connect (OSTI)

    Glick, D.C.; Davis, A.

    1984-07-01

    The multivariate statistical techniques of correlation coefficients, factor analysis, and cluster analysis, implemented by computer programs, can be used to process a large data set and produce a summary of relationships between variables and between samples. These techniques were used to find relationships for data on the inorganic constituents of US coals. Three hundred thirty-five whole-seam channel samples from six US coal provinces were analyzed for inorganic variables. After consideration of the attributes of data expressed on ash basis and whole-coal basis, it was decided to perform complete statistical analyses on both data sets. Thirty variables expressed on whole-coal basis and twenty-six variables expressed on ash basis were used. For each inorganic variable, a frequency distribution histogram and a set of summary statistics was produced. These were subdivided to reveal the manner in which concentrations of inorganic constituents vary between coal provinces and between coal regions. Data collected on 124 samples from three stratigraphic groups (Pottsville, Monongahela, Allegheny) in the Appalachian region were studied using analysis of variance to determine degree of variability between stratigraphic levels. Most variables showed differences in mean values between the three groups. 193 references, 71 figures, 54 tables.

  12. Classical least squares multivariate spectral analysis

    DOE Patents [OSTI]

    Haaland, David M.

    2002-01-01

    An improved classical least squares multivariate spectral analysis method that adds spectral shapes describing non-calibrated components and system effects (other than baseline corrections) present in the analyzed mixture to the prediction phase of the method. These improvements decrease or eliminate many of the restrictions to the CLS-type methods and greatly extend their capabilities, accuracy, and precision. One new application of PACLS includes the ability to accurately predict unknown sample concentrations when new unmodeled spectral components are present in the unknown samples. Other applications of PACLS include the incorporation of spectrometer drift into the quantitative multivariate model and the maintenance of a calibration on a drifting spectrometer. Finally, the ability of PACLS to transfer a multivariate model between spectrometers is demonstrated.

  13. Hybrid least squares multivariate spectral analysis methods

    DOE Patents [OSTI]

    Haaland, David M.

    2002-01-01

    A set of hybrid least squares multivariate spectral analysis methods in which spectral shapes of components or effects not present in the original calibration step are added in a following estimation or calibration step to improve the accuracy of the estimation of the amount of the original components in the sampled mixture. The "hybrid" method herein means a combination of an initial classical least squares analysis calibration step with subsequent analysis by an inverse multivariate analysis method. A "spectral shape" herein means normally the spectral shape of a non-calibrated chemical component in the sample mixture but can also mean the spectral shapes of other sources of spectral variation, including temperature drift, shifts between spectrometers, spectrometer drift, etc. The "shape" can be continuous, discontinuous, or even discrete points illustrative of the particular effect.

  14. Hybrid least squares multivariate spectral analysis methods

    DOE Patents [OSTI]

    Haaland, David M.

    2004-03-23

    A set of hybrid least squares multivariate spectral analysis methods in which spectral shapes of components or effects not present in the original calibration step are added in a following prediction or calibration step to improve the accuracy of the estimation of the amount of the original components in the sampled mixture. The hybrid method herein means a combination of an initial calibration step with subsequent analysis by an inverse multivariate analysis method. A spectral shape herein means normally the spectral shape of a non-calibrated chemical component in the sample mixture but can also mean the spectral shapes of other sources of spectral variation, including temperature drift, shifts between spectrometers, spectrometer drift, etc. The shape can be continuous, discontinuous, or even discrete points illustrative of the particular effect.

  15. Independent Statistics & Analysis

    U.S. Energy Information Administration (EIA) Indexed Site

    October 2014 Independent Statistics & Analysis www.eia.gov U.S. Department of Energy Washington, DC 20585 Quarterly Coal Distribution Report April - June 2014 This report was...

  16. Augmented classical least squares multivariate spectral analysis

    DOE Patents [OSTI]

    Haaland, David M.; Melgaard, David K.

    2004-02-03

    A method of multivariate spectral analysis, termed augmented classical least squares (ACLS), provides an improved CLS calibration model when unmodeled sources of spectral variation are contained in a calibration sample set. The ACLS methods use information derived from component or spectral residuals during the CLS calibration to provide an improved calibration-augmented CLS model. The ACLS methods are based on CLS so that they retain the qualitative benefits of CLS, yet they have the flexibility of PLS and other hybrid techniques in that they can define a prediction model even with unmodeled sources of spectral variation that are not explicitly included in the calibration model. The unmodeled sources of spectral variation may be unknown constituents, constituents with unknown concentrations, nonlinear responses, non-uniform and correlated errors, or other sources of spectral variation that are present in the calibration sample spectra. Also, since the various ACLS methods are based on CLS, they can incorporate the new prediction-augmented CLS (PACLS) method of updating the prediction model for new sources of spectral variation contained in the prediction sample set without having to return to the calibration process. The ACLS methods can also be applied to alternating least squares models. The ACLS methods can be applied to all types of multivariate data.

  17. Augmented Classical Least Squares Multivariate Spectral Analysis

    DOE Patents [OSTI]

    Haaland, David M.; Melgaard, David K.

    2005-07-26

    A method of multivariate spectral analysis, termed augmented classical least squares (ACLS), provides an improved CLS calibration model when unmodeled sources of spectral variation are contained in a calibration sample set. The ACLS methods use information derived from component or spectral residuals during the CLS calibration to provide an improved calibration-augmented CLS model. The ACLS methods are based on CLS so that they retain the qualitative benefits of CLS, yet they have the flexibility of PLS and other hybrid techniques in that they can define a prediction model even with unmodeled sources of spectral variation that are not explicitly included in the calibration model. The unmodeled sources of spectral variation may be unknown constituents, constituents with unknown concentrations, nonlinear responses, non-uniform and correlated errors, or other sources of spectral variation that are present in the calibration sample spectra. Also, since the various ACLS methods are based on CLS, they can incorporate the new prediction-augmented CLS (PACLS) method of updating the prediction model for new sources of spectral variation contained in the prediction sample set without having to return to the calibration process. The ACLS methods can also be applied to alternating least squares models. The ACLS methods can be applied to all types of multivariate data.

  18. Augmented Classical Least Squares Multivariate Spectral Analysis

    DOE Patents [OSTI]

    Haaland, David M.; Melgaard, David K.

    2005-01-11

    A method of multivariate spectral analysis, termed augmented classical least squares (ACLS), provides an improved CLS calibration model when unmodeled sources of spectral variation are contained in a calibration sample set. The ACLS methods use information derived from component or spectral residuals during the CLS calibration to provide an improved calibration-augmented CLS model. The ACLS methods are based on CLS so that they retain the qualitative benefits of CLS, yet they have the flexibility of PLS and other hybrid techniques in that they can define a prediction model even with unmodeled sources of spectral variation that are not explicitly included in the calibration model. The unmodeled sources of spectral variation may be unknown constituents, constituents with unknown concentrations, nonlinear responses, non-uniform and correlated errors, or other sources of spectral variation that are present in the calibration sample spectra. Also, since the various ACLS methods are based on CLS, they can incorporate the new prediction-augmented CLS (PACLS) method of updating the prediction model for new sources of spectral variation contained in the prediction sample set without having to return to the calibration process. The ACLS methods can also be applied to alternating least squares models. The ACLS methods can be applied to all types of multivariate data.

  19. Apparatus and system for multivariate spectral analysis

    DOE Patents [OSTI]

    Keenan, Michael R.; Kotula, Paul G.

    2003-06-24

    An apparatus and system for determining the properties of a sample from measured spectral data collected from the sample by performing a method of multivariate spectral analysis. The method can include: generating a two-dimensional matrix A containing measured spectral data; providing a weighted spectral data matrix D by performing a weighting operation on matrix A; factoring D into the product of two matrices, C and S.sup.T, by performing a constrained alternating least-squares analysis of D=CS.sup.T, where C is a concentration intensity matrix and S is a spectral shapes matrix; unweighting C and S by applying the inverse of the weighting used previously; and determining the properties of the sample by inspecting C and S. This method can be used by a spectrum analyzer to process X-ray spectral data generated by a spectral analysis system that can include a Scanning Electron Microscope (SEM) with an Energy Dispersive Detector and Pulse Height Analyzer.

  20. Independent Statistics & Analysis

    U.S. Energy Information Administration (EIA) Indexed Site

    Independent Statistics & Analysis www.eia.gov U.S. Department of Energy Washington, DC 20585 Quarterly Coal Report (Abbreviated) January - March 2016 June 2016 This report was prepared by the U.S. Energy Information Administration (EIA), the statistical and analytical agency within the U.S. Department of Energy. By law, EIA's data, analyses, and forecasts are independent of approval by any other officer or employee of the United States Government. The views in this report therefore should

  1. Independent Statistics & Analysis

    U.S. Energy Information Administration (EIA) Indexed Site

    August 2016 Independent Statistics & Analysis www.eia.gov U.S. Department of Energy Washington, DC 20585 Quarterly Coal Distribution Report October - December 2015 This report was prepared by the U.S. Energy Information Administration (EIA), the statistical and analytical agency within the U.S. Department of Energy. By law, EIA's data, analyses, and forecasts are independent of approval by any other officer or employee of the United States Government. The views in this report therefore

  2. Statistical Analysis.indd

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    LM Stakeholder Interaction and External Communications June 2014 Page 1 OVERVIEW The U.S. Department of Energy (DOE) Offi ce of Legacy Management (LM) makes every effort to communicate with its stakeholders through public and small group meetings, conferences, briefi ngs, news releases, telephone, e-mail, informational materials, and through the LM website. To assess the effectiveness of LM's communication with stakeholders across the nation, an analysis of stakeholder interaction was performed.

  3. Feasibility Study on the Use of On-line Multivariate Statistical Process Control for Safeguards Applications in Natural Uranium Conversion Plants

    SciTech Connect (OSTI)

    Ladd-Lively, Jennifer L

    2014-01-01

    The objective of this work was to determine the feasibility of using on-line multivariate statistical process control (MSPC) for safeguards applications in natural uranium conversion plants. Multivariate statistical process control is commonly used throughout industry for the detection of faults. For safeguards applications in uranium conversion plants, faults could include the diversion of intermediate products such as uranium dioxide, uranium tetrafluoride, and uranium hexafluoride. This study was limited to a 100 metric ton of uranium (MTU) per year natural uranium conversion plant (NUCP) using the wet solvent extraction method for the purification of uranium ore concentrate. A key component in the multivariate statistical methodology is the Principal Component Analysis (PCA) approach for the analysis of data, development of the base case model, and evaluation of future operations. The PCA approach was implemented through the use of singular value decomposition of the data matrix where the data matrix represents normal operation of the plant. Component mole balances were used to model each of the process units in the NUCP. However, this approach could be applied to any data set. The monitoring framework developed in this research could be used to determine whether or not a diversion of material has occurred at an NUCP as part of an International Atomic Energy Agency (IAEA) safeguards system. This approach can be used to identify the key monitoring locations, as well as locations where monitoring is unimportant. Detection limits at the key monitoring locations can also be established using this technique. Several faulty scenarios were developed to test the monitoring framework after the base case or normal operating conditions of the PCA model were established. In all of the scenarios, the monitoring framework was able to detect the fault. Overall this study was successful at meeting the stated objective.

  4. Nonparametric Multivariate Anomaly Analysis in Support of HPC Resilience

    SciTech Connect (OSTI)

    Ostrouchov, George; Naughton, III, Thomas J; Engelmann, Christian; Vallee, Geoffroy R; Scott, Stephen L

    2009-01-01

    Large-scale computing systems provide great potential for scientific exploration. However, the complexity that accompanies these enormous machines raises challeges for both, users and operators. The effective use of such systems is often hampered by failures encountered when running applications on systems containing tens-of-thousands of nodes and hundreds-of-thousands of compute cores capable of yielding petaflops of performance. In systems of this size failure detection is complicated and root-cause diagnosis difficult. This paper describes our recent work in the identification of anomalies in monitoring data and system logs to provide further insights into machine status, runtime behavior, failure modes and failure root causes. It discusses the details of an initial prototype that gathers the data and uses statistical techniques for analysis.

  5. Systematic wavelength selection for improved multivariate spectral analysis

    DOE Patents [OSTI]

    Thomas, Edward V.; Robinson, Mark R.; Haaland, David M.

    1995-01-01

    Methods and apparatus for determining in a biological material one or more unknown values of at least one known characteristic (e.g. the concentration of an analyte such as glucose in blood or the concentration of one or more blood gas parameters) with a model based on a set of samples with known values of the known characteristics and a multivariate algorithm using several wavelength subsets. The method includes selecting multiple wavelength subsets, from the electromagnetic spectral region appropriate for determining the known characteristic, for use by an algorithm wherein the selection of wavelength subsets improves the model's fitness of the determination for the unknown values of the known characteristic. The selection process utilizes multivariate search methods that select both predictive and synergistic wavelengths within the range of wavelengths utilized. The fitness of the wavelength subsets is determined by the fitness function F=.function.(cost, performance). The method includes the steps of: (1) using one or more applications of a genetic algorithm to produce one or more count spectra, with multiple count spectra then combined to produce a combined count spectrum; (2) smoothing the count spectrum; (3) selecting a threshold count from a count spectrum to select these wavelength subsets which optimize the fitness function; and (4) eliminating a portion of the selected wavelength subsets. The determination of the unknown values can be made: (1) noninvasively and in vivo; (2) invasively and in vivo; or (3) in vitro.

  6. Multivariate analysis of remote LIBS spectra using partial least squares, principal component analysis, and related techniques

    SciTech Connect (OSTI)

    Clegg, Samuel M; Barefield, James E; Wiens, Roger C; Sklute, Elizabeth; Dyare, Melinda D

    2008-01-01

    Quantitative analysis with LIBS traditionally employs calibration curves that are complicated by the chemical matrix effects. These chemical matrix effects influence the LIBS plasma and the ratio of elemental composition to elemental emission line intensity. Consequently, LIBS calibration typically requires a priori knowledge of the unknown, in order for a series of calibration standards similar to the unknown to be employed. In this paper, three new Multivariate Analysis (MV A) techniques are employed to analyze the LIBS spectra of 18 disparate igneous and highly-metamorphosed rock samples. Partial Least Squares (PLS) analysis is used to generate a calibration model from which unknown samples can be analyzed. Principal Components Analysis (PCA) and Soft Independent Modeling of Class Analogy (SIMCA) are employed to generate a model and predict the rock type of the samples. These MV A techniques appear to exploit the matrix effects associated with the chemistries of these 18 samples.

  7. Multivariate calibration techniques applied to NIRA (near infrared reflectance analysis) and FTIR (Fourier transform infrared) data

    SciTech Connect (OSTI)

    Long, C.L.

    1991-02-01

    Multivariate calibration techniques can reduce the time required for routine testing and can provide new methods of analysis. Multivariate calibration is commonly used with near infrared reflectance analysis (NIRA) and Fourier transform infrared (FTIR) spectroscopy. Two feasibility studies were performed to determine the capability of NIRA, using multivariate calibration techniques, to perform analyses on the types of samples that are routinely analyzed at this laboratory. The first study performed included a variety of samples and indicated that NIRA would be well-suited to perform analyses on selected materials properties such as water content and hydroxyl number on polyol samples, epoxy content on epoxy resins, water content of desiccants, and the amine values of various amine cure agents. A second study was performed to assess the capability of NIRA to perform quantitative analysis of hydroxyl numbers and water contents of hydroxyl-containing materials. Hydroxyl number and water content were selected for determination because these tests are frequently run on polyol materials and the hydroxyl number determination is time consuming. This study pointed out the necessity of obtaining calibration standards identical to the samples being analyzed for each type of polyol or other material being analyzed. Multivariate calibration techniques are frequently used with FTIR data to determine the composition of a large variety of complex mixtures. A literature search indicated many applications of multivariate calibration to FTIR data. Areas identified where quantitation by FTIR would provide a new capability are quantitation of components in epoxy and silicone resins, polychlorinated biphenyls (PCBs) in oils, and additives to polymers. 19 refs., 15 figs., 6 tabs.

  8. Fluorescence measurements for evaluating the application of multivariate analysis techniques to optically thick environments.

    SciTech Connect (OSTI)

    Reichardt, Thomas A.; Timlin, Jerilyn Ann; Jones, Howland D. T.; Sickafoose, Shane M.; Schmitt, Randal L.

    2010-09-01

    Laser-induced fluorescence measurements of cuvette-contained laser dye mixtures are made for evaluation of multivariate analysis techniques to optically thick environments. Nine mixtures of Coumarin 500 and Rhodamine 610 are analyzed, as well as the pure dyes. For each sample, the cuvette is positioned on a two-axis translation stage to allow the interrogation at different spatial locations, allowing the examination of both primary (absorption of the laser light) and secondary (absorption of the fluorescence) inner filter effects. In addition to these expected inner filter effects, we find evidence that a portion of the absorbed fluorescence is re-emitted. A total of 688 spectra are acquired for the evaluation of multivariate analysis approaches to account for nonlinear effects.

  9. Independent Statistics & Analysis Drilling Productivity Report

    U.S. Energy Information Administration (EIA) Indexed Site

    Independent Statistics & Analysis Drilling Productivity Report The seven regions analyzed in this report accounted for 92% of domestic oil production growth and all domestic natural gas production growth during 2011-14. August 2016 For key tight oil and shale gas regions U.S. Energy Information Administration Contents Year-over-year summary 2 Bakken Region 3 Eagle Ford Region 4 Haynesville Region 5 Marcellus Region 6 Niobrara Region 7 Permian Region 8 Utica Region 9 Explanatory notes 10

  10. Spectral compression algorithms for the analysis of very large multivariate images

    DOE Patents [OSTI]

    Keenan, Michael R.

    2007-10-16

    A method for spectrally compressing data sets enables the efficient analysis of very large multivariate images. The spectral compression algorithm uses a factored representation of the data that can be obtained from Principal Components Analysis or other factorization technique. Furthermore, a block algorithm can be used for performing common operations more efficiently. An image analysis can be performed on the factored representation of the data, using only the most significant factors. The spectral compression algorithm can be combined with a spatial compression algorithm to provide further computational efficiencies.

  11. Enhancing e-waste estimates: Improving data quality by multivariate Input–Output Analysis

    SciTech Connect (OSTI)

    Wang, Feng; Huisman, Jaco; Stevels, Ab; Baldé, Cornelis Peter

    2013-11-15

    Highlights: • A multivariate Input–Output Analysis method for e-waste estimates is proposed. • Applying multivariate analysis to consolidate data can enhance e-waste estimates. • We examine the influence of model selection and data quality on e-waste estimates. • Datasets of all e-waste related variables in a Dutch case study have been provided. • Accurate modeling of time-variant lifespan distributions is critical for estimate. - Abstract: Waste electrical and electronic equipment (or e-waste) is one of the fastest growing waste streams, which encompasses a wide and increasing spectrum of products. Accurate estimation of e-waste generation is difficult, mainly due to lack of high quality data referred to market and socio-economic dynamics. This paper addresses how to enhance e-waste estimates by providing techniques to increase data quality. An advanced, flexible and multivariate Input–Output Analysis (IOA) method is proposed. It links all three pillars in IOA (product sales, stock and lifespan profiles) to construct mathematical relationships between various data points. By applying this method, the data consolidation steps can generate more accurate time-series datasets from available data pool. This can consequently increase the reliability of e-waste estimates compared to the approach without data processing. A case study in the Netherlands is used to apply the advanced IOA model. As a result, for the first time ever, complete datasets of all three variables for estimating all types of e-waste have been obtained. The result of this study also demonstrates significant disparity between various estimation models, arising from the use of data under different conditions. It shows the importance of applying multivariate approach and multiple sources to improve data quality for modelling, specifically using appropriate time-varying lifespan parameters. Following the case study, a roadmap with a procedural guideline is provided to enhance e

  12. Spatial compression algorithm for the analysis of very large multivariate images

    DOE Patents [OSTI]

    Keenan, Michael R.

    2008-07-15

    A method for spatially compressing data sets enables the efficient analysis of very large multivariate images. The spatial compression algorithms use a wavelet transformation to map an image into a compressed image containing a smaller number of pixels that retain the original image's information content. Image analysis can then be performed on a compressed data matrix consisting of a reduced number of significant wavelet coefficients. Furthermore, a block algorithm can be used for performing common operations more efficiently. The spatial compression algorithms can be combined with spectral compression algorithms to provide further computational efficiencies.

  13. Statistical Hot Channel Analysis for the NBSR

    SciTech Connect (OSTI)

    Cuadra A.; Baek J.

    2014-05-27

    A statistical analysis of thermal limits has been carried out for the research reactor (NBSR) at the National Institute of Standards and Technology (NIST). The objective of this analysis was to update the uncertainties of the hot channel factors with respect to previous analysis for both high-enriched uranium (HEU) and low-enriched uranium (LEU) fuels. Although uncertainties in key parameters which enter into the analysis are not yet known for the LEU core, the current analysis uses reasonable approximations instead of conservative estimates based on HEU values. Cumulative distribution functions (CDFs) were obtained for critical heat flux ratio (CHFR), and onset of flow instability ratio (OFIR). As was done previously, the Sudo-Kaminaga correlation was used for CHF and the Saha-Zuber correlation was used for OFI. Results were obtained for probability levels of 90%, 95%, and 99.9%. As an example of the analysis, the results for both the existing reactor with HEU fuel and the LEU core show that CHFR would have to be above 1.39 to assure with 95% probability that there is no CHF. For the OFIR, the results show that the ratio should be above 1.40 to assure with a 95% probability that OFI is not reached.

  14. The Multi-Isotope Process Monitor: Multivariate Analysis of Gamma Spectra

    SciTech Connect (OSTI)

    Orton, Christopher R.; Rutherford, Crystal E.; Fraga, Carlos G.; Schwantes, Jon M.

    2011-10-30

    The International Atomic Energy Agency (IAEA) has established international safeguards standards for fissionable material at spent fuel reprocessing plants to ensure that significant quantities of nuclear material are not diverted from these facilities. Currently, methods to verify material control and accountancy (MC&A) at these facilities require time-consuming and resource-intensive destructive assay (DA). The time delay between sampling and subsequent DA provides a potential opportunity to divert the material out of the appropriate chemical stream. Leveraging new on-line nondestructive assay (NDA) techniques in conjunction with the traditional and highly precise DA methods may provide a more timely, cost-effective and resource efficient means for MC&A verification at such facilities. Pacific Northwest National Laboratory (PNNL) is developing on-line NDA process monitoring technologies, including the Multi-Isotope Process (MIP) Monitor. The MIP Monitor uses gamma spectroscopy and pattern recognition software to identify off-normal conditions in process streams. Recent efforts have been made to explore the basic limits of using multivariate analysis techniques on gamma-ray spectra. This paper will provide an overview of the methods and report our on-going efforts to develop and demonstrate the technology.

  15. Statistical analysis of random duration times

    SciTech Connect (OSTI)

    Engelhardt, M.E.

    1996-04-01

    This report presents basic statistical methods for analyzing data obtained by observing random time durations. It gives nonparametric estimates of the cumulative distribution function, reliability function and cumulative hazard function. These results can be applied with either complete or censored data. Several models which are commonly used with time data are discussed, and methods for model checking and goodness-of-fit tests are discussed. Maximum likelihood estimates and confidence limits are given for the various models considered. Some results for situations where repeated durations such as repairable systems are also discussed.

  16. Characterization of Used Nuclear Fuel with Multivariate Analysis for Process Monitoring

    SciTech Connect (OSTI)

    Dayman, Kenneth J.; Coble, Jamie B.; Orton, Christopher R.; Schwantes, Jon M.

    2014-01-01

    The Multi-Isotope Process (MIP) Monitor combines gamma spectroscopy and multivariate analysis to detect anomalies in various process streams in a nuclear fuel reprocessing system. Measured spectra are compared to models of nominal behavior at each measurement location to detect unexpected changes in system behavior. In order to improve the accuracy and specificity of process monitoring, fuel characterization may be used to more accurately train subsequent models in a full analysis scheme. This paper presents initial development of a reactor-type classifier that is used to select a reactor-specific partial least squares model to predict fuel burnup. Nuclide activities for prototypic used fuel samples were generated in ORIGEN-ARP and used to investigate techniques to characterize used nuclear fuel in terms of reactor type (pressurized or boiling water reactor) and burnup. A variety of reactor type classification algorithms, including k-nearest neighbors, linear and quadratic discriminant analyses, and support vector machines, were evaluated to differentiate used fuel from pressurized and boiling water reactors. Then, reactor type-specific partial least squares models were developed to predict the burnup of the fuel. Using these reactor type-specific models instead of a model trained for all light water reactors improved the accuracy of burnup predictions. The developed classification and prediction models were combined and applied to a large dataset that included eight fuel assembly designs, two of which were not used in training the models, and spanned the range of the initial 235U enrichment, cooling time, and burnup values expected of future commercial used fuel for reprocessing. Error rates were consistent across the range of considered enrichment, cooling time, and burnup values. Average absolute relative errors in burnup predictions for validation data both within and outside the training space were 0.0574% and 0.0597%, respectively. The errors seen in this

  17. Experimental control analysis of a fuel gas saturator. Final report. [Multivariable

    SciTech Connect (OSTI)

    Terwilliger, G.E.; Brower, A.S.; Baheti, R.S.; Smith, R.E.; Brown, D.H.

    1985-01-01

    The multivariable control of the clean fuel gas saturator of a coal gasification process has been demonstrated. First principle process models described the process dynamics from which linear models were generated and used for the actual control designs. The multivariable control was designed, its response to transients simulated and the controls were implemented in a computer controller for a fuel gas saturator. The test results obtained for the gas flow transients showed good correlation with the computer simulations, giving confidence in the ability of the simulation to predict the plant performance for other transients. In this study, both time and frequency domain multivariable design techniques were applied to provide the best possible design and to determine their relative effectiveness. No clear guidelines resulted; it appears that the selection may be made on the basis of personal preference, experience or the availability of computer-aided design tools, rather than inherent technical differences. This EPRI/GE fuel gas saturator control demonstration has shown that multivariable design techniques can be applied to a real process and that practical controls are developed. With suitable process models, presently available computer-aided control design software allows the control design, evaluation and implementation to be completed in a reasonable time period. The application of these techniques to power generation processes is recommended.

  18. Statistical Analysis of Transient Cycle Test Results in a 40...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Statistical Analysis of Transient Cycle Test Results in a 40 ... of calibration and measurement methods deer09shade.pdf ... Evaluation of a Partial Flow Dilution System for Transient ...

  19. A Divergence Statistics Extension to VTK for Performance Analysis.

    SciTech Connect (OSTI)

    Pebay, Philippe Pierre; Bennett, Janine Camille

    2015-02-01

    This report follows the series of previous documents ([PT08, BPRT09b, PT09, BPT09, PT10, PB13], where we presented the parallel descriptive, correlative, multi-correlative, principal component analysis, contingency, k -means, order and auto-correlative statistics engines which we developed within the Visualization Tool Kit ( VTK ) as a scalable, parallel and versatile statistics package. We now report on a new engine which we developed for the calculation of divergence statistics, a concept which we hereafter explain and whose main goal is to quantify the discrepancy, in a stasticial manner akin to measuring a distance, between an observed empirical distribution and a theoretical, "ideal" one. The ease of use of the new diverence statistics engine is illustrated by the means of C++ code snippets. Although this new engine does not yet have a parallel implementation, it has already been applied to HPC performance analysis, of which we provide an example.

  20. Data analysis using the Gnu R system for statistical computation

    SciTech Connect (OSTI)

    Simone, James; /Fermilab

    2011-07-01

    R is a language system for statistical computation. It is widely used in statistics, bioinformatics, machine learning, data mining, quantitative finance, and the analysis of clinical drug trials. Among the advantages of R are: it has become the standard language for developing statistical techniques, it is being actively developed by a large and growing global user community, it is open source software, it is highly portable (Linux, OS-X and Windows), it has a built-in documentation system, it produces high quality graphics and it is easily extensible with over four thousand extension library packages available covering statistics and applications. This report gives a very brief introduction to R with some examples using lattice QCD simulation results. It then discusses the development of R packages designed for chi-square minimization fits for lattice n-pt correlation functions.

  1. Statistical Software for spatial analysis of stratigraphic data sets

    Energy Science and Technology Software Center (OSTI)

    2003-04-08

    Stratistics s a tool for statistical analysis of spatially explicit data sets and model output for description and for model-data comparisons. lt is intended for the analysis of data sets commonly used in geology, such as gamma ray logs and lithologic sequences, as well as 2-D data such as maps. Stratistics incorporates a far wider range of spatial analysis methods drawn from multiple disciplines, than are currently available in other packages. These include incorporation ofmore » techniques from spatial and landscape ecology, fractal analysis, and mathematical geology. Its use should substantially reduce the risk associated with the use of predictive models« less

  2. Feature-Based Statistical Analysis of Combustion Simulation Data

    SciTech Connect (OSTI)

    Bennett, J; Krishnamoorthy, V; Liu, S; Grout, R; Hawkes, E; Chen, J; Pascucci, V; Bremer, P T

    2011-11-18

    We present a new framework for feature-based statistical analysis of large-scale scientific data and demonstrate its effectiveness by analyzing features from Direct Numerical Simulations (DNS) of turbulent combustion. Turbulent flows are ubiquitous and account for transport and mixing processes in combustion, astrophysics, fusion, and climate modeling among other disciplines. They are also characterized by coherent structure or organized motion, i.e. nonlocal entities whose geometrical features can directly impact molecular mixing and reactive processes. While traditional multi-point statistics provide correlative information, they lack nonlocal structural information, and hence, fail to provide mechanistic causality information between organized fluid motion and mixing and reactive processes. Hence, it is of great interest to capture and track flow features and their statistics together with their correlation with relevant scalar quantities, e.g. temperature or species concentrations. In our approach we encode the set of all possible flow features by pre-computing merge trees augmented with attributes, such as statistical moments of various scalar fields, e.g. temperature, as well as length-scales computed via spectral analysis. The computation is performed in an efficient streaming manner in a pre-processing step and results in a collection of meta-data that is orders of magnitude smaller than the original simulation data. This meta-data is sufficient to support a fully flexible and interactive analysis of the features, allowing for arbitrary thresholds, providing per-feature statistics, and creating various global diagnostics such as Cumulative Density Functions (CDFs), histograms, or time-series. We combine the analysis with a rendering of the features in a linked-view browser that enables scientists to interactively explore, visualize, and analyze the equivalent of one terabyte of simulation data. We highlight the utility of this new framework for combustion

  3. Lifetime statistics of quantum chaos studied by a multiscale analysis

    SciTech Connect (OSTI)

    Di Falco, A.; Krauss, T. F. [School of Physics and Astronomy, University of St. Andrews, North Haugh, St. Andrews, KY16 9SS (United Kingdom); Fratalocchi, A. [PRIMALIGHT, Faculty of Electrical Engineering, Applied Mathematics and Computational Science, King Abdullah University of Science and Technology (KAUST), Thuwal 23955-6900 (Saudi Arabia)

    2012-04-30

    In a series of pump and probe experiments, we study the lifetime statistics of a quantum chaotic resonator when the number of open channels is greater than one. Our design embeds a stadium billiard into a two dimensional photonic crystal realized on a silicon-on-insulator substrate. We calculate resonances through a multiscale procedure that combines energy landscape analysis and wavelet transforms. Experimental data is found to follow the universal predictions arising from random matrix theory with an excellent level of agreement.

  4. Statistical Analysis of Abnormal Electric Power Grid Behavior

    SciTech Connect (OSTI)

    Ferryman, Thomas A.; Amidan, Brett G.

    2010-10-30

    Pacific Northwest National Laboratory is developing a technique to analyze Phasor Measurement Unit data to identify typical patterns, atypical events and precursors to a blackout or other undesirable event. The approach combines a data-driven multivariate analysis with an engineering-model approach. The method identifies atypical events, provides a plane English description of the event, and the capability to use drill-down graphics for detailed investigations. The tool can be applied to the entire grid, individual organizations (e.g. TVA, BPA), or specific substations (e.g., TVA_CUMB). The tool is envisioned for (1) event investigations, (2) overnight processing to generate a Morning Report that characterizes the previous days activity with respect to previous activity over the previous 10-30 days, and (3) potentially near-real-time operation to support the grid operators. This paper presents the current status of the tool and illustrations of its application to real world PMU data collected in three 10-day periods in 2007.

  5. Multi-scale statistical analysis of coronal solar activity

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Gamborino, Diana; del-Castillo-Negrete, Diego; Martinell, Julio J.

    2016-07-08

    Multi-filter images from the solar corona are used to obtain temperature maps that are analyzed using techniques based on proper orthogonal decomposition (POD) in order to extract dynamical and structural information at various scales. Exploring active regions before and after a solar flare and comparing them with quiet regions, we show that the multi-scale behavior presents distinct statistical properties for each case that can be used to characterize the level of activity in a region. Information about the nature of heat transport is also to be extracted from the analysis.

  6. Statistical analysis of cascading failures in power grids

    SciTech Connect (OSTI)

    Chertkov, Michael; Pfitzner, Rene; Turitsyn, Konstantin

    2010-12-01

    We introduce a new microscopic model of cascading failures in transmission power grids. This model accounts for automatic response of the grid to load fluctuations that take place on the scale of minutes, when optimum power flow adjustments and load shedding controls are unavailable. We describe extreme events, caused by load fluctuations, which cause cascading failures of loads, generators and lines. Our model is quasi-static in the causal, discrete time and sequential resolution of individual failures. The model, in its simplest realization based on the Directed Current description of the power flow problem, is tested on three standard IEEE systems consisting of 30, 39 and 118 buses. Our statistical analysis suggests a straightforward classification of cascading and islanding phases in terms of the ratios between average number of removed loads, generators and links. The analysis also demonstrates sensitivity to variations in line capacities. Future research challenges in modeling and control of cascading outages over real-world power networks are discussed.

  7. Multivariate Data EXplorer (MDX)

    Energy Science and Technology Software Center (OSTI)

    2012-08-01

    The MDX toolkit facilitates exploratory data analysis and visualization of multivariate datasets. MDX provides and interactive graphical user interface to load, explore, and modify multivariate datasets stored in tabular forms. MDX uses an extended version of the parallel coordinates plot and scatterplots to represent the data. The user can perform rapid visual queries using mouse gestures in the visualization panels to select rows or columns of interest. The visualization panel provides coordinated multiple views wherebymore » selections made in one plot are propagated to the other plots. Users can also export selected data or reconfigure the visualization panel to explore relationships between columns and rows in the data.« less

  8. ROOT: A C++ framework for petabyte data storage, statistical analysis and visualization

    SciTech Connect (OSTI)

    Antcheva, I.; Ballintijn, M.; Bellenot, B.; Biskup, M.; Brun, R.; Buncic, N.; Canal, Ph.; Casadei, D.; Couet, O.; Fine, V.; Franco, L.; ,

    2009-01-01

    ROOT is an object-oriented C++ framework conceived in the high-energy physics (HEP) community, designed for storing and analyzing petabytes of data in an efficient way. Any instance of a C++ class can be stored into a ROOT file in a machine-independent compressed binary format. In ROOT the TTree object container is optimized for statistical data analysis over very large data sets by using vertical data storage techniques. These containers can span a large number of files on local disks, the web or a number of different shared file systems. In order to analyze this data, the user can chose out of a wide set of mathematical and statistical functions, including linear algebra classes, numerical algorithms such as integration and minimization, and various methods for performing regression analysis (fitting). In particular, the RooFit package allows the user to perform complex data modeling and fitting while the RooStats library provides abstractions and implementations for advanced statistical tools. Multivariate classification methods based on machine learning techniques are available via the TMVA package. A central piece in these analysis tools are the histogram classes which provide binning of one- and multi-dimensional data. Results can be saved in high-quality graphical formats like Postscript and PDF or in bitmap formats like JPG or GIF. The result can also be stored into ROOT macros that allow a full recreation and rework of the graphics. Users typically create their analysis macros step by step, making use of the interactive C++ interpreter CINT, while running over small data samples. Once the development is finished, they can run these macros at full compiled speed over large data sets, using on-the-fly compilation, or by creating a stand-alone batch program. Finally, if processing farms are available, the user can reduce the execution time of intrinsically parallel tasks - e.g. data mining in HEP - by using PROOF, which will take care of optimally

  9. Statistical Analysis of Tank 5 Floor Sample Results

    SciTech Connect (OSTI)

    Shine, E. P.

    2013-01-31

    Sampling has been completed for the characterization of the residual material on the floor of Tank 5 in the F-Area Tank Farm at the Savannah River Site (SRS), near Aiken, SC. The sampling was performed by Savannah River Remediation (SRR) LLC using a stratified random sampling plan with volume-proportional compositing. The plan consisted of partitioning the residual material on the floor of Tank 5 into three non-overlapping strata: two strata enclosed accumulations, and a third stratum consisted of a thin layer of material outside the regions of the two accumulations. Each of three composite samples was constructed from five primary sample locations of residual material on the floor of Tank 5. Three of the primary samples were obtained from the stratum containing the thin layer of material, and one primary sample was obtained from each of the two strata containing an accumulation. This report documents the statistical analyses of the analytical results for the composite samples. The objective of the analysis is to determine the mean concentrations and upper 95% confidence (UCL95) bounds for the mean concentrations for a set of analytes in the tank residuals. The statistical procedures employed in the analyses were consistent with the Environmental Protection Agency (EPA) technical guidance by Singh and others [2010]. Savannah River National Laboratory (SRNL) measured the sample bulk density, nonvolatile beta, gross alpha, and the radionuclide1, elemental, and chemical concentrations three times for each of the composite samples. The analyte concentration data were partitioned into three separate groups for further analysis: analytes with every measurement above their minimum detectable concentrations (MDCs), analytes with no measurements above their MDCs, and analytes with a mixture of some measurement results above and below their MDCs. The means, standard deviations, and UCL95s were computed for the analytes in the two groups that had at least some measurements

  10. Ensemble Solar Forecasting Statistical Quantification and Sensitivity Analysis: Preprint

    SciTech Connect (OSTI)

    Cheung, WanYin; Zhang, Jie; Florita, Anthony; Hodge, Bri-Mathias; Lu, Siyuan; Hamann, Hendrik F.; Sun, Qian; Lehman, Brad

    2015-12-08

    Uncertainties associated with solar forecasts present challenges to maintain grid reliability, especially at high solar penetrations. This study aims to quantify the errors associated with the day-ahead solar forecast parameters and the theoretical solar power output for a 51-kW solar power plant in a utility area in the state of Vermont, U.S. Forecasts were generated by three numerical weather prediction (NWP) models, including the Rapid Refresh, the High Resolution Rapid Refresh, and the North American Model, and a machine-learning ensemble model. A photovoltaic (PV) performance model was adopted to calculate theoretical solar power generation using the forecast parameters (e.g., irradiance, cell temperature, and wind speed). Errors of the power outputs were quantified using statistical moments and a suite of metrics, such as the normalized root mean squared error (NRMSE). In addition, the PV model's sensitivity to different forecast parameters was quantified and analyzed. Results showed that the ensemble model yielded forecasts in all parameters with the smallest NRMSE. The NRMSE of solar irradiance forecasts of the ensemble NWP model was reduced by 28.10% compared to the best of the three NWP models. Further, the sensitivity analysis indicated that the errors of the forecasted cell temperature attributed only approximately 0.12% to the NRMSE of the power output as opposed to 7.44% from the forecasted solar irradiance.

  11. Plutonium metal exchange program : current status and statistical analysis

    SciTech Connect (OSTI)

    Tandon, L.; Eglin, J. L.; Michalak, S. E.; Picard, R. R.; Temer, D. J.

    2004-01-01

    The Rocky Flats Plutonium (Pu) Metal Sample Exchange program was conducted to insure the quality and intercomparability of measurements such as Pu assay, Pu isotopics, and impurity analyses. The Rocky Flats program was discontinued in 1989 after more than 30 years. In 2001, Los Alamos National Laboratory (LANL) reestablished the Pu Metal Exchange program. In addition to the Atomic Weapons Establishment (AWE) at Aldermaston, six Department of Energy (DOE) facilities Argonne East, Argonne West, Livermore, Los Alamos, New Brunswick Laboratory, and Savannah River are currently participating in the program. Plutonium metal samples are prepared and distributed to the sites for destructive measurements to determine elemental concentration, isotopic abundance, and both metallic and nonmetallic impurity levels. The program provides independent verification of analytical measurement capabilies for each participating facility and allows problems in analytical methods to be identified. The current status of the program will be discussed with emphasis on the unique statistical analysis and modeling of the data developed for the program. The discussion includes the definition of the consensus values for each analyte (in the presence and absence of anomalous values and/or censored values), and interesting features of the data and the results.

  12. Thermal hydraulic limits analysis using statistical propagation of parametric uncertainties

    SciTech Connect (OSTI)

    Chiang, K. Y.; Hu, L. W.; Forget, B.

    2012-07-01

    The MIT Research Reactor (MITR) is evaluating the conversion from highly enriched uranium (HEU) to low enrichment uranium (LEU) fuel. In addition to the fuel element re-design, a reactor power upgraded from 6 MW to 7 MW is proposed in order to maintain the same reactor performance of the HEU core. Previous approach in analyzing the impact of engineering uncertainties on thermal hydraulic limits via the use of engineering hot channel factors (EHCFs) was unable to explicitly quantify the uncertainty and confidence level in reactor parameters. The objective of this study is to develop a methodology for MITR thermal hydraulic limits analysis by statistically combining engineering uncertainties with an aim to eliminate unnecessary conservatism inherent in traditional analyses. This method was employed to analyze the Limiting Safety System Settings (LSSS) for the MITR, which is the avoidance of the onset of nucleate boiling (ONB). Key parameters, such as coolant channel tolerances and heat transfer coefficients, were considered as normal distributions using Oracle Crystal Ball to calculate ONB. The LSSS power is determined with 99.7% confidence level. The LSSS power calculated using this new methodology is 9.1 MW, based on core outlet coolant temperature of 60 deg. C, and primary coolant flow rate of 1800 gpm, compared to 8.3 MW obtained from the analytical method using the EHCFs with same operating conditions. The same methodology was also used to calculate the safety limit (SL) for the MITR, conservatively determined using onset of flow instability (OFI) as the criterion, to verify that adequate safety margin exists between LSSS and SL. The calculated SL is 10.6 MW, which is 1.5 MW higher than LSSS. (authors)

  13. Deep Data Analysis of Conductive Phenomena on Complex Oxide Interfaces...

    Office of Scientific and Technical Information (OSTI)

    This approach conjoins multivariate statistical analysis with physics-based ... local transport and other functional phenomena in other spatially inhomogeneous systems. ...

  14. Statistical language analysis for automatic exfiltration event detection.

    SciTech Connect (OSTI)

    Robinson, David Gerald

    2010-04-01

    This paper discusses the recent development a statistical approach for the automatic identification of anomalous network activity that is characteristic of exfiltration events. This approach is based on the language processing method eferred to as latent dirichlet allocation (LDA). Cyber security experts currently depend heavily on a rule-based framework for initial detection of suspect network events. The application of the rule set typically results in an extensive list of uspect network events that are then further explored manually for suspicious activity. The ability to identify anomalous network events is heavily dependent on the experience of the security personnel wading through the network log. Limitations f this approach are clear: rule-based systems only apply to exfiltration behavior that has previously been observed, and experienced cyber security personnel are rare commodities. Since the new methodology is not a discrete rule-based pproach, it is more difficult for an insider to disguise the exfiltration events. A further benefit is that the methodology provides a risk-based approach that can be implemented in a continuous, dynamic or evolutionary fashion. This permits uspect network activity to be identified early with a quantifiable risk associated with decision making when responding to suspicious activity.

  15. NEPA litigation 1988-1995: A detailed statistical analysis

    SciTech Connect (OSTI)

    Reinke, D.C.; Robitaille, P.

    1997-08-01

    The intent of this study was to identify trends and lessons learned from litigated NEPA documents and to compare and contrast these trends among Federal agencies. More than 350 NEPA cases were collected, reviewed, and analyzed. Of the NEPA cases reviewed, more than 170 were appeals or Supreme Court cases, mostly from the late 1980s through 1995. For this time period, the sampled documents represent the majority of the appeals court cases and all the Supreme Court cases. Additionally, over 170 district court cases were also examined as a representative sample of district court decisions on NEPA. Cases on agency actions found to need NEPA documentation (but that had no documentation) and cases on NEPA documents that were found to be inadequate were pooled and examined to determine the factors that were responsible for these findings. The inadequate documents were specifically examined to determine if there were any general trends. The results are shown in detailed statistical terms. Generally, when a Federal agency has some type of NEPA documentation (e.g., CX, EA, or EIS) and at least covers the basic NEPA procedural requirements, the agency typically wins the litigation. NEPA documents that lose generally have serious errors of omission. An awareness and understanding of the errors of omission can help Federal agencies to ensure that they produce winner a greater percentage of the time.

  16. Templates and Examples — Statistics and Search Log Analysis

    Office of Energy Efficiency and Renewable Energy (EERE)

    Here you will find custom templates and EERE-specific examples you can use to plan, conduct, and report on your usability and analysis activities. These templates are examples of forms you might use, but you are not required to use them for EERE products.

  17. An efficient parallel sampling technique for Multivariate Poisson-Lognormal model: Analysis with two crash count datasets

    SciTech Connect (OSTI)

    Zhan, Xianyuan; Aziz, H. M. Abdul; Ukkusuri, Satish V.

    2015-11-19

    Our study investigates the Multivariate Poisson-lognormal (MVPLN) model that jointly models crash frequency and severity accounting for correlations. The ordinary univariate count models analyze crashes of different severity level separately ignoring the correlations among severity levels. The MVPLN model is capable to incorporate the general correlation structure and takes account of the over dispersion in the data that leads to a superior data fitting. But, the traditional estimation approach for MVPLN model is computationally expensive, which often limits the use of MVPLN model in practice. In this work, a parallel sampling scheme is introduced to improve the original Markov Chain Monte Carlo (MCMC) estimation approach of the MVPLN model, which significantly reduces the model estimation time. Two MVPLN models are developed using the pedestrian vehicle crash data collected in New York City from 2002 to 2006, and the highway-injury data from Washington State (5-year data from 1990 to 1994) The Deviance Information Criteria (DIC) is used to evaluate the model fitting. The estimation results show that the MVPLN models provide a superior fit over univariate Poisson-lognormal (PLN), univariate Poisson, and Negative Binomial models. Moreover, the correlations among the latent effects of different severity levels are found significant in both datasets that justifies the importance of jointly modeling crash frequency and severity accounting for correlations.

  18. An efficient parallel sampling technique for Multivariate Poisson-Lognormal model: Analysis with two crash count datasets

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Zhan, Xianyuan; Aziz, H. M. Abdul; Ukkusuri, Satish V.

    2015-11-19

    Our study investigates the Multivariate Poisson-lognormal (MVPLN) model that jointly models crash frequency and severity accounting for correlations. The ordinary univariate count models analyze crashes of different severity level separately ignoring the correlations among severity levels. The MVPLN model is capable to incorporate the general correlation structure and takes account of the over dispersion in the data that leads to a superior data fitting. But, the traditional estimation approach for MVPLN model is computationally expensive, which often limits the use of MVPLN model in practice. In this work, a parallel sampling scheme is introduced to improve the original Markov Chainmore » Monte Carlo (MCMC) estimation approach of the MVPLN model, which significantly reduces the model estimation time. Two MVPLN models are developed using the pedestrian vehicle crash data collected in New York City from 2002 to 2006, and the highway-injury data from Washington State (5-year data from 1990 to 1994) The Deviance Information Criteria (DIC) is used to evaluate the model fitting. The estimation results show that the MVPLN models provide a superior fit over univariate Poisson-lognormal (PLN), univariate Poisson, and Negative Binomial models. Moreover, the correlations among the latent effects of different severity levels are found significant in both datasets that justifies the importance of jointly modeling crash frequency and severity accounting for correlations.« less

  19. U.S. Energy Information Administration Independent Statistics & Analysis

    U.S. Energy Information Administration (EIA) Indexed Site

    EIA's crude-by-rail data For EIA Energy Conference June 16, 2015 | Washington, DC By Mindi Farber-DeAnda, Biofuels and Emerging Technologies Team Office of Petroleum, Natural Gas, and Biofuels Analysis Takeaways * At the end of March, EIA published monthly crude-by-rail (CBR) data along with its monthly petroleum supply balances * EIA monthly data provides credible and publicly-available information on CBR movements, including historical monthly data starting in 2010 * Inter-regional CBR

  20. A Statistical Analysis Of Bottom-Hole Temperature Data In The...

    Open Energy Info (EERE)

    Statistical Analysis Of Bottom-Hole Temperature Data In The Hinton Area Of West-Central Alberta Jump to: navigation, search OpenEI Reference LibraryAdd to library Journal Article:...

  1. Analysis of compressive fracture in rock using statistical techniques

    SciTech Connect (OSTI)

    Blair, S.C.

    1994-12-01

    Fracture of rock in compression is analyzed using a field-theory model, and the processes of crack coalescence and fracture formation and the effect of grain-scale heterogeneities on macroscopic behavior of rock are studied. The model is based on observations of fracture in laboratory compression tests, and incorporates assumptions developed using fracture mechanics analysis of rock fracture. The model represents grains as discrete sites, and uses superposition of continuum and crack-interaction stresses to create cracks at these sites. The sites are also used to introduce local heterogeneity. Clusters of cracked sites can be analyzed using percolation theory. Stress-strain curves for simulated uniaxial tests were analyzed by studying the location of cracked sites, and partitioning of strain energy for selected intervals. Results show that the model implicitly predicts both development of shear-type fracture surfaces and a strength-vs-size relation that are similar to those observed for real rocks. Results of a parameter-sensitivity analysis indicate that heterogeneity in the local stresses, attributed to the shape and loading of individual grains, has a first-order effect on strength, and that increasing local stress heterogeneity lowers compressive strength following an inverse power law. Peak strength decreased with increasing lattice size and decreasing mean site strength, and was independent of site-strength distribution. A model for rock fracture based on a nearest-neighbor algorithm for stress redistribution is also presented and used to simulate laboratory compression tests, with promising results.

  2. Integrated Data Collection Analysis (IDCA) Program - Statistical Analysis of RDX Standard Data Sets

    SciTech Connect (OSTI)

    Sandstrom, Mary M.; Brown, Geoffrey W.; Preston, Daniel N.; Pollard, Colin J.; Warner, Kirstin F.; Sorensen, Daniel N.; Remmers, Daniel L.; Phillips, Jason J.; Shelley, Timothy J.; Reyes, Jose A.; Hsu, Peter C.; Reynolds, John G.

    2015-10-30

    The Integrated Data Collection Analysis (IDCA) program is conducting a Proficiency Test for Small- Scale Safety and Thermal (SSST) testing of homemade explosives (HMEs). Described here are statistical analyses of the results for impact, friction, electrostatic discharge, and differential scanning calorimetry analysis of the RDX Type II Class 5 standard. The material was tested as a well-characterized standard several times during the proficiency study to assess differences among participants and the range of results that may arise for well-behaved explosive materials. The analyses show that there are detectable differences among the results from IDCA participants. While these differences are statistically significant, most of them can be disregarded for comparison purposes to assess potential variability when laboratories attempt to measure identical samples using methods assumed to be nominally the same. The results presented in this report include the average sensitivity results for the IDCA participants and the ranges of values obtained. The ranges represent variation about the mean values of the tests of between 26% and 42%. The magnitude of this variation is attributed to differences in operator, method, and environment as well as the use of different instruments that are also of varying age. The results appear to be a good representation of the broader safety testing community based on the range of methods, instruments, and environments included in the IDCA Proficiency Test.

  3. Differentiation of Microbial Species and Strains in Coculture Biofilms by Multivariate Analysis of Laser Desorption Postionization Mass Spectra

    SciTech Connect (OSTI)

    University of Illinois at Chicago; Montana State University; Bhardwaj, Chhavi; Cui, Yang; Hofstetter, Theresa; Liu, Suet Yi; Bernstein, Hans C.; Carlson, Ross P.; Ahmed, Musahid; Hanley, Luke

    2013-04-01

    7.87 to 10.5 eV vacuum ultraviolet (VUV) photon energies were used in laser desorption postionization mass spectrometry (LDPI-MS) to analyze biofilms comprised of binary cultures of interacting microorganisms. The effect of photon energy was examined using both tunable synchrotron and laser sources of VUV radiation. Principal components analysis (PCA) was applied to the MS data to differentiate species in Escherichia coli-Saccharomyces cerevisiae coculture biofilms. PCA of LDPI-MS also differentiated individual E. coli strains in a biofilm comprised of two interacting gene deletion strains, even though these strains differed from the wild type K-12 strain by no more than four gene deletions each out of approximately 2000 genes. PCA treatment of 7.87 eV LDPI-MS data separated the E. coli strains into three distinct groups two ?pure? groups and a mixed region. Furthermore, the ?pure? regions of the E. coli cocultures showed greater variance by PCA when analyzed by 7.87 eV photon energies than by 10.5 eV radiation. Comparison of the 7.87 and 10.5 eV data is consistent with the expectation that the lower photon energy selects a subset of low ionization energy analytes while 10.5 eV is more inclusive, detecting a wider range of analytes. These two VUV photon energies therefore give different spreads via PCA and their respective use in LDPI-MS constitute an additional experimental parameter to differentiate strains and species.

  4. Statistical Analysis of Transient Cycle Test Results in a 40 CFR Part 1065

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Engine Dynamometer Test Cell | Department of Energy Analysis of Transient Cycle Test Results in a 40 CFR Part 1065 Engine Dynamometer Test Cell Statistical Analysis of Transient Cycle Test Results in a 40 CFR Part 1065 Engine Dynamometer Test Cell Effects of ""new"" engine testing procedures (40 CFR Part 1065) with respect to repeatability of transient engine dynamometer tests were examined as well as the effects of calibration and measurement methods deer09_shade.pdf

  5. STATISTICAL ANALYSIS OF CURRENT SHEETS IN THREE-DIMENSIONAL MAGNETOHYDRODYNAMIC TURBULENCE

    SciTech Connect (OSTI)

    Zhdankin, Vladimir; Boldyrev, Stanislav; Uzdensky, Dmitri A.; Perez, Jean C. E-mail: boldyrev@wisc.edu E-mail: jcperez@wisc.edu

    2013-07-10

    We develop a framework for studying the statistical properties of current sheets in numerical simulations of magnetohydrodynamic (MHD) turbulence with a strong guide field, as modeled by reduced MHD. We describe an algorithm that identifies current sheets in a simulation snapshot and then determines their geometrical properties (including length, width, and thickness) and intensities (peak current density and total energy dissipation rate). We then apply this procedure to simulations of reduced MHD and perform a statistical analysis on the obtained population of current sheets. We evaluate the role of reconnection by separately studying the populations of current sheets which contain magnetic X-points and those which do not. We find that the statistical properties of the two populations are different in general. We compare the scaling of these properties to phenomenological predictions obtained for the inertial range of MHD turbulence. Finally, we test whether the reconnecting current sheets are consistent with the Sweet-Parker model.

  6. Statistical planning and analysis for treatments of tar sand wastewater. Final report

    SciTech Connect (OSTI)

    Pirie, W.R.

    1984-03-01

    The first part of this report discusses the overall statistical planning, coordination and design for several tar sand wastewater treatment projects contracted by the Laramie Energy Technology Center (LETC) of the Department of Energy. A general discussion of the benefits of consistent statistical design and analysis for data-oriented projects is included, with recommendations for implementation. A detailed outline of the principles of general linear models design is followed by an introduction to recent developments in general linear models by ranks (GLMR) analysis and a comparison to standard analysis using Gaussian or normal theory (GLMN). A listing of routines contained in the VPI Nonparametric Statistics Package (NPSP), installed on the Cyber computer system at the University of Wyoming is included. Part 2 describes in detail the design and analysis for treatments by Gas Flotation, Foam Separation, Coagulation, and Ozonation, with comparisons among the first three methods. Rank methods are used for most analyses, and several detailed examples are included. For optimization studies, the powerful tools of response surface analysis (RSA) are employed, and several sections contain discussion on the benefits of RSA. All four treatment methods proved to be effective for removal of TOC and suspended solids from the wastewater. Because the processes and equipment designs were new, optimum removals were not achieved by these initial studies and reasons for that are discussed. Pollutant levels were nevertheless reduced to levels appropriate for recycling within the process, and for such reuses as steam generation, according to the DOE/LETC project officer. 12 refs., 8 figs., 21 tabs.

  7. Statistical analysis of multipole components in the magnetic field of the RHIC arc regions

    SciTech Connect (OSTI)

    Beebe-Wang,J.; Jain, A.

    2009-05-04

    The existence of multipolar components in the dipole and quadrupole magnets is one of the factors limiting the beam stability in the RHIC operations. Therefore, the statistical properties of the non-linear fields are crucial for understanding the beam behavior and for achieving the superior performance in RHIC. In an earlier work [1], the field quality analysis of the RHIC interaction regions (IR) was presented. Furthermore, a procedure for developing non-linear IR models constructed from measured multipolar data of RHIC IR magnets was described. However, the field quality in the regions outside of the RHIC IR had not yet been addressed. In this paper, we present the statistical analysis of multipolar components in the magnetic fields of the RHIC arc regions. The emphasis is on the lower order components, especially the sextupole in the arc dipole and the 12-pole in the quadrupole magnets, since they are shown to have the strongest effects on the beam stability. Finally, the inclusion of the measured multipolar components data of RHIC arc regions and their statistical properties into tracking models is discussed.

  8. HotPatch Web Gateway: Statistical Analysis of Unusual Patches on Protein Surfaces

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Pettit, Frank K.; Bowie, James U. [DOE-Molecular Biology Institute

    HotPatch finds unusual patches on the surface of proteins, and computes just how unusual they are (patch rareness), and how likely each patch is to be of functional importance (functional confidence (FC).) The statistical analysis is done by comparing your protein's surface against the surfaces of a large set of proteins whose functional sites are known. Optionally, HotPatch can also write a script that will display the patches on the structure, when the script is loaded into some common molecular visualization programs. HotPatch generates complete statistics (functional confidence and patch rareness) on the most significant patches on your protein. For each property you choose to analyze, you'll receive an email to which will be attached a PDB-format file in which atomic B-factors (temp. factors) are replaced by patch indices; and the PDB file's Header Remarks will give statistical scores and a PDB-format file in which atomic B-factors are replaced by the raw values of the property used for patch analysis (for example, hydrophobicity instead of hydrophobic patches). [Copied with edits from http://hotpatch.mbi.ucla.edu/

  9. Stochastic response analysis of tension leg platform: A statistical quadratization and cubicization approach

    SciTech Connect (OSTI)

    Kareem, A.; Zhao, J.

    1994-12-31

    The nonlinearities in the wind and wave loadings of compliant offshore platforms and in their structural characteristics result in response statistics that deviate from a Gaussian distribution. This paper focuses on the analysis of the response of these structures to random nonlinear wind and wave loads. As an improvement over the commonly used linearization approach an equivalent statistical quadratization (ESQ) and cubicization (ESC) approach is presented. The nonlinear loading or structural characteristics can be expressed in terms of an equivalent polynomial that contains terms up to quadratic or cubic depending on the type of nonlinearity. The response statistics and its cumulants are based on Volterra theory. A direct integration scheme is utilized to evaluate the response cumulants. The results provide a good comparison with simulation. It is noted that the ESQ provides an accurate description of the systems with asymmetrical nonlinearities, whereas, for symmetrical nonlinearities the ESC provides a good representation. Based on the information on higher-order cumulants, the response pdf, crossing rates and peak value distributions can be derived.

  10. An Asynchronous Many-Task Implementation of In-Situ Statistical Analysis using Legion.

    SciTech Connect (OSTI)

    Pebay, Philippe Pierre; Bennett, Janine Camille

    2015-11-01

    In this report, we propose a framework for the design and implementation of in-situ analy- ses using an asynchronous many-task (AMT) model, using the Legion programming model together with the MiniAero mini-application as a surrogate for full-scale parallel scientific computing applications. The bulk of this work consists of converting the Learn/Derive/Assess model which we had initially developed for parallel statistical analysis using MPI [PTBM11], from a SPMD to an AMT model. In this goal, we propose an original use of the concept of Legion logical regions as a replacement for the parallel communication schemes used for the only operation of the statistics engines that require explicit communication. We then evaluate this proposed scheme in a shared memory environment, using the Legion port of MiniAero as a proxy for a full-scale scientific application, as a means to provide input data sets of variable size for the in-situ statistical analyses in an AMT context. We demonstrate in particular that the approach has merit, and warrants further investigation, in collaboration with ongoing efforts to improve the overall parallel performance of the Legion system.

  11. In-Situ Statistical Analysis of Autotune Simulation Data using Graphical Processing Units

    SciTech Connect (OSTI)

    Ranjan, Niloo; Sanyal, Jibonananda; New, Joshua Ryan

    2013-08-01

    Developing accurate building energy simulation models to assist energy efficiency at speed and scale is one of the research goals of the Whole-Building and Community Integration group, which is a part of Building Technologies Research and Integration Center (BTRIC) at Oak Ridge National Laboratory (ORNL). The aim of the Autotune project is to speed up the automated calibration of building energy models to match measured utility or sensor data. The workflow of this project takes input parameters and runs EnergyPlus simulations on Oak Ridge Leadership Computing Facility s (OLCF) computing resources such as Titan, the world s second fastest supercomputer. Multiple simulations run in parallel on nodes having 16 processors each and a Graphics Processing Unit (GPU). Each node produces a 5.7 GB output file comprising 256 files from 64 simulations. Four types of output data covering monthly, daily, hourly, and 15-minute time steps for each annual simulation is produced. A total of 270TB+ of data has been produced. In this project, the simulation data is statistically analyzed in-situ using GPUs while annual simulations are being computed on the traditional processors. Titan, with its recent addition of 18,688 Compute Unified Device Architecture (CUDA) capable NVIDIA GPUs, has greatly extended its capability for massively parallel data processing. CUDA is used along with C/MPI to calculate statistical metrics such as sum, mean, variance, and standard deviation leveraging GPU acceleration. The workflow developed in this project produces statistical summaries of the data which reduces by multiple orders of magnitude the time and amount of data that needs to be stored. These statistical capabilities are anticipated to be useful for sensitivity analysis of EnergyPlus simulations.

  12. Interactive statistical-distribution-analysis program utilizing numerical and graphical methods

    SciTech Connect (OSTI)

    Glandon, S. R.; Fields, D. E.

    1982-04-01

    The TERPED/P program is designed to facilitate the quantitative analysis of experimental data, determine the distribution function that best describes the data, and provide graphical representations of the data. This code differs from its predecessors, TEDPED and TERPED, in that a printer-plotter has been added for graphical output flexibility. The addition of the printer-plotter provides TERPED/P with a method of generating graphs that is not dependent on DISSPLA, Integrated Software Systems Corporation's confidential proprietary graphics package. This makes it possible to use TERPED/P on systems not equipped with DISSPLA. In addition, the printer plot is usually produced more rapidly than a high-resolution plot can be generated. Graphical and numerical tests are performed on the data in accordance with the user's assumption of normality or lognormality. Statistical analysis options include computation of the chi-squared statistic and its significance level and the Kolmogorov-Smirnov one-sample test confidence level for data sets of more than 80 points. Plots can be produced on a Calcomp paper plotter, a FR80 film plotter, or a graphics terminal using the high-resolution, DISSPLA-dependent plotter or on a character-type output device by the printer-plotter. The plots are of cumulative probability (abscissa) versus user-defined units (ordinate). The program was developed on a Digital Equipment Corporation (DEC) PDP-10 and consists of 1500 statements. The language used is FORTRAN-10, DEC's extended version of FORTRAN-IV.

  13. Statistical analysis of content of Cs-137 in soils in Bansko-Razlog region

    SciTech Connect (OSTI)

    Kobilarov, R. G.

    2014-11-18

    Statistical analysis of the data set consisting of the activity concentrations of {sup 137}Cs in soils in Bansko–Razlog region is carried out in order to establish the dependence of the deposition and the migration of {sup 137}Cs on the soil type. The descriptive statistics and the test of normality show that the data set have not normal distribution. Positively skewed distribution and possible outlying values of the activity of {sup 137}Cs in soils were observed. After reduction of the effects of outliers, the data set is divided into two parts, depending on the soil type. Test of normality of the two new data sets shows that they have a normal distribution. Ordinary kriging technique is used to characterize the spatial distribution of the activity of {sup 137}Cs over an area covering 40 km{sup 2} (whole Razlog valley). The result (a map of the spatial distribution of the activity concentration of {sup 137}Cs) can be used as a reference point for future studies on the assessment of radiological risk to the population and the erosion of soils in the study area.

  14. Statistical Projections for Multi-resolution, Multi-dimensional Visual Data Exploration and Analysis

    SciTech Connect (OSTI)

    Hoa T. Nguyen; Stone, Daithi; E. Wes Bethel

    2016-01-01

    An ongoing challenge in visual exploration and analysis of large, multi-dimensional datasets is how to present useful, concise information to a user for some specific visualization tasks. Typical approaches to this problem have proposed either reduced-resolution versions of data, or projections of data, or both. These approaches still have some limitations such as consuming high computation or suffering from errors. In this work, we explore the use of a statistical metric as the basis for both projections and reduced-resolution versions of data, with a particular focus on preserving one key trait in data, namely variation. We use two different case studies to explore this idea, one that uses a synthetic dataset, and another that uses a large ensemble collection produced by an atmospheric modeling code to study long-term changes in global precipitation. The primary findings of our work are that in terms of preserving the variation signal inherent in data, that using a statistical measure more faithfully preserves this key characteristic across both multi-dimensional projections and multi-resolution representations than a methodology based upon averaging.

  15. Statistical Methods and Software for the Analysis of Occupational Exposure Data with Non-detectable Values

    SciTech Connect (OSTI)

    Frome, EL

    2005-09-20

    Environmental exposure measurements are, in general, positive and may be subject to left censoring; i.e,. the measured value is less than a ''detection limit''. In occupational monitoring, strategies for assessing workplace exposures typically focus on the mean exposure level or the probability that any measurement exceeds a limit. Parametric methods used to determine acceptable levels of exposure, are often based on a two parameter lognormal distribution. The mean exposure level, an upper percentile, and the exceedance fraction are used to characterize exposure levels, and confidence limits are used to describe the uncertainty in these estimates. Statistical methods for random samples (without non-detects) from the lognormal distribution are well known for each of these situations. In this report, methods for estimating these quantities based on the maximum likelihood method for randomly left censored lognormal data are described and graphical methods are used to evaluate the lognormal assumption. If the lognormal model is in doubt and an alternative distribution for the exposure profile of a similar exposure group is not available, then nonparametric methods for left censored data are used. The mean exposure level, along with the upper confidence limit, is obtained using the product limit estimate, and the upper confidence limit on an upper percentile (i.e., the upper tolerance limit) is obtained using a nonparametric approach. All of these methods are well known but computational complexity has limited their use in routine data analysis with left censored data. The recent development of the R environment for statistical data analysis and graphics has greatly enhanced the availability of high-quality nonproprietary (open source) software that serves as the basis for implementing the methods in this paper.

  16. Multivariable analysis of the effects of Li, H{sub 2}, and pH on PWR primary water stress corrosion cracking. Final report

    SciTech Connect (OSTI)

    Eason, E.D.; Merton, A.A.; Wright, J.E.

    1996-05-01

    The effects of Li, pH and H, on primary water stress corrosion cracking (PWSCC) of Alloy 600 were investigated for temperatures between 320 and 330{degrees}C. Specimens included in the study were reverse U-bends (RUBs) made from several different heats of Alloy 600. The characteristic life, {eta}, which represents the time until 63.2% of the population initiates PWSCC, was computed using a modified Weibull statistical analysis algorithm and was analyzed for effects of the water chemistry variables previously mentioned. It was determined that the water chemistry variables are less sensitive than the metallurgical characteristics defined by the heat, heat treatment and initial stress state of the specimen (diameter and style of RUB); the maximum impact of chemistry effects was 0.13 to 0.59 standard deviations compared to a range of three (3) standard deviations for all variables. A first-order model was generated to estimate the effect of changes in pH, Li and H, concentrations on the characteristic life. The characteristic time to initiate cracks, {eta}, is not sensitive to Li and H{sub 2} concentrations in excess of 3.5 ppm and 25 ml/kg, respectively. Below these values, (1) {eta} decreases by {approximately}20% when [Li] is increased from 0.7 to 3.5 ppm; (2) {eta} decreases by {approximately}9% when [H{sub 2}] is increased from 13.1 to 25.0 ml/kg; and (3) {eta} decreases by {approximately}14% when pH is increased from 7.0 to 7.4, in each case holding the other two variables constant.

  17. Characterization and analysis of metal wastage in coal-fired fluidized-bed combustors. Statistical analysis plan

    SciTech Connect (OSTI)

    Nazemi, A.H.; Smith, E.P.

    1985-11-01

    Metal loss from in-bed heat exchangers has been a persistent problem in FBC systems. As part of its program in FBC technology development, the US Department of Energy/Morgantown Energy Technology Center (DOE/METC) supports a number of projects directed toward providing both theoretical and experimental results which will guide solutions to the metal loss problem. As a part of this effort, METC and The MITRE Corporation began a project in 1984 to collect and analyze metal loss data from various experimental, pilot-scale, and full-scale coal-fired FBC systems worldwide. The specific objective of this effort is to investigate the effects of unit design parameters and operating conditions on metal loss through the use of regression and analysis of variance techniques. To date, forty-one FBC systems worldwide have been identified and most of the data sets which characterize the metal loss occurrences in those units have been developed. The results of MITRE's effort to date were reported earlier (Interim Report No. DOE/MC/21351-1930, August 1985). This report describes the statistical procedures that MITRE will follow to analyze FBC metal loss data. The data will be analyzed using several regression techniques to find variables related to metal loss. Correlation and single variable regressions will be used to indicate important relationships. The joint relationships between the explanatory variables and metal loss will be examined by building multiple regression models. In order to prevent erroneous conclusions, diagnostics will be performed based on partial residual plots, residual analysis, and multicollinearity statistics. 7 refs.

  18. Statistical Analysis of Baseline Load Models for Non-Residential Buildings

    SciTech Connect (OSTI)

    Coughlin, Katie; Piette, Mary Ann; Goldman, Charles; Kiliccote, Sila

    2008-11-10

    Policymakers are encouraging the development of standardized and consistent methods to quantify the electric load impacts of demand response programs. For load impacts, an essential part of the analysis is the estimation of the baseline load profile. In this paper, we present a statistical evaluation of the performance of several different models used to calculate baselines for commercial buildings participating in a demand response program in California. In our approach, we use the model to estimate baseline loads for a large set of proxy event days for which the actual load data are also available. Measures of the accuracy and bias of different models, the importance of weather effects, and the effect of applying morning adjustment factors (which use data from the day of the event to adjust the estimated baseline) are presented. Our results suggest that (1) the accuracy of baseline load models can be improved substantially by applying a morning adjustment, (2) the characterization of building loads by variability and weather sensitivity is a useful indicator of which types of baseline models will perform well, and (3) models that incorporate temperature either improve the accuracy of the model fit or do not change it.

  19. Statistical Analysis of Microarray Data with Replicated Spots: A Case Study withSynechococcusWH8102

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Thomas, E. V.; Phillippy, K. H.; Brahamsha, B.; Haaland, D. M.; Timlin, J. A.; Elbourne, L. D. H.; Palenik, B.; Paulsen, I. T.

    2009-01-01

    Until recently microarray experiments often involved relatively few arrays with only a single representation of each gene on each array. A complete genome microarray with multiple spots per gene (spread out spatially across the array) was developed in order to compare the gene expression of a marine cyanobacterium and a knockout mutant strain in a defined artificial seawater medium. Statistical methods were developed for analysis in the special situation of this case study where there is gene replication within an array and where relatively few arrays are used, which can be the case with current array technology. Due in partmoreto the replication within an array, it was possible to detect very small changes in the levels of expression between the wild type and mutant strains. One interesting biological outcome of this experiment is the indication of the extent to which the phosphorus regulatory system of this cyanobacterium affects the expression of multiple genes beyond those strictly involved in phosphorus acquisition.less

  20. STAC -- a new Swedish code for statistical analysis of cracks in SG-tubes

    SciTech Connect (OSTI)

    Poern, K.

    1997-02-01

    Steam generator (SG) tubes in pressurized water reactor plants are exposed to various types of degradation processes, among which stress corrosion cracking in particular has been observed. To be able to evaluate the safety importance of such cracking of SG-tubes one has to have a good and empirically founded knowledge about the scope and the size of the cracks as well as the rate of their continuous growth. The basis of experience is to a large extent constituted of the annually performed SG-inspections and crack sizing procedures. On the basis of this experience one can estimate the distribution of existing crack lengths, and modify this distribution with regard to maintenance (plugging) and the predicted rate of crack propagation. Finally, one can calculate the rupture probability of SG-tubes as a function of a given critical crack length. On account of the Swedish Nuclear Power Inspectorate an introductory study has been performed in order to get a survey of what has been done elsewhere in this field. The study resulted in a proposal of a computerizable model to be able to estimate the distribution of true cracks, to modify this distribution due to the crack growth and to compute the probability of tube rupture. The model has now been implemented in a compute code, called STAC (STatistical Analysis of Cracks). This paper is aimed to give a brief outline of the model to facilitate the understanding of the possibilities and limitations associated with the model.

  1. Mathematical and statistical analysis of the effect of boron on yield parameters of wheat

    SciTech Connect (OSTI)

    Rawashdeh, Hamzeh; Sala, Florin; Boldea, Marius

    2015-03-10

    The main objective of this research is to investigate the effect of foliar applications of boron at different growth stages on yield and yield parameters of wheat. The contribution of boron in achieving yield parameters is described by second degree polynomial equations, with high statistical confidence (p<0.01; F theoretical < F calculated, according to ANOVA test, for Alfa = 0.05). Regression analysis, based on R{sup 2} values obtained, made it possible to evaluate the particular contribution of boron to the realization of yield parameters. This was lower for spike length (R{sup 2} = 0.812), thousand seeds weight (R{sup 2} = 0.850) and higher in the case of the number of spikelets (R{sup 2} = 0.936) and the number of seeds on a spike (R{sup 2} = 0.960). These results confirm that boron plays an important part in achieving the number of seeds on a spike in the case of wheat, as the contribution of this element to the process of flower fertilization is well-known. In regards to productivity elements, the contribution of macroelements to yield quantity is clear, the contribution of B alone being R{sup 2} = 0.868.

  2. Statistical Analysis of Microarray Data with Replicated Spots: A Case Study with Synechococcus WH8102

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Thomas, E. V.; Phillippy, K. H.; Brahamsha, B.; Haaland, D. M.; Timlin, J. A.; Elbourne, L. D. H.; Palenik, B.; Paulsen, I. T.

    2009-01-01

    Until recently microarray experiments often involved relatively few arrays with only a single representation of each gene on each array. A complete genome microarray with multiple spots per gene (spread out spatially across the array) was developed in order to compare the gene expression of a marine cyanobacterium and a knockout mutant strain in a defined artificial seawater medium. Statistical methods were developed for analysis in the special situation of this case study where there is gene replication within an array and where relatively few arrays are used, which can be the case with current array technology. Due in partmore » to the replication within an array, it was possible to detect very small changes in the levels of expression between the wild type and mutant strains. One interesting biological outcome of this experiment is the indication of the extent to which the phosphorus regulatory system of this cyanobacterium affects the expression of multiple genes beyond those strictly involved in phosphorus acquisition.« less

  3. Dose impact in radiographic lung injury following lung SBRT: Statistical analysis and geometric interpretation

    SciTech Connect (OSTI)

    Yu, Victoria; Kishan, Amar U.; Cao, Minsong; Low, Daniel; Lee, Percy; Ruan, Dan

    2014-03-15

    been demonstrated. Bimodal behavior was observed in the dose distribution of lung injury after SBRT. Novel statistical and geometrical analysis has shown that the systematically quantified low-dose peak at approximately 35 Gy, or 70% prescription dose, is a good indication of a critical dose for injury. The determined critical dose of 35 Gy resembles the critical dose volume limit of 30 Gy for ipsilateral bronchus in RTOG 0618 and results from previous studies. The authors seek to further extend this improved analysis method to a larger cohort to better understand the interpatient variation in radiographic lung injury dose response post-SBRT.

  4. Statistical Sciences

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    6 Statistical Sciences Applying statistical reasoning and rigor to multidisciplinary scientific investigations Contact Us Group Leader James Gattiker (Acting) Email Deputy Group Leader Geralyn Hemphill (Acting) Email Group Administrator LeeAnn Martinez (505) 667-3308 Email Statistical Sciences Statistical Sciences provides statistical reasoning and rigor to multidisciplinary scientific investigations and development, application, and communication of cutting-edge statistical sciences research.

  5. Impacts of different data averaging times on statistical analysis of distributed domestic photovoltaic systems

    SciTech Connect (OSTI)

    Widen, Joakim; Waeckelgaard, Ewa; Paatero, Jukka; Lund, Peter

    2010-03-15

    The trend of increasing application of distributed generation with solar photovoltaics (PV-DG) suggests that a widespread integration in existing low-voltage (LV) grids is possible in the future. With massive integration in LV grids, a major concern is the possible negative impacts of excess power injection from on-site generation. For power-flow simulations of such grid impacts, an important consideration is the time resolution of demand and generation data. This paper investigates the impact of time averaging on high-resolution data series of domestic electricity demand and PV-DG output and on voltages in a simulated LV grid. Effects of 10-minutely and hourly averaging on descriptive statistics and duration curves were determined. Although time averaging has a considerable impact on statistical properties of the demand in individual households, the impact is smaller on aggregate demand, already smoothed from random coincidence, and on PV-DG output. Consequently, the statistical distribution of simulated grid voltages was also robust against time averaging. The overall judgement is that statistical investigation of voltage variations in the presence of PV-DG does not require higher resolution than hourly. (author)

  6. Time varying, multivariate volume data reduction

    SciTech Connect (OSTI)

    Ahrens, James P; Fout, Nathaniel; Ma, Kwan - Liu

    2010-01-01

    Large-scale supercomputing is revolutionizing the way science is conducted. A growing challenge, however, is understanding the massive quantities of data produced by large-scale simulations. The data, typically time-varying, multivariate, and volumetric, can occupy from hundreds of gigabytes to several terabytes of storage space. Transferring and processing volume data of such sizes is prohibitively expensive and resource intensive. Although it may not be possible to entirely alleviate these problems, data compression should be considered as part of a viable solution, especially when the primary means of data analysis is volume rendering. In this paper we present our study of multivariate compression, which exploits correlations among related variables, for volume rendering. Two configurations for multidimensional compression based on vector quantization are examined. We emphasize quality reconstruction and interactive rendering, which leads us to a solution using graphics hardware to perform on-the-fly decompression during rendering. In this paper we present a solution which addresses the need for data reduction in large supercomputing environments where data resulting from simulations occupies tremendous amounts of storage. Our solution employs a lossy encoding scheme to acrueve data reduction with several options in terms of rate-distortion behavior. We focus on encoding of multiple variables together, with optional compression in space and time. The compressed volumes can be rendered directly with commodity graphics cards at interactive frame rates and rendering quality similar to that of static volume renderers. Compression results using a multivariate time-varying data set indicate that encoding multiple variables results in acceptable performance in the case of spatial and temporal encoding as compared to independent compression of variables. The relative performance of spatial vs. temporal compression is data dependent, although temporal compression has the

  7. Statistical analysis of an inter-laboratory comparison of small-scale safety and thermal testing of RDX

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Brown, Geoffrey W.; Sandstrom, Mary M.; Preston, Daniel N.; Pollard, Colin J.; Warner, Kirstin F.; Sorensen, Daniel N.; Remmers, Daniel L.; Phillips, Jason J.; Shelley, Timothy J.; Reyes, Jose A.; et al

    2014-11-17

    In this study, the Integrated Data Collection Analysis (IDCA) program has conducted a proficiency test for small-scale safety and thermal (SSST) testing of homemade explosives (HMEs). Described here are statistical analyses of the results from this test for impact, friction, electrostatic discharge, and differential scanning calorimetry analysis of the RDX Class 5 Type II standard. The material was tested as a well-characterized standard several times during the proficiency test to assess differences among participants and the range of results that may arise for well-behaved explosive materials.

  8. Usage Statistics

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Usage Statistics Usage Statistics Genepool Cluster Statistics Period: daily weekly monthly quarter yearly 2year Utilization By Group Jobs Pending Last edited: 2013-09-26 18:21:13...

  9. APS Operational Statistics

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Downtime Log Yearly Operation Statistics 2016 Statistics 2015 Statistics 2014 Statistics 2013 Statistics 2012 Statistics 2011 Statistics 2010 Statistics 2009 Statistics 2008...

  10. Statistical analysis of surface lineaments and fractures for characterizing naturally fractured reservoirs

    SciTech Connect (OSTI)

    Guo, Genliang; George, S.A.; Lindsey, R.P.

    1997-08-01

    Thirty-six sets of surface lineaments and fractures mapped from satellite images and/or aerial photos from parts of the Mid-continent and Colorado Plateau regions were collected, digitized, and statistically analyzed in order to obtain the probability distribution functions of natural fractures for characterizing naturally fractured reservoirs. The orientations and lengths of the surface linear features were calculated using the digitized coordinates of the two end points of each individual linear feature. The spacing data of the surface linear features within an individual set were, obtained using a new analytical sampling technique. Statistical analyses were then performed to find the best-fit probability distribution functions for the orientation, length, and spacing of each data set. Twenty-five hypothesized probability distribution functions were used to fit each data set. A chi-square goodness-of-fit test was used to rank the significance of each fit. A distribution which provides the lowest chi-square goodness-of-fit value was considered the best-fit distribution. The orientations of surface linear features were best-fitted by triangular, normal, or logistic distributions; the lengths were best-fitted by PearsonVI, PearsonV, lognormal2, or extreme-value distributions; and the spacing data were best-fitted by lognormal2, PearsonVI, or lognormal distributions. These probability functions can be used to stochastically characterize naturally fractured reservoirs.

  11. SU-E-J-261: Statistical Analysis and Chaotic Dynamics of Respiratory Signal of Patients in BodyFix

    SciTech Connect (OSTI)

    Michalski, D; Huq, M; Bednarz, G; Lalonde, R; Yang, Y; Heron, D

    2014-06-01

    Purpose: To quantify respiratory signal of patients in BodyFix undergoing 4DCT scan with and without immobilization cover. Methods: 20 pairs of respiratory tracks recorded with RPM system during 4DCT scan were analyzed. Descriptive statistic was applied to selected parameters of exhale-inhale decomposition. Standardized signals were used with the delay method to build orbits in embedded space. Nonlinear behavior was tested with surrogate data. Sample entropy SE, Lempel-Ziv complexity LZC and the largest Lyapunov exponents LLE were compared. Results: Statistical tests show difference between scans for inspiration time and its variability, which is bigger for scans without cover. The same is for variability of the end of exhalation and inhalation. Other parameters fail to show the difference. For both scans respiratory signals show determinism and nonlinear stationarity. Statistical test on surrogate data reveals their nonlinearity. LLEs show signals chaotic nature and its correlation with breathing period and its embedding delay time. SE, LZC and LLE measure respiratory signal complexity. Nonlinear characteristics do not differ between scans. Conclusion: Contrary to expectation cover applied to patients in BodyFix appears to have limited effect on signal parameters. Analysis based on trajectories of delay vectors shows respiratory system nonlinear character and its sensitive dependence on initial conditions. Reproducibility of respiratory signal can be evaluated with measures of signal complexity and its predictability window. Longer respiratory period is conducive for signal reproducibility as shown by these gauges. Statistical independence of the exhale and inhale times is also supported by the magnitude of LLE. The nonlinear parameters seem more appropriate to gauge respiratory signal complexity since its deterministic chaotic nature. It contrasts with measures based on harmonic analysis that are blind for nonlinear features. Dynamics of breathing, so crucial for

  12. Multivariate Spatio-Temporal Clustering of Times-Series Data: An Approach for Diagnosing Cloud Properties and Understandin...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Multivariate Spatio-Temporal Clustering of Times-Series Data: An Approach for Diagnosing Cloud Properties and Understanding ARM Site Representativeness F. M. Hoffman and W. W. Hargrove Oak Ridge National Laboratory Oakridge, Tennessee A. D. Del Genio National Aeronautics and Space Administration Goddard Institute for Space Studies Columbia University, New York Multivariate Clustering A multivariate statistical clustering technique-based on the iterative k-means algorithm of Hartigan (Hartigan

  13. Rebound 2007: Analysis of U.S. Light-Duty Vehicle Travel Statistics

    SciTech Connect (OSTI)

    Greene, David L

    2010-01-01

    U.S. national time series data on vehicle travel by passenger cars and light trucks covering the period 1966 2007 are used to test for the existence, size and stability of the rebound effect for motor vehicle fuel efficiency on vehicle travel. The data show a statistically significant effect of gasoline price on vehicle travel but do not support the existence of a direct impact of fuel efficiency on vehicle travel. Additional tests indicate that fuel price effects have not been constant over time, although the hypothesis of symmetry with respect to price increases and decreases is not rejected. Small and Van Dender (2007) model of a declining rebound effect with income is tested and similar results are obtained.

  14. STATISTICAL ANALYSIS OF INTERFEROMETRIC MEASUREMENTS OF AXIS RATIOS FOR CLASSICAL Be STARS

    SciTech Connect (OSTI)

    Cyr, R. P.; Jones, C. E.; Tycner, C.

    2015-01-20

    This work presents a novel method to estimate the effective opening angle of CBe star disks from projected axis ratio measurements, obtained by interferometry using a statistical approach. A Monte Carlo scheme was used to generate a large set of theoretical axis ratios from disk models using different distributions of disk densities and opening angles. These theoretical samples were then compared to observational samples, using a two-sample Kolmogorov-Smirnov test, to determine which theoretical distribution best reproduces the observations. The results suggest that the observed ratio distributions in the K, H, and N band can best be explained by the presence of thin disks, with opening half-angles of the order of 0.°15-4.°0. Results for measurements over the Hα line point toward slightly thicker disks, 3.°7-14°, which is consistent with a flaring disk predicted by the viscous disk model.

  15. International Energy Statistics

    U.S. Energy Information Administration (EIA) Indexed Site

    Eia.gov BETA - Data - U.S. Energy Information Administration (EIA) U.S. Energy Information Administration - EIA - Independent Statistics and Analysis Sources & Uses Petroleum &...

  16. Analysis of hyper-spectral data derived from an imaging Fourier transform: A statistical perspective

    SciTech Connect (OSTI)

    Sengupta, S.K.; Clark, G.A.; Fields, D.J.

    1996-01-10

    Fourier transform spectrometers (FTS) using optical sensors are increasingly being used in various branches of science. Typically, a FTS generates a three-dimensional data cube with two spatial dimensions and one frequency/wavelength dimension. The number of frequency dimensions in such data cubes is generally very large, often in the hundreds, making data analytical procedures extremely complex. In the present report, the problem is viewed from a statistical perspective. A set of procedures based on the high degree of inter-channel correlation structure often present in such hyper-spectral data, has been identified and applied to an example data set of dimension 100 x 128 x 128 comprising 128 spectral bands. It is shown that in this case, the special eigen-structure of the correlation matrix has allowed the authors to extract just a few linear combinations of the channels (the significant principal vectors) that effectively contain almost all of the spectral information contained in the data set analyzed. This in turn, enables them to segment the objects in the given spatial frame using, in a parsimonious yet highly effective way, most of the information contained in the data set.

  17. Improved Multivariate Calibration Models for Corn Stover Feedstock and Dilute-Acid Pretreated Corn Stover

    SciTech Connect (OSTI)

    Wolfrum, E. J.; Sluiter, A. D.

    2009-01-01

    We have studied rapid calibration models to predict the composition of a variety of biomass feedstocks by correlating near-infrared (NIR) spectroscopic data to compositional data produced using traditional wet chemical analysis techniques. The rapid calibration models are developed using multivariate statistical analysis of the spectroscopic and wet chemical data. This work discusses the latest versions of the NIR calibration models for corn stover feedstock and dilute-acid pretreated corn stover. Measures of the calibration precision and uncertainty are presented. No statistically significant differences (p = 0.05) are seen between NIR calibration models built using different mathematical pretreatments. Finally, two common algorithms for building NIR calibration models are compared; no statistically significant differences (p = 0.05) are seen for the major constituents glucan, xylan, and lignin, but the algorithms did produce different predictions for total extractives. A single calibration model combining the corn stover feedstock and dilute-acid pretreated corn stover samples gave less satisfactory predictions than the separate models.

  18. High Statistics Analysis using Anisotropic Clover Lattices: (I) Single Hadron Correlation Functions

    SciTech Connect (OSTI)

    Beane, S; Detmold, W; Luu, T; Orginos, K; Parreno, A; Savage, M; Torok, A; Walker-Loud, A

    2009-03-23

    We present the results of high-statistics calculations of correlation functions generated with single-baryon interpolating operators on an ensemble of dynamical anisotropic gauge-field configurations generated by the Hadron Spectrum Collaboration using a tadpole-improved clover fermion action and Symanzik-improved gauge action. A total of 292, 500 sets of measurements are made using 1194 gauge configurations of size 20{sup 3} x 128 with an anisotropy parameter {zeta} = b{sub s}/b{sub t} = 3.5, a spatial lattice spacing of b{sub s} = 0.1227 {+-} 0.0008 fm, and pion mass of M{sub {pi}} {approx} 390 MeV. Ground state baryons masses are extracted with fully quantified uncertainties that are at or below the {approx} 0.2%-level in lattice units. The lowest-lying negative-parity states are also extracted albeit with a somewhat lower level of precision. In the case of the nucleon, this negative-parity state is above the N{pi} threshold and, therefore, the isospin-1/2 {pi}N s-wave scattering phase-shift can be extracted using Luescher's method. The disconnected contributions to this process are included indirectly in the gauge-field configurations and do not require additional calculations. The signal-to-noise ratio in the various correlation functions is explored and is found to degrade exponentially faster than naive expectations on many time-slices. This is due to backward propagating states arising from the anti-periodic boundary conditions imposed on the quark-propagators in the time-direction. We explore how best to distribute computational resources between configuration generation and propagator measurements in order to optimize the extraction of single baryon observables.

  19. Statistical analysis of liquid seepage in partially saturated heterogeneous fracture systems

    SciTech Connect (OSTI)

    Liou, T.S.

    1999-12-01

    Field evidence suggests that water flow in unsaturated fracture systems may occur along fast preferential flow paths. However, conventional macroscale continuum approaches generally predict the downward migration of water as a spatially uniform wetting front subjected to strong inhibition into the partially saturated rock matrix. One possible cause of this discrepancy may be the spatially random geometry of the fracture surfaces, and hence, the irregular fracture aperture. Therefore, a numerical model was developed in this study to investigate the effects of geometric features of natural rock fractures on liquid seepage and solute transport in 2-D planar fractures under isothermal, partially saturated conditions. The fractures were conceptualized as 2-D heterogeneous porous media that are characterized by their spatially correlated permeability fields. A statistical simulator, which uses a simulated annealing (SA) algorithm, was employed to generate synthetic permeability fields. Hypothesized geometric features that are expected to be relevant for seepage behavior, such as spatially correlated asperity contacts, were considered in the SA algorithm. Most importantly, a new perturbation mechanism for SA was developed in order to consider specifically the spatial correlation near conditioning asperity contacts. Numerical simulations of fluid flow and solute transport were then performed in these synthetic fractures by the flow simulator TOUGH2, assuming that the effects of matrix permeability, gas phase pressure, capillary/permeability hysteresis, and molecular diffusion can be neglected. Results of flow simulation showed that liquid seepage in partially saturated fractures is characterized by localized preferential flow, along with bypassing, funneling, and localized ponding. Seepage pattern is dominated by the fraction of asperity contracts, and their shape, size, and spatial correlation. However, the correlation structure of permeability field is less important

  20. Cluster Statistics

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Statistics Cluster Statistics Ganglia Ganglia can be used to monitor performance of PDSF nodes... Read More PDSF IO Monitoring This page shows the IO response of the elizas and...

  1. Statistical Association

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Kelly named Fellow of the American Statistical Association August 2, 2016 The American Statistical Association (ASA) has honored Elizabeth Kelly of the Lab's Statistical Sciences group with the title of Fellow. The ASA recognized her for providing exemplary statistical leadership of and collaboration on multidisciplinary teams dealing with environmental restoration, weapon quality, and nuclear materials storage to ensure the safety and security of the Nation. She will receive the Fellow award at

  2. Statistical Analysis of Demographic and Temporal Differences in LANL's 2014 Voluntary Protection Program Survey

    SciTech Connect (OSTI)

    Davis, Adam Christopher; Booth, Steven Richard

    2015-08-20

    Voluntary Protection Program (VPP) surveys were conducted in 2013 and 2014 to assess the degree to which workers at Los Alamos National Laboratory feel that their safety is valued by their management and peers. The goal of this analysis is to determine whether the difference between the VPP survey scores in 2013 and 2014 is significant, and to present the data in a way such that it can help identify either positive changes or potential opportunities for improvement. Data for several questions intended to identify the demographic groups of the respondent are included in both the 2013 and 2014 VPP survey results. These can be used to identify any significant differences among groups of employees as well as to identify any temporal trends in these cohorts.

  3. Improving Thermal Model Prediction Through Statistical Analysis of Irradiation and Post-Irradiation Data from AGR Experiments

    SciTech Connect (OSTI)

    Binh T. Pham; Grant L. Hawkes; Jeffrey J. Einerson

    2014-05-01

    As part of the High Temperature Reactors (HTR) R&D program, a series of irradiation tests, designated as Advanced Gas-cooled Reactor (AGR), have been defined to support development and qualification of fuel design, fabrication process, and fuel performance under normal operation and accident conditions. The AGR tests employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule and instrumented with thermocouples (TC) embedded in graphite blocks enabling temperature control. While not possible to obtain by direct measurements in the tests, crucial fuel conditions (e.g., temperature, neutron fast fluence, and burnup) are calculated using core physics and thermal modeling codes. This paper is focused on AGR test fuel temperature predicted by the ABAQUS code's finite element-based thermal models. The work follows up on a previous study, in which several statistical analysis methods were adapted, implemented in the NGNP Data Management and Analysis System (NDMAS), and applied for qualification of AGR-1 thermocouple data. Abnormal trends in measured data revealed by the statistical analysis are traced to either measuring instrument deterioration or physical mechanisms in capsules that may have shifted the system thermal response. The main thrust of this work is to exploit the variety of data obtained in irradiation and post-irradiation examination (PIE) for assessment of modeling assumptions. As an example, the uneven reduction of the control gas gap in Capsule 5 found in the capsule metrology measurements in PIE helps identify mechanisms other than TC drift causing the decrease in TC readings. This suggests a more physics-based modification of the thermal model that leads to a better fit with experimental data, thus reducing model uncertainty and increasing confidence in the calculated fuel temperatures of the AGR-1 test.

  4. Improving Thermal Model Prediction Through Statistical Analysis of Irradiation and Post-Irradiation Data from AGR Experiments

    SciTech Connect (OSTI)

    Dr. Binh T. Pham; Grant L. Hawkes; Jeffrey J. Einerson

    2012-10-01

    As part of the Research and Development program for Next Generation High Temperature Reactors (HTR), a series of irradiation tests, designated as Advanced Gas-cooled Reactor (AGR), have been defined to support development and qualification of fuel design, fabrication process, and fuel performance under normal operation and accident conditions. The AGR tests employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule and instrumented with thermocouples (TC) embedded in graphite blocks enabling temperature control. The data representing the crucial test fuel conditions (e.g., temperature, neutron fast fluence, and burnup) while impossible to obtain from direct measurements are calculated by physics and thermal models. The irradiation and post-irradiation examination (PIE) experimental data are used in model calibration effort to reduce the inherent uncertainty of simulation results. This paper is focused on fuel temperature predicted by the ABAQUS code’s finite element-based thermal models. The work follows up on a previous study, in which several statistical analysis methods were adapted, implemented in the NGNP Data Management and Analysis System (NDMAS), and applied for improving qualification of AGR-1 thermocouple data. The present work exercises the idea that the abnormal trends of measured data observed from statistical analysis may be caused by either measuring instrument deterioration or physical mechanisms in capsules that may have shifted the system thermal response. As an example, the uneven reduction of the control gas gap in Capsule 5 revealed by the capsule metrology measurements in PIE helps justify the reduction in TC readings instead of TC drift. This in turn prompts modification of thermal model to better fit with experimental data, thus help increase confidence, and in other word reduce model uncertainties in thermal simulation results of the AGR-1 test.

  5. A statistical analysis of seeds and other high-contrast exoplanet surveys: massive planets or low-mass brown dwarfs?

    SciTech Connect (OSTI)

    Brandt, Timothy D.; Spiegel, David S.; McElwain, Michael W.; Grady, C. A.; Turner, Edwin L.; Mede, Kyle; Kuzuhara, Masayuki; Schlieder, Joshua E.; Brandner, W.; Feldt, M.; Wisniewski, John P.; Abe, L.; Biller, B.; Carson, J.; Currie, T.; Egner, S.; Golota, T.; Guyon, O.; Goto, M.; Hashimoto, J.; and others

    2014-10-20

    We conduct a statistical analysis of a combined sample of direct imaging data, totalling nearly 250 stars. The stars cover a wide range of ages and spectral types, and include five detections (? And b, two ?60 M {sub J} brown dwarf companions in the Pleiades, PZ Tel B, and CD35 2722B). For some analyses we add a currently unpublished set of SEEDS observations, including the detections GJ 504b and GJ 758B. We conduct a uniform, Bayesian analysis of all stellar ages using both membership in a kinematic moving group and activity/rotation age indicators. We then present a new statistical method for computing the likelihood of a substellar distribution function. By performing most of the integrals analytically, we achieve an enormous speedup over brute-force Monte Carlo. We use this method to place upper limits on the maximum semimajor axis of the distribution function derived from radial-velocity planets, finding model-dependent values of ?30-100 AU. Finally, we model the entire substellar sample, from massive brown dwarfs to a theoretically motivated cutoff at ?5 M {sub J}, with a single power-law distribution. We find that p(M, a)?M {sup 0.65} {sup } {sup 0.60} a {sup 0.85} {sup } {sup 0.39} (1? errors) provides an adequate fit to our data, with 1.0%-3.1% (68% confidence) of stars hosting 5-70 M {sub J} companions between 10 and 100 AU. This suggests that many of the directly imaged exoplanets known, including most (if not all) of the low-mass companions in our sample, formed by fragmentation in a cloud or disk, and represent the low-mass tail of the brown dwarfs.

  6. Independent Statistics & Analysis

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    monthly data for smaller electric power plants that are excluded from the monthly filing ... independent power producers, and electric utility combined heat and power plants. ...

  7. The price of electricity from private power producers: Stage 2, Expansion of sample and preliminary statistical analysis

    SciTech Connect (OSTI)

    Comnes, G.A.; Belden, T.N.; Kahn, E.P.

    1995-02-01

    The market for long-term bulk power is becoming increasingly competitive and mature. Given that many privately developed power projects have been or are being developed in the US, it is possible to begin to evaluate the performance of the market by analyzing its revealed prices. Using a consistent method, this paper presents levelized contract prices for a sample of privately developed US generation properties. The sample includes 26 projects with a total capacity of 6,354 MW. Contracts are described in terms of their choice of technology, choice of fuel, treatment of fuel price risk, geographic location, dispatchability, expected dispatch niche, and size. The contract price analysis shows that gas technologies clearly stand out as the most attractive. At an 80% capacity factor, coal projects have an average 20-year levelized price of $0.092/kWh, whereas natural gas combined cycle and/or cogeneration projects have an average price of $0.069/kWh. Within each technology type subsample, however, there is considerable variation. Prices for natural gas combustion turbines and one wind project are also presented. A preliminary statistical analysis is conducted to understand the relationship between price and four categories of explanatory factors including product heterogeneity, geographic heterogeneity, economic and technological change, and other buyer attributes (including avoided costs). Because of residual price variation, we are unable to accept the hypothesis that electricity is a homogeneous product. Instead, the analysis indicates that buyer value still plays an important role in the determination of price for competitively-acquired electricity.

  8. Analysis/plot generation code with significance levels computed using Kolmogorov-Smirnov statistics valid for both large and small samples

    SciTech Connect (OSTI)

    Kurtz, S.E.; Fields, D.E.

    1983-10-01

    This report describes a version of the TERPED/P computer code that is very useful for small data sets. A new algorithm for determining the Kolmogorov-Smirnov (KS) statistics is used to extend program applicability. The TERPED/P code facilitates the analysis of experimental data and assists the user in determining its probability distribution function. Graphical and numerical tests are performed interactively in accordance with the user's assumption of normally or log-normally distributed data. Statistical analysis options include computation of the chi-square statistic and the KS one-sample test statistic and the corresponding significance levels. Cumulative probability plots of the user's data are generated either via a local graphics terminal, a local line printer or character-oriented terminal, or a remote high-resolution graphics device such as the FR80 film plotter or the Calcomp paper plotter. Several useful computer methodologies suffer from limitations of their implementations of the KS nonparametric test. This test is one of the more powerful analysis tools for examining the validity of an assumption about the probability distribution of a set of data. KS algorithms are found in other analysis codes, including the Statistical Analysis Subroutine (SAS) package and earlier versions of TERPED. The inability of these algorithms to generate significance levels for sample sizes less than 50 has limited their usefulness. The release of the TERPED code described herein contains algorithms to allow computation of the KS statistic and significance level for data sets of, if the user wishes, as few as three points. Values computed for the KS statistic are within 3% of the correct value for all data set sizes.

  9. Cluster Statistics

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Genepool Memory Heatmaps Usage Statistics UGE Scheduler Cycle Time File storage and I/O Data Management Supported Systems FAQ Performance and Optimization Genepool Completed Jobs Genepool Training and Tutorials Websites, databases and cluster services Queues and Scheduling Retired Systems Storage & File Systems Application Performance Data & Analytics Job Logs & Statistics Training & Tutorials Software Policies User Surveys NERSC Users Group Help Staff Blogs Request Repository

  10. A statistical analysis of systematic errors in temperature and ram velocity estimates from satellite-borne retarding potential analyzers

    SciTech Connect (OSTI)

    Klenzing, J. H.; Earle, G. D.; Heelis, R. A.; Coley, W. R. [William B. Hanson Center for Space Sciences, University of Texas at Dallas, 800 W. Campbell Rd. WT15, Richardson, Texas 75080 (United States)

    2009-05-15

    The use of biased grids as energy filters for charged particles is common in satellite-borne instruments such as a planar retarding potential analyzer (RPA). Planar RPAs are currently flown on missions such as the Communications/Navigation Outage Forecast System and the Defense Meteorological Satellites Program to obtain estimates of geophysical parameters including ion velocity and temperature. It has been shown previously that the use of biased grids in such instruments creates a nonuniform potential in the grid plane, which leads to inherent errors in the inferred parameters. A simulation of ion interactions with various configurations of biased grids has been developed using a commercial finite-element analysis software package. Using a statistical approach, the simulation calculates collected flux from Maxwellian ion distributions with three-dimensional drift relative to the instrument. Perturbations in the performance of flight instrumentation relative to expectations from the idealized RPA flux equation are discussed. Both single grid and dual-grid systems are modeled to investigate design considerations. Relative errors in the inferred parameters for each geometry are characterized as functions of ion temperature and drift velocity.

  11. Extracting bb Higgs Decay Signals using Multivariate Techniques

    SciTech Connect (OSTI)

    Smith, W Clarke; /George Washington U. /SLAC

    2012-08-28

    For low-mass Higgs boson production at ATLAS at {radical}s = 7 TeV, the hard subprocess gg {yields} h{sup 0} {yields} b{bar b} dominates but is in turn drowned out by background. We seek to exploit the intrinsic few-MeV mass width of the Higgs boson to observe it above the background in b{bar b}-dijet mass plots. The mass resolution of existing mass-reconstruction algorithms is insufficient for this purpose due to jet combinatorics, that is, the algorithms cannot identify every jet that results from b{bar b} Higgs decay. We combine these algorithms using the neural net (NN) and boosted regression tree (BDT) multivariate methods in attempt to improve the mass resolution. Events involving gg {yields} h{sup 0} {yields} b{bar b} are generated using Monte Carlo methods with Pythia and then the Toolkit for Multivariate Analysis (TMVA) is used to train and test NNs and BDTs. For a 120 GeV Standard Model Higgs boson, the m{sub h{sup 0}}-reconstruction width is reduced from 8.6 to 6.5 GeV. Most importantly, however, the methods used here allow for more advanced m{sub h{sup 0}}-reconstructions to be created in the future using multivariate methods.

  12. ORISE: Statistical Analyses of Worker Health

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    appropriate methods of statistical analysis to a variety of problems in occupational health and other areas. Our expertise spans a range of capabilities essential for statistical...

  13. Storage Statistics

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Storage Trends and Summaries Storage by Scientific Discipline Troubleshooting I/O Resources for Scientific Applications at NERSC Optimizing I/O performance on the Lustre file system I/O Formats Science Databases Sharing Data Transferring Data Unix Groups at NERSC Unix File Permissions Application Performance Data & Analytics Job Logs & Statistics Training & Tutorials Software Policies User Surveys NERSC Users Group Help Staff Blogs Request Repository Mailing List Home » For Users

  14. Statistical Survival Analysis of Fish and Wildlife Tagging Studies; SURPH.1 Manual - Analysis of Release-Recapture Data for Survival Studies, 1994 Technical Manual.

    SciTech Connect (OSTI)

    Smith, Steven G.; Skalski, John R.; Schelechte, J. Warren

    1994-12-01

    Program SURPH is the culmination of several years of research to develop a comprehensive computer program to analyze survival studies of fish and wildlife populations. Development of this software was motivated by the advent of the PIT-tag (Passive Integrated Transponder) technology that permits the detection of salmonid smolt as they pass through hydroelectric facilities on the Snake and Columbia Rivers in the Pacific Northwest. Repeated detections of individually tagged smolt and analysis of their capture-histories permits estimates of downriver survival probabilities. Eventual installation of detection facilities at adult fish ladders will also permit estimation of ocean survival and upstream survival of returning salmon using the statistical methods incorporated in SURPH.1. However, the utility of SURPH.1 far exceeds solely the analysis of salmonid tagging studies. Release-recapture and radiotelemetry studies from a wide range of terrestrial and aquatic species have been analyzed using SURPH.1 to estimate discrete time survival probabilities and investigate survival relationships. The interactive computing environment of SURPH.1 was specifically developed to allow researchers to investigate the relationship between survival and capture processes and environmental, experimental and individual-based covariates. Program SURPH.1 represents a significant advancement in the ability of ecologists to investigate the interplay between morphologic, genetic, environmental and anthropogenic factors on the survival of wild species. It is hoped that this better understanding of risk factors affecting survival will lead to greater appreciation of the intricacies of nature and to improvements in the management of wild resources. This technical report is an introduction to SURPH.1 and provides a user guide for both the UNIX and MS-Windows{reg_sign} applications of the SURPH software.

  15. Statistical Analysis and Modeling of Occupancy Patterns in Open-Plan Offices using Measured Lighting-Switch Data

    SciTech Connect (OSTI)

    Chang, Wen-Kuei; Hong, Tianzhen

    2013-01-01

    Occupancy profile is one of the driving factors behind discrepancies between the measured and simulated energy consumption of buildings. The frequencies of occupants leaving their offices and the corresponding durations of absences have significant impact on energy use and the operational controls of buildings. This study used statistical methods to analyze the occupancy status, based on measured lighting-switch data in five-minute intervals, for a total of 200 open-plan (cubicle) offices. Five typical occupancy patterns were identified based on the average daily 24-hour profiles of the presence of occupants in their cubicles. These statistical patterns were represented by a one-square curve, a one-valley curve, a two-valley curve, a variable curve, and a flat curve. The key parameters that define the occupancy model are the average occupancy profile together with probability distributions of absence duration, and the number of times an occupant is absent from the cubicle. The statistical results also reveal that the number of absence occurrences decreases as total daily presence hours decrease, and the duration of absence from the cubicle decreases as the frequency of absence increases. The developed occupancy model captures the stochastic nature of occupants moving in and out of cubicles, and can be used to generate a more realistic occupancy schedule. This is crucial for improving the evaluation of the energy saving potential of occupancy based technologies and controls using building simulations. Finally, to demonstrate the use of the occupancy model, weekday occupant schedules were generated and discussed.

  16. Multivariate Calibration Models for Sorghum Composition using Near-Infrared Spectroscopy

    SciTech Connect (OSTI)

    Wolfrum, E.; Payne, C.; Stefaniak, T.; Rooney, W.; Dighe, N.; Bean, B.; Dahlberg, J.

    2013-03-01

    NREL developed calibration models based on near-infrared (NIR) spectroscopy coupled with multivariate statistics to predict compositional properties relevant to cellulosic biofuels production for a variety of sorghum cultivars. A robust calibration population was developed in an iterative fashion. The quality of models developed using the same sample geometry on two different types of NIR spectrometers and two different sample geometries on the same spectrometer did not vary greatly.

  17. Expert systems and the CPI product substitution review: A needs analysis for the US Bureau of Labor Statistics

    SciTech Connect (OSTI)

    Arrowood, L.F.; Tonn, B.E.

    1992-02-01

    This report presents recommendations relative to the use of expert systems and machine learning techniques by the Bureau of Labor Statistics (BLS) to substantially automate product substitution decisions associated with the Consumer Price Index (CPI). Thirteen commercially available, PC-based expert system shells have received in-depth evaluations. Various machine learning techniques were also reviewed. Two recommendations are given: (1) BLS should use the expert system shell LEVEL5 OBJECT and establish a software development methodology for expert systems; and (2) BLS should undertake a small study to evaluate the potential of machine learning techniques to create and maintain the approximately 350 ELI-specific knowledge bases to be used in CPI product substitution review.

  18. Intrinsic alignments of galaxies in the MassiveBlack-II simulation: Analysis of two-point statistics

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Tenneti, Ananth; Singh, Sukhdeep; Mandelbaum, Rachel; Matteo, Tiziana Di; Feng, Yu; Khandai, Nishikanta

    2015-03-11

    The intrinsic alignment of galaxies with the large-scale density field in an important astrophysical contaminant in upcoming weak lensing surveys. We present detailed measurements of the galaxy intrinsic alignments and associated ellipticity-direction (ED) and projected shape (wg₊) correlation functions for galaxies in the cosmological hydrodynamic MassiveBlack-II (MB-II) simulation. We carefully assess the effects on galaxy shapes, misalignment of the stellar component with the dark matter shape and two-point statistics of iterative weighted (by mass and luminosity) definitions of the (reduced and unreduced) inertia tensor. We find that iterative procedures must be adopted for a reliable measurement of the reduced tensormore » but that luminosity versus mass weighting has only negligible effects. Both ED and wg₊ correlations increase in amplitude with subhalo mass (in the range of 10¹⁰ – 6.0 X 10¹⁴h⁻¹ M⊙), with a weak redshift dependence (from z = 1 to z = 0.06) at fixed mass. At z ~ 0.3, we predict a wg₊ that is in reasonable agreement with SDSS LRG measurements and that decreases in amplitude by a factor of ~ 5–18 for galaxies in the LSST survey. We also compared the intrinsic alignment of centrals and satellites, with clear detection of satellite radial alignments within the host halos. Finally, we show that wg₊ (using subhalos as tracers of density and wδ (using dark matter density) predictions from the simulations agree with that of non-linear alignment models (NLA) at scales where the 2-halo term dominates in the correlations (and tabulate associated NLA fitting parameters). The 1-halo term induces a scale dependent bias at small scales which is not modeled in the NLA model.« less

  19. Intrinsic alignments of galaxies in the MassiveBlack-II simulation: Analysis of two-point statistics

    SciTech Connect (OSTI)

    Tenneti, Ananth; Singh, Sukhdeep; Mandelbaum, Rachel; Matteo, Tiziana Di; Feng, Yu; Khandai, Nishikanta

    2015-03-11

    The intrinsic alignment of galaxies with the large-scale density field in an important astrophysical contaminant in upcoming weak lensing surveys. We present detailed measurements of the galaxy intrinsic alignments and associated ellipticity-direction (ED) and projected shape (wg₊) correlation functions for galaxies in the cosmological hydrodynamic MassiveBlack-II (MB-II) simulation. We carefully assess the effects on galaxy shapes, misalignment of the stellar component with the dark matter shape and two-point statistics of iterative weighted (by mass and luminosity) definitions of the (reduced and unreduced) inertia tensor. We find that iterative procedures must be adopted for a reliable measurement of the reduced tensor but that luminosity versus mass weighting has only negligible effects. Both ED and wg₊ correlations increase in amplitude with subhalo mass (in the range of 10¹⁰ – 6.0 X 10¹⁴h⁻¹ M), with a weak redshift dependence (from z = 1 to z = 0.06) at fixed mass. At z ~ 0.3, we predict a wg₊ that is in reasonable agreement with SDSS LRG measurements and that decreases in amplitude by a factor of ~ 5–18 for galaxies in the LSST survey. We also compared the intrinsic alignment of centrals and satellites, with clear detection of satellite radial alignments within the host halos. Finally, we show that wg₊ (using subhalos as tracers of density and wδ (using dark matter density) predictions from the simulations agree with that of non-linear alignment models (NLA) at scales where the 2-halo term dominates in the correlations (and tabulate associated NLA fitting parameters). The 1-halo term induces a scale dependent bias at small scales which is not modeled in the NLA model.

  20. Intrinsic alignments of galaxies in the MassiveBlack-II simulation: Analysis of two-point statistics

    SciTech Connect (OSTI)

    Tenneti, Ananth; Singh, Sukhdeep; Mandelbaum, Rachel; Matteo, Tiziana Di; Feng, Yu; Khandai, Nishikanta

    2015-03-11

    The intrinsic alignment of galaxies with the large-scale density field in an important astrophysical contaminant in upcoming weak lensing surveys. We present detailed measurements of the galaxy intrinsic alignments and associated ellipticity-direction (ED) and projected shape (wg?) correlation functions for galaxies in the cosmological hydrodynamic MassiveBlack-II (MB-II) simulation. We carefully assess the effects on galaxy shapes, misalignment of the stellar component with the dark matter shape and two-point statistics of iterative weighted (by mass and luminosity) definitions of the (reduced and unreduced) inertia tensor. We find that iterative procedures must be adopted for a reliable measurement of the reduced tensor but that luminosity versus mass weighting has only negligible effects. Both ED and wg? correlations increase in amplitude with subhalo mass (in the range of 10? 6.0 X 10?h? M?), with a weak redshift dependence (from z = 1 to z = 0.06) at fixed mass. At z ~ 0.3, we predict a wg? that is in reasonable agreement with SDSS LRG measurements and that decreases in amplitude by a factor of ~ 518 for galaxies in the LSST survey. We also compared the intrinsic alignment of centrals and satellites, with clear detection of satellite radial alignments within the host halos. Finally, we show that wg? (using subhalos as tracers of density and w? (using dark matter density) predictions from the simulations agree with that of non-linear alignment models (NLA) at scales where the 2-halo term dominates in the correlations (and tabulate associated NLA fitting parameters). The 1-halo term induces a scale dependent bias at small scales which is not modeled in the NLA model.

  1. Statistical and Spectral Analysis of Wind Characteristics Relevant to Wind Energy Assessment Using Tower Measurements in Complex Terrain

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Belu, Radian; Koracin, Darko

    2013-01-01

    The main objective of the study was to investigate spatial and temporal characteristics of the wind speed and direction in complex terrain that are relevant to wind energy assessment and development, as well as to wind energy system operation, management, and grid integration. Wind data from five tall meteorological towers located in Western Nevada, USA, operated from August 2003 to March 2008, used in the analysis. The multiannual average wind speeds did not show significant increased trend with increasing elevation, while the turbulence intensity slowly decreased with an increase were the average wind speed. The wind speed and direction weremore » modeled using the Weibull and the von Mises distribution functions. The correlations show a strong coherence between the wind speed and direction with slowly decreasing amplitude of the multiday periodicity with increasing lag periods. The spectral analysis shows significant annual periodicity with similar characteristics at all locations. The relatively high correlations between the towers and small range of the computed turbulence intensity indicate that wind variability is dominated by the regional synoptic processes. Knowledge and information about daily, seasonal, and annual wind periodicities are very important for wind energy resource assessment, wind power plant operation, management, and grid integration.« less

  2. Computing contingency statistics in parallel.

    SciTech Connect (OSTI)

    Bennett, Janine Camille; Thompson, David; Pebay, Philippe Pierre

    2010-09-01

    Statistical analysis is typically used to reduce the dimensionality of and infer meaning from data. A key challenge of any statistical analysis package aimed at large-scale, distributed data is to address the orthogonal issues of parallel scalability and numerical stability. Many statistical techniques, e.g., descriptive statistics or principal component analysis, are based on moments and co-moments and, using robust online update formulas, can be computed in an embarrassingly parallel manner, amenable to a map-reduce style implementation. In this paper we focus on contingency tables, through which numerous derived statistics such as joint and marginal probability, point-wise mutual information, information entropy, and {chi}{sup 2} independence statistics can be directly obtained. However, contingency tables can become large as data size increases, requiring a correspondingly large amount of communication between processors. This potential increase in communication prevents optimal parallel speedup and is the main difference with moment-based statistics where the amount of inter-processor communication is independent of data size. Here we present the design trade-offs which we made to implement the computation of contingency tables in parallel.We also study the parallel speedup and scalability properties of our open source implementation. In particular, we observe optimal speed-up and scalability when the contingency statistics are used in their appropriate context, namely, when the data input is not quasi-diffuse.

  3. Multivariate volume visualization through dynamic projections

    SciTech Connect (OSTI)

    Liu, Shusen; Wang, Bei; Thiagarajan, Jayaraman J.; Bremer, Peer -Timo; Pascucci, Valerio

    2014-11-01

    We propose a multivariate volume visualization framework that tightly couples dynamic projections with a high-dimensional transfer function design for interactive volume visualization. We assume that the complex, high-dimensional data in the attribute space can be well-represented through a collection of low-dimensional linear subspaces, and embed the data points in a variety of 2D views created as projections onto these subspaces. Through dynamic projections, we present animated transitions between different views to help the user navigate and explore the attribute space for effective transfer function design. Our framework not only provides a more intuitive understanding of the attribute space but also allows the design of the transfer function under multiple dynamic views, which is more flexible than being restricted to a single static view of the data. For large volumetric datasets, we maintain interactivity during the transfer function design via intelligent sampling and scalable clustering. As a result, using examples in combustion and climate simulations, we demonstrate how our framework can be used to visualize interesting structures in the volumetric space.

  4. Statistical analysis of the dynamics of secondary electrons in the flare of a high-voltage beam-type discharge

    SciTech Connect (OSTI)

    Demkin, V. P.; Mel'nichuk, S. V.

    2014-09-15

    In the present work, results of investigations into the dynamics of secondary electrons with helium atoms in the presence of the reverse electric field arising in the flare of a high-voltage pulsed beam-type discharge and leading to degradation of the primary electron beam are presented. The electric field in the discharge of this type at moderate pressures can reach several hundred V/cm and leads to considerable changes in the kinetics of secondary electrons created in the process of propagation of the electron beam generated in the accelerating gap with a grid anode. Moving in the accelerating electric field toward the anode, secondary electrons create the so-called compensating current to the anode. The character of electron motion and the compensating current itself are determined by the ratio of the field strength to the concentration of atoms (E/n). The energy and angular spectra of secondary electrons are calculated by the Monte Carlo method for different ratios E/n of the electric field strength to the helium atom concentration. The motion of secondary electrons with threshold energy is studied for inelastic collisions of helium atoms and differential analysis is carried out of the collisional processes causing energy losses of electrons in helium for different E/n values. The mechanism of creation and accumulation of slow electrons as a result of inelastic collisions of secondary electrons with helium atoms and selective population of metastable states of helium atoms is considered. It is demonstrated that in a wide range of E/n values the motion of secondary electrons in the beam-type discharge flare has the character of drift. At E/n values characteristic for the discharge of the given type, the drift velocity of these electrons is calculated and compared with the available experimental data.

  5. Parallel auto-correlative statistics with VTK.

    SciTech Connect (OSTI)

    Pebay, Philippe Pierre; Bennett, Janine Camille

    2013-08-01

    This report summarizes existing statistical engines in VTK and presents both the serial and parallel auto-correlative statistics engines. It is a sequel to [PT08, BPRT09b, PT09, BPT09, PT10] which studied the parallel descriptive, correlative, multi-correlative, principal component analysis, contingency, k-means, and order statistics engines. The ease of use of the new parallel auto-correlative statistics engine is illustrated by the means of C++ code snippets and algorithm verification is provided. This report justifies the design of the statistics engines with parallel scalability in mind, and provides scalability and speed-up analysis results for the autocorrelative statistics engine.

  6. Deep Data Analysis of Conductive Phenomena on Complex Oxide Interfaces: Physics from Data Mining

    SciTech Connect (OSTI)

    Strelcov, Evgheni; Belianinov, Alex; Hsieh, Ying-Hui; Jesse, Stephen; Baddorf, Arthur P; Chu, Ying Hao; Kalinin, Sergei V

    2014-01-01

    Spatial variability of electronic transport in BiFeO3-CoFe2O4 (BFO-CFO) self-assembled heterostructures is explored using spatially resolved first order reversal curve (FORC) current voltage (IV) mapping. Multivariate statistical analysis of FORC-IV data classifies statistically significant behaviors and maps characteristic responses spatially. In particular, regions of grain, matrix, and grain boundary responses are clearly identified. K-means and Bayesian demixing analysis suggests the characteristic response be separated into four components, with hysteretic type behavior localized at the BFO-CFO tubular interfaces. The conditions under which Bayesian components allow direct physical interpretation are explored, and transport mechanisms at the grain boundaries and individual phases are analyzed. This approach conjoins multivariate statistical analysis with physics-based interpretation, actualizing a robust, universal, data driven approach to problem solving, which can be applied to exploration of local transport and other functional phenomena in other spatially inhomogeneous systems.

  7. ARM - Facility Statistics

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    ARMFacility Statistics 2015 Quarterly Reports First Quarter (PDF) Second Quarter (PDF) Third Quarter (PDF) Fourth Quarter (PDF) Historical Statistics Field Campaigns Operational...

  8. Web Analytics and Statistics

    Office of Energy Efficiency and Renewable Energy (EERE)

    EERE uses Google Analytics to capture statistics on its websites. These statistics help website managers measure and report on users, sessions, most visited pages, and more.

  9. Environment/Health/Safety (EHS): Monthly Accident Statistics

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Personal Protective Equipment (PPE) Injury Review & Analysis Worker Safety and Health Program: PUB-3851 Monthly Accident Statistics Latest Accident Statistics Accident...

  10. Automated Multivariate Optimization Tool for Energy Analysis: Preprint

    SciTech Connect (OSTI)

    Ellis, P. G.; Griffith, B. T.; Long, N.; Torcellini, P. A.; Crawley, D.

    2006-07-01

    Building energy simulations are often used for trial-and-error evaluation of ''what-if'' options in building design--a limited search for an optimal solution, or ''optimization''. Computerized searching has the potential to automate the input and output, evaluate many options, and perform enough simulations to account for the complex interactions among combinations of options. This paper describes ongoing efforts to develop such a tool. The optimization tool employs multiple modules, including a graphical user interface, a database, a preprocessor, the EnergyPlus simulation engine, an optimization engine, and a simulation run manager. Each module is described and the overall application architecture is summarized.

  11. Assessment of Critical Events Corridors through Multivariate Cascading Outages Analysis

    SciTech Connect (OSTI)

    Makarov, Yuri V.; Samaan, Nader A.; Diao, Ruisheng; Kumbale, Murali; Chen, Yousu; Singh, Ruchi; Green, Irina; Morgan, Mark P.

    2011-10-17

    Massive blackouts of electrical power systems in North America over the past decade has focused increasing attention upon ways to identify and simulate network events that may potentially lead to widespread network collapse. This paper summarizes a method to simulate power-system vulnerability to cascading failures to a supplied set of initiating events synonymously termed as Extreme Events. The implemented simulation method is currently confined to simulating steady state power-system response to a set of extreme events. The outlined method of simulation is meant to augment and provide a new insight into bulk power transmission network planning that at present remains mainly confined to maintaining power system security for single and double component outages under a number of projected future network operating conditions. Although one of the aims of this paper is to demonstrate the feasibility of simulating network vulnerability to cascading outages, a more important goal has been to determine vulnerable parts of the network that may potentially be strengthened in practice so as to mitigate system susceptibility to cascading failures. This paper proposes to demonstrate a systematic approach to analyze extreme events and identify vulnerable system elements that may be contributing to cascading outages. The hypothesis of critical events corridors is proposed to represent repeating sequential outages that can occur in the system for multiple initiating events. The new concept helps to identify system reinforcements that planners could engineer in order to 'break' the critical events sequences and therefore lessen the likelihood of cascading outages. This hypothesis has been successfully validated with a California power system model.

  12. Visualization of Multivariate Data --- Inventor: Eliot Feibush | Princeton

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Plasma Physics Lab Visualization of Multivariate Data --- Inventor: Eliot Feibush The invention is a process for displaying multivariate data while eliminating ambiguity. It clarifies and shows more dimensions of data than existing methods such as scatter plots and icon plots where multiple samples can occupy the same location in a drawing. The display technique is not affected by the sequential order of the data. The invented process solves these problems by mapping the sample totals of

  13. AMERICAN STATISTICAL ASSOCIATION

    U.S. Energy Information Administration (EIA) Indexed Site

    ... The t-statistics test significant and those t-statistics are to the actual regression ... on the NEMS, you've got to test this with real live data eventually to see how it's doing. ...

  14. Method for factor analysis of GC/MS data

    DOE Patents [OSTI]

    Van Benthem, Mark H; Kotula, Paul G; Keenan, Michael R

    2012-09-11

    The method of the present invention provides a fast, robust, and automated multivariate statistical analysis of gas chromatography/mass spectroscopy (GC/MS) data sets. The method can involve systematic elimination of undesired, saturated peak masses to yield data that follow a linear, additive model. The cleaned data can then be subjected to a combination of PCA and orthogonal factor rotation followed by refinement with MCR-ALS to yield highly interpretable results.

  15. Forecasting of municipal solid waste quantity in a developing country using multivariate grey models

    SciTech Connect (OSTI)

    Intharathirat, Rotchana; Abdul Salam, P.; Kumar, S.; Untong, Akarapong

    2015-05-15

    Highlights: • Grey model can be used to forecast MSW quantity accurately with the limited data. • Prediction interval overcomes the uncertainty of MSW forecast effectively. • A multivariate model gives accuracy associated with factors affecting MSW quantity. • Population, urbanization, employment and household size play role for MSW quantity. - Abstract: In order to plan, manage and use municipal solid waste (MSW) in a sustainable way, accurate forecasting of MSW generation and composition plays a key role. It is difficult to carry out the reliable estimates using the existing models due to the limited data available in the developing countries. This study aims to forecast MSW collected in Thailand with prediction interval in long term period by using the optimized multivariate grey model which is the mathematical approach. For multivariate models, the representative factors of residential and commercial sectors affecting waste collected are identified, classified and quantified based on statistics and mathematics of grey system theory. Results show that GMC (1, 5), the grey model with convolution integral, is the most accurate with the least error of 1.16% MAPE. MSW collected would increase 1.40% per year from 43,435–44,994 tonnes per day in 2013 to 55,177–56,735 tonnes per day in 2030. This model also illustrates that population density is the most important factor affecting MSW collected, followed by urbanization, proportion employment and household size, respectively. These mean that the representative factors of commercial sector may affect more MSW collected than that of residential sector. Results can help decision makers to develop the measures and policies of waste management in long term period.

  16. Multivariate classification of infrared spectra of cell and tissue samples

    DOE Patents [OSTI]

    Haaland, David M.; Jones, Howland D. T.; Thomas, Edward V.

    1997-01-01

    Multivariate classification techniques are applied to spectra from cell and tissue samples irradiated with infrared radiation to determine if the samples are normal or abnormal (cancerous). Mid and near infrared radiation can be used for in vivo and in vitro classifications using at least different wavelengths.

  17. FY 2005 Statistical Table

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Statistical Table by Appropriation (dollars in thousands - OMB Scoring) Table of Contents Summary...................................................................................................... 1 Mandatory Funding....................................................................................... 3 Energy Supply.............................................................................................. 4 Non-Defense site acceleration

  18. ARM - Historical Visitor Statistics

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Operational Visitors and Accounts Data Archive and Usage (October 1995 - Present) Historical Visitor Statistics As a national user facility, ARM is required to report...

  19. International Energy Statistics - EIA

    Gasoline and Diesel Fuel Update (EIA)

    International > International Energy Statistics International Energy Statistics Petroleum Production | Annual Monthly/Quarterly Consumption | Annual Monthly/Quarterly Capacity | Bunker Fuels | Stocks | Annual Monthly/Quarterly Reserves | Imports | Annual Monthly/Quarterly Exports | CO2 Emissions | Heat Content Natural Gas All Flows | Production | Consumption | Reserves | Imports | Exports | Carbon Dioxide Emissions | Heat Content Coal All Flows | Production | Consumption | Reserves | Imports

  20. Studying Resist Stochastics with the Multivariate Poisson Propagation Model

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Naulleau, Patrick; Anderson, Christopher; Chao, Weilun; Bhattarai, Suchit; Neureuther, Andrew

    2014-01-01

    Progress in the ultimate performance of extreme ultraviolet resist has arguably decelerated in recent years suggesting an approach to stochastic limits both in photon counts and material parameters. Here we report on the performance of a variety of leading extreme ultraviolet resist both with and without chemical amplification. The measured performance is compared to stochastic modeling results using the Multivariate Poisson Propagation Model. The results show that the best materials are indeed nearing modeled performance limits.

  1. Various forms of indexing HDMR for modelling multivariate classification problems

    SciTech Connect (OSTI)

    Aksu, a?r?; Tunga, M. Alper

    2014-12-10

    The Indexing HDMR method was recently developed for modelling multivariate interpolation problems. The method uses the Plain HDMR philosophy in partitioning the given multivariate data set into less variate data sets and then constructing an analytical structure through these partitioned data sets to represent the given multidimensional problem. Indexing HDMR makes HDMR be applicable to classification problems having real world data. Mostly, we do not know all possible class values in the domain of the given problem, that is, we have a non-orthogonal data structure. However, Plain HDMR needs an orthogonal data structure in the given problem to be modelled. In this sense, the main idea of this work is to offer various forms of Indexing HDMR to successfully model these real life classification problems. To test these different forms, several well-known multivariate classification problems given in UCI Machine Learning Repository were used and it was observed that the accuracy results lie between 80% and 95% which are very satisfactory.

  2. Statistical design of a uranium corrosion experiment

    SciTech Connect (OSTI)

    Wendelberger, Joanne R; Moore, Leslie M

    2009-01-01

    This work supports an experiment being conducted by Roland Schulze and Mary Ann Hill to study hydride formation, one of the most important forms of corrosion observed in uranium and uranium alloys. The study goals and objectives are described in Schulze and Hill (2008), and the work described here focuses on development of a statistical experiment plan being used for the study. The results of this study will contribute to the development of a uranium hydriding model for use in lifetime prediction models. A parametric study of the effect of hydrogen pressure, gap size and abrasion on hydride initiation and growth is being planned where results can be analyzed statistically to determine individual effects as well as multi-variable interactions. Input to ESC from this experiment will include expected hydride nucleation, size, distribution, and volume on various uranium surface situations (geometry) as a function of age. This study will also address the effect of hydrogen threshold pressure on corrosion nucleation and the effect of oxide abrasion/breach on hydriding processes. Statistical experiment plans provide for efficient collection of data that aids in understanding the impact of specific experiment factors on initiation and growth of corrosion. The experiment planning methods used here also allow for robust data collection accommodating other sources of variation such as the density of inclusions, assumed to vary linearly along the cast rods from which samples are obtained.

  3. Scalable k-means statistics with Titan.

    SciTech Connect (OSTI)

    Thompson, David C.; Bennett, Janine C.; Pebay, Philippe Pierre

    2009-11-01

    This report summarizes existing statistical engines in VTK/Titan and presents both the serial and parallel k-means statistics engines. It is a sequel to [PT08], [BPRT09], and [PT09] which studied the parallel descriptive, correlative, multi-correlative, principal component analysis, and contingency engines. The ease of use of the new parallel k-means engine is illustrated by the means of C++ code snippets and algorithm verification is provided. This report justifies the design of the statistics engines with parallel scalability in mind, and provides scalability and speed-up analysis results for the k-means engine.

  4. Statistical methods for environmental pollution monitoring

    SciTech Connect (OSTI)

    Gilbert, R.O.

    1987-01-01

    The application of statistics to environmental pollution monitoring studies requires a knowledge of statistical analysis methods particularly well suited to pollution data. This book fills that need by providing sampling plans, statistical tests, parameter estimation procedure techniques, and references to pertinent publications. Most of the statistical techniques are relatively simple, and examples, exercises, and case studies are provided to illustrate procedures. The book is logically divided into three parts. Chapters 1, 2, and 3 are introductory chapters. Chapters 4 through 10 discuss field sampling designs and Chapters 11 through 18 deal with a broad range of statistical analysis procedures. Some statistical techniques given here are not commonly seen in statistics book. For example, see methods for handling correlated data (Sections 4.5 and 11.12), for detecting hot spots (Chapter 10), and for estimating a confidence interval for the mean of a lognormal distribution (Section 13.2). Also, Appendix B lists a computer code that estimates and tests for trends over time at one or more monitoring stations using nonparametric methods (Chapters 16 and 17). Unfortunately, some important topics could not be included because of their complexity and the need to limit the length of the book. For example, only brief mention could be made of time series analysis using Box-Jenkins methods and of kriging techniques for estimating spatial and spatial-time patterns of pollution, although multiple references on these topics are provided. Also, no discussion of methods for assessing risks from environmental pollution could be included.

  5. Candidate Assembly Statistical Evaluation

    Energy Science and Technology Software Center (OSTI)

    1998-07-15

    The Savannah River Site (SRS) receives aluminum clad spent Material Test Reactor (MTR) fuel from all over the world for storage and eventual reprocessing. There are hundreds of different kinds of MTR fuels and these fuels will continue to be received at SRS for approximately ten more years. SRS''s current criticality evaluation methodology requires the modeling of all MTR fuels utilizing Monte Carlo codes, which is extremely time consuming and resource intensive. Now that amore » significant number of MTR calculations have been conducted it is feasible to consider building statistical models that will provide reasonable estimations of MTR behavior. These statistical models can be incorporated into a standardized model homogenization spreadsheet package to provide analysts with a means of performing routine MTR fuel analyses with a minimal commitment of time and resources. This became the purpose for development of the Candidate Assembly Statistical Evaluation (CASE) program at SRS.« less

  6. Statistics, Uncertainty, and Transmitted Variation

    SciTech Connect (OSTI)

    Wendelberger, Joanne Roth

    2014-11-05

    The field of Statistics provides methods for modeling and understanding data and making decisions in the presence of uncertainty. When examining response functions, variation present in the input variables will be transmitted via the response function to the output variables. This phenomenon can potentially have significant impacts on the uncertainty associated with results from subsequent analysis. This presentation will examine the concept of transmitted variation, its impact on designed experiments, and a method for identifying and estimating sources of transmitted variation in certain settings.

  7. Statistically significant relational data mining :

    SciTech Connect (OSTI)

    Berry, Jonathan W.; Leung, Vitus Joseph; Phillips, Cynthia Ann; Pinar, Ali; Robinson, David Gerald; Berger-Wolf, Tanya; Bhowmick, Sanjukta; Casleton, Emily; Kaiser, Mark; Nordman, Daniel J.; Wilson, Alyson G.

    2014-02-01

    This report summarizes the work performed under the project (3z(BStatitically significant relational data mining.(3y (BThe goal of the project was to add more statistical rigor to the fairly ad hoc area of data mining on graphs. Our goal was to develop better algorithms and better ways to evaluate algorithm quality. We concetrated on algorithms for community detection, approximate pattern matching, and graph similarity measures. Approximate pattern matching involves finding an instance of a relatively small pattern, expressed with tolerance, in a large graph of data observed with uncertainty. This report gathers the abstracts and references for the eight refereed publications that have appeared as part of this work. We then archive three pieces of research that have not yet been published. The first is theoretical and experimental evidence that a popular statistical measure for comparison of community assignments favors over-resolved communities over approximations to a ground truth. The second are statistically motivated methods for measuring the quality of an approximate match of a small pattern in a large graph. The third is a new probabilistic random graph model. Statisticians favor these models for graph analysis. The new local structure graph model overcomes some of the issues with popular models such as exponential random graph models and latent variable models.

  8. Direct injection of natural gas in blast furnaces at high rates: Preliminary statistical analysis of blast furnace carbon balance at Armco-Middletown. Topical report, January 1990-September 1992

    SciTech Connect (OSTI)

    Neels, J.K.; Brown, F.C.

    1992-09-01

    The economic benefits of supplemental fuel injections depend, in part, on the coke replacement ratio. An assessment of the accuracy with which blast furnace coke rate may be measured and a determination of the key drivers of coke rate uncertainty are offered, to provide guidance for experiments in high-rate gas injection. Using statistical analysis tools, an expression for the measurement error associated with the various terms of blast furnace carbon balance is developed. Coke rate calculations based on the material balance are most sensitive to coke carbon content and to proper tracking of hot metal tapping schedule.

  9. FY 2006 Statistical Table

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Statistical Table by Appropriation (dollars in thousands - OMB Scoring) FY 2004 FY 2005 FY 2006 Comparable Comparable Request to FY 2006 vs. FY 2005 Approp Approp Congress Discretionary Summary By Appropriation Energy And Water Development Appropriation Summary: Energy Programs Energy supply Operation and maintenance................................................. 787,941 909,903 862,499 -47,404 -5.2% Construction......................................................................... 6,956

  10. FY 2013 Statistical Table

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Statistical Table by Appropriation (dollars in thousands - OMB Scoring) FY 2011 FY 2012 FY 2013 Current Enacted Congressional Approp. Approp. * Request $ % Discretionary Summary By Appropriation Energy And Water Development, And Related Agencies Appropriation Summary: Energy Programs Energy efficiency and renewable energy........................................ 1,771,721 1,809,638 2,337,000 +527,362 +29.1% Electricity delivery and energy reliability.........................................

  11. Statistical Analysis of the Phase 3 Emissions Data Collected in the EPAct/V2/E89 Program: January 7, 2010 - July 6, 2012

    SciTech Connect (OSTI)

    Gunst, R. F.

    2013-05-01

    Phase 3 of the EPAct/V2/E-89 Program investigated the effects of 27 program fuels and 15 program vehicles on exhaust emissions and fuel economy. All vehicles were tested over the California Unified Driving Cycle (LA-92) at 75 degrees F. The program fuels differed on T50, T90, ethanol, Reid vapor pressure, and aromatics. The vehicles tested were new, low-mileage 2008 model year Tier 2 vehicles. A total of 956 test runs were made. Comprehensive statistical modeling and analyses were conducted on methane, carbon dioxide, carbon monoxide, fuel economy, non-methane hydrocarbons, non-methane organic gases, oxides of nitrogen, particulate matter, and total hydrocarbons. In general, model fits determined that emissions and fuel economy were complicated by functions of the five fuel parameters. An extensive evaluation of alternative model fits produced a number of competing model fits. Many of these alternative fits produce similar estimates of mean emissions for the 27 program fuels but should be carefully evaluated for use with emerging fuels with combinations of fuel parameters not included here. The program includes detailed databases on each of the 27 program fuels on each of the 15 vehicles and on each of the vehicles on each of the program fuels.

  12. Neutron Total Cross Sections of {sup 235}U From Transmission Measurements in the Energy Range 2 keV to 300 keV and Statistical Model Analysis of the Data

    SciTech Connect (OSTI)

    Derrien, H.; Harvey, J.A.; Larson, N.M.; Leal, L.C.; Wright, R.Q.

    2000-05-01

    The average {sup 235}U neutron total cross sections were obtained in the energy range 2 keV to 330 keV from high-resolution transmission measurements of a 0.033 atom/b sample.1 The experimental data were corrected for the contribution of isotope impurities and for resonance self-shielding effects in the sample. The results are in very good agreement with the experimental data of Poenitz et al.4 in the energy range 40 keV to 330 keV and are the only available accurate experimental data in the energy range 2 keV to 40 keV. ENDF/B-VI evaluated data are 1.7% larger. The SAMMY/FITACS code 2 was used for a statistical model analysis of the total cross section, selected fission cross sections and data in the energy range 2 keV to 200 keV. SAMMY/FITACS is an extended version of SAMMY which allows consistent analysis of the experimental data in the resolved and unresolved resonance region. The Reich-Moore resonance parameters were obtained 3 from a SAMMY Bayesian fits of high resolution experimental neutron transmission and partial cross section data below 2.25 keV, and the corresponding average parameters and covariance data were used in the present work as input for the statistical model analysis of the high energy range of the experimental data. The result of the analysis shows that the average resonance parameters obtained from the analysis of the unresolved resonance region are consistent with those obtained in the resolved energy region. Another important result is that ENDF/B-VI capture cross section could be too small by more than 10% in the energy range 10 keV to 200 keV.

  13. Topology for statistical modeling of petascale data.

    SciTech Connect (OSTI)

    Pascucci, Valerio; Mascarenhas, Ajith Arthur; Rusek, Korben; Bennett, Janine Camille; Levine, Joshua; Pebay, Philippe Pierre; Gyulassy, Attila; Thompson, David C.; Rojas, Joseph Maurice

    2011-07-01

    This document presents current technical progress and dissemination of results for the Mathematics for Analysis of Petascale Data (MAPD) project titled 'Topology for Statistical Modeling of Petascale Data', funded by the Office of Science Advanced Scientific Computing Research (ASCR) Applied Math program. Many commonly used algorithms for mathematical analysis do not scale well enough to accommodate the size or complexity of petascale data produced by computational simulations. The primary goal of this project is thus to develop new mathematical tools that address both the petascale size and uncertain nature of current data. At a high level, our approach is based on the complementary techniques of combinatorial topology and statistical modeling. In particular, we use combinatorial topology to filter out spurious data that would otherwise skew statistical modeling techniques, and we employ advanced algorithms from algebraic statistics to efficiently find globally optimal fits to statistical models. This document summarizes the technical advances we have made to date that were made possible in whole or in part by MAPD funding. These technical contributions can be divided loosely into three categories: (1) advances in the field of combinatorial topology, (2) advances in statistical modeling, and (3) new integrated topological and statistical methods.

  14. Computer, Computational, and Statistical Sciences

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    CCS Computer, Computational, and Statistical Sciences Computational physics, computer science, applied mathematics, statistics and the integration of large data streams are central ...

  15. ORISE: Statistical Analyses of Worker Health

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Statistical Analyses Statistical analyses at the Oak Ridge Institute for Science and Education (ORISE) support ongoing programs involving medical surveillance of workers and other populations, as well as occupational epidemiology and research. ORISE emphasizes insightful and accurate analysis, practical interpretation of results and clear, easily read reports. All analyses are preceded by extensive data scrubbing and verification. ORISE's approach relies on applying appropriate methods of

  16. Key World Energy Statistics-2010 | Open Energy Information

    Open Energy Info (EERE)

    World Energy Statistics-2010 AgencyCompany Organization: International Energy Agency Sector: Energy Topics: Market analysis Resource Type: Dataset, Maps Website: www.iea.org...

  17. Experimental Mathematics and Computational Statistics

    SciTech Connect (OSTI)

    Bailey, David H.; Borwein, Jonathan M.

    2009-04-30

    The field of statistics has long been noted for techniques to detect patterns and regularities in numerical data. In this article we explore connections between statistics and the emerging field of 'experimental mathematics'. These includes both applications of experimental mathematics in statistics, as well as statistical methods applied to computational mathematics.

  18. Characterization of a hybrid target multi-keV x-ray source by a multi-parameter statistical analysis of titanium K-shell emission

    SciTech Connect (OSTI)

    Primout, M.; Babonneau, D.; Jacquet, L.; Gilleron, F.; Peyrusse, O.; Fournier, K. B.; Marrs, R.; May, M. J.; Heeter, R. F.; Wallace, R. J.

    2015-11-10

    We studied the titanium K-shell emission spectra from multi-keV x-ray source experiments with hybrid targets on the OMEGA laser facility. Using the collisional-radiative TRANSPEC code, dedicated to K-shell spectroscopy, we reproduced the main features of the detailed spectra measured with the time-resolved MSPEC spectrometer. We developed a general method to infer the Ne, Te and Ti characteristics of the target plasma from the spectral analysis (ratio of integrated Lyman-α to Helium-α in-band emission and the peak amplitude of individual line ratios) of the multi-keV x-ray emission. Finally, these thermodynamic conditions are compared to those calculated independently by the radiation-hydrodynamics transport code FCI2.

  19. Characterization of a hybrid target multi-keV x-ray source by a multi-parameter statistical analysis of titanium K-shell emission

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Primout, M.; Babonneau, D.; Jacquet, L.; Gilleron, F.; Peyrusse, O.; Fournier, K. B.; Marrs, R.; May, M. J.; Heeter, R. F.; Wallace, R. J.

    2015-11-10

    We studied the titanium K-shell emission spectra from multi-keV x-ray source experiments with hybrid targets on the OMEGA laser facility. Using the collisional-radiative TRANSPEC code, dedicated to K-shell spectroscopy, we reproduced the main features of the detailed spectra measured with the time-resolved MSPEC spectrometer. We developed a general method to infer the Ne, Te and Ti characteristics of the target plasma from the spectral analysis (ratio of integrated Lyman-α to Helium-α in-band emission and the peak amplitude of individual line ratios) of the multi-keV x-ray emission. Finally, these thermodynamic conditions are compared to those calculated independently by the radiation-hydrodynamics transportmore » code FCI2.« less

  20. International petroleum statistics report

    SciTech Connect (OSTI)

    1995-10-01

    The International Petroleum Statistics Report is a monthly publication that provides current international oil data. This report presents data on international oil production, demand, imports, exports and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). Section 2 presents an oil supply/demand balance for the world, in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries.

  1. International petroleum statistics report

    SciTech Connect (OSTI)

    1997-05-01

    The International Petroleum Statistics Report is a monthly publication that provides current international oil data. This report is published for the use of Members of Congress, Federal agencies, State agencies, industry, and the general public. Publication of this report is in keeping with responsibilities given the Energy Information Administration in Public Law 95-91. The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1995; OECD stocks from 1973 through 1995; and OECD trade from 1985 through 1995.

  2. AMERICAN STATISTICAL ASSOCIATION (ASA)

    U.S. Energy Information Administration (EIA) Indexed Site

    ... team to EIA products and services, data analysis, forecast, education, and documentation. ... the use or misuse of review recommendations that should be considered in the project plan? ...

  3. Improved Geothermometry Through Multivariate Reaction Path Modeling and Evaluation of Geomicrobiological Influences on Geochemical Temperature Indicators

    Broader source: Energy.gov [DOE]

    Improved Geothermometry Through Multivariate Reaction Path Modeling and Evaluation of Geomicrobiological Influences on Geochemical Temperature Indicators presentation at the April 2013 peer review meeting held in Denver, Colorado.

  4. Statistical physics ""Beyond equilibrium

    SciTech Connect (OSTI)

    Ecke, Robert E

    2009-01-01

    The scientific challenges of the 21st century will increasingly involve competing interactions, geometric frustration, spatial and temporal intrinsic inhomogeneity, nanoscale structures, and interactions spanning many scales. We will focus on a broad class of emerging problems that will require new tools in non-equilibrium statistical physics and that will find application in new material functionality, in predicting complex spatial dynamics, and in understanding novel states of matter. Our work will encompass materials under extreme conditions involving elastic/plastic deformation, competing interactions, intrinsic inhomogeneity, frustration in condensed matter systems, scaling phenomena in disordered materials from glasses to granular matter, quantum chemistry applied to nano-scale materials, soft-matter materials, and spatio-temporal properties of both ordinary and complex fluids.

  5. Exploratory Data analysis ENvironment eXtreme scale (EDENx)

    SciTech Connect (OSTI)

    Steed, Chad Allen

    2015-07-01

    EDENx is a multivariate data visualization tool that allows interactive user driven analysis of large-scale data sets with high dimensionality. EDENx builds on our earlier system, called EDEN to enable analysis of more dimensions and larger scale data sets. EDENx provides an initial overview of summary statistics for each variable in the data set under investigation. EDENx allows the user to interact with graphical summary plots of the data to investigate subsets and their statistical associations. These plots include histograms, binned scatterplots, binned parallel coordinate plots, timeline plots, and graphical correlation indicators. From the EDENx interface, a user can select a subsample of interest and launch a more detailed data visualization via the EDEN system. EDENx is best suited for high-level, aggregate analysis tasks while EDEN is more appropriate for detail data investigations.

  6. Statistical methods for environmental pollution monitoring

    SciTech Connect (OSTI)

    Gilbert, R.O.

    1986-01-01

    This volume covers planning, design, and data analysis. It offers statistical methods for designing environmental sampling and monitoring programs as well as analyzing the resulting data. Statistical sample survey methods to problems of estimating average and total amounts of environmental pollution are presented in detail. The book also provides a broad array of statistical analysis methods for many purposes...numerous examples...three case studies...end-of-chapter questions...computer codes (showing what output looks like along with its interpretation)...a discussion of Kriging methods for estimating pollution concentration contours over space and/or time...nomographs for determining the number of samples required to detect hot spots with specified confidence...and a description and tables for conducting Rosner's test to identify outlaying (usually large) pollution measurements in a data set.

  7. statistics | OpenEI Community

    Open Energy Info (EERE)

    statistics Home Rmckeel's picture Submitted by Rmckeel(297) Contributor 8 November, 2012 - 12:58 OpenEI dashboard Google Analytics mediawiki OpenEI statistics wiki OpenEI web...

  8. International petroleum statistics report

    SciTech Connect (OSTI)

    1996-10-01

    The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. Word oil production and OECD demand data are for the years 1970 through 1995; OECD stocks from 1973 through 1995; and OECD trade from 1985 through 1995.

  9. International petroleum statistics report

    SciTech Connect (OSTI)

    1995-11-01

    The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1994; OECD stocks from 1973 through 1994; and OECD trade from 1984 through 1994.

  10. International petroleum statistics report

    SciTech Connect (OSTI)

    1997-07-01

    The International Petroleum Statistics Report is a monthly publication that provides current international data. The report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent 12 months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1996; OECD stocks from 1973 through 1996; and OECD trade from 1986 through 1996.

  11. International petroleum statistics report

    SciTech Connect (OSTI)

    1995-07-27

    The International Petroleum Statistics Report presents data on international oil production, demand, imports, and exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1994; OECD stocks from 1973 through 1994; and OECD trade from 1984 through 1994.

  12. International petroleum statistics report

    SciTech Connect (OSTI)

    1996-05-01

    The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1995; OECD stocks from 1973 through 1995; and OECD trade from 1084 through 1994.

  13. Independent Statistics & Analysis Drilling Productivity Report

    Gasoline and Diesel Fuel Update (EIA)

    with +- signs and color-coded arrows to highlight the growth or decline in oil (brown) or natural gas (blue). New-well oilgas production per rig Charts present historical...

  14. Big-Data RHEED analysis for understanding epitaxial film growth processes

    SciTech Connect (OSTI)

    Vasudevan, Rama K; Tselev, Alexander; Baddorf, Arthur P; Kalinin, Sergei V

    2014-10-28

    Reflection high energy electron diffraction (RHEED) has by now become a standard tool for in-situ monitoring of film growth by pulsed laser deposition and molecular beam epitaxy. Yet despite the widespread adoption and wealth of information in RHEED image, most applications are limited to observing intensity oscillations of the specular spot, and much additional information on growth is discarded. With ease of data acquisition and increased computation speeds, statistical methods to rapidly mine the dataset are now feasible. Here, we develop such an approach to the analysis of the fundamental growth processes through multivariate statistical analysis of RHEED image sequence. This approach is illustrated for growth of LaxCa1-xMnO3 films grown on etched (001) SrTiO3 substrates, but is universal. The multivariate methods including principal component analysis and k-means clustering provide insight into the relevant behaviors, the timing and nature of a disordered to ordered growth change, and highlight statistically significant patterns. Fourier analysis yields the harmonic components of the signal and allows separation of the relevant components and baselines, isolating the assymetric nature of the step density function and the transmission spots from the imperfect layer-by-layer (LBL) growth. These studies show the promise of big data approaches to obtaining more insight into film properties during and after epitaxial film growth. Furthermore, these studies open the pathway to use forward prediction methods to potentially allow significantly more control over growth process and hence final film quality.

  15. Statistics for nuclear engineers and scientists. Part 1. Basic statistical inference

    SciTech Connect (OSTI)

    Beggs, W.J.

    1981-02-01

    This report is intended for the use of engineers and scientists working in the nuclear industry, especially at the Bettis Atomic Power Laboratory. It serves as the basis for several Bettis in-house statistics courses. The objectives of the report are to introduce the reader to the language and concepts of statistics and to provide a basic set of techniques to apply to problems of the collection and analysis of data. Part 1 covers subjects of basic inference. The subjects include: descriptive statistics; probability; simple inference for normally distributed populations, and for non-normal populations as well; comparison of two populations; the analysis of variance; quality control procedures; and linear regression analysis.

  16. ARM - Historical Field Campaign Statistics

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Operational Visitors and Accounts Data Archive and Usage (October 1995 - Present) Historical Field Campaign Statistics ARM Climate Research Facility users regularly conduct...

  17. Angular-momentum nonclassicality by breaking classical bounds on statistics

    SciTech Connect (OSTI)

    Luis, Alfredo; Rivas, Angel

    2011-10-15

    We derive simple practical procedures revealing the quantum behavior of angular momentum variables by the violation of classical upper bounds on the statistics. Data analysis is minimum and definite conclusions are obtained without evaluation of moments, or any other more sophisticated procedures. These nonclassical tests are very general and independent of other typical quantum signatures of nonclassical behavior such as sub-Poissonian statistics, squeezing, or oscillatory statistics, being insensitive to the nonclassical behavior displayed by other variables.

  18. Design and performance of a scalable, parallel statistics toolkit.

    SciTech Connect (OSTI)

    Thompson, David C.; Bennett, Janine Camille; Pebay, Philippe Pierre

    2010-11-01

    Most statistical software packages implement a broad range of techniques but do so in an ad hoc fashion, leaving users who do not have a broad knowledge of statistics at a disadvantage since they may not understand all the implications of a given analysis or how to test the validity of results. These packages are also largely serial in nature, or target multicore architectures instead of distributed-memory systems, or provide only a small number of statistics in parallel. This paper surveys a collection of parallel implementations of statistics algorithm developed as part of a common framework over the last 3 years. The framework strategically groups modeling techniques with associated verification and validation techniques to make the underlying assumptions of the statistics more clear. Furthermore it employs a design pattern specifically targeted for distributed-memory parallelism, where architectural advances in large-scale high-performance computing have been focused. Moment-based statistics (which include descriptive, correlative, and multicorrelative statistics, principal component analysis (PCA), and k-means statistics) scale nearly linearly with the data set size and number of processes. Entropy-based statistics (which include order and contingency statistics) do not scale well when the data in question is continuous or quasi-diffuse but do scale well when the data is discrete and compact. We confirm and extend our earlier results by now establishing near-optimal scalability with up to 10,000 processes.

  19. Moore named an American Statistical Society Fellow

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Moore named an American Statistical Society Fellow Moore named an American Statistical Society Fellow The ASA inducted Leslie (Lisa) Moore as a Fellow at the 2014 Joint Statistical...

  20. Localization of polyhydroxybutyrate in sugarcane using Fourier-transform infrared microspectroscopy and multivariate imaging

    SciTech Connect (OSTI)

    Lupoi, Jason S.; Smith-Moritz, Andreia; Singh, Seema; McQualter, Richard; Scheller, Henrik V.; Simmons, Blake A.; Henry, Robert J.

    2015-07-10

    Background: Slow-degrading, fossil fuel-derived plastics can have deleterious effects on the environment, especially marine ecosystems. The production of bio-based, biodegradable plastics from or in plants can assist in supplanting those manufactured using fossil fuels. Polyhydroxybutyrate (PHB) is one such biodegradable polyester that has been evaluated as a possible candidate for relinquishing the use of environmentally harmful plastics. Results: PHB, possessing similar properties to polyesters produced from non-renewable sources, has been previously engineered in sugarcane, thereby creating a high-value co-product in addition to the high biomass yield. This manuscript illustrates the coupling of a Fourier-transform infrared microspectrometer, equipped with a focal plane array (FPA) detector, with multivariate imaging to successfully identify and localize PHB aggregates. Principal component analysis imaging facilitated the mining of the abundant quantity of spectral data acquired using the FPA for distinct PHB vibrational modes. PHB was measured in the chloroplasts of mesophyll and bundle sheath cells, acquiescent with previously evaluated plant samples. Conclusion: This study demonstrates the power of IR microspectroscopy to rapidly image plant sections to provide a snapshot of the chemical composition of the cell. While PHB was localized in sugarcane, this method is readily transferable to other value-added co-products in different plants.

  1. Localization of polyhydroxybutyrate in sugarcane using Fourier-transform infrared microspectroscopy and multivariate imaging

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Lupoi, Jason S.; Smith-Moritz, Andreia; Singh, Seema; McQualter, Richard; Scheller, Henrik V.; Simmons, Blake A.; Henry, Robert J.

    2015-07-10

    Background: Slow-degrading, fossil fuel-derived plastics can have deleterious effects on the environment, especially marine ecosystems. The production of bio-based, biodegradable plastics from or in plants can assist in supplanting those manufactured using fossil fuels. Polyhydroxybutyrate (PHB) is one such biodegradable polyester that has been evaluated as a possible candidate for relinquishing the use of environmentally harmful plastics. Results: PHB, possessing similar properties to polyesters produced from non-renewable sources, has been previously engineered in sugarcane, thereby creating a high-value co-product in addition to the high biomass yield. This manuscript illustrates the coupling of a Fourier-transform infrared microspectrometer, equipped with a focalmore » plane array (FPA) detector, with multivariate imaging to successfully identify and localize PHB aggregates. Principal component analysis imaging facilitated the mining of the abundant quantity of spectral data acquired using the FPA for distinct PHB vibrational modes. PHB was measured in the chloroplasts of mesophyll and bundle sheath cells, acquiescent with previously evaluated plant samples. Conclusion: This study demonstrates the power of IR microspectroscopy to rapidly image plant sections to provide a snapshot of the chemical composition of the cell. While PHB was localized in sugarcane, this method is readily transferable to other value-added co-products in different plants.« less

  2. High Performance Multivariate Visual Data Exploration for Extremely Large Data

    SciTech Connect (OSTI)

    Rubel, Oliver; Wu, Kesheng; Childs, Hank; Meredith, Jeremy; Geddes, Cameron G.R.; Cormier-Michel, Estelle; Ahern, Sean; Weber, Gunther H.; Messmer, Peter; Hagen, Hans; Hamann, Bernd; Bethel, E. Wes; Prabhat,

    2008-08-22

    One of the central challenges in modern science is the need to quickly derive knowledge and understanding from large, complex collections of data. We present a new approach that deals with this challenge by combining and extending techniques from high performance visual data analysis and scientific data management. This approach is demonstrated within the context of gaining insight from complex, time-varying datasets produced by a laser wakefield accelerator simulation. Our approach leverages histogram-based parallel coordinates for both visual information display as well as a vehicle for guiding a data mining operation. Data extraction and subsetting are implemented with state-of-the-art index/query technology. This approach, while applied here to accelerator science, is generally applicable to a broad set of science applications, and is implemented in a production-quality visual data analysis infrastructure. We conduct a detailed performance analysis and demonstrate good scalability on a distributed memory Cray XT4 system.

  3. A Visual Analytics Approach for Correlation, Classification, and Regression Analysis

    SciTech Connect (OSTI)

    Steed, Chad A; SwanII, J. Edward; Fitzpatrick, Patrick J.; Jankun-Kelly, T.J.

    2012-02-01

    New approaches that combine the strengths of humans and machines are necessary to equip analysts with the proper tools for exploring today's increasing complex, multivariate data sets. In this paper, a novel visual data mining framework, called the Multidimensional Data eXplorer (MDX), is described that addresses the challenges of today's data by combining automated statistical analytics with a highly interactive parallel coordinates based canvas. In addition to several intuitive interaction capabilities, this framework offers a rich set of graphical statistical indicators, interactive regression analysis, visual correlation mining, automated axis arrangements and filtering, and data classification techniques. The current work provides a detailed description of the system as well as a discussion of key design aspects and critical feedback from domain experts.

  4. Topological Cacti: Visualizing Contour-based Statistics

    SciTech Connect (OSTI)

    Weber, Gunther H.; Bremer, Peer-Timo; Pascucci, Valerio

    2011-05-26

    Contours, the connected components of level sets, play an important role in understanding the global structure of a scalar field. In particular their nestingbehavior and topology-often represented in form of a contour tree-have been used extensively for visualization and analysis. However, traditional contour trees onlyencode structural properties like number of contours or the nesting of contours, but little quantitative information such as volume or other statistics. Here we use thesegmentation implied by a contour tree to compute a large number of per-contour (interval) based statistics of both the function defining the contour tree as well asother co-located functions. We introduce a new visual metaphor for contour trees, called topological cacti, that extends the traditional toporrery display of acontour tree to display additional quantitative information as width of the cactus trunk and length of its spikes. We apply the new technique to scalar fields ofvarying dimension and different measures to demonstrate the effectiveness of the approach.

  5. Key China Energy Statistics 2011

    SciTech Connect (OSTI)

    Levine, Mark; Fridley, David; Lu, Hongyou; Fino-Chen, Cecilia

    2012-01-15

    The China Energy Group at Lawrence Berkeley National Laboratory (LBNL) was established in 1988. Over the years the Group has gained recognition as an authoritative source of China energy statistics through the publication of its China Energy Databook (CED). In 2008 the Group published the Seventh Edition of the CED (http://china.lbl.gov/research/chinaenergy-databook). This handbook summarizes key statistics from the CED and is expressly modeled on the International Energy Agency’s “Key World Energy Statistics” series of publications. The handbook contains timely, clearly-presented data on the supply, transformation, and consumption of all major energy sources.

  6. Key China Energy Statistics 2012

    SciTech Connect (OSTI)

    Levine, Mark; Fridley, David; Lu, Hongyou; Fino-Chen, Cecilia

    2012-05-01

    The China Energy Group at Lawrence Berkeley National Laboratory (LBNL) was established in 1988. Over the years the Group has gained recognition as an authoritative source of China energy statistics through the publication of its China Energy Databook (CED). The Group has published seven editions to date of the CED (http://china.lbl.gov/research/chinaenergy-databook). This handbook summarizes key statistics from the CED and is expressly modeled on the International Energy Agency’s “Key World Energy Statistics” series of publications. The handbook contains timely, clearly-presented data on the supply, transformation, and consumption of all major energy sources.

  7. Exploratory Data analysis ENvironment eXtreme scale (EDENx)

    Energy Science and Technology Software Center (OSTI)

    2015-07-01

    EDENx is a multivariate data visualization tool that allows interactive user driven analysis of large-scale data sets with high dimensionality. EDENx builds on our earlier system, called EDEN to enable analysis of more dimensions and larger scale data sets. EDENx provides an initial overview of summary statistics for each variable in the data set under investigation. EDENx allows the user to interact with graphical summary plots of the data to investigate subsets and their statisticalmore » associations. These plots include histograms, binned scatterplots, binned parallel coordinate plots, timeline plots, and graphical correlation indicators. From the EDENx interface, a user can select a subsample of interest and launch a more detailed data visualization via the EDEN system. EDENx is best suited for high-level, aggregate analysis tasks while EDEN is more appropriate for detail data investigations.« less

  8. QUANTUM MECHANICS WITHOUT STATISTICAL POSTULATES

    SciTech Connect (OSTI)

    G. GEIGER; ET AL

    2000-11-01

    The Bohmian formulation of quantum mechanics describes the measurement process in an intuitive way without a reduction postulate. Due to the chaotic motion of the hidden classical particle all statistical features of quantum mechanics during a sequence of repeated measurements can be derived in the framework of a deterministic single system theory.

  9. Ideas for Effective Communication of Statistical Results

    SciTech Connect (OSTI)

    Anderson-Cook, Christine M.

    2015-03-01

    Effective presentation of statistical results to those with less statistical training, including managers and decision-makers requires planning, anticipation and thoughtful delivery. Here are several recommendations for effectively presenting statistical results.

  10. ARM - Lesson Plans: Historical Climate Statistics

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Historical Climate Statistics Outreach Home Room News Publications Traditional Knowledge ... Teachers' Toolbox Lesson Plans Lesson Plans: Historical Climate Statistics Objective The ...

  11. IEA Energy Statistics | Open Energy Information

    Open Energy Info (EERE)

    Statistics Jump to: navigation, search Tool Summary LAUNCH TOOL Name: IEA Energy Statistics AgencyCompany Organization: International Energy Agency Sector: Energy Topics: GHG...

  12. VTPI-Transportation Statistics | Open Energy Information

    Open Energy Info (EERE)

    Area: Transportation Resource Type: Dataset Website: www.vtpi.orgtdmtdm80.htm Cost: Free VTPI-Transportation Statistics Screenshot References: VTPI-Transportation Statistics1...

  13. STATISTICAL PERFORMANCE EVALUATION OF SPRING OPERATED PRESSURE...

    Office of Scientific and Technical Information (OSTI)

    STATISTICAL PERFORMANCE EVALUATION OF SPRING OPERATED PRESSURE RELIEF VALVE RELIABILITY IMPROVEMENTS 2004 TO 2014 Citation Details In-Document Search Title: STATISTICAL PERFORMANCE ...

  14. Statistical Transmutation in Floquet Driven Optical Lattices...

    Office of Scientific and Technical Information (OSTI)

    Statistical Transmutation in Floquet Driven Optical Lattices Citation Details In-Document Search This content will become publicly available on November 3, 2016 Title: Statistical ...

  15. Statistical Transmutation in Floquet Driven Optical Lattices...

    Office of Scientific and Technical Information (OSTI)

    Statistical Transmutation in Floquet Driven Optical Lattices This content will become publicly available on November 3, 2016 Prev Next Title: Statistical Transmutation in ...

  16. On the ability of Order Statistics to distinguish different models for continuum gamma decay

    SciTech Connect (OSTI)

    Sandoval, J. J.; Cristancho, F.

    2007-10-26

    A simulation procedure to calculate some important parameters to the application of Order Statistics in the analysis of continuum gamma decay is presented.

  17. Moore honored with American Statistical Association award

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    American Statistical Association Award Moore honored with American Statistical Association award Lisa Moore is the recipient of the 2013 Don Owen Award presented by the American Statistical Association, San Antonio Chapter. May 24, 2013 Leslie "Lisa" Moore Leslie "Lisa" Moore The American Statistical Association (ASA) is the world's largest community of statisticians. It was founded in Massachusetts in 1839. Leslie "Lisa" Moore of the Laboratory's Statistical

  18. Office of Survey Development and Statistical Integration

    U.S. Energy Information Administration (EIA) Indexed Site

    Steve Harvey April 27, 2011 | Washington, D.C. Tough Choices in U.S. EIA's Data Programs Agenda * Office of Oil, Gas, and Coal Supply Statistics * Office of Petroleum and Biofuels Statistics * Office of Electricity, Renewables, and Uranium Statistics * Office of Energy Consumption and Efficiency Statistics * Office of Survey Development and Statistical Integration 2 Presenter name, Presentation location, Presentation date Coal Data Collection Program 3 James Kendell Washington, DC, April 27,

  19. Transportation Statistics Annual Report 1997

    SciTech Connect (OSTI)

    Fenn, M.

    1997-01-01

    This document is the fourth Transportation Statistics Annual Report (TSAR) prepared by the Bureau of Transportation Statistics (BTS) for the President and Congress. As in previous years, it reports on the state of U.S. transportation system at two levels. First, in Part I, it provides a statistical and interpretive survey of the system—its physical characteristics, its economic attributes, aspects of its use and performance, and the scale and severity of unintended consequences of transportation, such as fatalities and injuries, oil import dependency, and environment impacts. Part I also explores the state of transportation statistics, and new needs of the rapidly changing world of transportation. Second, Part II of the report, as in prior years, explores in detail the performance of the U.S. transportation system from the perspective of desired social outcomes or strategic goals. This year, the performance aspect of transportation chosen for thematic treatment is “Mobility and Access,” which complements past TSAR theme sections on “The Economic Performance of Transportation” (1995) and “Transportation and the Environment” (1996). Mobility and access are at the heart of the transportation system’s performance from the user’s perspective. In what ways and to what extent does the geographic freedom provided by transportation enhance personal fulfillment of the nation’s residents and contribute to economic advancement of people and businesses? This broad question underlies many of the topics examined in Part II: What is the current level of personal mobility in the United States, and how does it vary by sex, age, income level, urban or rural location, and over time? What factors explain variations? Has transportation helped improve people’s access to work, shopping, recreational facilities, and medical services, and in what ways and in what locations? How have barriers, such as age, disabilities, or lack of an automobile, affected these

  20. Lectures on probability and statistics

    SciTech Connect (OSTI)

    Yost, G.P.

    1984-09-01

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another.

  1. Improved Geothermometry Through Multivariate Reaction Path Modeling and Evaluation of Geomicrobiological Influences on Geochemical Temperature Indicators

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Improved Geothermometry Through Multivariate Reaction Path Modeling and Evaluation of Geomicrobiological Influences on Geochemical Temperature Indicators Project Officer: Eric Hass Total Project Funding: $999,000 April 24, 2013 Craig Cooper Larry Hull Idaho National Laboratory This presentation does not contain any proprietary confidential, or otherwise restricted information. 2 | US DOE Geothermal Program eere.energy.gov Relevance/Impact of Research Geothermometry enables estimation of

  2. A new subgrid-scale representation of hydrometeor fields using a multivariate PDF

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Griffin, Brian M.; Larson, Vincent E.

    2016-06-03

    The subgrid-scale representation of hydrometeor fields is important for calculating microphysical process rates. In order to represent subgrid-scale variability, the Cloud Layers Unified By Binormals (CLUBB) parameterization uses a multivariate probability density function (PDF). In addition to vertical velocity, temperature, and moisture fields, the PDF includes hydrometeor fields. Previously, hydrometeor fields were assumed to follow a multivariate single lognormal distribution. Now, in order to better represent the distribution of hydrometeors, two new multivariate PDFs are formulated and introduced.The new PDFs represent hydrometeors using either a delta-lognormal or a delta-double-lognormal shape. The two new PDF distributions, plus the previous single lognormalmore » shape, are compared to histograms of data taken from large-eddy simulations (LESs) of a precipitating cumulus case, a drizzling stratocumulus case, and a deep convective case. Finally, the warm microphysical process rates produced by the different hydrometeor PDFs are compared to the same process rates produced by the LES.« less

  3. Tomography and weak lensing statistics

    SciTech Connect (OSTI)

    Munshi, Dipak; Coles, Peter; Kilbinger, Martin E-mail: peter.coles@astro.cf.ac.uk

    2014-04-01

    We provide generic predictions for the lower order cumulants of weak lensing maps, and their correlators for tomographic bins as well as in three dimensions (3D). Using small-angle approximation, we derive the corresponding one- and two-point probability distribution function for the tomographic maps from different bins and for 3D convergence maps. The modelling of weak lensing statistics is obtained by adopting a detailed prescription for the underlying density contrast that involves hierarchal ansatz and lognormal distribution. We study the dependence of our results on cosmological parameters and source distributions corresponding to the realistic surveys such as LSST and DES. We briefly outline how photometric redshift information can be incorporated in our results. We also show how topological properties of convergence maps can be quantified using our results.

  4. DOE - NNSA/NFO -- FOIA Statistics

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Statistics NNSANFO Language Options U.S. DOENNSA - Nevada Field Office FOIA Statistics The FOIA has become a useful tool for researchers, news media, and the general public. In ...

  5. EERE Statistics Archive | Department of Energy

    Broader source: Energy.gov (indexed) [DOE]

    This page provides EERE Web statistics for all office and corporate websites that opted to use EERE's analytics account. Webtrends statistics for Fiscal Year 2009 (FY09) to FY11 ...

  6. Search for: All records | SciTech Connect

    Office of Scientific and Technical Information (OSTI)

    ... conjoins multivariate statistical analysis with physics-based interpretation, ... of local transport and other functional phenomena in other spatially ...

  7. I/O Statistics Last 30 Days

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    I/O Statistics Last 30 Days I/O Statistics Last 30 Days These plots show the daily statistics for the last 30 days for the storage systems at NERSC in terms of the amount of data transferred and the number of files transferred. Daily I/O Volume Daily I/O Count

  8. STORM: A STatistical Object Representation Model

    SciTech Connect (OSTI)

    Rafanelli, M. ); Shoshani, A. )

    1989-11-01

    In this paper we explore the structure and semantic properties of the entities stored in statistical databases. We call such entities statistical objects'' (SOs) and propose a new statistical object representation model,'' based on a graph representation. We identify a number of SO representational problems in current models and propose a methodology for their solution. 11 refs.

  9. Quantrum chaos and statistical nuclear physics

    SciTech Connect (OSTI)

    Not Available

    1986-01-01

    This book contains 33 selections. Some of the titles are: Chaotic motion and statistical nuclear theory; Test of spectrum and strength fluctuations with proton resonances; Nuclear level densities and level spacing distributions; Spectral statistics of scale invariant systems; and Antiunitary symmetries and energy level statistics.

  10. Statistics and Discoveries at the LHC (1/4)

    ScienceCinema (OSTI)

    None

    2011-10-06

    The lectures will give an introduction to statistics as applied in particle physics and will provide all the necessary basics for data analysis at the LHC. Special emphasis will be placed on the the problems and questions that arise when searching for new phenomena, including p-values, discovery significance, limit setting procedures, treatment of small signals in the presence of large backgrounds. Specific issues that will be addressed include the advantages and drawbacks of different statistical test procedures (cut-based, likelihood-ratio, etc.), the look-elsewhere effect and treatment of systematic uncertainties.

  11. Statistics and Discoveries at the LHC (4/4)

    ScienceCinema (OSTI)

    None

    2011-10-06

    The lectures will give an introduction to statistics as applied in particle physics and will provide all the necessary basics for data analysis at the LHC. Special emphasis will be placed on the the problems and questions that arise when searching for new phenomena, including p-values, discovery significance, limit setting procedures, treatment of small signals in the presence of large backgrounds. Specific issues that will be addressed include the advantages and drawbacks of different statistical test procedures (cut-based, likelihood-ratio, etc.), the look-elsewhere effect and treatment of systematic uncertainties.

  12. Statistics and Discoveries at the LHC (2/4)

    ScienceCinema (OSTI)

    None

    2011-10-06

    The lectures will give an introduction to statistics as applied in particle physics and will provide all the necessary basics for data analysis at the LHC. Special emphasis will be placed on the the problems and questions that arise when searching for new phenomena, including p-values, discovery significance, limit setting procedures, treatment of small signals in the presence of large backgrounds. Specific issues that will be addressed include the advantages and drawbacks of different statistical test procedures (cut-based, likelihood-ratio, etc.), the look-elsewhere effect and treatment of systematic uncertainties.

  13. Statistics and Discoveries at the LHC (3/4)

    ScienceCinema (OSTI)

    None

    2011-10-06

    The lectures will give an introduction to statistics as applied in particle physics and will provide all the necessary basics for data analysis at the LHC. Special emphasis will be placed on the the problems and questions that arise when searching for new phenomena, including p-values, discovery significance, limit setting procedures, treatment of small signals in the presence of large backgrounds. Specific issues that will be addressed include the advantages and drawbacks of different statistical test procedures (cut-based, likelihood-ratio, etc.), the look-elsewhere effect and treatment of systematic uncertainties.

  14. Multivariable Robust Control of a Simulated Hybrid Solid Oxide Fuel Cell Gas Turbine Plant

    SciTech Connect (OSTI)

    Tsai, Alex; Banta, Larry; Tucker, D.A.; Gemmen, R.S.

    2008-06-01

    This paper presents a systematic approach to the multivariable robust control of a hybrid fuel cell gas turbine plant. The hybrid configuration under investigation comprises a physical simulation of a 300kW fuel cell coupled to a 120kW auxiliary power unit single spool gas turbine. The facility provides for the testing and simulation of different fuel cell models that in turn help identify the key issues encountered in the transient operation of such systems. An empirical model of the facility consisting of a simulated fuel cell cathode volume and balance of plant components is derived via frequency response data. Through the modulation of various airflow bypass valves within the hybrid configuration, Bode plots are used to derive key input/output interactions in Transfer Function format. A multivariate system is then built from individual transfer functions, creating a matrix that serves as the nominal plant in an H-Infinity robust control algorithm. The controller’s main objective is to track and maintain hybrid operational constraints in the fuel cell’s cathode airflow, and the turbo machinery states of temperature and speed, under transient disturbances. This algorithm is then tested on a Simulink/MatLab platform for various perturbations of load and fuel cell heat effluence.

  15. Bayesian Treed Multivariate Gaussian Process with Adaptive Design: Application to a Carbon Capture Unit

    SciTech Connect (OSTI)

    Konomi, Bledar A.; Karagiannis, Georgios; Sarkar, Avik; Sun, Xin; Lin, Guang

    2014-05-16

    Computer experiments (numerical simulations) are widely used in scientific research to study and predict the behavior of complex systems, which usually have responses consisting of a set of distinct outputs. The computational cost of the simulations at high resolution are often expensive and become impractical for parametric studies at different input values. To overcome these difficulties we develop a Bayesian treed multivariate Gaussian process (BTMGP) as an extension of the Bayesian treed Gaussian process (BTGP) in order to model and evaluate a multivariate process. A suitable choice of covariance function and the prior distributions facilitates the different Markov chain Monte Carlo (MCMC) movements. We utilize this model to sequentially sample the input space for the most informative values, taking into account model uncertainty and expertise gained. A simulation study demonstrates the use of the proposed method and compares it with alternative approaches. We apply the sequential sampling technique and BTMGP to model the multiphase flow in a full scale regenerator of a carbon capture unit. The application presented in this paper is an important tool for research into carbon dioxide emissions from thermal power plants.

  16. On the Bayesian Treed Multivariate Gaussian Process with Linear Model of Coregionalization

    SciTech Connect (OSTI)

    Konomi, Bledar A.; Karagiannis, Georgios; Lin, Guang

    2015-02-01

    The Bayesian treed Gaussian process (BTGP) has gained popularity in recent years because it provides a straightforward mechanism for modeling non-stationary data and can alleviate computational demands by fitting models to less data. The extension of BTGP to the multivariate setting requires us to model the cross-covariance and to propose efficient algorithms that can deal with trans-dimensional MCMC moves. In this paper we extend the cross-covariance of the Bayesian treed multivariate Gaussian process (BTMGP) to that of linear model of Coregionalization (LMC) cross-covariances. Different strategies have been developed to improve the MCMC mixing and invert smaller matrices in the Bayesian inference. Moreover, we compare the proposed BTMGP with existing multiple BTGP and BTMGP in test cases and multiphase flow computer experiment in a full scale regenerator of a carbon capture unit. The use of the BTMGP with LMC cross-covariance helped to predict the computer experiments relatively better than existing competitors. The proposed model has a wide variety of applications, such as computer experiments and environmental data. In the case of computer experiments we also develop an adaptive sampling strategy for the BTMGP with LMC cross-covariance function.

  17. Statistical assessment of Monte Carlo distributional tallies

    SciTech Connect (OSTI)

    Kiedrowski, Brian C; Solomon, Clell J

    2010-12-09

    Four tests are developed to assess the statistical reliability of distributional or mesh tallies. To this end, the relative variance density function is developed and its moments are studied using simplified, non-transport models. The statistical tests are performed upon the results of MCNP calculations of three different transport test problems and appear to show that the tests are appropriate indicators of global statistical quality.

  18. Statistics for characterizing data on the periphery

    SciTech Connect (OSTI)

    Theiler, James P; Hush, Donald R

    2010-01-01

    We introduce a class of statistics for characterizing the periphery of a distribution, and show that these statistics are particularly valuable for problems in target detection. Because so many detection algorithms are rooted in Gaussian statistics, we concentrate on ellipsoidal models of high-dimensional data distributions (that is to say: covariance matrices), but we recommend several alternatives to the sample covariance matrix that more efficiently model the periphery of a distribution, and can more effectively detect anomalous data samples.

  19. Statistical methods for nuclear material management

    SciTech Connect (OSTI)

    Bowen W.M.; Bennett, C.A.

    1988-12-01

    This book is intended as a reference manual of statistical methodology for nuclear material management practitioners. It describes statistical methods currently or potentially important in nuclear material management, explains the choice of methods for specific applications, and provides examples of practical applications to nuclear material management problems. Together with the accompanying training manual, which contains fully worked out problems keyed to each chapter, this book can also be used as a textbook for courses in statistical methods for nuclear material management. It should provide increased understanding and guidance to help improve the application of statistical methods to nuclear material management problems.

  20. Moore honored with American Statistical Association award

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Before his death in 1991, Professor Owen was the Distinguished Professor of Statistics at Southern Methodist University in Dallas, Texas. His illustrious career serves as the ...

  1. Structure Learning and Statistical Estimation in Distribution...

    Office of Scientific and Technical Information (OSTI)

    Citation Details In-Document Search Title: Structure Learning and Statistical Estimation ... Part I of this paper discusses the problem of learning the operational structure of the ...

  2. Statistics for Industry Groups and Industries, 2003

    SciTech Connect (OSTI)

    2009-01-18

    Statistics for the U.S. Department of Commerce including types of manufacturing, employees, and products as outlined in the Annual Survey of Manufacturers (ASM).

  3. STATISTICAL PERFORMANCE EVALUATION OF SPRING OPERATED PRESSURE...

    Office of Scientific and Technical Information (OSTI)

    VALVE RELIABILITY IMPROVEMENTS 2004 TO 2014 Citation Details In-Document Search Title: STATISTICAL PERFORMANCE EVALUATION OF SPRING OPERATED PRESSURE RELIEF VALVE RELIABILITY ...

  4. ANNUAL FEDERAL EQUAL EMPLOYMENT OPPORTUNITY STATISTICAL REPORT...

    National Nuclear Security Administration (NNSA)

    STATISTICAL REPORT OF DISCRIMINATION COMPLAINTS (REPORTING PERIOD BEGINS OCTOBER 1ST AND ... COUNSELORINVESTIGATOR F. COMPLAINTS IN LINE E CLOSED DURING REPORT PERIOD a. FULL-TIME b. ...

  5. [pic] EERE Web Site Statistics - Social Media

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    EERE Web Site Statistics - Social Media Custom View: 10110 - 93011 October 1, 2010 ... site compels visitors to return. Updating web site content is one way to draw return ...

  6. Analysis of stream sediment reconnaissance data for mineral resources from the Montrose NTMS Quadrangle, Colorado

    SciTech Connect (OSTI)

    Beyth, M.; Broxton, D.; McInteer, C.; Averett, W.R.; Stablein, N.K.

    1980-06-01

    Multivariate statistical analysis to support the National Uranium Resource Evaluation and to evaluate strategic and other commercially important mineral resources was carried out on Hydrogeochemical and Stream Sediment Reconnaissance data from the Montrose quadrangle, Colorado. The analysis suggests that: (1) the southern Colorado Mineral Belt is an area favorable for uranium mineral occurrences; (2) carnotite-type occurrences are likely in the nose of the Gunnison Uplift; (3) uranium mineral occurrences may be present along the western and northern margins of the West Elk crater; (4) a base-metal mineralized area is associated with the Uncompahgre Uplift; and (5) uranium and base metals are associated in some areas, and both are often controlled by faults trending west-northwest and north.

  7. Federal offshore statistics: leasing - exploration - production - revenue

    SciTech Connect (OSTI)

    Essertier, E.P.

    1984-01-01

    Federal Offshore Statistics is a numerical record of what has happened since Congress gave authority to the Secretary of the Interior in 1953 to lease the Federal portion of the Continental Shelf for oil and gas. The publication updates and augments the first Federal Offshore Statistics, published in December 1983. It also extends a statistical series published annually from 1969 until 1981 by the US Geological Survey (USGS) under the title Outer Continental Shelf Statistics. The USGS collected royalties and supervised operation and production of minerals on the Outer Continental Shelf (OCS) until the Minerals Management Service (MMS) took over these functions in 1982. Statistics are presented under the following topics: (1) highlights, (2) leasing, (3) exploration and development, (4) production and revenue, (5) federal offshore production by ranking operator, 1983, (6) reserves and undiscovered recoverable resources, and (7) oil pollution in the world's oceans.

  8. Multivariable Robust Control of a Simulated Hybrid Solid Oxide Fuel Cell Gas Turbine Plant

    SciTech Connect (OSTI)

    Tsai, Alex; Banta, Larry; Tucker, David; Gemmen, Randall

    2010-08-01

    This work presents a systematic approach to the multivariable robust control of a hybrid fuel cell gas turbine plant. The hybrid configuration under investigation built by the National Energy Technology Laboratory comprises a physical simulation of a 300kW fuel cell coupled to a 120kW auxiliary power unit single spool gas turbine. The public facility provides for the testing and simulation of different fuel cell models that in turn help identify the key difficulties encountered in the transient operation of such systems. An empirical model of the built facility comprising a simulated fuel cell cathode volume and balance of plant components is derived via frequency response data. Through the modulation of various airflow bypass valves within the hybrid configuration, Bode plots are used to derive key input/output interactions in transfer function format. A multivariate system is then built from individual transfer functions, creating a matrix that serves as the nominal plant in an H{sub {infinity}} robust control algorithm. The controller’s main objective is to track and maintain hybrid operational constraints in the fuel cell’s cathode airflow, and the turbo machinery states of temperature and speed, under transient disturbances. This algorithm is then tested on a Simulink/MatLab platform for various perturbations of load and fuel cell heat effluence. As a complementary tool to the aforementioned empirical plant, a nonlinear analytical model faithful to the existing process and instrumentation arrangement is evaluated and designed in the Simulink environment. This parallel task intends to serve as a building block to scalable hybrid configurations that might require a more detailed nonlinear representation for a wide variety of controller schemes and hardware implementations.

  9. Appendix D: Statistical Methodology of Estimating Petroleum Exports Using Data from U.S. Customs and Border Protection

    U.S. Energy Information Administration (EIA) Indexed Site

    Statistical Methodology of Estimating Petroleum Exports Using Data from U.S. Customs and Border Protection August 31, 2016 Independent Statistics & Analysis www.eia.gov U.S. Department of Energy Washington, DC 20585 U.S. Energy Information Administration | Statistical Methodology of Estimating Petroleum Exports Using Data from U.S. Customs and Border Protection This report was prepared by the U.S. Energy Information Administration (EIA), the statistical and analytical agency within the U.S.

  10. Exploring laser-induced breakdown spectroscopy for nuclear materials analysis and in-situ applications

    SciTech Connect (OSTI)

    Martin, Madhavi Z; Allman, Steve L; Brice, Deanne Jane; Martin, Rodger Carl; Andre, Nicolas O

    2012-01-01

    Laser-induced breakdown spectroscopy (LIBS) has been used to determine the limits of detection of strontium (Sr) and cesium (Cs), common nuclear fission products. Additionally, detection limits were determined for cerium (Ce), often used as a surrogate for radioactive plutonium in laboratory studies. Results were obtained using a laboratory instrument with a Nd:YAG laser at fundamental wavelength of 1064 nm, frequency doubled to 532 nm with energy of 50 mJ/pulse. The data was compared for different concentrations of Sr and Ce dispersed in a CaCO3 (white) and carbon (black) matrix. We have addressed the sampling errors, limits of detection, reproducibility, and accuracy of measurements as they relate to multivariate analysis in pellets that were doped with the different elements at various concentrations. These results demonstrate that LIBS technique is inherently well suited for in situ analysis of nuclear materials in hot cells. Three key advantages are evident: (1) small samples (mg) can be evaluated; (2) nuclear materials can be analyzed with minimal sample preparation; and (3) samples can be remotely analyzed very rapidly (ms-seconds). Our studies also show that the methods can be made quantitative. Very robust multivariate models have been used to provide quantitative measurement and statistical evaluation of complex materials derived from our previous research on wood and soil samples.

  11. Security of statistical data bases: invasion of privacy through attribute correlational modeling

    SciTech Connect (OSTI)

    Palley, M.A.

    1985-01-01

    This study develops, defines, and applies a statistical technique for the compromise of confidential information in a statistical data base. Attribute Correlational Modeling (ACM) recognizes that the information contained in a statistical data base represents real world statistical phenomena. As such, ACM assumes correlational behavior among the database attributes. ACM proceeds to compromise confidential information through creation of a regression model, where the confidential attribute is treated as the dependent variable. The typical statistical data base may preclude the direct application of regression. In this scenario, the research introduces the notion of a synthetic data base, created through legitimate queries of the actual data base, and through proportional random variation of responses to these queries. The synthetic data base is constructed to resemble the actual data base as closely as possible in a statistical sense. ACM then applies regression analysis to the synthetic data base, and utilizes the derived model to estimate confidential information in the actual database.

  12. Computing contingency statistics in parallel : design trade-offs and limiting cases.

    SciTech Connect (OSTI)

    Bennett, Janine Camille; Thompson, David; Pebay, Philippe Pierre

    2010-06-01

    Statistical analysis is typically used to reduce the dimensionality of and infer meaning from data. A key challenge of any statistical analysis package aimed at large-scale, distributed data is to address the orthogonal issues of parallel scalability and numerical stability. Many statistical techniques, e.g., descriptive statistics or principal component analysis, are based on moments and co-moments and, using robust online update formulas, can be computed in an embarrassingly parallel manner, amenable to a map-reduce style implementation. In this paper we focus on contingency tables, through which numerous derived statistics such as joint and marginal probability, point-wise mutual information, information entropy, and {chi}{sup 2} independence statistics can be directly obtained. However, contingency tables can become large as data size increases, requiring a correspondingly large amount of communication between processors. This potential increase in communication prevents optimal parallel speedup and is the main difference with moment-based statistics (which we discussed in [1]) where the amount of inter-processor communication is independent of data size. Here we present the design trade-offs which we made to implement the computation of contingency tables in parallel. We also study the parallel speedup and scalability properties of our open source implementation. In particular, we observe optimal speed-up and scalability when the contingency statistics are used in their appropriate context, namely, when the data input is not quasi-diffuse.

  13. Computing contingency statistics in parallel : design trade-offs and limiting cases.

    SciTech Connect (OSTI)

    Thompson, David C.; Bennett, Janine C.; Pebay, Philippe Pierre

    2010-03-01

    Statistical analysis is typically used to reduce the dimensionality of and infer meaning from data. A key challenge of any statistical analysis package aimed at large-scale, distributed data is to address the orthogonal issues of parallel scalability and numerical stability. Many statistical techniques, e.g., descriptive statistics or principal component analysis, are based on moments and co-moments and, using robust online update formulas, can be computed in an embarrassingly parallel manner, amenable to a map-reduce style implementation. In this paper we focus on contingency tables, through which numerous derived statistics such as joint and marginal probability, point-wise mutual information, information entropy, and X{sup 2} independence statistics can be directly obtained. However, contingency tables can become large as data size increases, requiring a correspondingly large amount of communication between processors. This potential increase in communication prevents optimal parallel speedup and is the main difference with moment-based statistics (which we discussed in [1]) where the amount of inter-processor communication is independent of data size. Here we present the design trade-offs which we made to implement the computation of contingency tables in parallel.We also study the parallel speedup and scalability properties of our open source implementation. In particular, we observe optimal speed-up and scalability when the contingency statistics are used in their appropriate context, namely, when the data input is not quasi-diffuse.

  14. I/O Statistics Last 30 Days

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Last 30 Days These plots show the daily statistics for the last 30 days for the storage systems at NERSC in terms of the amount of data transferred and the number of files...

  15. Statistical criteria for characterizing irradiance time series.

    SciTech Connect (OSTI)

    Stein, Joshua S.; Ellis, Abraham; Hansen, Clifford W.

    2010-10-01

    We propose and examine several statistical criteria for characterizing time series of solar irradiance. Time series of irradiance are used in analyses that seek to quantify the performance of photovoltaic (PV) power systems over time. Time series of irradiance are either measured or are simulated using models. Simulations of irradiance are often calibrated to or generated from statistics for observed irradiance and simulations are validated by comparing the simulation output to the observed irradiance. Criteria used in this comparison should derive from the context of the analyses in which the simulated irradiance is to be used. We examine three statistics that characterize time series and their use as criteria for comparing time series. We demonstrate these statistics using observed irradiance data recorded in August 2007 in Las Vegas, Nevada, and in June 2009 in Albuquerque, New Mexico.

  16. EERE Web Site Engagement Statistics: FY09

    Broader source: Energy.gov (indexed) [DOE]

    WEB SITE ENGAGEMENT STATISTICS TECHNOLOGY ADVANCEMENT AND OUTREACH | 01 TABLE OF CONTENTS ... Views 02 Average Visit Duration 03 Top 20 Web Sites by Visits 03 Top 20 Visited Pages 04 ...

  17. Statistical Fault Detection & Diagnosis Expert System

    Energy Science and Technology Software Center (OSTI)

    1996-12-18

    STATMON is an expert system that performs real-time fault detection and diagnosis of redundant sensors in any industrial process requiring high reliability. After a training period performed during normal operation, the expert system monitors the statistical properties of the incoming signals using a pattern recognition test. If the test determines that statistical properties of the signals have changed, the expert system performs a sequence of logical steps to determine which sensor or machine component hasmoredegraded.less

  18. FY 2015 Statistical Table by Appropriation

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Statistical Table by Appropriation (dollars in thousands - OMB Scoring) Statistical Table by Appropriation Page 1 FY 2015 Congressional Request FY 2013 FY 2014 FY 2014 FY 2014 FY 2015 Current Enacted Adjustment Current Congressional Approp. Approp. Approp. Request Discretionary Summary By Appropriation Energy And Water Development And Related Agencies Appropriation Summary: Energy Programs Energy efficiency and renewable energy............................... 1,691,757 1,900,641 ---- 1,900,641

  19. FY 2015 Statistical Table by Organization

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    of Energy FY 2015 Statistical Table by Organization (dollars in thousands - OMB Scoring) Statistical Table by Organization Page 1 FY 2015 Congressional Request FY 2013 FY 2014 FY 2014 FY 2014 FY 2015 Current Enacted Adjustments Current Congressional Approp. Approp. Approp. Request Discretionary Summary By Organization National Nuclear Security Administration Weapons Activities........................................................................... 6,966,855 7,781,000 ---- 7,781,000 8,314,902

  20. Statistical analysis of variations in impurity ion heating at...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    ... heating of majority ions has been studied; 16 and most recently, the charge and mass dependency of impurity ion heating has been investigated. 17 Despite many laboratory ...

  1. Canadian National Energy Use Database: Statistics and Analysis...

    Open Energy Info (EERE)

    Occidental, Inuktitut, Inupiaq, Iranian languages, Irish, Iroquoian languages, Italian, Japanese, Javanese, Judeo-Arabic, Judeo-Persian, Kabardian, Kabyle, Kachin; Jingpho,...

  2. Statistical Analysis of Variation in the Human Plasma Proteome

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Corzett, Todd H.; Fodor, Imola K.; Choi, Megan W.; Walsworth, Vicki L.; Turteltaub, Kenneth W.; McCutchen-Maloney, Sandra L.; Chromy, Brett A.

    2010-01-01

    Quantifying the variation in the human plasma proteome is an essential prerequisite for disease-specific biomarker detection. We report here on the longitudinal and individual variation in human plasma characterized by two-dimensional difference gel electrophoresis (2-D DIGE) using plasma samples from eleven healthy subjects collected three times over a two week period. Fixed-effects modeling was used to remove dye and gel variability. Mixed-effects modeling was then used to quantitate the sources of proteomic variation. The subject-to-subject variation represented the largest variance component, while the time-within-subject variation was comparable to the experimental variation found in a previous technical variability study where onemore » human plasma sample was processed eight times in parallel and each was then analyzed by 2-D DIGE in triplicate. Here, 21 protein spots had larger than 50% CV, suggesting that these proteins may not be appropriate as biomarkers and should be carefully scrutinized in future studies. Seventy-eight protein spots showing differential protein levels between different individuals or individual collections were identified by mass spectrometry and further characterized using hierarchical clustering. The results present a first step toward understanding the complexity of longitudinal and individual variation in the human plasma proteome, and provide a baseline for improved biomarker discovery.« less

  3. MAVTgsa: An R Package for Gene Set (Enrichment) Analysis

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Chien, Chih-Yi; Chang, Ching-Wei; Tsai, Chen-An; Chen, James J.

    2014-01-01

    Gene semore » t analysis methods aim to determine whether an a priori defined set of genes shows statistically significant difference in expression on either categorical or continuous outcomes. Although many methods for gene set analysis have been proposed, a systematic analysis tool for identification of different types of gene set significance modules has not been developed previously. This work presents an R package, called MAVTgsa, which includes three different methods for integrated gene set enrichment analysis. (1) The one-sided OLS (ordinary least squares) test detects coordinated changes of genes in gene set in one direction, either up- or downregulation. (2) The two-sided MANOVA (multivariate analysis variance) detects changes both up- and downregulation for studying two or more experimental conditions. (3) A random forests-based procedure is to identify gene sets that can accurately predict samples from different experimental conditions or are associated with the continuous phenotypes. MAVTgsa computes the P values and FDR (false discovery rate) q -value for all gene sets in the study. Furthermore, MAVTgsa provides several visualization outputs to support and interpret the enrichment results. This package is available online.« less

  4. Statistics of particle time-temperature histories.

    SciTech Connect (OSTI)

    Hewson, John C.; Lignell, David O.; Sun, Guangyuan

    2014-10-01

    Particles in non - isothermal turbulent flow are subject to a stochastic environment tha t produces a distribution of particle time - temperature histories. This distribution is a function of the dispersion of the non - isothermal (continuous) gas phase and the distribution of particles relative to that gas phase. In this work we extend the one - dimensional turbulence (ODT) model to predict the joint dispersion of a dispersed particle phase and a continuous phase. The ODT model predicts the turbulent evolution of continuous scalar fields with a model for the cascade of fluctuations to smaller sc ales (the 'triplet map') at a rate that is a function of the fully resolved one - dimens ional velocity field . Stochastic triplet maps also drive Lagrangian particle dispersion with finite Stokes number s including inertial and eddy trajectory - crossing effect s included. Two distinct approaches to this coupling between triplet maps and particle dispersion are developed and implemented along with a hybrid approach. An 'instantaneous' particle displacement model matches the tracer particle limit and provide s an accurate description of particle dispersion. A 'continuous' particle displacement m odel translates triplet maps into a continuous velocity field to which particles respond. Particles can alter the turbulence, and modifications to the stochastic rate expr ession are developed for two - way coupling between particles and the continuous phase. Each aspect of model development is evaluated in canonical flows (homogeneous turbulence, free - shear flows and wall - bounded flows) for which quality measurements are ava ilable. ODT simulations of non - isothermal flows provide statistics for particle heating. These simulations show the significance of accurately predicting the joint statistics of particle and fluid dispersion . Inhomogeneous turbulence coupled with the in fluence of the mean flow fields on particles of varying properties alter s particle dispersion. The

  5. Complex statistics and diffusion in nonlinear disordered particle chains

    SciTech Connect (OSTI)

    Antonopoulos, Ch. G.; Bountis, T.; Skokos, Ch.; Drossos, L.

    2014-06-15

    We investigate dynamically and statistically diffusive motion in a Klein-Gordon particle chain in the presence of disorder. In particular, we examine a low energy (subdiffusive) and a higher energy (self-trapping) case and verify that subdiffusive spreading is always observed. We then carry out a statistical analysis of the motion, in both cases, in the sense of the Central Limit Theorem and present evidence of different chaos behaviors, for various groups of particles. Integrating the equations of motion for times as long as 10{sup 9}, our probability distribution functions always tend to Gaussians and show that the dynamics does not relax onto a quasi-periodic Kolmogorov-Arnold-Moser torus and that diffusion continues to spread chaotically for arbitrarily long times.

  6. Statistical physics on the light-front

    SciTech Connect (OSTI)

    Raufeisen, J.

    2005-06-14

    The formulation of statistical physics using light-front quantization, instead of conventional equal-time boundary conditions, has important advantages for describing relativistic statistical systems, such as heavy ion collisions. We develop light-front field theory at finite temperature and density with special attention to Quantum Chromodynamics. We construct the most general form of the statistical operator allowed by the Poincare algebra and introduce the chemical potential in a covariant way. In light-front quantization, the Green's functions of a quark in a medium can be defined in terms of just 2-component spinors and does not lead to doublers in the transverse directions. A seminal property of light-front Green's functions is that they are related to parton densities in coordinate space. Namely, the diagonal and off-diagonal parton distributions measured in hard scattering experiments can be interpreted as light-front density matrices.

  7. n-dimensional Statistical Inverse Graphical Hydraulic Test Simulator

    Energy Science and Technology Software Center (OSTI)

    2012-09-12

    nSIGHTS (n-dimensional Statistical Inverse Graphical Hydraulic Test Simulator) is a comprehensive well test analysis software package. It provides a user-interface, a well test analysis model and many tools to analyze both field and simulated data. The well test analysis model simulates a single-phase, one-dimensional, radial/non-radial flow regime, with a borehole at the center of the modeled flow system. nSIGHTS solves the radially symmetric n-dimensional forward flow problem using a solver based on a graph-theoretic approach.more » The results of the forward simulation are pressure, and flow rate, given all the input parameters. The parameter estimation portion of nSIGHTS uses a perturbation-based approach to interpret the best-fit well and reservoir parameters, given an observed dataset of pressure and flow rate.« less

  8. Statistical data of the uranium industry

    SciTech Connect (OSTI)

    1983-01-01

    This report is a compendium of information relating to US uranium reserves and potential resources and to exploration, mining, milling, and other activities of the uranium industry through 1982. The statistics are based primarily on data provided voluntarily by the uranium exploration, mining and milling companies. The compendium has been published annually since 1968 and reflects the basic programs of the Grand Junction Area Office of the US Department of Energy. Statistical data obtained from surveys conducted by the Energy Information Administration are included in Section IX. The production, reserves, and drilling data are reported in a manner which avoids disclosure of proprietary information.

  9. Federal offshore statistics: leasing, exploration, production, revenue

    SciTech Connect (OSTI)

    Essertier, E.P.

    1983-01-01

    The statistics in this update of the Outer Continental Shelf Statistics publication document what has happened since federal leasing began on the Outer Continental Shelf (OCS) in 1954. Highlights note that of the 29.8 million acres actually leased from 175.6 million acres offered for leasing, 20.1% were in frontier areas. Total revenues for the 1954-1982 period were $58.9 billion with about 13% received in 1982. The book is divided into six parts covering highlights, leasing, exploration and development, production and revenue, reserves and undiscovered recoverable resources, and pollution problems from well and tanker accidents. 5 figures, 59 tables.

  10. Statistical Fault Detection & Diagnosis Expert System

    Energy Science and Technology Software Center (OSTI)

    1996-12-18

    STATMON is an expert system that performs real-time fault detection and diagnosis of redundant sensors in any industrial process requiring high reliability. After a training period performed during normal operation, the expert system monitors the statistical properties of the incoming signals using a pattern recognition test. If the test determines that statistical properties of the signals have changed, the expert system performs a sequence of logical steps to determine which sensor or machine component hasmore » degraded.« less