Statistical Approaches to Fault Analysis in Multivariate Process Control
Ungar, Lyle H.
Statistical Approaches to Fault Analysis in Multivariate Process Control Richard D. De Veaux Lyle H Abstract After a brief review of some statistical approaches to multivariate process control, we present analysis or partial least squares to limit the number of latent variables to study. While using historical
Characterization of Nuclear Fuel using Multivariate Statistical Analysis
Robel, M; Robel, M; Robel, M; Kristo, M J; Kristo, M J
2007-11-27T23:59:59.000Z
Various combinations of reactor type and fuel composition have been characterized using principle components analysis (PCA) of the concentrations of 9 U and Pu isotopes in the 10 fuel as a function of burnup. The use of PCA allows the reduction of the 9-dimensional data (isotopic concentrations) into a 3-dimensional approximation, giving a visual representation of the changes in nuclear fuel composition with burnup. Real-world variation in the concentrations of {sup 234}U and {sup 236}U in the fresh (unirradiated) fuel was accounted for. The effects of reprocessing were also simulated. The results suggest that, 15 even after reprocessing, Pu isotopes can be used to determine both the type of reactor and the initial fuel composition with good discrimination. Finally, partial least squares discriminant analysis (PSLDA) was investigated as a substitute for PCA. Our results suggest that PLSDA is a better tool for this application where separation between known classes is most important.
An Application of Multivariate Statistical Analysis for Query-Driven Visualization
Gosink, Luke J.; Garth, Christoph; Anderson, John C.; Bethel, E. Wes; Joy, Kenneth I.
2010-03-01T23:59:59.000Z
Abstract?Driven by the ability to generate ever-larger, increasingly complex data, there is an urgent need in the scientific community for scalable analysis methods that can rapidly identify salient trends in scientific data. Query-Driven Visualization (QDV) strategies are among the small subset of techniques that can address both large and highly complex datasets. This paper extends the utility of QDV strategies with a statistics-based framework that integrates non-parametric distribution estimation techniques with a new segmentation strategy to visually identify statistically significant trends and features within the solution space of a query. In this framework, query distribution estimates help users to interactively explore their query's solution and visually identify the regions where the combined behavior of constrained variables is most important, statistically, to their inquiry. Our new segmentation strategy extends the distribution estimation analysis by visually conveying the individual importance of each variable to these regions of high statistical significance. We demonstrate the analysis benefits these two strategies provide and show how they may be used to facilitate the refinement of constraints over variables expressed in a user's query. We apply our method to datasets from two different scientific domains to demonstrate its broad applicability.
Classical least squares multivariate spectral analysis
Haaland, David M. (Albuquerque, NM)
2002-01-01T23:59:59.000Z
An improved classical least squares multivariate spectral analysis method that adds spectral shapes describing non-calibrated components and system effects (other than baseline corrections) present in the analyzed mixture to the prediction phase of the method. These improvements decrease or eliminate many of the restrictions to the CLS-type methods and greatly extend their capabilities, accuracy, and precision. One new application of PACLS includes the ability to accurately predict unknown sample concentrations when new unmodeled spectral components are present in the unknown samples. Other applications of PACLS include the incorporation of spectrometer drift into the quantitative multivariate model and the maintenance of a calibration on a drifting spectrometer. Finally, the ability of PACLS to transfer a multivariate model between spectrometers is demonstrated.
Multivariate calibration applied to the quantitative analysis of infrared spectra
Haaland, D.M.
1991-01-01T23:59:59.000Z
Multivariate calibration methods are very useful for improving the precision, accuracy, and reliability of quantitative spectral analyses. Spectroscopists can more effectively use these sophisticated statistical tools if they have a qualitative understanding of the techniques involved. A qualitative picture of the factor analysis multivariate calibration methods of partial least squares (PLS) and principal component regression (PCR) is presented using infrared calibrations based upon spectra of phosphosilicate glass thin films on silicon wafers. Comparisons of the relative prediction abilities of four different multivariate calibration methods are given based on Monte Carlo simulations of spectral calibration and prediction data. The success of multivariate spectral calibrations is demonstrated for several quantitative infrared studies. The infrared absorption and emission spectra of thin-film dielectrics used in the manufacture of microelectronic devices demonstrate rapid, nondestructive at-line and in-situ analyses using PLS calibrations. Finally, the application of multivariate spectral calibrations to reagentless analysis of blood is presented. We have found that the determination of glucose in whole blood taken from diabetics can be precisely monitored from the PLS calibration of either mind- or near-infrared spectra of the blood. Progress toward the non-invasive determination of glucose levels in diabetics is an ultimate goal of this research. 13 refs., 4 figs.
Hybrid least squares multivariate spectral analysis methods
Haaland, David M. (Albuquerque, NM)
2002-01-01T23:59:59.000Z
A set of hybrid least squares multivariate spectral analysis methods in which spectral shapes of components or effects not present in the original calibration step are added in a following estimation or calibration step to improve the accuracy of the estimation of the amount of the original components in the sampled mixture. The "hybrid" method herein means a combination of an initial classical least squares analysis calibration step with subsequent analysis by an inverse multivariate analysis method. A "spectral shape" herein means normally the spectral shape of a non-calibrated chemical component in the sample mixture but can also mean the spectral shapes of other sources of spectral variation, including temperature drift, shifts between spectrometers, spectrometer drift, etc. The "shape" can be continuous, discontinuous, or even discrete points illustrative of the particular effect.
Hybrid least squares multivariate spectral analysis methods
Haaland, David M.
2004-03-23T23:59:59.000Z
A set of hybrid least squares multivariate spectral analysis methods in which spectral shapes of components or effects not present in the original calibration step are added in a following prediction or calibration step to improve the accuracy of the estimation of the amount of the original components in the sampled mixture. The hybrid method herein means a combination of an initial calibration step with subsequent analysis by an inverse multivariate analysis method. A spectral shape herein means normally the spectral shape of a non-calibrated chemical component in the sample mixture but can also mean the spectral shapes of other sources of spectral variation, including temperature drift, shifts between spectrometers, spectrometer drift, etc. The shape can be continuous, discontinuous, or even discrete points illustrative of the particular effect.
Augmented Classical Least Squares Multivariate Spectral Analysis
Haaland, David M. (Albuquerque, NM); Melgaard, David K. (Albuquerque, NM)
2005-07-26T23:59:59.000Z
A method of multivariate spectral analysis, termed augmented classical least squares (ACLS), provides an improved CLS calibration model when unmodeled sources of spectral variation are contained in a calibration sample set. The ACLS methods use information derived from component or spectral residuals during the CLS calibration to provide an improved calibration-augmented CLS model. The ACLS methods are based on CLS so that they retain the qualitative benefits of CLS, yet they have the flexibility of PLS and other hybrid techniques in that they can define a prediction model even with unmodeled sources of spectral variation that are not explicitly included in the calibration model. The unmodeled sources of spectral variation may be unknown constituents, constituents with unknown concentrations, nonlinear responses, non-uniform and correlated errors, or other sources of spectral variation that are present in the calibration sample spectra. Also, since the various ACLS methods are based on CLS, they can incorporate the new prediction-augmented CLS (PACLS) method of updating the prediction model for new sources of spectral variation contained in the prediction sample set without having to return to the calibration process. The ACLS methods can also be applied to alternating least squares models. The ACLS methods can be applied to all types of multivariate data.
Augmented Classical Least Squares Multivariate Spectral Analysis
Haaland, David M. (Albuquerque, NM); Melgaard, David K. (Albuquerque, NM)
2005-01-11T23:59:59.000Z
A method of multivariate spectral analysis, termed augmented classical least squares (ACLS), provides an improved CLS calibration model when unmodeled sources of spectral variation are contained in a calibration sample set. The ACLS methods use information derived from component or spectral residuals during the CLS calibration to provide an improved calibration-augmented CLS model. The ACLS methods are based on CLS so that they retain the qualitative benefits of CLS, yet they have the flexibility of PLS and other hybrid techniques in that they can define a prediction model even with unmodeled sources of spectral variation that are not explicitly included in the calibration model. The unmodeled sources of spectral variation may be unknown constituents, constituents with unknown concentrations, nonlinear responses, non-uniform and correlated errors, or other sources of spectral variation that are present in the calibration sample spectra. Also, since the various ACLS methods are based on CLS, they can incorporate the new prediction-augmented CLS (PACLS) method of updating the prediction model for new sources of spectral variation contained in the prediction sample set without having to return to the calibration process. The ACLS methods can also be applied to alternating least squares models. The ACLS methods can be applied to all types of multivariate data.
Augmented classical least squares multivariate spectral analysis
Haaland, David M.; Melgaard, David K.
2004-02-03T23:59:59.000Z
A method of multivariate spectral analysis, termed augmented classical least squares (ACLS), provides an improved CLS calibration model when unmodeled sources of spectral variation are contained in a calibration sample set. The ACLS methods use information derived from component or spectral residuals during the CLS calibration to provide an improved calibration-augmented CLS model. The ACLS methods are based on CLS so that they retain the qualitative benefits of CLS, yet they have the flexibility of PLS and other hybrid techniques in that they can define a prediction model even with unmodeled sources of spectral variation that are not explicitly included in the calibration model. The unmodeled sources of spectral variation may be unknown constituents, constituents with unknown concentrations, nonlinear responses, non-uniform and correlated errors, or other sources of spectral variation that are present in the calibration sample spectra. Also, since the various ACLS methods are based on CLS, they can incorporate the new prediction-augmented CLS (PACLS) method of updating the prediction model for new sources of spectral variation contained in the prediction sample set without having to return to the calibration process. The ACLS methods can also be applied to alternating least squares models. The ACLS methods can be applied to all types of multivariate data.
Apparatus and system for multivariate spectral analysis
Keenan, Michael R. (Albuquerque, NM); Kotula, Paul G. (Albuquerque, NM)
2003-06-24T23:59:59.000Z
An apparatus and system for determining the properties of a sample from measured spectral data collected from the sample by performing a method of multivariate spectral analysis. The method can include: generating a two-dimensional matrix A containing measured spectral data; providing a weighted spectral data matrix D by performing a weighting operation on matrix A; factoring D into the product of two matrices, C and S.sup.T, by performing a constrained alternating least-squares analysis of D=CS.sup.T, where C is a concentration intensity matrix and S is a spectral shapes matrix; unweighting C and S by applying the inverse of the weighting used previously; and determining the properties of the sample by inspecting C and S. This method can be used by a spectrum analyzer to process X-ray spectral data generated by a spectral analysis system that can include a Scanning Electron Microscope (SEM) with an Energy Dispersive Detector and Pulse Height Analyzer.
Multivariate Mathematical Morphology applied to Color Image Analysis
LefÃ¨vre, SÃ©bastien
Chapter 10 Multivariate Mathematical Morphology applied to Color Image Analysis 10.1. Introduction analysis framework, currently fully developed for both binary and gray-level images. Its popularity in the image processing community is mainly due to its rigorous mathematical foundation as well as its inherent
Independent Statistics & Analysis
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
AFDC Printable Version Share this resource Send a link to EERE: Alternative Fuels Data Center Home Page to someone by E-mail Share EERE: Alternative Fuels Data Center Home Page on Facebook Tweet about EERE: Alternative Fuels Data Center Home Page on Twitter Bookmark EERE:1 First Use of Energy for All Purposes (Fuel and Nonfuel),Feet) Year Jan Feb Mar Apr May JunDatastreamsmmcrcalgovInstrumentsruc DocumentationP-SeriesFlickrinformation for planningto FuelIndependent Statistics & Analysis
Efficient Bayesian multivariate fMRI analysis using a sparsifying spatio-temporal prior
Edinburgh, University of
Efficient Bayesian multivariate fMRI analysis using a sparsifying spatio-temporal prior Marcel A Available online 1 December 2009 Keywords: Multivariate analysis Bayesian inference Expectation propagation Laplace prior is introduced as a multivariate approach to the analysis of neuroimaging data. It is shown
Stanford University
are often described by time-dependent statistical parameters such as mean wind speed, turbulence intensity, mean wind direction and vertical mean wind profile, which depends on the surface roughness (e.g. landMultivariate analysis and prediction of wind turbine response to varying wind field characteristics
A Multivariate Time Series Method for Monte Carlo Reactor Analysis
Taro Ueki
2008-08-14T23:59:59.000Z
A robust multivariate time series method has been established for the Monte Carlo calculation of neutron multiplication problems. The method is termed Coarse Mesh Projection Method (CMPM) and can be implemented using the coarse statistical bins for acquisition of nuclear fission source data. A novel aspect of CMPM is the combination of the general technical principle of projection pursuit in the signal processing discipline and the neutron multiplication eigenvalue problem in the nuclear engineering discipline. CMPM enables reactor physicists to accurately evaluate major eigenvalue separations of nuclear reactors with continuous energy Monte Carlo calculation. CMPM was incorporated in the MCNP Monte Carlo particle transport code of Los Alamos National Laboratory. The great advantage of CMPM over the traditional Fission Matrix method is demonstrated for the three space-dimensional modeling of the initial core of a pressurized water reactor.
DAMOCO: MATLAB toolbox for multivariate data analysis, based on coupled oscillators approach
Potsdam, UniversitÃ¤t
DAMOCO: MATLAB toolbox for multivariate data analysis, based on coupled oscillators approach This manual describes the collection of MATLAB programs for multivariate data analysis, based on modeling.agnld.uni-potsdam.de/~mros/damoco.html. 1 Introduction DAMOCO means Data Analysis with Models Of Coupled Oscillators. This MATLAB toolbox
IEEE SIGNAL PROCESSING MAGAZINE 2013 1 Kernel Multivariate Analysis Framework for
Camps-Valls, Gustavo
IEEE SIGNAL PROCESSING MAGAZINE 2013 1 Kernel Multivariate Analysis Framework for Supervised in the literature collectively grouped under the field of Multivariate Analysis (MVA). This paper provides a uniform Correlation Analysis (CCA) and Orthonormalized PLS (OPLS), as well as their non- linear extensions derived
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE 1 Learning Multivariate Distributions
Geman, Donald
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE 1 Learning Multivariate for learning high-dimensional multivariate probability distributions from estimated marginals. The approach- tially well-adapted to small-sample learning, where the bias-variance trade-off makes it necessary
Ladd-Lively, Jennifer L [ORNL] [ORNL
2014-01-01T23:59:59.000Z
The objective of this work was to determine the feasibility of using on-line multivariate statistical process control (MSPC) for safeguards applications in natural uranium conversion plants. Multivariate statistical process control is commonly used throughout industry for the detection of faults. For safeguards applications in uranium conversion plants, faults could include the diversion of intermediate products such as uranium dioxide, uranium tetrafluoride, and uranium hexafluoride. This study was limited to a 100 metric ton of uranium (MTU) per year natural uranium conversion plant (NUCP) using the wet solvent extraction method for the purification of uranium ore concentrate. A key component in the multivariate statistical methodology is the Principal Component Analysis (PCA) approach for the analysis of data, development of the base case model, and evaluation of future operations. The PCA approach was implemented through the use of singular value decomposition of the data matrix where the data matrix represents normal operation of the plant. Component mole balances were used to model each of the process units in the NUCP. However, this approach could be applied to any data set. The monitoring framework developed in this research could be used to determine whether or not a diversion of material has occurred at an NUCP as part of an International Atomic Energy Agency (IAEA) safeguards system. This approach can be used to identify the key monitoring locations, as well as locations where monitoring is unimportant. Detection limits at the key monitoring locations can also be established using this technique. Several faulty scenarios were developed to test the monitoring framework after the base case or normal operating conditions of the PCA model were established. In all of the scenarios, the monitoring framework was able to detect the fault. Overall this study was successful at meeting the stated objective.
Multivariate Statistical Analysis of Water Chemistry in Evaluating the
AFDC Printable Version Share this resource Send a link to EERE: Alternative Fuels Data Center Home Page to someone by E-mail Share EERE: Alternative Fuels Data Center Home Page on Facebook Tweet about EERE: Alternative Fuels Data Center Home Page on Twitter Bookmark EERE: Alternative Fuels Data Center Home Page on Google Bookmark EERE: Alternative Fuels Data Center Home Page on Office of Inspector General Office0-72.pdfGeorgeDoesn't32Department ofMoving Away fromMultifamilyMultipurpose
Statistical Analysis of CCD Data: Error Analysis/Noise Theorem
Peletier, Reynier
Statistical Analysis of CCD Data: Error Analysis/Noise Theorem Why Statistical Approach? Systematic Errors Random Errors (= Statistical Errors) Accuracy and Precision Best Estimator: Mean, Median Distribution Statistical CCD Data Analysis #12;Why do we need statistical analysis? (= Why do we need to worry
A Scaled Difference Chi-square Test Statistic for Moment Structure Analysis
Albert Satorra; Peter Bentler
2011-01-01T23:59:59.000Z
with small samples: Test statistics. Multivariate BehavioralKano, Y. (1992). Can test statistics in covariance structurewith applications in statistics and econometrics . New York:
Systematic wavelength selection for improved multivariate spectral analysis
Thomas, Edward V. (2828 Georgia NE., Albuquerque, NM 87110); Robinson, Mark R. (1603 Solano NE., Albuquerque, NM 87110); Haaland, David M. (809 Richmond Dr. SE., Albuquerque, NM 87106)
1995-01-01T23:59:59.000Z
Methods and apparatus for determining in a biological material one or more unknown values of at least one known characteristic (e.g. the concentration of an analyte such as glucose in blood or the concentration of one or more blood gas parameters) with a model based on a set of samples with known values of the known characteristics and a multivariate algorithm using several wavelength subsets. The method includes selecting multiple wavelength subsets, from the electromagnetic spectral region appropriate for determining the known characteristic, for use by an algorithm wherein the selection of wavelength subsets improves the model's fitness of the determination for the unknown values of the known characteristic. The selection process utilizes multivariate search methods that select both predictive and synergistic wavelengths within the range of wavelengths utilized. The fitness of the wavelength subsets is determined by the fitness function F=.function.(cost, performance). The method includes the steps of: (1) using one or more applications of a genetic algorithm to produce one or more count spectra, with multiple count spectra then combined to produce a combined count spectrum; (2) smoothing the count spectrum; (3) selecting a threshold count from a count spectrum to select these wavelength subsets which optimize the fitness function; and (4) eliminating a portion of the selected wavelength subsets. The determination of the unknown values can be made: (1) noninvasively and in vivo; (2) invasively and in vivo; or (3) in vitro.
Park, Jinyong (University of Arizona, Tucson, AZ); Balasingham, P. (University of Arizona, Tucson, AZ); McKenna, Sean Andrew; Kulatilake, Pinnaduwa H. S. W. (University of Arizona, Tucson, AZ)
2004-09-01T23:59:59.000Z
Sandia National Laboratories, under contract to Nuclear Waste Management Organization of Japan (NUMO), is performing research on regional classification of given sites in Japan with respect to potential volcanic disruption using multivariate statistics and geo-statistical interpolation techniques. This report provides results obtained for hierarchical probabilistic regionalization of volcanism for the Sengan region in Japan by applying multivariate statistical techniques and geostatistical interpolation techniques on the geologic data provided by NUMO. A workshop report produced in September 2003 by Sandia National Laboratories (Arnold et al., 2003) on volcanism lists a set of most important geologic variables as well as some secondary information related to volcanism. Geologic data extracted for the Sengan region in Japan from the data provided by NUMO revealed that data are not available at the same locations for all the important geologic variables. In other words, the geologic variable vectors were found to be incomplete spatially. However, it is necessary to have complete geologic variable vectors to perform multivariate statistical analyses. As a first step towards constructing complete geologic variable vectors, the Universal Transverse Mercator (UTM) zone 54 projected coordinate system and a 1 km square regular grid system were selected. The data available for each geologic variable on a geographic coordinate system were transferred to the aforementioned grid system. Also the recorded data on volcanic activity for Sengan region were produced on the same grid system. Each geologic variable map was compared with the recorded volcanic activity map to determine the geologic variables that are most important for volcanism. In the regionalized classification procedure, this step is known as the variable selection step. The following variables were determined as most important for volcanism: geothermal gradient, groundwater temperature, heat discharge, groundwater pH value, presence of volcanic rocks and presence of hydrothermal alteration. Data available for each of these important geologic variables were used to perform directional variogram modeling and kriging to estimate values for each variable at 23949 centers of the chosen 1 km cell grid system that represents the Sengan region. These values formed complete geologic variable vectors at each of the 23,949 one km cell centers.
Clegg, Samuel M [Los Alamos National Laboratory; Barefield, James E [Los Alamos National Laboratory; Wiens, Roger C [Los Alamos National Laboratory; Sklute, Elizabeth [MT HOLYOKE COLLEGE; Dyare, Melinda D [MT HOLYOKE COLLEGE
2008-01-01T23:59:59.000Z
Quantitative analysis with LIBS traditionally employs calibration curves that are complicated by the chemical matrix effects. These chemical matrix effects influence the LIBS plasma and the ratio of elemental composition to elemental emission line intensity. Consequently, LIBS calibration typically requires a priori knowledge of the unknown, in order for a series of calibration standards similar to the unknown to be employed. In this paper, three new Multivariate Analysis (MV A) techniques are employed to analyze the LIBS spectra of 18 disparate igneous and highly-metamorphosed rock samples. Partial Least Squares (PLS) analysis is used to generate a calibration model from which unknown samples can be analyzed. Principal Components Analysis (PCA) and Soft Independent Modeling of Class Analogy (SIMCA) are employed to generate a model and predict the rock type of the samples. These MV A techniques appear to exploit the matrix effects associated with the chemistries of these 18 samples.
Van Benthem, Mark Hilary; Mowry, Curtis Dale; Kotula, Paul Gabriel; Borek, Theodore Thaddeus, III
2010-09-01T23:59:59.000Z
Thermal decomposition of poly dimethyl siloxane compounds, Sylgard{reg_sign} 184 and 186, were examined using thermal desorption coupled gas chromatography-mass spectrometry (TD/GC-MS) and multivariate analysis. This work describes a method of producing multiway data using a stepped thermal desorption. The technique involves sequentially heating a sample of the material of interest with subsequent analysis in a commercial GC/MS system. The decomposition chromatograms were analyzed using multivariate analysis tools including principal component analysis (PCA), factor rotation employing the varimax criterion, and multivariate curve resolution. The results of the analysis show seven components related to offgassing of various fractions of siloxanes that vary as a function of temperature. Thermal desorption coupled with gas chromatography-mass spectrometry (TD/GC-MS) is a powerful analytical technique for analyzing chemical mixtures. It has great potential in numerous analytic areas including materials analysis, sports medicine, in the detection of designer drugs; and biological research for metabolomics. Data analysis is complicated, far from automated and can result in high false positive or false negative rates. We have demonstrated a step-wise TD/GC-MS technique that removes more volatile compounds from a sample before extracting the less volatile compounds. This creates an additional dimension of separation before the GC column, while simultaneously generating three-way data. Sandia's proven multivariate analysis methods, when applied to these data, have several advantages over current commercial options. It also has demonstrated potential for success in finding and enabling identification of trace compounds. Several challenges remain, however, including understanding the sources of noise in the data, outlier detection, improving the data pretreatment and analysis methods, developing a software tool for ease of use by the chemist, and demonstrating our belief that this multivariate analysis will enable superior differentiation capabilities. In addition, noise and system artifacts challenge the analysis of GC-MS data collected on lower cost equipment, ubiquitous in commercial laboratories. This research has the potential to affect many areas of analytical chemistry including materials analysis, medical testing, and environmental surveillance. It could also provide a method to measure adsorption parameters for chemical interactions on various surfaces by measuring desorption as a function of temperature for mixtures. We have presented results of a novel method for examining offgas products of a common PDMS material. Our method involves utilizing a stepped TD/GC-MS data acquisition scheme that may be almost totally automated, coupled with multivariate analysis schemes. This method of data generation and analysis can be applied to a number of materials aging and thermal degradation studies.
Sloshing in the LNG shipping industry: risk modelling through multivariate heavy-tail analysis
Sloshing in the LNG shipping industry: risk modelling through multivariate heavy-tail analysis In the liquefied natural gas (LNG) shipping industry, the phenomenon of slosh- ing can lead to the occurrence. The parsimonious representation thus obtained proves to be very convenient for the simulation of mul- tivariate
Scientific Data Analysis via Statistical Learning
Geddes, Cameron Guy Robinson
observations and simulations. Statistical machine learning algorithms have enormous potential to provide data, and the analysis of hurricanes and tropical storms in climate simulations. #12;Supervised Learning for SupernovaScientific Data Analysis via Statistical Learning Raquel Romano romano at hpcrd dot lbl dot gov
9. Statistical Energy Analysis (SEA) 80 9. Statistical Energy Analysis (SEA)
Berlin,Technische UniversitÃ¤t
9. Statistical Energy Analysis (SEA) 80 _____________________________________________________________________________ 9. Statistical Energy Analysis (SEA) 9.1 Introduction In this chapter an introduction to a framework denoted Statistical Energy Analysis was developed in the 1960's, to a great extent to clarify and handle
Reichardt, Thomas A.; Timlin, Jerilyn Ann; Jones, Howland D. T.; Sickafoose, Shane M.; Schmitt, Randal L.
2010-09-01T23:59:59.000Z
Laser-induced fluorescence measurements of cuvette-contained laser dye mixtures are made for evaluation of multivariate analysis techniques to optically thick environments. Nine mixtures of Coumarin 500 and Rhodamine 610 are analyzed, as well as the pure dyes. For each sample, the cuvette is positioned on a two-axis translation stage to allow the interrogation at different spatial locations, allowing the examination of both primary (absorption of the laser light) and secondary (absorption of the fluorescence) inner filter effects. In addition to these expected inner filter effects, we find evidence that a portion of the absorbed fluorescence is re-emitted. A total of 688 spectra are acquired for the evaluation of multivariate analysis approaches to account for nonlinear effects.
Statistical Analysis of EXTREMES in GEOPHYSICS
Gilleland, Eric
Statistical Analysis of EXTREMES in GEOPHYSICS Zwiers FW and Kharin VV. 1998. Changes in the extremes of the climate simulated by CCC GCM2 under CO2 dou- bling. J. Climate 11:2200Â2222. http://www.ral.ucar.edu/staff/ericg/readinggroup.html #12;Outline Â· Some background on Extreme Value Statistics Â Extremal Types Theorem Â Max
Spectral compression algorithms for the analysis of very large multivariate images
Keenan, Michael R. (Albuquerque, NM)
2007-10-16T23:59:59.000Z
A method for spectrally compressing data sets enables the efficient analysis of very large multivariate images. The spectral compression algorithm uses a factored representation of the data that can be obtained from Principal Components Analysis or other factorization technique. Furthermore, a block algorithm can be used for performing common operations more efficiently. An image analysis can be performed on the factored representation of the data, using only the most significant factors. The spectral compression algorithm can be combined with a spatial compression algorithm to provide further computational efficiencies.
Statistical analysis of correlated fossil fuel securities
Li, Derek Z
2011-01-01T23:59:59.000Z
Forecasting the future prices or returns of a security is extraordinarily difficult if not impossible. However, statistical analysis of a basket of highly correlated securities offering a cross-sectional representation of ...
Enhancing e-waste estimates: Improving data quality by multivariate Input–Output Analysis
Wang, Feng, E-mail: fwang@unu.edu [Institute for Sustainability and Peace, United Nations University, Hermann-Ehler-Str. 10, 53113 Bonn (Germany); Design for Sustainability Lab, Faculty of Industrial Design Engineering, Delft University of Technology, Landbergstraat 15, 2628CE Delft (Netherlands); Huisman, Jaco [Institute for Sustainability and Peace, United Nations University, Hermann-Ehler-Str. 10, 53113 Bonn (Germany); Design for Sustainability Lab, Faculty of Industrial Design Engineering, Delft University of Technology, Landbergstraat 15, 2628CE Delft (Netherlands); Stevels, Ab [Design for Sustainability Lab, Faculty of Industrial Design Engineering, Delft University of Technology, Landbergstraat 15, 2628CE Delft (Netherlands); Baldé, Cornelis Peter [Institute for Sustainability and Peace, United Nations University, Hermann-Ehler-Str. 10, 53113 Bonn (Germany); Statistics Netherlands, Henri Faasdreef 312, 2492 JP Den Haag (Netherlands)
2013-11-15T23:59:59.000Z
Highlights: • A multivariate Input–Output Analysis method for e-waste estimates is proposed. • Applying multivariate analysis to consolidate data can enhance e-waste estimates. • We examine the influence of model selection and data quality on e-waste estimates. • Datasets of all e-waste related variables in a Dutch case study have been provided. • Accurate modeling of time-variant lifespan distributions is critical for estimate. - Abstract: Waste electrical and electronic equipment (or e-waste) is one of the fastest growing waste streams, which encompasses a wide and increasing spectrum of products. Accurate estimation of e-waste generation is difficult, mainly due to lack of high quality data referred to market and socio-economic dynamics. This paper addresses how to enhance e-waste estimates by providing techniques to increase data quality. An advanced, flexible and multivariate Input–Output Analysis (IOA) method is proposed. It links all three pillars in IOA (product sales, stock and lifespan profiles) to construct mathematical relationships between various data points. By applying this method, the data consolidation steps can generate more accurate time-series datasets from available data pool. This can consequently increase the reliability of e-waste estimates compared to the approach without data processing. A case study in the Netherlands is used to apply the advanced IOA model. As a result, for the first time ever, complete datasets of all three variables for estimating all types of e-waste have been obtained. The result of this study also demonstrates significant disparity between various estimation models, arising from the use of data under different conditions. It shows the importance of applying multivariate approach and multiple sources to improve data quality for modelling, specifically using appropriate time-varying lifespan parameters. Following the case study, a roadmap with a procedural guideline is provided to enhance e-waste estimation studies.
Stratigraphic statistical curvature analysis techniques
Bengtson, C.A.; Ziagos, J.P.
1987-05-01T23:59:59.000Z
SCAT applies statistical techniques to dipmeter data to identify patterns of bulk curvature, determine transverse and longitudinal structural directions, and reconstruct cross sections and contour maps. STRAT-SCAT applies the same concepts to geometric interpretation of multistoried unimodal, bimodal, or trough-type cross-bedding and also to seismic stratigraphy-scale stratigraphic structures. Structural dip, which comprises the bulk of dipmeter data, is related to beds that (statistically) were deposited with horizontal attitudes; stratigraphic dip is related to beds that were deposited with preferentially oriented nonhorizontal attitudes or to beds that assumed such attitudes because of differential compaction. Stratigraphic dip generates local zones of departure from structural dip on special SCAT plots. The RMS (root-mean-square) of apparent structural dip is greatest in the (structural) T-direction and least in the perpendicular L-direction; the RMS of stratigraphic dip (measured with respect to structural dip) is greatest in the stratigraphic T*-direction and least in the stratigraphic L*-direction. Multistoried, cross-bedding appears on T*-plots as local zones of either greater scatter or statistically significant departure of stratigraphic median dip from structural dip. In contrast, the L*-plot (except for trough-type cross-bedding) is sensitive to cross-bedding. Seismic stratigraphy-scale depositional sequences are identified on Mercator dip versus azimuth plots and polar tangent plots as secondary cylindrical-fold patterns imposed on global structural patterns. Progradational sequences generate local cycloid-type patterns on T*-plots, and compactional sequences generate local cycloid-type patterns on T*-plots, and compactional sequences generate local half-cusp patterns. Both features, however, show only structural dip on L*-plots.
Spatial compression algorithm for the analysis of very large multivariate images
Keenan, Michael R. (Albuquerque, NM)
2008-07-15T23:59:59.000Z
A method for spatially compressing data sets enables the efficient analysis of very large multivariate images. The spatial compression algorithms use a wavelet transformation to map an image into a compressed image containing a smaller number of pixels that retain the original image's information content. Image analysis can then be performed on a compressed data matrix consisting of a reduced number of significant wavelet coefficients. Furthermore, a block algorithm can be used for performing common operations more efficiently. The spatial compression algorithms can be combined with spectral compression algorithms to provide further computational efficiencies.
Statistical Design, Analysis and Graphics for the Guadalupe
Statistical Design, Analysis and Graphics for the Guadalupe River Assessment Technical Memoranda Science Center (2013). Statistical Design, Analysis and Graphics for the Guadalupe River Assessment
Statistical Energy Analysis and the second principle of thermodynamics
Paris-Sud XI, UniversitÃ© de
Statistical Energy Analysis and the second principle of thermodynamics Alain Le Bot Abstract Statistical Energy Analysis is a statistical method in vibroacoustics en- tirely based on the application discussed. 1 Introduction Statistical Energy Analysis [1, 2] is born from the application of statistical
KINETIC ANALYSIS OF HIGH-NITROGEN ENERGETIC MATERIALS USING MULTIVARIATE NONLINEAR REGRESSION
Campbell, M. S. (Mary Stinecipher); Rabie, R. L. (Ronald L.); Diaz-Acosta, I. (Irina); Pulay, P. (Peter)
2001-01-01T23:59:59.000Z
New high-nitrogen energetic materials were synthesized by Hiskey and Naud. J. Opfermann reported a new tool for finding the probable model of the complex reactions using multivariate non-linear regression analysis of DSC and TGA data from several measurements run at different heating rates. This study is to take the kinetic parameters from the different steps and discover which reaction step is responsible for the runaway reaction by comparing predicted results from the Frank-Kamenetsckii equation with the critical temperature found experimentally using the modified Henkin test.
STATISTICAL ANALYSIS OF PROTEIN FOLDING KINETICS
Dinner, Aaron
STATISTICAL ANALYSIS OF PROTEIN FOLDING KINETICS AARON R. DINNER New Chemistry Laboratory for Protein Folding: Advances in Chemical Physics, Volume 120. Edited by Richard A. Friesner. Series Editors Experimental and theoretical studies have led to the emergence of a unified general mechanism for protein
A multivariate analysis of the energy intensity of sprawl versus compact living in the U.S. for 2003
Vermont, University of
patterns in household energy intensities. We define sprawl in terms of location in rural areas or in areasAnalysis A multivariate analysis of the energy intensity of sprawl versus compact living in the U of Illinois at Urbana-Champaign, 1101 W Peabody Dr, Urbana, IL 61801, United States d Smart Energy Design
Li, Deyuan
Direct Journal of Multivariate Analysis journal homepage: www.elsevier.com/locate/jmva A note on tail dependence. Such a dependence structure is critical for various purposes, which include asset pricing, portfolio optimization random variable (Y, Z) with the same marginal distributions, TDC is defined as = lim u P Y > u|Z > u
Statistical Hot Channel Analysis for the NBSR
Cuadra A.; Baek J.
2014-05-27T23:59:59.000Z
A statistical analysis of thermal limits has been carried out for the research reactor (NBSR) at the National Institute of Standards and Technology (NIST). The objective of this analysis was to update the uncertainties of the hot channel factors with respect to previous analysis for both high-enriched uranium (HEU) and low-enriched uranium (LEU) fuels. Although uncertainties in key parameters which enter into the analysis are not yet known for the LEU core, the current analysis uses reasonable approximations instead of conservative estimates based on HEU values. Cumulative distribution functions (CDFs) were obtained for critical heat flux ratio (CHFR), and onset of flow instability ratio (OFIR). As was done previously, the Sudo-Kaminaga correlation was used for CHF and the Saha-Zuber correlation was used for OFI. Results were obtained for probability levels of 90%, 95%, and 99.9%. As an example of the analysis, the results for both the existing reactor with HEU fuel and the LEU core show that CHFR would have to be above 1.39 to assure with 95% probability that there is no CHF. For the OFIR, the results show that the ratio should be above 1.40 to assure with a 95% probability that OFI is not reached.
annihilation factor analysis: Topics by E-print Network
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
16 Correlated Bayesian Factor Analysis CiteSeer Summary: Factor analysis is a method in multivariate statistical analysis that can help scientists determine which variables to...
Wriggers, Willy
Vised Manuscript ReceiVed: September 3, 2008 A multivariate statistical theory, local feature analysis (LFA be described by harmonic potential wells. For most systems, the dimension n of this essential space is veryCoarse-Graining Protein Structures With Local Multivariate Features from Molecular Dynamics Zhiyong
Statistics for Analysis of Experimental Data Catherine A. Peters
Peters, Catherine A.
Statistics for Analysis of Experimental Data Catherine A. Peters Department of Civil Engineering Processes Laboratory Manual S. E. Powers, Ed. AEESP, Champaign, IL 2001 1 #12;Statistics Princeton University Princeton, NJ 08544 Statistics is a mathematical tool for quantitative analysis of data
Goutami Chattopadhyay; Surajit Chattopadhyay; Rajni Jain
2009-10-28T23:59:59.000Z
In this paper, the complexities in the relationship between rainfall and sea surface temperature (SST) anomalies during the winter monsoon (November-January) over India were evaluated statistically using scatter plot matrices and autocorrelation functions.Linear as well as polynomial trend equations were obtained and it was observed that the coefficient of determination for the linear trend was very low and it remained low even when polynomial trend of degree six was used. An exponential regression equation and an artificial neural network with extensive variable selection were generated to forecast the average winter monsoon rainfall of a given year using the rainfall amounts and the sea surface temperature anomalies in the winter monsoon months of the previous year as predictors. The regression coefficients for the multiple exponential regression equation were generated using Levenberg-Marquardt algorithm. The artificial neural network was generated in the form of a multiplayer perceptron with sigmoid non-linearity and genetic-algorithm based variable selection. Both of the predictive models were judged statistically using the Willmott index, percentage error of prediction, and prediction yields. The statistical assessment revealed the potential of artificial neural network over exponential regression.
Seismic Attribute Analysis Using Higher Order Statistics
Greenidge, Janelle Candice
2009-05-15T23:59:59.000Z
Seismic data processing depends on mathematical and statistical tools such as convolution, crosscorrelation and stack that employ second-order statistics (SOS). Seismic signals are non-Gaussian and therefore contain information beyond SOS. One...
Characterization of Used Nuclear Fuel with Multivariate Analysis for Process Monitoring
Dayman, Kenneth J. [Univ. of Texas at Austin, TX (United States); Coble, Jamie B. [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States); Orton, Christopher R. [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States); Schwantes, Jon M. [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States)
2014-01-01T23:59:59.000Z
The Multi-Isotope Process (MIP) Monitor combines gamma spectroscopy and multivariate analysis to detect anomalies in various process streams in a nuclear fuel reprocessing system. Measured spectra are compared to models of nominal behavior at each measurement location to detect unexpected changes in system behavior. In order to improve the accuracy and specificity of process monitoring, fuel characterization may be used to more accurately train subsequent models in a full analysis scheme. This paper presents initial development of a reactor-type classifier that is used to select a reactor-specific partial least squares model to predict fuel burnup. Nuclide activities for prototypic used fuel samples were generated in ORIGEN-ARP and used to investigate techniques to characterize used nuclear fuel in terms of reactor type (pressurized or boiling water reactor) and burnup. A variety of reactor type classification algorithms, including k-nearest neighbors, linear and quadratic discriminant analyses, and support vector machines, were evaluated to differentiate used fuel from pressurized and boiling water reactors. Then, reactor type-specific partial least squares models were developed to predict the burnup of the fuel. Using these reactor type-specific models instead of a model trained for all light water reactors improved the accuracy of burnup predictions. The developed classification and prediction models were combined and applied to a large dataset that included eight fuel assembly designs, two of which were not used in training the models, and spanned the range of the initial 235U enrichment, cooling time, and burnup values expected of future commercial used fuel for reprocessing. Error rates were consistent across the range of considered enrichment, cooling time, and burnup values. Average absolute relative errors in burnup predictions for validation data both within and outside the training space were 0.0574% and 0.0597%, respectively. The errors seen in this work are artificially low, because the models were trained, optimized, and tested on simulated, noise-free data. However, these results indicate that the developed models may generalize well to new data and that the proposed approach constitutes a viable first step in developing a fuel characterization algorithm based on gamma spectra.
Statistical Analysis of Transient Cycle Test Results in a 40...
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Analysis of Transient Cycle Test Results in a 40 CFR Part 1065 Engine Dynamometer Test Cell Statistical Analysis of Transient Cycle Test Results in a 40 CFR Part 1065 Engine...
Multivariate Analysis from a Statistical Point of View K.S. Cranmer
Fernandez, Thomas
for the results of the search for the Standard Model Higgs boson at LEP [4]. The Neyman-Pearson theory (which we
Statistical analysis of Single Nucleotide Polymorphism microarrays in cancer
Paris-Sud XI, UniversitÃ© de
Statistical analysis of Single Nucleotide Polymorphism microarrays in cancer studies Pierre Neuvial Nucleotide Polymorphism (SNP) arrays. We define the copy number states formally, and show how Nucleotide Polymorphism microarrays in cancer studies
analysis sampling statistics: Topics by E-print Network
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
bounds point out a difficu... Blum, A S; Blum, Avrim; Yang, Ke 2003-01-01 39 Statistical Energy Analysis 2 Long. 3 Long. Energy Storage, Conversion and Utilization Websites...
An economic and statistical analysis of pecan prices
Hertel, Karlene Sharon
1979-01-01T23:59:59.000Z
AN ECONOMIC AND STATISTICAL ANALYSIS OP PECAN PRICES A Thesis by KARLENE SHARON HERTEL Submitted to the Graduate College of Texas A&M University in partial fulfillment of the requirement for the degree of NASTER OP SCIENCE August 1979 Maj... or Subject: Agricultural Economics AN ECONOMIC AND STATISTICAL ANALYSIS OF PECAN PRICES A Thesis by KARLENE SHARON HERTEL Approved as to style and content by: ( hairman of Committ ) ead of Depar ment) (Member) (Member) August 1979 ABSTRACT...
Statistical analysis of a dynamical multifragmentation path
A. H. Raduta; M. Colonna; V. Baran; M. Di Toro
2006-02-20T23:59:59.000Z
A microcanonical multifragmentation model (MMM) is used for investigating whether equilibration really occurs in the dynamical evolution of two heavy ion collisions simulated via a stochastic mean field approach (SMF). The standard deviation function between the dynamically obtained freeze-out fragment distributions corresponding to the reaction $^{129}$Xe+$^{119}$Sn at 32 MeV/u and the MMM ones corresponding to a wide range of mass, excitation energy, freeze-out volume and nuclear level density cut-off parameter shows a unique minimum. A distinct statistically equilibrated stage is identified in the dynamical evolution of the system.
Special problems in statistical critical pert analysis
Robieux, Christian Claude
1978-01-01T23:59:59.000Z
the mathematical problem. The section 1. 3. 5 presents different kinds of remainder term and ex- plains the consequences of these results. But before we need to sum up the structure of the network by a certain kind of statistical relationship between the paths...". The definition of I is I = E( (T ? E(T)) (T ? E(T)) ' ) Since T = PX, we have I = P E( (X ? E(Z)) (X ? E(X)) ' )P' The independence of the x . ' s implies i E=Pdiag (o. . . a) P' 1''''' b (1. 1. 2) where o2 is the variance of the activity time T 1 As diag...
Shapiro, Alex
StatisticalInstitute Towards a Unified Theory of Inequality Constrained Testing in Multivariate Analysis A. Shapiro Introduction Statisticalinferencefor equalityconstrainedproblemsin multivariateanalysisis well established properties of linear spaces. In particularit is meaningfulto consideran orthogonal
Multivariate analysis of spatial patterns: a unified approach to local and global structures
Thioulouse, Jean
.g., principal component analysis, correspondence analysis) can then be used to detect total, local and global to write a total variance decomposition into local and global components, and to propose a unified view and the global one (with the same point of view as in Legendre, 1993), the total variability being decomposed
Data analysis using the Gnu R system for statistical computation
Simone, James; /Fermilab
2011-07-01T23:59:59.000Z
R is a language system for statistical computation. It is widely used in statistics, bioinformatics, machine learning, data mining, quantitative finance, and the analysis of clinical drug trials. Among the advantages of R are: it has become the standard language for developing statistical techniques, it is being actively developed by a large and growing global user community, it is open source software, it is highly portable (Linux, OS-X and Windows), it has a built-in documentation system, it produces high quality graphics and it is easily extensible with over four thousand extension library packages available covering statistics and applications. This report gives a very brief introduction to R with some examples using lattice QCD simulation results. It then discusses the development of R packages designed for chi-square minimization fits for lattice n-pt correlation functions.
Statistical Error analysis of Nucleon-Nucleon phenomenological potentials
R. Navarro Perez; J. E. Amaro; E. Ruiz Arriola
2014-06-10T23:59:59.000Z
Nucleon-Nucleon potentials are commonplace in nuclear physics and are determined from a finite number of experimental data with limited precision sampling the scattering process. We study the statistical assumptions implicit in the standard least squares fitting procedure and apply, along with more conventional tests, a tail sensitive quantile-quantile test as a simple and confident tool to verify the normality of residuals. We show that the fulfilment of normality tests is linked to a judicious and consistent selection of a nucleon-nucleon database. These considerations prove crucial to a proper statistical error analysis and uncertainty propagation. We illustrate these issues by analyzing about 8000 proton-proton and neutron-proton scattering published data. This enables the construction of potentials meeting all statistical requirements necessary for statistical uncertainty estimates in nuclear structure calculations.
Earnings forecast bias -a statistical analysis Franois Dossou
Paris-Sud XI, UniversitÃ© de
Earnings forecast bias - a statistical analysis FranÃ§ois Dossou Sandrine Lardic** Karine Michalon' earnings forecasts is an important aspect of research for different reasons: Many empirical studies employ analysts' consensus forecasts as a proxy for the market's expectations of future earnings in order
Statistical Analysis of Protein Folding Kinetics Aaron R. Dinner
Dinner, Aaron
Statistical Analysis of Protein Folding Kinetics Aaron R. Dinner , Sung-Sau So Â¡ , and Martin and theoretical studies over several years have led to the emergence of a unified general mechanism for protein folding that serves as a framework for the design and interpretation of research in this area [1
Advanced Analysis Qualifying Examination Department of Mathematics and Statistics
Massachusetts at Amherst, University of
be a continuous increasing invertible function. Let µF and µF be the Lebesgue-Stieljes measures associated to FNAME: Advanced Analysis Qualifying Examination Department of Mathematics and Statistics University function or characteristic function of A. 2. If a measure is not specified, use Lebesgue measure on R
Statistical Analysis of Environment Canada's Wind Speed Data
Taylor, James H.
Statistical Analysis of Environment Canada's Wind Speed Data Someshwar Singh Department Brunswick-Fredericton New Brunswick, Canada Email: jtaylor@unb.ca Abstract--Wind energy utilities use wind. This paper reports on a study of the histories of wind speed forecasts and actual wind speed data available
Statistical mechanical analysis of the dynamics of learning in perceptrons
Coolen, ACC "Ton"
with constant learning rate 2.5. Theory versus simulations 3. On-line learning: complete training setsStatistical mechanical analysis of the dynamics of learning in perceptrons C. W. H. MACE and A. C to analyse the dynamics of various classes of supervised learning rules in perceptrons. The character
Statistical Analysis of X-ray Speckle at the NSLS
Ophelia K. C. Tsui; S. G. J. Mochrie; L. E. Berman
1997-09-30T23:59:59.000Z
We report a statistical analysis of the static speckle produced by illuminating a disordered aerogel sample by a nominally coherent x-ray beam at wiggler beamline X25 at the National Synchrotron Light Source. The results of the analysis allow us to determine that the coherence delivered to the X25 hutch is within 35% of what is expected. The rate of coherent photons is approximately two times smaller than expected on the basis of the X25 wiggler source brilliance.
Multivariate Analysis of Spectral Measurements for the Characterization of Semiconductor Processes
Boning, Duane S.
simulation software, and verifying this model against experimentally measured thermal behavior in CMP. Thesis as comprehensive analysis and comparison of two innovative sensors, optical reflectance and IR thermography is shown to be useful for characterizing thermal behavior and energy flow in the CMP process. This thesis
Feature-Based Statistical Analysis of Combustion Simulation Data
Bennett, J; Krishnamoorthy, V; Liu, S; Grout, R; Hawkes, E; Chen, J; Pascucci, V; Bremer, P T
2011-11-18T23:59:59.000Z
We present a new framework for feature-based statistical analysis of large-scale scientific data and demonstrate its effectiveness by analyzing features from Direct Numerical Simulations (DNS) of turbulent combustion. Turbulent flows are ubiquitous and account for transport and mixing processes in combustion, astrophysics, fusion, and climate modeling among other disciplines. They are also characterized by coherent structure or organized motion, i.e. nonlocal entities whose geometrical features can directly impact molecular mixing and reactive processes. While traditional multi-point statistics provide correlative information, they lack nonlocal structural information, and hence, fail to provide mechanistic causality information between organized fluid motion and mixing and reactive processes. Hence, it is of great interest to capture and track flow features and their statistics together with their correlation with relevant scalar quantities, e.g. temperature or species concentrations. In our approach we encode the set of all possible flow features by pre-computing merge trees augmented with attributes, such as statistical moments of various scalar fields, e.g. temperature, as well as length-scales computed via spectral analysis. The computation is performed in an efficient streaming manner in a pre-processing step and results in a collection of meta-data that is orders of magnitude smaller than the original simulation data. This meta-data is sufficient to support a fully flexible and interactive analysis of the features, allowing for arbitrary thresholds, providing per-feature statistics, and creating various global diagnostics such as Cumulative Density Functions (CDFs), histograms, or time-series. We combine the analysis with a rendering of the features in a linked-view browser that enables scientists to interactively explore, visualize, and analyze the equivalent of one terabyte of simulation data. We highlight the utility of this new framework for combustion science; however, it is applicable to many other science domains.
analysis randomized placebo-controlled: Topics by E-print Network
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Very large sample size data sets of displacement, unknown authors 2006-01-01 70 Math 566 Multivariate Statistical Analysis Course Description from Bulletin: Matrix algebra and...
Statistical fractal analysis of 25 young star clusters
Gregorio-Hetem, J; Santos-Silva, T; Fernandes, B
2015-01-01T23:59:59.000Z
A large sample of young stellar groups is analysed aiming to investigate their clustering properties and dynamical evolution. A comparison of the Q statistical parameter, measured for the clusters, with the fractal dimension estimated for the projected clouds shows that 52% of the sample has substructures and tends to follow the theoretically expected relation between clusters and clouds, according to calculations for artificial distribution of points. The fractal statistics was also compared to structural parameters revealing that clusters having radial density profile show a trend of parameter s increasing with mean surface stellar density. The core radius of the sample, as a function of age, follows a distribution similar to that observed in stellar groups of Milky Way and other galaxies. They also have dynamical age, indicated by their crossing time that is similar to unbound associations. The statistical analysis allowed us to separate the sample into two groups showing different clustering characteristi...
HistFitter software framework for statistical data analysis
M. Baak; G. J. Besjes; D. Cote; A. Koutsman; J. Lorenz; D. Short
2014-10-06T23:59:59.000Z
We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton-proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearly arbitrary complexity. Starting from an object-oriented configuration, defined by users, the framework builds probability density functions that are automatically fitted to data and interpreted with statistical tests. A key innovation of HistFitter is its design, which is rooted in core analysis strategies of particle physics. The concepts of control, signal and validation regions are woven into its very fabric. These are progressively treated with statistically rigorous built-in methods. Being capable of working with multiple data models at once, HistFitter introduces an additional level of abstraction that allows for easy bookkeeping, manipulation and testing of large collections of signal hypotheses. Finally, HistFitter provides a collection of tools to present results with publication-quality style through a simple command-line interface.
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
of Experiments Reliability Bayesian Methods Statistical Computation Statistical Graphics and Visualization Analysis of Measurement Systems Projects Data Analysis System...
Dyman, T.S.
1987-05-01T23:59:59.000Z
Desmoinesian sandstones from the northeast Oklahoma platform and from the Anadarko and McAlester basins record a complex interaction between mid-Pennsylvanian source-area tectonism and cyclic sedimentation patterns associated with transgressions and regressions. Framework grain summaries for 67 thin sections from sandstones of the Cherokee Group (Bartlesville, Red Fork, Skinner, and Prue) were subjected to multivariate statistical analysis to establish regional compositional trends for provenance analysis. R-mode cluster and correspondence analyses were used to determine the contributing effect (total variance) of key framework grains. Fragments of monocrystalline and polycrystalline quartz, chert, metamorphic rock, and limestone contribute most to the variation in the grain population. Q-mode cluster and correspondence analyses were used to identify three distinct petrofacies. Petrofacies I is rich in monocrystalline quartz (86 to 98%) and contains rare mica and rock fragments. Petrofacies II is also rich in monocrystalline quartz (66 to 86%) and contains as much as 15% metamorphic and sedimentary rock fragments. Petrofacies III is compositionally heterogeneous and contains fragments of polycrystalline and monocrystalline quartz, mica, chert, and metamorphic and sedimentary rocks. Quantitative analyses indicate that Desmoinesian sandstones were derived from complex sedimentary and metamorphic source areas. Petrofacies I sandstones are restricted to the southwestern part of the Anadarko basin and the northeast Oklahoma platform, whereas petrofacies II and III sandstones are distributed throughout the study area. The distribution of petrofacies within the region suggests a model of source-area interaction and cratonic sediment recycling.
Statistical Analysis of Abnormal Electric Power Grid Behavior
Ferryman, Thomas A.; Amidan, Brett G.
2010-10-30T23:59:59.000Z
Pacific Northwest National Laboratory is developing a technique to analyze Phasor Measurement Unit data to identify typical patterns, atypical events and precursors to a blackout or other undesirable event. The approach combines a data-driven multivariate analysis with an engineering-model approach. The method identifies atypical events, provides a plane English description of the event, and the capability to use drill-down graphics for detailed investigations. The tool can be applied to the entire grid, individual organizations (e.g. TVA, BPA), or specific substations (e.g., TVA_CUMB). The tool is envisioned for (1) event investigations, (2) overnight processing to generate a Morning Report that characterizes the previous days activity with respect to previous activity over the previous 10-30 days, and (3) potentially near-real-time operation to support the grid operators. This paper presents the current status of the tool and illustrations of its application to real world PMU data collected in three 10-day periods in 2007.
ATP binding to a multisubunit enzyme: statistical thermodynamics analysis
Yunxin Zhang
2012-03-22T23:59:59.000Z
Due to inter-subunit communication, multisubunit enzymes usually hydrolyze ATP in a concerted fashion. However, so far the principle of this process remains poorly understood. In this study, from the viewpoint of statistical thermodynamics, a simple model is presented. In this model, we assume that the binding of ATP will change the potential of the corresponding enzyme subunit, and the degree of this change depends on the state of its adjacent subunits. The probability of enzyme in a given state satisfies the Boltzmann's distribution. Although it looks much simple, this model can fit the recent experimental data of chaperonin TRiC/CCT well. From this model, the dominant state of TRiC/CCT can be obtained. This study provided a new way to understand biophysical processes by statistical thermodynamics analysis.
ATP binding to a multisubunit enzyme: statistical thermodynamics analysis
Zhang, Yunxin
2012-01-01T23:59:59.000Z
Due to inter-subunit communication, multisubunit enzymes usually hydrolyze ATP in a concerted fashion. However, so far the principle of this process remains poorly understood. In this study, from the viewpoint of statistical thermodynamics, a simple model is presented. In this model, we assume that the binding of ATP will change the potential of the corresponding enzyme subunit, and the degree of this change depends on the state of its adjacent subunits. The probability of enzyme in a given state satisfies the Boltzmann's distribution. Although it looks much simple, this model can fit the recent experimental data of chaperonin TRiC/CCT well. From this model, the dominant state of TRiC/CCT can be obtained. This study provided a new way to understand biophysical processes by statistical thermodynamics analysis.
Wheeler, Conrad, and Figliozzi 1 A Statistical Analysis of Bicycle Rider Performance
Bertini, Robert L.
Wheeler, Conrad, and Figliozzi 1 A Statistical Analysis of Bicycle Rider Performance: The impact) A Statistical Analysis of Bicycle Rider Performance: The impact of gender on riders' performance at signalized;Wheeler, Conrad, and Figliozzi 2 A Statistical Analysis of Bicycle Rider Performance: The impact of gender
Statistical analysis of cascading failures in power grids
Chertkov, Michael [Los Alamos National Laboratory; Pfitzner, Rene [Los Alamos National Laboratory; Turitsyn, Konstantin [Los Alamos National Laboratory
2010-12-01T23:59:59.000Z
We introduce a new microscopic model of cascading failures in transmission power grids. This model accounts for automatic response of the grid to load fluctuations that take place on the scale of minutes, when optimum power flow adjustments and load shedding controls are unavailable. We describe extreme events, caused by load fluctuations, which cause cascading failures of loads, generators and lines. Our model is quasi-static in the causal, discrete time and sequential resolution of individual failures. The model, in its simplest realization based on the Directed Current description of the power flow problem, is tested on three standard IEEE systems consisting of 30, 39 and 118 buses. Our statistical analysis suggests a straightforward classification of cascading and islanding phases in terms of the ratios between average number of removed loads, generators and links. The analysis also demonstrates sensitivity to variations in line capacities. Future research challenges in modeling and control of cascading outages over real-world power networks are discussed.
Statistical Analysis and Modeling of Occupancy Patterns in Open-Plan
LBNL-6080E Statistical Analysis and Modeling of Occupancy Patterns in Open-Plan Offices using Orlando Lawrence Berkeley National Laboratory is an equal opportunity employer. #12;1 Statistical Analysis statistical methods to analyze the occupancy status, based on measured lighting-switch data in five
Distributed Multivariate Regression Using Wavelet-based Collective Data Mining.
Kargupta, Hilol
an approach to the analysis of distributed, heterogeneous databases with distinct feature spacesDistributed Multivariate Regression Using Wavelet-based Collective Data Mining. Daryl E a method for distributed multivariate regression using wavelet- based Collective Data Mining (CDM
Li, Haijun
Multivariate Extremes Dependence Comparison Stochastic Tail Order Back to Multivariate Extremes Dependence Comparison of Multivariate Extremes Haijun Li Department of Mathematics Washington State University IWAP12, Jerusalem Haijun Li Dependence Comparison of Multivariate Extremes IWAP12, Jerusalem 1
Tatiana G. Levitskaia; James M. Peterson; Emily L. Campbell; Amanda J. Casella; Dean R. Peterman; Samuel A. Bryan
2013-12-01T23:59:59.000Z
In liquid–liquid extraction separation processes, accumulation of organic solvent degradation products is detrimental to the process robustness, and frequent solvent analysis is warranted. Our research explores the feasibility of online monitoring of the organic solvents relevant to used nuclear fuel reprocessing. This paper describes the first phase of developing a system for monitoring the tributyl phosphate (TBP)/n-dodecane solvent commonly used to separate used nuclear fuel. In this investigation, the effect of extraction of nitric acid from aqueous solutions of variable concentrations on the quantification of TBP and its major degradation product dibutylphosphoric acid (HDBP) was assessed. Fourier transform infrared (FTIR) spectroscopy was used to discriminate between HDBP and TBP in the nitric acid-containing TBP/n-dodecane solvent. Multivariate analysis of the spectral data facilitated the development of regression models for HDBP and TBP quantification in real time, enabling online implementation of the monitoring system. The predictive regression models were validated using TBP/n-dodecane solvent samples subjected to high-dose external ?-irradiation. The predictive models were translated to flow conditions using a hollow fiber FTIR probe installed in a centrifugal contactor extraction apparatus, demonstrating the applicability of the FTIR technique coupled with multivariate analysis for the online monitoring of the organic solvent degradation products.
Levitskaia, Tatiana G.; Peterson, James M.; Campbell, Emily L.; Casella, Amanda J.; Peterman, Dean; Bryan, Samuel A.
2013-11-05T23:59:59.000Z
In liquid-liquid extraction separation processes, accumulation of organic solvent degradation products is detrimental to the process robustness and frequent solvent analysis is warranted. Our research explores feasibility of online monitoring of the organic solvents relevant to used nuclear fuel reprocessing. This paper describes the first phase of developing a system for monitoring the tributyl phosphate (TBP)/n-dodecane solvent commonly used to separate used nuclear fuel. In this investigation, the effect of extraction of nitric acid from aqueous solutions of variable concentrations on the quantification of TBP and its major degradation product dibutyl phosphoric acid (HDBP) was assessed. Fourier Transform Infrared Spectroscopy (FTIR) spectroscopy was used to discriminate between HDBP and TBP in the nitric acid-containing TBP/n-dodecane solvent. Multivariate analysis of the spectral data facilitated the development of regression models for HDBP and TBP quantification in real time, enabling online implementation of the monitoring system. The predictive regression models were validated using TBP/n-dodecane solvent samples subjected to the high dose external gamma irradiation. The predictive models were translated to flow conditions using a hollow fiber FTIR probe installed in a centrifugal contactor extraction apparatus demonstrating the applicability of the FTIR technique coupled with multivariate analysis for the online monitoring of the organic solvent degradation products.
10-10:50 am, Location Green Center 265 Web Page: The username and password for the website, economics, education, ecology, geology, sociology, energy, atmospheric sciences, law enforcement
Gerencher, J.J. Jr.
1983-01-01T23:59:59.000Z
Multivariate statistical techniques have been applied to study interrelationships among 12 variables within a set of 277 coals representing whole-seam channel, column, and core samples obtained from each of the 6 coal provinces of the United States, and varying in rank from lignite through anthracite. The data are maintained in computerized data base at The Pennsylvania State University Coal Research Section. The variables selected are components of the elemental analysis (carbon, oxygen, organic sulfur, hydrogen, and nitrogen), selected components of the proximate analysis (volatile matter and moisture), calorific value, reflectance of vitrinite, and the relative proportions of the 3 maceral groups (total vitrinite, inertinite, and liptinite group macerals). Faactor analyses performed on the entire data set and on subsets separated on the basis of rank, geographic location, and by cluster analysis indicated that rank is the most important factor in determining the amount of variation of each data set. The rank-dependent variables for the entire data set are carbon, reflectance, oxygen, volatile matter, calorific value, and moisture. The maceral groups account for the next greatest source of variation. Organic sulfur is independent of the first 2 factors and is the third most important source of variation. Cluster analyses indicated that the most significant partitioning produces 4 groups which are differentiated primarily on the basis of rank, maceral composition, and organic sulfur content. Factor analyses of the individual groups provide insights into the coalification processes of these more homogeneous coal associations.
Statistical Analysis of Tank 5 Floor Sample Results
Shine, E. P.
2013-01-31T23:59:59.000Z
Sampling has been completed for the characterization of the residual material on the floor of Tank 5 in the F-Area Tank Farm at the Savannah River Site (SRS), near Aiken, SC. The sampling was performed by Savannah River Remediation (SRR) LLC using a stratified random sampling plan with volume-proportional compositing. The plan consisted of partitioning the residual material on the floor of Tank 5 into three non-overlapping strata: two strata enclosed accumulations, and a third stratum consisted of a thin layer of material outside the regions of the two accumulations. Each of three composite samples was constructed from five primary sample locations of residual material on the floor of Tank 5. Three of the primary samples were obtained from the stratum containing the thin layer of material, and one primary sample was obtained from each of the two strata containing an accumulation. This report documents the statistical analyses of the analytical results for the composite samples. The objective of the analysis is to determine the mean concentrations and upper 95% confidence (UCL95) bounds for the mean concentrations for a set of analytes in the tank residuals. The statistical procedures employed in the analyses were consistent with the Environmental Protection Agency (EPA) technical guidance by Singh and others [2010]. Savannah River National Laboratory (SRNL) measured the sample bulk density, nonvolatile beta, gross alpha, and the radionuclide1, elemental, and chemical concentrations three times for each of the composite samples. The analyte concentration data were partitioned into three separate groups for further analysis: analytes with every measurement above their minimum detectable concentrations (MDCs), analytes with no measurements above their MDCs, and analytes with a mixture of some measurement results above and below their MDCs. The means, standard deviations, and UCL95s were computed for the analytes in the two groups that had at least some measurements above their MDCs. The identification of distributions and the selection of UCL95 procedures generally followed the protocol in Singh, Armbya, and Singh [2010]. When all of an analyte's measurements lie below their MDCs, only a summary of the MDCs can be provided. The measurement results reported by SRNL are listed, and the results of this analysis are reported. The data were generally found to follow a normal distribution, and to be homogenous across composite samples.
Statistical Analysis of a Telephone Call Center: A Queueing-Science Perspective
Shen, Haipeng
Statistical Analysis of a Telephone Call Center: A Queueing-Science Perspective Lawrence Brown Corresponding author: Haipeng Shen Department of Statistics and Operations Research University of North Carolina)962-1279 Lawrence Brown is Professor, Department of Statistics, The Wharton School, University of Pennsylvania
Statistical Analysis of Cross-Tabs D. White and A. Korotayev
White, Douglas R.
Chapter 5 Statistical Analysis of Cross-Tabs D. White and A. Korotayev 2 Jan 2004 Html links are live Some new text added in Blue 30 Oct 2004 Introduction Descriptive statistics includes collecting access to a database through a program such as SPSS, the Statistical Package for the Social Sciences
Application of statistical learning theory to plankton image analysis
Hu, Qiao, Ph. D. Massachusetts Institute of Technology
2006-01-01T23:59:59.000Z
A fundamental problem in limnology and oceanography is the inability to quickly identify and map distributions of plankton. This thesis addresses the problem by applying statistical machine learning to video images collected ...
Statistical analysis of large-scale structure in the Universe
Martin Kerscher
1999-12-15T23:59:59.000Z
Methods for the statistical characterization of the large-scale structure in the Universe will be the main topic of the present text. The focus is on geometrical methods, mainly Minkowski functionals and the J-function. Their relations to standard methods used in cosmology and spatial statistics and their application to cosmological datasets will be discussed. This work is not only meant as a short review for comologist, but also attempts to illustrate these morphological methods and to make them accessible to scientists from other fields. Consequently, a short introduction to the standard picture of cosmology is given.
2011-01-01T23:59:59.000Z
19. Wetzel DL: Near-infrared reflectance analysis - sleepertreatments of raw near-infrared signal in the measurement ofusing transmittance near-infrared spectroscopy. J Agric Food
2011-01-01T23:59:59.000Z
19. Wetzel DL: Near-infrared reflectance analysis - sleepertreatments of raw near-infrared signal in the measurement ofusing transmittance near-infrared spectroscopy. J Agric Food
Statistical Inference for Exploratory Data Analysis and Model Diagnostics
Buja, Andreas
Wharton School, University of Pennsylvania, 2 Iowa State University, 3 Fred Hutchinson Cancer Research thinking. Keywords: permutation tests, rotation tests, statistical graphics, visual data mining, simulation analytic activities that rely primarily on visual displays and only secondarily on numeric summaries. EDA
A SHARP ANALYSIS ON THE ASYMPTOTIC BEHAVIOR OF THE DURBIN-WATSON STATISTIC FOR THE FIRST-ORDER
Paris-Sud XI, UniversitÃ© de
A SHARP ANALYSIS ON THE ASYMPTOTIC BEHAVIOR OF THE DURBIN-WATSON STATISTIC FOR THE FIRST a sharp analysis on the as- ymptotic behavior of the Durbin-Watson statistic. We focus our attention for the Durbin-Watson statistic. Finally, we propose a new bilateral statistical test for residual
Toribio, M C; Giovanelli, R; Haynes, M P; Martin, A
2011-01-01T23:59:59.000Z
This is the second paper of two reporting results from a study of the HI content and stellar properties of nearby galaxies detected by the Arecibo Legacy Fast ALFA blind 21-cm line survey and the Sloan Digital Sky Survey in a 2160 deg^2 region covered by both surveys. We apply strategies of multivariate data analysis to a complete HI flux-limited subset of 1624 objects extracted from the control sample of HI emitters assembled by Toribio et al. (2011a) in order to: i) investigate the correlation structure of the space defined by an extensive set of observables describing gas-rich systems; ii) identify the intrinsic parameters that best define their HI content; and iii) explore the scaling relations arising from the joint distributions of the quantities most strongly correlated with the HI mass. The principal component analysis performed over a set of five galaxy properties reveals that they are strongly interrelated, supporting previous claims that nearby HI emitters show a high degree of correlation. The bes...
Rutledge, Steven
products such as bulk hydrometeor identification and Doppler-derived winds to be viewed in real time-depth analysis using radar products, such as Doppler-derived wind vectors and hydrometeor identification, has identification, and rainfall rate. The software was successfully tested during the summers of 2004 and 2005
Kockelman, Kara M.
(fatal or incapacitating) and non-severe crash rates reflects latent covariates that have impacts across and William J. Murray Jr. Fellow Department of Civil, Architectural and Environmental Engineering in Accident Analysis & Prevention ABSTRACT This work examines the relationship between 3-year pedestrian crash
Fazzio, Thomas J. (Thomas Joseph)
2010-01-01T23:59:59.000Z
This paper attempts to understand the price dynamics of the North American natural gas market through a statistical survey that includes an analysis of the variables influencing the price and volatility of this energy ...
Parallel and Statistical Analysis and Modeling of Nanometer VLSI Systems
Liu, Xue-Xin
2013-01-01T23:59:59.000Z
layers. Advanced cooling techniques such as integratedtraditional fan-based cooling techniques are not sufficientcooling problems. Fast and accurate thermal analysis techniques
Defect site prediction based upon statistical analysis of fault signatures
Trinka, Michael Robert
2004-09-30T23:59:59.000Z
Good failure analysis is the ability to determine the site of a circuit defect quickly and accurately. We propose a method for defect site prediction that is based on a site's probability of excitation, making no assumptions about the type...
University of Illinois at Chicago; Montana State University; Bhardwaj, Chhavi; Cui, Yang; Hofstetter, Theresa; Liu, Suet Yi; Bernstein, Hans C.; Carlson, Ross P.; Ahmed, Musahid; Hanley, Luke
2013-04-01T23:59:59.000Z
7.87 to 10.5 eV vacuum ultraviolet (VUV) photon energies were used in laser desorption postionization mass spectrometry (LDPI-MS) to analyze biofilms comprised of binary cultures of interacting microorganisms. The effect of photon energy was examined using both tunable synchrotron and laser sources of VUV radiation. Principal components analysis (PCA) was applied to the MS data to differentiate species in Escherichia coli-Saccharomyces cerevisiae coculture biofilms. PCA of LDPI-MS also differentiated individual E. coli strains in a biofilm comprised of two interacting gene deletion strains, even though these strains differed from the wild type K-12 strain by no more than four gene deletions each out of approximately 2000 genes. PCA treatment of 7.87 eV LDPI-MS data separated the E. coli strains into three distinct groups two ?pure? groups and a mixed region. Furthermore, the ?pure? regions of the E. coli cocultures showed greater variance by PCA when analyzed by 7.87 eV photon energies than by 10.5 eV radiation. Comparison of the 7.87 and 10.5 eV data is consistent with the expectation that the lower photon energy selects a subset of low ionization energy analytes while 10.5 eV is more inclusive, detecting a wider range of analytes. These two VUV photon energies therefore give different spreads via PCA and their respective use in LDPI-MS constitute an additional experimental parameter to differentiate strains and species.
Statistical Analysis of Spatial Point Patterns on Deep Seismic Reflection Data: A Preliminary Test
Schmidt, Volker
Statistical Analysis of Spatial Point Patterns on Deep Seismic Reflection Data: A Preliminary Test analysis may provide a new tool for analysing spatial variations in reflection data. Key Words: Deep The purpose of this paper is to present spatial point pattern analyses of seismic reflection data in an effort
Koch, C.D.; Pirkle, F.L.; Schmidt, J.S.
1981-01-01T23:59:59.000Z
A Principal Components Analysis (PCA) has been written to aid in the interpretation of multivariate aerial radiometric data collected by the US Department of Energy (DOE) under the National Uranium Resource Evaluation (NURE) program. The variations exhibited by these data have been reduced and classified into a number of linear combinations by using the PCA program. The PCA program then generates histograms and outlier maps of the individual variates. Black and white plots can be made on a Calcomp plotter by the application of follow-up programs. All programs referred to in this guide were written for a DEC-10. From this analysis a geologist may begin to interpret the data structure. Insight into geological processes underlying the data may be obtained.
Macdonald Polynomials and Multivariable Basic Hypergeometric Series
Michael J. Schlosser
2007-03-30T23:59:59.000Z
We study Macdonald polynomials from a basic hypergeometric series point of view. In particular, we show that the Pieri formula for Macdonald polynomials and its recently discovered inverse, a recursion formula for Macdonald polynomials, both represent multivariable extensions of the terminating very-well-poised 6-phi-5 summation formula. We derive several new related identities including multivariate extensions of Jackson's very-well-poised 8-phi-7 summation. Motivated by our basic hypergeometric analysis, we propose an extension of Macdonald polynomials to Macdonald symmetric functions indexed by partitions with complex parts. These appear to possess nice properties.
Identification of faults in a multivariate process with Bayesian network
Boyer, Edmond
control charts in a Bayesian network. Thus, in the context of multivariate processes, we propose network. Key words: Multivariate SPC, T2 decomposition, Bayesian network 1. Introduction Nowadays (i.e. control charts, methods based on Principal Component Analysis, Projection to Latent Structure
A new sliced inverse regression method for multivariate response regression
Paris-Sud XI, UniversitÃ© de
inverse regression. 1 Introduction In analyzing large datasets, multivariate response regression analysis) which is a well-known method to estimate the EDR space. The link function can be estimatedA new sliced inverse regression method for multivariate response regression RaphaÂ¨el Coudret1, St
PATHS: Analysis of PATH Duration Statistics and their Impact on Reactive MANET Routing Protocols
Krishnamachari, Bhaskar
PATHS: Analysis of PATH Duration Statistics and their Impact on Reactive MANET Routing Protocols Department of Electrical Engineering University of Southern California {narayans,fbai,bkrishna,helmy}@usc.edu ABSTRACT We develop a detailed approach to study how mobility im- pacts the performance of reactive MANET
Paris-Sud XI, UniversitÃ© de
with statistical analysis Anthony Ung, Laure Malherbe, Frederik Meleux, Bertrand Bessagnet, Laurence Rouil and MACCII modeling team INERIS institut, Paris, France Corresponding author: Anthony.ung@ineris.fr Abstract/QC dossiers and available on the MACC project website for each model. All models have also very significant
Statistical analysis of electric power production costs JORGE VALENZUELA and MAINAK MAZUMDAR*
Mazumdar, Mainak
Statistical analysis of electric power production costs JORGE VALENZUELA and MAINAK MAZUMDAR be sucient production at all times to meet the demand for electric power. If a low-cost generating unit fails uncertainty in the forecast of production costs. 1. Introduction One of the characteristics of electric power
ECOGRAPHY 25: 553557, 2002 Integrating the statistical analysis of spatial data in ecology
Liebhold, Andrew
ECOGRAPHY 25: 553Â557, 2002 Integrating the statistical analysis of spatial data in ecology A. M of spatial data in ecology. Â Ecography 25: 553Â557. In many areas of ecology there is an increasing emphasis on spatial relationships. Often ecologists are interested in new ways of analyzing data with the objective
Paliouras, George
From Web Usage Statistics to Web Usage Analysis Georgios Paliouras,* Christos Papatheodorou and Telecommunications, ** Division of Applied Technologies, National Centre for Scientific Research (NCSR) "Demokritos with the extraction of meta-knowledge from the Web. In particular, knowledge about Web usage which is invaluable
A Prediction Method for Job Runtimes on Shared Processors: Survey, Statistical Analysis and New
van der Mei, Rob
A Prediction Method for Job Runtimes on Shared Processors: Survey, Statistical Analysis and New predictions of the expected computation times of those jobs on remote hosts. Currently, there are no effective prediction methods available that cope with the ever-changing running times of jobs on a grid environment
Statistical analysis of wind energy in Chile David Watts a,b,*, Danilo Jara a
Catholic University of Chile (Universidad CatÃ³lica de Chile)
Data Bank Statistical analysis of wind energy in Chile David Watts a,b,*, Danilo Jara December 2010 Keywords: Wind Wind speed Energy Capacity factor Electricity Chile a b s t r a c t Bearing role in any future national energy generation matrix. With a view to understanding the local wind
Statistical analysis of 4-year observations of aerosol sizes in a semi-rural continental environment
Lee, Shan-Hu
Statistical analysis of 4-year observations of aerosol sizes in a semi-rural continental. Introduction Formation of new aerosol particles via gas-to-particle conver- sion is an important process, which to understanding how new particle formation (NPF) processes lead to formation of cloud condensation nuclei (CCN
Statistical Analysis of High-Cycle Fatigue Behavior of Friction Stir Welded AA5083-H321
Grujicic, Mica
Statistical Analysis of High-Cycle Fatigue Behavior of Friction Stir Welded AA5083-H321 M. Grujicic AA5083, fatigue behavior, friction stir welding, maximum likelihood estimation 1. Introduction Friction stir welding (FSW) is a relatively new solid-state metal-joining process that was invented
AIAA-2003-0867 STATISTICAL ANALYSIS OF INFLOW AND STRUCTURAL RESPONSE DATA
Manuel, Lance
AIAA-2003-0867 1 STATISTICAL ANALYSIS OF INFLOW AND STRUCTURAL RESPONSE DATA FROM THE LIST PROGRAM, University of Texas at Austin, Austin, TX 78712 2 Wind Energy Technology Department, Sandia National is gathering inflow and structural response data on a modified version of the Micon 65/13 wind turbine
Transient Analysis of Data Traffic in Cognitive Radio Networks: A Non-equilibrium Statistical
Li, Husheng
efficiency of spectrum utilization. In cognitive radio systems, a secondary user (without license) can access a licensed spectrum channel if there is no primary user (with license) transmitting over this channel. When1 Transient Analysis of Data Traffic in Cognitive Radio Networks: A Non-equilibrium Statistical
Development of Statistical Energy Analysis Tools for Toyota Motor Engineering & Manufacturing
Chen, J; Collins, Ro.; Gao, G.; Schaffer, D.; Wu, J.
2014-01-01T23:59:59.000Z
Development of Statistical Energy Analysis Tools for Toyota Motor Engineering & Manufacturing Duke University | Bass Connections in Energy IETC | May 21, 2014 Jason Chen, Robert Collins, Gary Gao, Daniel Schaffer, Jill Wu ESL-IE-14...-05-06 Proceedings of the Thrity-Sixth Industrial Energy Technology Conference New Orleans, LA. May 20-23, 2014 Presentation Agenda ? Project introduction and goals ? Duke team’s energy consumption models ? Analysis of Toyota’s current consumption model ? Duke vs...
Peter Bierhorst
2014-09-30T23:59:59.000Z
Recent experiments have reached detection efficiencies sufficient to close the detection loophole, testing the Clauser-Horne (CH) version of Bell's inequality. For a similar future experiment to be completely loophole-free, it will be important to have discrete experimental trials with randomized measurement settings for each trial, and the statistical analysis should not overlook the possibility of a local state varying over time with possible dependence on earlier trials (the "memory loophole"). In this paper, a mathematical model for such a CH experiment is presented, and a method for statistical analysis that is robust to memory effects is introduced. Additionally, a new method for calculating exact p-values for martingale-based statistics is described; previously, only non-sharp upper bounds derived from the Azuma-Hoeffding inequality have been available for such statistics. This improvement decreases the required number of experimental trials to demonstrate non-locality. The statistical techniques are applied to the data of recent experiments and found to perform well.
Statistical simulation procedures
Tremelling, Robert Norman
2012-06-07T23:59:59.000Z
. Ringer This thesis investicates methods for estimating statistical distribution functions which cannot be easily solved by theoretical techniques. The stratified Ilonte Carlo procedure of Ringer and Suharto [11j is extended to the multivariate case... and applied to two probIems, one practical, the other theoretical. This procedure Is shown to be more precise than simple Monte Carlo simula- t. i on. numerical method proposed by Soserville [12j is also extended to the multivariate case, and compared...
1 Introduction Towards Better Graphics for Multivariate Anal-
Thioulouse, Jean
1 Introduction Summary Keywords: Towards Better Graphics for Multivariate Anal- ysis, interactive graphics, factor map, principal component analysis Dynamic graphics are gaining more and more popularity with the availabil- ity of powerfull microcomputers and user-friendly graphical interfaces
A Multivariate Moving Average Control Chart for Photovoltaic Processes
Chunchom Pongchavalit
Abstract—For the electrical metrics that describe photovoltaic cell performance are inherently multivariate in nature, use of a univariate, or one variable, statistical process control chart can have important limitations. Development of a comprehensive process control strategy is known to be significantly beneficial to reducing process variability that ultimately drives up the manufacturing cost photovoltaic cells. The multivariate moving average or MMA chart, is applied to the electrical metrics of photovoltaic cells to illustrate the improved sensitivity on process variability this method of control charting offers. The result show the ability of the MMA chart to expand to as any variables as needed, suggests an application with multiple photovoltaic electrical metrics being used in concert to determine the processes state of control. Keywords—The multivariate moving average control chart, Photovoltaic processes control, Multivariate system. I.
Statistical analysis of the electrical breakdown time delay distributions in krypton
Maluckov, Cedomir A.; Karamarkovic, Jugoslav P.; Radovic, Miodrag K.; Pejovic, Momcilo M. [Technical Faculty in Bor, University of Belgrade, Vojske Jugoslavije 24, 19210 Bor (Serbia and Montenegro); Faculty of Civil Engineering and Architecture, University of Nis, Beogradska 14, 18000 Nis (Serbia and Montenegro); Faculty of Sciences and Mathematics, University of Nis, P.O. Box 224, 18001 Nis (Serbia and Montenegro); Faculty of Electronic Engineering, University of Nis, P.O. Box 73, 18001 Nis (Serbia and Montenegro)
2006-08-15T23:59:59.000Z
The statistical analysis of the experimentally observed electrical breakdown time delay distributions in the krypton-filled diode tube at 2.6 mbar is presented. The experimental distributions are obtained on the basis of 1000 successive and independent measurements. The theoretical electrical breakdown time delay distribution is evaluated as the convolution of the statistical time delay with exponential, and discharge formative time with Gaussian distribution. The distribution parameters are estimated by the stochastic modelling of the time delay distributions, and by comparing them with the experimental distributions for different relaxation times, voltages, and intensities of UV radiation. The transition of distribution shapes, from Gaussian-type to the exponential-like, is investigated by calculating the corresponding skewness and excess kurtosis parameters. It is shown that the mathematical model based on the convolution of two random variable distributions describes experimentally obtained time delay distributions and the separation of the total breakdown time delay to the statistical and formative time delay.
Statistical Methods for Enhanced Metrology in Semiconductor/Photovoltaic Manufacturing
Zeng, Dekong
2012-01-01T23:59:59.000Z
statistical process control (SPC) charts. The concept is toMethods Multivariate SPC charts utilize high dimensionalThe limits for a multivariate SPC chart can be defined with
Statistical Process Variation Analysis of a Graphene FET based LC-VCO for WLAN Applications
Mohanty, Saraju P.
Statistical Process Variation Analysis of a Graphene FET based LC-VCO for WLAN Applications Md Abir.AbirKhan@my.unt.edu, saraju.mohanty@unt.edu, and elias.kougianos@unt.edu Abstract--Graphene which is a single atom layer-frequency electronics due to low Ion/Ioff ratio. In this paper, design exploration of a graphene FET (GFET) based LC
Development of a thermobalance and analysis of lignites by thermogravimetric and statistical methods
Ferguson, James Allen
1984-01-01T23:59:59.000Z
coal or lignite is its calorific values, Q, the heat it produces when combusted. A simple, inexpensive, and rapid method for accurately predicting Q is needed because the pres- ently available methods are complicated, expensive, and take a great...DEVELOPMENT OF A THERMOBALANCE AND ANALYSIS OF LIGNITES BY THERMOGRAVIMETRIC AND STATISTICAL METHODS A Thesis by JAMES ALLEN PERGUSON Submitted to the Graduate College of Texas ASM University in partial fulfillment of the requirements...
Multivariate Forecast Evaluation And Rationality Testing
Komunjer, Ivana; OWYANG, MICHAEL
2007-01-01T23:59:59.000Z
1062—1088. MULTIVARIATE FORECASTS Chaudhuri, P. (1996): “OnKingdom. MULTIVARIATE FORECASTS Kirchgässner, G. , and U. K.2005): “Estimation and Testing of Forecast Rationality under
HotPatch Web Gateway: Statistical Analysis of Unusual Patches on Protein Surfaces
DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]
Pettit, Frank K.; Bowie, James U.(DOE-Molecular Biology Institute)
HotPatch finds unusual patches on the surface of proteins, and computes just how unusual they are (patch rareness), and how likely each patch is to be of functional importance (functional confidence (FC).) The statistical analysis is done by comparing your protein's surface against the surfaces of a large set of proteins whose functional sites are known. Optionally, HotPatch can also write a script that will display the patches on the structure, when the script is loaded into some common molecular visualization programs. HotPatch generates complete statistics (functional confidence and patch rareness) on the most significant patches on your protein. For each property you choose to analyze, you'll receive an email to which will be attached a PDB-format file in which atomic B-factors (temp. factors) are replaced by patch indices; and the PDB file's Header Remarks will give statistical scores and a PDB-format file in which atomic B-factors are replaced by the raw values of the property used for patch analysis (for example, hydrophobicity instead of hydrophobic patches). [Copied with edits from http://hotpatch.mbi.ucla.edu/
Eason, E.D.; Merton, A.A.; Wright, J.E.
1996-05-01T23:59:59.000Z
The effects of Li, pH and H, on primary water stress corrosion cracking (PWSCC) of Alloy 600 were investigated for temperatures between 320 and 330{degrees}C. Specimens included in the study were reverse U-bends (RUBs) made from several different heats of Alloy 600. The characteristic life, {eta}, which represents the time until 63.2% of the population initiates PWSCC, was computed using a modified Weibull statistical analysis algorithm and was analyzed for effects of the water chemistry variables previously mentioned. It was determined that the water chemistry variables are less sensitive than the metallurgical characteristics defined by the heat, heat treatment and initial stress state of the specimen (diameter and style of RUB); the maximum impact of chemistry effects was 0.13 to 0.59 standard deviations compared to a range of three (3) standard deviations for all variables. A first-order model was generated to estimate the effect of changes in pH, Li and H, concentrations on the characteristic life. The characteristic time to initiate cracks, {eta}, is not sensitive to Li and H{sub 2} concentrations in excess of 3.5 ppm and 25 ml/kg, respectively. Below these values, (1) {eta} decreases by {approximately}20% when [Li] is increased from 0.7 to 3.5 ppm; (2) {eta} decreases by {approximately}9% when [H{sub 2}] is increased from 13.1 to 25.0 ml/kg; and (3) {eta} decreases by {approximately}14% when pH is increased from 7.0 to 7.4, in each case holding the other two variables constant.
Hofland, G.S.; Barton, C.C.
1990-10-01T23:59:59.000Z
The computer program FREQFIT is designed to perform regression and statistical chi-squared goodness of fit analysis on one-dimensional or two-dimensional data. The program features an interactive user dialogue, numerous help messages, an option for screen or line printer output, and the flexibility to use practically any commercially available graphics package to create plots of the program`s results. FREQFIT is written in Microsoft QuickBASIC, for IBM-PC compatible computers. A listing of the QuickBASIC source code for the FREQFIT program, a user manual, and sample input data, output, and plots are included. 6 refs., 1 fig.
Statistical and risk analysis for the measured and predicted axial response of 100 piles
Perdomo, Dario
1986-01-01T23:59:59.000Z
of Committee) cu. Har y M. Coyl (Member) J rey D. Hart (Member) Donald McDonald (Head of Department) May 1986 ABSTRACT Statistical and Risk Analysis for the Measured and Predicted Axial Response of 100 Piles (December 1985) Dario Perdomo, B. S... encouragement and financial support. Sincere thanks are expressed to Dr. Jean-Louis Briaud and Nr. Larry Tucker i'or their guidance and advice throughout the course of this research. The assistance of Dr. Harry Coyle and Dr. Jeffrey Hart are also...
Applications of Minkowski Functionals to the Statistical Analysis of Dark Matter Models
Michael Platzoeder; Thomas Buchert
1995-09-04T23:59:59.000Z
A new method for the statistical analysis of 3D point processes, based on the family of Minkowski functionals, is explained and applied to modelled galaxy distributions generated by a toy-model and cosmological simulations of the large-scale structure in the Universe. These measures are sensitive to both, geometrical and topological properties of spatial patterns and appear to be very effective in discriminating different point processes. Moreover by the means of conditional subsampling, different building blocks of large-scale structures like sheets, filaments and clusters can be detected and extracted from a given distribution.
In-Situ Statistical Analysis of Autotune Simulation Data using Graphical Processing Units
Ranjan, Niloo [ORNL; Sanyal, Jibonananda [ORNL; New, Joshua Ryan [ORNL
2013-08-01T23:59:59.000Z
Developing accurate building energy simulation models to assist energy efficiency at speed and scale is one of the research goals of the Whole-Building and Community Integration group, which is a part of Building Technologies Research and Integration Center (BTRIC) at Oak Ridge National Laboratory (ORNL). The aim of the Autotune project is to speed up the automated calibration of building energy models to match measured utility or sensor data. The workflow of this project takes input parameters and runs EnergyPlus simulations on Oak Ridge Leadership Computing Facility s (OLCF) computing resources such as Titan, the world s second fastest supercomputer. Multiple simulations run in parallel on nodes having 16 processors each and a Graphics Processing Unit (GPU). Each node produces a 5.7 GB output file comprising 256 files from 64 simulations. Four types of output data covering monthly, daily, hourly, and 15-minute time steps for each annual simulation is produced. A total of 270TB+ of data has been produced. In this project, the simulation data is statistically analyzed in-situ using GPUs while annual simulations are being computed on the traditional processors. Titan, with its recent addition of 18,688 Compute Unified Device Architecture (CUDA) capable NVIDIA GPUs, has greatly extended its capability for massively parallel data processing. CUDA is used along with C/MPI to calculate statistical metrics such as sum, mean, variance, and standard deviation leveraging GPU acceleration. The workflow developed in this project produces statistical summaries of the data which reduces by multiple orders of magnitude the time and amount of data that needs to be stored. These statistical capabilities are anticipated to be useful for sensitivity analysis of EnergyPlus simulations.
Statistical analysis of narrow-band signals at setilive.org
Nikitin, Igor
2015-01-01T23:59:59.000Z
SETILive is a web project forwarding radio signals from SETI Institute's Allen Telescope Array (ATA) for the analysis of volunteers. It contains a large archive with more than 1.5 millions observations for more than 7.5 thousands observation targets, including directions to exoplanets discovered by telescope Kepler and other sources. It also supports various tools for signal collection and classification. Till recent time it supported live feeds of signals from ATA together with a feedback loop, a possibility to interrupt the schedule and repeat the observation of an interesting signal registered by sufficiently many viewers. Unfortunately, since 12-Oct-2014 the live feeds have been discontinued. We hope that the project will persist, taking into account the importance of the search subject, the worldwide interest to the topic and the value of already collected data. In this paper we present the results of statistical analysis of data stored in SETILive archive, using Radon transform and specially constructed...
Interactive statistical-distribution-analysis program utilizing numerical and graphical methods
Glandon, S. R.; Fields, D. E.
1982-04-01T23:59:59.000Z
The TERPED/P program is designed to facilitate the quantitative analysis of experimental data, determine the distribution function that best describes the data, and provide graphical representations of the data. This code differs from its predecessors, TEDPED and TERPED, in that a printer-plotter has been added for graphical output flexibility. The addition of the printer-plotter provides TERPED/P with a method of generating graphs that is not dependent on DISSPLA, Integrated Software Systems Corporation's confidential proprietary graphics package. This makes it possible to use TERPED/P on systems not equipped with DISSPLA. In addition, the printer plot is usually produced more rapidly than a high-resolution plot can be generated. Graphical and numerical tests are performed on the data in accordance with the user's assumption of normality or lognormality. Statistical analysis options include computation of the chi-squared statistic and its significance level and the Kolmogorov-Smirnov one-sample test confidence level for data sets of more than 80 points. Plots can be produced on a Calcomp paper plotter, a FR80 film plotter, or a graphics terminal using the high-resolution, DISSPLA-dependent plotter or on a character-type output device by the printer-plotter. The plots are of cumulative probability (abscissa) versus user-defined units (ordinate). The program was developed on a Digital Equipment Corporation (DEC) PDP-10 and consists of 1500 statements. The language used is FORTRAN-10, DEC's extended version of FORTRAN-IV.
STATISTICAL METHODS STATISTICAL METHODS
Delorme, Arnaud
STATISTICAL METHODS 1 STATISTICAL METHODS Arnaud Delorme, Swartz Center for Computational@salk.edu. Keywords: statistical methods, inference, models, clinical, software, bootstrap, resampling, PCA, ICA Abstract: Statistics represents that body of methods by which characteristics of a population are inferred
Stability and error analysis on partially implicit schemes Department of Mathematics and Statistics
Sun, Tong
and Statistics Bowling Green State University, Bowling Green, OH 43403 Abstract. Subdomain techniques have been
DECOMPOSITION OF MULTIVARIATE DATASETS WITH STRUCTURE/ORDERING
analysis. However, contrary to Fourier decomposition these new variables are located in frequency as well as location (space, time, wavelength etc). 1 Introduction The maximum autocorrelation factor (MAF) analysisDECOMPOSITION OF MULTIVARIATE DATASETS WITH STRUCTURE/ORDERING OF OBSERVATIONS OR VARIABLES USING
1 Introduction Towards Better Graphics for Multivariate Anal
Thioulouse, Jean
1 Introduction Summary Keywords: Towards Better Graphics for Multivariate AnalÂ ysis, exploratory data analysis, interactive graphics, factor map, principal component analysis Dynamic graphicsÂfriendly graphical interfaces. Their help in analysing numerical data sets has been widely recognized, and the book
MULTIVARIATE CONTROL CHARTS WITH A BAYESIAN Sylvain VERRON, Teodor TIPLICA, Abdessamad KOBI
Paris-Sud XI, UniversitÃ© de
of a multivariate process with a bayesian network. As a discriminant analysis is easily modeled with a bayesian cases of the discriminant analysis. So, we give the structure of the bayesian network as well as the parameters of the network in order to detect faults in the multivariate space in the same manners as if we
Noise-Based Volume Rendering for the Visualization of Multivariate Volumetric Data
Tierny, Julien
to meteorology and ge- ology require the analysis of such multivariate data and are in need of a comprehensiveNoise-Based Volume Rendering for the Visualization of Multivariate Volumetric Data Rostislav-velocity upstream (cyan). This is impossible with normal direct volume rendering (d). Abstract--Analysis
Statistical Model Analysis of (n,p) Cross Sections and Average Energy For Fission Neutron Spectrum
Odsuren, M.; Khuukhenkhuu, G. [Nuclear Research Center, National University of Mongolia, Ulaanbaatar (Mongolia)
2011-06-28T23:59:59.000Z
Investigation of charged particle emission reaction cross sections for fast neutrons is important to both nuclear reactor technology and the understanding of nuclear reaction mechanisms. In particular, the study of (n,p) cross sections is necessary to estimate radiation damage due to hydrogen production, nuclear heating and transmutations in the structural materials of fission and fusion reactors. On the other hand, it is often necessary in practice to evaluate the neutron cross sections of the nuclides for which no experimental data are available.Because of this, we carried out the systematical analysis of known experimental (n,p) and (n,a) cross sections for fast neutrons and observed a systematical regularity in the wide energy interval of 6-20 MeV and for broad mass range of target nuclei. To explain this effect using the compound, pre-equilibrium and direct reaction mechanisms some formulae were deduced. In this paper, in the framework of the statistical model known experimental (n,p) cross sections averaged over the thermal fission neutron spectrum of U-235 are analyzed. It was shown that the experimental data are satisfactorily described by the statistical model. Also, in the case of (n,p) cross sections the effective average neutron energy for fission spectrum of U-235 was found to be around 3 MeV.
A DETAILED STATISTICAL ANALYSIS OF THE MASS PROFILES OF GALAXY CLUSTERS
Host, Ole [Department of Physics and Astronomy, University College London, Gower Street, London, WC1E 6BT (United Kingdom); Hansen, Steen H. [Dark Cosmology Centre, Niels Bohr Institute, University of Copenhagen, Juliane Maries Vej 30, DK-2100 Copenhagen (Denmark)
2011-07-20T23:59:59.000Z
The distribution of mass in the halos of galaxies and galaxy clusters has been probed observationally, theoretically, and in numerical simulations, yet there is still confusion about which of several suggested parameterized models is the better representation, and whether these models are universal. We use the temperature and density profiles of the intracluster medium as measured by X-ray observations of 11 relaxed galaxy clusters to investigate mass models for the halo using a thorough Bayesian statistical analysis. We make careful comparisons between two- and three-parameter models, including the issue of a universal third parameter. We find that, of the two-parameter models, the Navarro-Frenk-White (NFW) is the best representation, but we also find moderate statistical evidence that a generalized three-parameter NFW model with a freely varying inner slope is preferred, despite penalizing against the extra degree of freedom. There is a strong indication that this inner slope needs to be determined for each cluster individually, i.e., some clusters have central cores and others have steep cusps. The mass-concentration relation of our sample is in reasonable agreement with predictions based on numerical simulations.
Schrull, Jeffrey Lee
1987-01-01T23:59:59.000Z
T'A'0-DIMENSIONAL SPECTRAL/STATISTICAL ANAI. YSIS OF MARINE MAGNETIC DATA: IMPLICATIONS FOR DEPTH-TO-MAGNETIC SOURCE A Thesis by JEFFREY LEE SCHRULL Submitted to the Graduate College of Texas AdtM University in partial fulfillment... of the requirements for the degree of MASTER OF SCIENCE May 1987 Major Subject: Geophysics TWO-DIMENSIONAL SPECTRAL/STATISTICAL ANALYSIS OF MARINE MAGNETIC DATA: IMPLICATIONS FOR DEPTH-TO-MAGNETIC SOURCE A Thesis JEFFREY LEE SCHRULL Approved as to style...
Time varying, multivariate volume data reduction
Ahrens, James P [Los Alamos National Laboratory; Fout, Nathaniel [UC DAVIS; Ma, Kwan - Liu [UC DAVIS
2010-01-01T23:59:59.000Z
Large-scale supercomputing is revolutionizing the way science is conducted. A growing challenge, however, is understanding the massive quantities of data produced by large-scale simulations. The data, typically time-varying, multivariate, and volumetric, can occupy from hundreds of gigabytes to several terabytes of storage space. Transferring and processing volume data of such sizes is prohibitively expensive and resource intensive. Although it may not be possible to entirely alleviate these problems, data compression should be considered as part of a viable solution, especially when the primary means of data analysis is volume rendering. In this paper we present our study of multivariate compression, which exploits correlations among related variables, for volume rendering. Two configurations for multidimensional compression based on vector quantization are examined. We emphasize quality reconstruction and interactive rendering, which leads us to a solution using graphics hardware to perform on-the-fly decompression during rendering. In this paper we present a solution which addresses the need for data reduction in large supercomputing environments where data resulting from simulations occupies tremendous amounts of storage. Our solution employs a lossy encoding scheme to acrueve data reduction with several options in terms of rate-distortion behavior. We focus on encoding of multiple variables together, with optional compression in space and time. The compressed volumes can be rendered directly with commodity graphics cards at interactive frame rates and rendering quality similar to that of static volume renderers. Compression results using a multivariate time-varying data set indicate that encoding multiple variables results in acceptable performance in the case of spatial and temporal encoding as compared to independent compression of variables. The relative performance of spatial vs. temporal compression is data dependent, although temporal compression has the advantage of offering smooth animations, while spatial compression can handle volumes of larger dimensions.
Symbolic Discriminant Analysis for Mining Gene Expression Patterns Jason Moore
Fernandez, Thomas
multivariate statistical approach for classification of observations into groups because the theory is wellSymbolic Discriminant Analysis for Mining Gene Expression Patterns Jason Moore Vanderbilt: Leukemia Jason H. Moore, Joel S. Parker, Lance W. Hahn Linear discriminant analysis is a popular
Wolfrum, E. J.; Sluiter, A. D.
2009-01-01T23:59:59.000Z
We have studied rapid calibration models to predict the composition of a variety of biomass feedstocks by correlating near-infrared (NIR) spectroscopic data to compositional data produced using traditional wet chemical analysis techniques. The rapid calibration models are developed using multivariate statistical analysis of the spectroscopic and wet chemical data. This work discusses the latest versions of the NIR calibration models for corn stover feedstock and dilute-acid pretreated corn stover. Measures of the calibration precision and uncertainty are presented. No statistically significant differences (p = 0.05) are seen between NIR calibration models built using different mathematical pretreatments. Finally, two common algorithms for building NIR calibration models are compared; no statistically significant differences (p = 0.05) are seen for the major constituents glucan, xylan, and lignin, but the algorithms did produce different predictions for total extractives. A single calibration model combining the corn stover feedstock and dilute-acid pretreated corn stover samples gave less satisfactory predictions than the separate models.
Maximum Likelihood Estimation of Mixture Densities for Binned and Truncated Multivariate
Smyth, Padhraic
Maximum Likelihood Estimation of Mixture Densities for Binned and Truncated Multivariate Data in data analysis and machine learning. This paper addresses the problem of fitting mixture densities to multivariate binned and truncated data. The EM approach proposed by McLachlan and Jones (1988
Computing Large Sparse Multivariate Optimization Problems with an Application in Biophysics
Boppana, Rajendra V.
-performance computational techniques for this final phase of the analysis are well-known and are not the focus of this paperComputing Large Sparse Multivariate Optimization Problems with an Application in Biophysics Abstract We present a novel divide and conquer method for parallelizing a large scale multivariate linear
Grunwald, Sabine
Comparison of multivariate methods for inferential modeling of soil carbon using visible/near-infrared Diffuse reflectance spectroscopy Visible/near-infrared spectroscopy Multivariate calibration Pre-processing transformations In order to reduce costs and time in the analysis of soil properties, visible/near-infrared
Paris-Sud XI, UniversitÃ© de
assumption is equivalent to the equirepartition of energy in the modal approach. This equivalenceDerivation of statistical energy analysis from radiative exchanges A. LE BOT Laboratoire de to be uncorrelated leading to the additivity of energy. Inside all subsystems, the energy density is the sum
-scan fusion, the statistical analysis step computes probability volumes of the fused data using a local-ray computed tomography (3DCT). 3DCT is a non-touching and non-destructive method, which allows a fast char: Photograph (a) and principle scheme (b) of an industrial 3D X-ray computed tomography system. General design
Yu, Victoria; Kishan, Amar U.; Cao, Minsong; Low, Daniel; Lee, Percy; Ruan, Dan, E-mail: druan@mednet.ucla.edu [Department of Radiation Oncology, David Geffen School of Medicine, University of California, Los Angeles, Los Angeles, California 90024 (United States)] [Department of Radiation Oncology, David Geffen School of Medicine, University of California, Los Angeles, Los Angeles, California 90024 (United States)
2014-03-15T23:59:59.000Z
Purpose: To demonstrate a new method of evaluating dose response of treatment-induced lung radiographic injury post-SBRT (stereotactic body radiotherapy) treatment and the discovery of bimodal dose behavior within clinically identified injury volumes. Methods: Follow-up CT scans at 3, 6, and 12 months were acquired from 24 patients treated with SBRT for stage-1 primary lung cancers or oligometastic lesions. Injury regions in these scans were propagated to the planning CT coordinates by performing deformable registration of the follow-ups to the planning CTs. A bimodal behavior was repeatedly observed from the probability distribution for dose values within the deformed injury regions. Based on a mixture-Gaussian assumption, an Expectation-Maximization (EM) algorithm was used to obtain characteristic parameters for such distribution. Geometric analysis was performed to interpret such parameters and infer the critical dose level that is potentially inductive of post-SBRT lung injury. Results: The Gaussian mixture obtained from the EM algorithm closely approximates the empirical dose histogram within the injury volume with good consistency. The average Kullback-Leibler divergence values between the empirical differential dose volume histogram and the EM-obtained Gaussian mixture distribution were calculated to be 0.069, 0.063, and 0.092 for the 3, 6, and 12 month follow-up groups, respectively. The lower Gaussian component was located at approximately 70% prescription dose (35 Gy) for all three follow-up time points. The higher Gaussian component, contributed by the dose received by planning target volume, was located at around 107% of the prescription dose. Geometrical analysis suggests the mean of the lower Gaussian component, located at 35 Gy, as a possible indicator for a critical dose that induces lung injury after SBRT. Conclusions: An innovative and improved method for analyzing the correspondence between lung radiographic injury and SBRT treatment dose has been demonstrated. Bimodal behavior was observed in the dose distribution of lung injury after SBRT. Novel statistical and geometrical analysis has shown that the systematically quantified low-dose peak at approximately 35 Gy, or 70% prescription dose, is a good indication of a critical dose for injury. The determined critical dose of 35 Gy resembles the critical dose volume limit of 30 Gy for ipsilateral bronchus in RTOG 0618 and results from previous studies. The authors seek to further extend this improved analysis method to a larger cohort to better understand the interpatient variation in radiographic lung injury dose response post-SBRT.
Statistical Analysis of Baseline Load Models for Non-Residential Buildings
Coughlin, Katie; Piette, Mary Ann; Goldman, Charles; Kiliccote, Sila
2008-11-10T23:59:59.000Z
Policymakers are encouraging the development of standardized and consistent methods to quantify the electric load impacts of demand response programs. For load impacts, an essential part of the analysis is the estimation of the baseline load profile. In this paper, we present a statistical evaluation of the performance of several different models used to calculate baselines for commercial buildings participating in a demand response program in California. In our approach, we use the model to estimate baseline loads for a large set of proxy event days for which the actual load data are also available. Measures of the accuracy and bias of different models, the importance of weather effects, and the effect of applying morning adjustment factors (which use data from the day of the event to adjust the estimated baseline) are presented. Our results suggest that (1) the accuracy of baseline load models can be improved substantially by applying a morning adjustment, (2) the characterization of building loads by variability and weather sensitivity is a useful indicator of which types of baseline models will perform well, and (3) models that incorporate temperature either improve the accuracy of the model fit or do not change it.
Bochanski, John J. [Astronomy and Astrophysics Department, Pennsylvania State University, 525 Davey Laboratory, University Park, PA 16802 (United States); Hawley, Suzanne L. [Astronomy Department, University of Washington, Box 351580, Seattle, WA 98195 (United States); West, Andrew A., E-mail: jjb29@psu.edu [Department of Astronomy, Boston University, 725 Commonwealth Avenue, Boston, MA 02215 (United States)
2011-03-15T23:59:59.000Z
We present a statistical parallax analysis of low-mass dwarfs from the Sloan Digital Sky Survey. We calculate absolute r-band magnitudes (M{sub r} ) as a function of color and spectral type and investigate changes in M{sub r} with location in the Milky Way. We find that magnetically active M dwarfs are intrinsically brighter in M{sub r} than their inactive counterparts at the same color or spectral type. Metallicity, as traced by the proxy {zeta}, also affects M{sub r} , with metal-poor stars having fainter absolute magnitudes than higher metallicity M dwarfs at the same color or spectral type. Additionally, we measure the velocity ellipsoid and solar reflex motion for each subsample of M dwarfs. We find good agreement between our measured solar peculiar motion and previous results for similar populations, as well as some evidence for differing motions of early and late M-type populations in U and W velocities that cannot be attributed to asymmetric drift. The reflex solar motion and the velocity dispersions both show that younger populations, as traced by magnetic activity and location near the Galactic plane, have experienced less dynamical heating. We introduce a new parameter, the independent position altitude (IPA), to investigate populations as a function of vertical height from the Galactic plane. M dwarfs at all types exhibit an increase in velocity dispersion when analyzed in comparable IPA subgroups.
Of Disasters and Dragon Kings: A Statistical Analysis of Nuclear Power Incidents & Accidents
Wheatley, Spencer; Sornette, Didier
2015-01-01T23:59:59.000Z
We provide, and perform a risk theoretic statistical analysis of, a dataset that is 75 percent larger than the previous best dataset on nuclear incidents and accidents, comparing three measures of severity: INES (International Nuclear Event Scale), radiation released, and damage dollar losses. The annual rate of nuclear accidents, with size above 20 Million US$, per plant, decreased from the 1950s until dropping significantly after Chernobyl (April, 1986). The rate is now roughly stable at 0.002 to 0.003, i.e., around 1 event per year across the current fleet. The distribution of damage values changed after Three Mile Island (TMI; March, 1979), where moderate damages were suppressed but the tail became very heavy, being described by a Pareto distribution with tail index 0.55. Further, there is a runaway disaster regime, associated with the "dragon-king" phenomenon, amplifying the risk of extreme damage. In fact, the damage of the largest event (Fukushima; March, 2011) is equal to 60 percent of the total damag...
Li, Mohan
2012-10-19T23:59:59.000Z
of decorrelation in the RF data. In this thesis, a 3D elastography algorithm that estimates all the three components of tissue displacement is implemented and tested statistically. In this research, displacement fields of mechanical models are simulated. RF signals...
Brown, Emery N.
Coherence analysis characterizes frequency-dependent covariance between signals, and is useful for multivariate oscillatory data often encountered in neuroscience. The global coherence provides a summary of coherent behavior ...
Weck, Peter J; Brown, Michael R; Wicks, Robert T
2014-01-01T23:59:59.000Z
The Bandt-Pompe permutation entropy and the Jensen-Shannon statistical complexity are used to analyze fluctuating time series of three different plasmas: the magnetohydrodynamic (MHD) turbulence in the plasma wind tunnel of the Swarthmore Spheromak Experiment (SSX), drift-wave turbulence of ion saturation current fluctuations in the edge of the Large Plasma Device (LAPD) and fully-developed turbulent magnetic fluctuations of the solar wind taken from the WIND spacecraft. The entropy and complexity values are presented as coordinates on the CH plane for comparison among the different plasma environments and other fluctuation models. The solar wind is found to have the highest permutation entropy and lowest statistical complexity of the three data sets analyzed. Both laboratory data sets have larger values of statistical complexity, suggesting these systems have fewer degrees of freedom in their fluctuations, with SSX magnetic fluctuations having slightly less complexity than the LAPD edge fluctuations. The CH ...
Not Available
1986-10-01T23:59:59.000Z
Statistical analyses were performed on 4 years of fluoride emissions data from a primary aluminum reduction plant. These analyses were used to develop formulae and procedures for use by regulatory agencies in determining alternate sampling frequencies for secondary (roof monitor) emissions testing on a case-by-case basis. Monitoring procedures for ensuring compliance even with a reduced test frequency are also addressed.
Statistical Analysis of Microgravity Two-Phase Slug Flow via the Drift Flux Model
Larsen, Benjamin A
2014-05-01T23:59:59.000Z
. The result was a statistically consistent microgravity slug flow data base consisting of 220 data points from 8 different experiments and the associated values for the concentration parameter, Co, and drift velocity, u_(gj). A key component for this model...
Czarnecki, Krzysztof
2012 Beaver Computing Challenge Results Sponsor: 1 #12;Statistics Overall Statistics Number.24/6 Glasses: 1.74/4 Text Machine: 5.54/8 Hierarchical Structure: 2.98/6 Bebrocarina: 2.14/4 Beaver Navigation: 5.06/8 Beaver Pyramid: 3.23/6 Change Direction: 1.45/4 Power Generation: 5.56/8 Rotating Puzzle: 5
Garcia-Garcia, Adrian Luis, E-mail: agarciag@ipn.mx; Dominguez-Lopez, Ivan, E-mail: idominguezl@ipn.mx; Lopez-Jimenez, Luis, E-mail: llopez1002@ipn.mx; Barceinas-Sanchez, J.D. Oscar, E-mail: obarceinas@ipn.mx
2014-01-15T23:59:59.000Z
Quantification of nanometric precipitates in metallic alloys has been traditionally performed using transmission electron microscopy, which is nominally a low throughput technique. This work presents a comparative study of quantification of ?? and ? precipitates in aluminum alloy AA7075-T651 using transmission electron microscopy (TEM) and non-contact atomic force microscopy (AFM). AFM quantification was compared with 2-D stereological results reported elsewhere. Also, a method was developed, using specialized software, to characterize nanometric size precipitates observed in dark-field TEM micrographs. Statistical analysis of the quantification results from both measurement techniques supports the use of AFM for precipitate characterization. Once the precipitate stoichiometry has been determined by appropriate analytical techniques like TEM, as it is the case for ?? and ? in AA7075-T651, the relative ease with which specimens are prepared for AFM analysis could be advantageous in product and process development, and quality control, where a large number of samples are expected for analysis on a regular basis. - Highlights: • Nanometric MgZn{sub 2} precipitates in AA7075-T651 were characterized using AFM and TEM. • Phase-contrast AFM was used to differentiate metal matrix from MgZn{sub 2} precipitates. • TEM and AFM micrographs were analyzed using commercially available software. • AFM image analysis and TEM 2-D stereology render statistically equivalent results.
Statistical analysis of aerosol species, trace gasses, and meteorology in Chicago
O'Brien, Timothy E.
possible pollutant sources. Keywords Atmospheric aerosols . Canonical correlation analysis . Chicago air pollution studies involve collection and anal- ysis of atmospheric aerosols and concurrent meteorol- ogy) and principal component analysis (PCA) were applied to atmospheric aerosol and trace gas concentrations
Statistical Analysis of LifeData with Masked CauseofFailure
Basu, Sanjib
, a detailed Failure Mode Effect Analysis (FMEA) can be carried out in a routine manner. In reliability been pursued under the general heading of Failure Mode Effect Analysis (FMEA) when the exact causes
Causation-Based T2 Decomposition for Multivariate Process Monitoring and Diagnosis
Jin, Jionghua "Judy"
. Multivariate SPC using Hotelling 2 T statistic is widely adopted for change detection. However, 2 T control chart alone is not capable of identifying the root causes of the change. Thus, decomposition of 2 network, causal model, SPC, 2 T decomposition Biography Ms. Li is a research student in the Department
Crow, Ben D
2006-01-01T23:59:59.000Z
of Globalization: Statistics Weiss, L. (1997). "of Globalization: Statistics Milanovic, B. (1999). Truethe focus of global statistics, particularly in relation to
Thioulouse, Jean
Analysis in SAR and Environmental Studies. Kluwer Academic Publishers. http://pbil.univ-lyon1.fr/R/articles Analysis in SAR and Environmental Studies. Kluwer Academic Publishers. http://pbil.univ-lyon1.fr/R/articles Analysis in SAR and Environmental Studies. Kluwer Academic Publishers. http://pbil.univ-lyon1.fr/R/articles
Dynamic Conditional Correlation - A Simple Class of Multivariate GARCH Models
Engle, Robert F
2000-01-01T23:59:59.000Z
Multivariate Simultaneous GARCH," Econometric Theory 11,and Joseph Mezrich, (1996) "GARCH for Groups," Risk August,SIMPLE CLASS OF MULTIVARIATE GARCH MODELS BY ROBERT F. ENGLE
adaptable multivariate calibration: Topics by E-print Network
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Det. 0-18 Multiples Det. >18 pseudo Strong, Andrew W. 54 OPTIMAL SOLUTIONS OF MULTIVARIATE COUPLING Mathematics Websites Summary: OPTIMAL SOLUTIONS OF MULTIVARIATE COUPLING...
Zeros in linear multivariable control systems
Ewing, Robert Fennell
1974-01-01T23:59:59.000Z
ZEROS IN LINEAR MULTIVARIABLE CONTROL SYSTEMS A Thesis by ROBERT FENNELL EWING Submitted to the Graduate College of Texas A8M University in partial fulfillment of the requirement for the degree of MASTER OF SCIENCE August 1974 Major... Control Systems (August 1974) Robert Fennell Ewing, B. S. , Texas A&M University Chairman of Advisory Committee: Dr. J. W. Howze This thesis examines the problem of altering the transfer function matrix of a linear, time-invariant, multivariable system...
Guo, Genliang; George, S.A.; Lindsey, R.P.
1997-08-01T23:59:59.000Z
Thirty-six sets of surface lineaments and fractures mapped from satellite images and/or aerial photos from parts of the Mid-continent and Colorado Plateau regions were collected, digitized, and statistically analyzed in order to obtain the probability distribution functions of natural fractures for characterizing naturally fractured reservoirs. The orientations and lengths of the surface linear features were calculated using the digitized coordinates of the two end points of each individual linear feature. The spacing data of the surface linear features within an individual set were, obtained using a new analytical sampling technique. Statistical analyses were then performed to find the best-fit probability distribution functions for the orientation, length, and spacing of each data set. Twenty-five hypothesized probability distribution functions were used to fit each data set. A chi-square goodness-of-fit test was used to rank the significance of each fit. A distribution which provides the lowest chi-square goodness-of-fit value was considered the best-fit distribution. The orientations of surface linear features were best-fitted by triangular, normal, or logistic distributions; the lengths were best-fitted by PearsonVI, PearsonV, lognormal2, or extreme-value distributions; and the spacing data were best-fitted by lognormal2, PearsonVI, or lognormal distributions. These probability functions can be used to stochastically characterize naturally fractured reservoirs.
Paris-Sud XI, UniversitÃ© de
to the same EDR space is provided. Thus, the proposed multivariate SIR method can be used properly on each, multivariate response regression analysis with a p-dimensional vector of regressors has been extensively) introduced sliced inverse regression (SIR) which is a well-known method to estimate the EDR space. The link
Reddy, T. A.; Claridge, D.; Wu, J.
analysis to identify these models. However, such models tend to suffer from physically unreasonable regression coefficients and instability due to the fact that the predictor variables (i.e., climatic parameters, building internal loads, etc...
Statistical static timing analysis considering the impact of power supply noise in VLSI circuits
Kim, Hyun Sung
2009-06-02T23:59:59.000Z
As semiconductor technology is scaled and voltage level is reduced, the impact of the variation in power supply has become very significant in predicting the realistic worst-case delays in integrated circuits. The analysis of power supply noise...
ForPeerReview A novel statistical model for mandibular helical axis analysis
Reich, Brian J.
on a Cartesian coordinate system, the three angles (roll, pitch and yaw) are sequence-dependent and influenced-analysis system. These movements were compared with similar movement in the same group after treatment
Al-Nasir, Abdul Majid Hamza
1968-01-01T23:59:59.000Z
ORDER RELATIONS AND PRIOR DISTRIBU 'IONS IN 1:-IE ESTXYJiTION OF MULTIVARIATE NOPSLAL PARAI'E&iTiS NI~N PARTIAL DATA A Thesis by ABDUL MAJID HA?ZA AL-NASZR Submitt d o the Grad. nate College oi' Texas UM Univ rsity in partial fu' fillment . f... the requirement for the aegree of MASTER OF SCIENCE August 1968 Major Subject: Statistics ORDER RELATIONS AND PRIOR DISTRIBUTIONS IJJ THE ESTI1UTION OF MULTIVARIATE NORJJAL PARtuETERS NlTH PARTIAL DATA A Thesis ( by ABDUL IJAJID HANZA AL-NASIR Approved...
Statistical Laboratory & Department of Statistics
Statistical Laboratory & Department of Statistics Annual Report July 1, 2005 to December 31, 2006...............................................33 Statistical Computing Section ......................................34 CSSM and statistical methodology in the nutritional sciences. We were also very pleased to secure a permanent lecturer
Extracting bb Higgs Decay Signals using Multivariate Techniques
Smith, W Clarke; /George Washington U. /SLAC
2012-08-28T23:59:59.000Z
For low-mass Higgs boson production at ATLAS at {radical}s = 7 TeV, the hard subprocess gg {yields} h{sup 0} {yields} b{bar b} dominates but is in turn drowned out by background. We seek to exploit the intrinsic few-MeV mass width of the Higgs boson to observe it above the background in b{bar b}-dijet mass plots. The mass resolution of existing mass-reconstruction algorithms is insufficient for this purpose due to jet combinatorics, that is, the algorithms cannot identify every jet that results from b{bar b} Higgs decay. We combine these algorithms using the neural net (NN) and boosted regression tree (BDT) multivariate methods in attempt to improve the mass resolution. Events involving gg {yields} h{sup 0} {yields} b{bar b} are generated using Monte Carlo methods with Pythia and then the Toolkit for Multivariate Analysis (TMVA) is used to train and test NNs and BDTs. For a 120 GeV Standard Model Higgs boson, the m{sub h{sup 0}}-reconstruction width is reduced from 8.6 to 6.5 GeV. Most importantly, however, the methods used here allow for more advanced m{sub h{sup 0}}-reconstructions to be created in the future using multivariate methods.
Dr. Binh T. Pham; Grant L. Hawkes; Jeffrey J. Einerson
2012-10-01T23:59:59.000Z
As part of the Research and Development program for Next Generation High Temperature Reactors (HTR), a series of irradiation tests, designated as Advanced Gas-cooled Reactor (AGR), have been defined to support development and qualification of fuel design, fabrication process, and fuel performance under normal operation and accident conditions. The AGR tests employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule and instrumented with thermocouples (TC) embedded in graphite blocks enabling temperature control. The data representing the crucial test fuel conditions (e.g., temperature, neutron fast fluence, and burnup) while impossible to obtain from direct measurements are calculated by physics and thermal models. The irradiation and post-irradiation examination (PIE) experimental data are used in model calibration effort to reduce the inherent uncertainty of simulation results. This paper is focused on fuel temperature predicted by the ABAQUS code’s finite element-based thermal models. The work follows up on a previous study, in which several statistical analysis methods were adapted, implemented in the NGNP Data Management and Analysis System (NDMAS), and applied for improving qualification of AGR-1 thermocouple data. The present work exercises the idea that the abnormal trends of measured data observed from statistical analysis may be caused by either measuring instrument deterioration or physical mechanisms in capsules that may have shifted the system thermal response. As an example, the uneven reduction of the control gas gap in Capsule 5 revealed by the capsule metrology measurements in PIE helps justify the reduction in TC readings instead of TC drift. This in turn prompts modification of thermal model to better fit with experimental data, thus help increase confidence, and in other word reduce model uncertainties in thermal simulation results of the AGR-1 test.
Statistical Analysis of Algorithms: A Case Study of MarketClearing Mechanisms in the
Srinivasan, Aravind
they were visiting Los Alamos National Laboratory. y National Infrastructure Simulation and Analysis Center and CCS3, P.O. Box 1663, MS M997, Los Alamos National Laboratory, Los Alamos NM 87545. Email: barÂ rett, UniÂ versity of Maryland, College Park, MD 20742. Parts of this work were done while at Lucent
ibr: Iterative bias reduction multivariate smoothing
Hengartner, Nicholas W [Los Alamos National Laboratory; Cornillon, Pierre-andre [AGRO-SUP, FRANCE; Matzner - Lober, Eric [RENNES 2, FRANCE
2009-01-01T23:59:59.000Z
Regression is a fundamental data analysis tool for relating a univariate response variable Y to a multivariate predictor X {element_of} E R{sup d} from the observations (X{sub i}, Y{sub i}), i = 1,...,n. Traditional nonparametric regression use the assumption that the regression function varies smoothly in the independent variable x to locally estimate the conditional expectation m(x) = E[Y|X = x]. The resulting vector of predicted values {cflx Y}{sub i} at the observed covariates X{sub i} is called a regression smoother, or simply a smoother, because the predicted values {cflx Y}{sub i} are less variable than the original observations Y{sub i}. Linear smoothers are linear in the response variable Y and are operationally written as {cflx m} = X{sub {lambda}}Y, where S{sub {lambda}} is a n x n smoothing matrix. The smoothing matrix S{sub {lambda}} typically depends on a tuning parameter which we denote by {lambda}, and that governs the tradeoff between the smoothness of the estimate and the goodness-of-fit of the smoother to the data by controlling the effective size of the local neighborhood over which the responses are averaged. We parameterize the smoothing matrix such that large values of {lambda} are associated to smoothers that averages over larger neighborhood and produce very smooth curves, while small {lambda} are associated to smoothers that average over smaller neighborhood to produce a more wiggly curve that wants to interpolate the data. The parameter {lambda} is the bandwidth for kernel smoother, the span size for running-mean smoother, bin smoother, and the penalty factor {lambda} for spline smoother.
Jackson, Philip JB
Statistical identification of critical, dependent and redundant articulators Veena D Singampalli of articulatory roles Evaluation by exhaustive search and trajectory generation Summary Statistical identification, Belgium #12;Statistical identification of critical, dependent and redundant articulators Veena D
Jackson, Philip JB
Statistical identification of critical, dependent and redundant articulators Philip Jackson & Veena Statistical identification of critical, dependent and redundant articulators Philip Jackson & Veena Acoustics'08 Paris #12;Statistical identification of critical, dependent and redundant articulators Philip
32. Statistics 1 32. STATISTICS
Masci, Frank
32. Statistics 1 32. STATISTICS Revised September 2007 by G. Cowan (RHUL). This chapter gives an overview of statistical methods used in High Energy Physics. In statistics, we are interested in using's validity or to determine the values of its parameters. There are two main approaches to statistical
Experimental Statistics NBS Handbook 91: Experimental Statistics [1] was
Experimental Statistics NBS Handbook 91: Experimental Statistics [1] was first published in 1963 as a series of five Army Ordnance Pamphlets OSRDDP 20-110-114. The publication was prepared in the Statistical. Basic Statistical Concepts and Analysis and Inter- pretation of Measurement Data 2. Standard Techniques
Inoculating Multivariate Schemes Against Differential Attacks
adding as few as 10 Plus poly- nomials to the Perturbed Matsumoto-Imai (PMI) cryptosystem when g = 1 scheme the Perturbed Matsumoto-Imai-Plus (PMI+) cryptosystem. Keywords: multivariate, public key, called the perturbed Matsumoto-Imai cryptosystem (PMI), is slower as one needs to go through a search
A diagnostic procedure for multivariate quality control
Keserla, Adhinarayan A.
1993-01-01T23:59:59.000Z
. The proposed diagnostic procedure is triggered by off-target signals from the multivariate control chart. The performance of the procedure are investigate in conjunction with the two control charts, the X2 chart and the MC1 chart. Two performance measures...
Multivariate Optical Computation for Predictive Spectroscopy
Myrick, Michael Lenn
Multivariate Optical Computation for Predictive Spectroscopy Matthew P. Nelson, Jeffrey F. Aust, J Research Council of Canada, Ottawa, Ontario, Canada K1A 0R6 A novel optical approach to predicting chemical into the structure of a set of paired optical filters. Light passing through the paired filters produces an analog
N. Panja; A. K. Chattopadhyay
2014-12-05T23:59:59.000Z
We report results of an experimental study, complemented by detailed statistical analysis of the experimental data, on the development of a more effective control method of drug delivery using a pH sensitive acrylic polymer. New copolymers based on acrylic acid and fatty acid are constructed from dodecyl castor oil and a tercopolymer based on methyl methacrylate, acrylic acid and acryl amide were prepared using this new approach. Water swelling characteristics of fatty acid, acrylic acid copolymer and tercopolymer respectively in acid and alkali solutions have been studied by a step-change method. The antibiotic drug cephalosporin and paracetamol have also been incorporated into the polymer blend through dissolution with the release of the antibiotic drug being evaluated in bacterial stain media and buffer solution. Our results show that the rate of release of paracetamol getss affected by the pH factor and also by the nature of polymer blend. Our experimental data have later been statistically analyzed to quantify the precise nature of polymer decay rates on the pH density of the relevant polymer solvents. The time evolution of the polymer decay rates indicate a marked transition from a linear to a strictly non-linear regime depending on the whether the chosen sample is a general copolymer (linear) or a tercopolymer (non-linear). Non-linear data extrapolation techniques have been used to make probabilistic predictions about the variation in weight percentages of retained polymers at all future times, thereby quantifying the degree of efficacy of the new method of drug delivery.
Comnes, G.A.; Belden, T.N.; Kahn, E.P.
1995-02-01T23:59:59.000Z
The market for long-term bulk power is becoming increasingly competitive and mature. Given that many privately developed power projects have been or are being developed in the US, it is possible to begin to evaluate the performance of the market by analyzing its revealed prices. Using a consistent method, this paper presents levelized contract prices for a sample of privately developed US generation properties. The sample includes 26 projects with a total capacity of 6,354 MW. Contracts are described in terms of their choice of technology, choice of fuel, treatment of fuel price risk, geographic location, dispatchability, expected dispatch niche, and size. The contract price analysis shows that gas technologies clearly stand out as the most attractive. At an 80% capacity factor, coal projects have an average 20-year levelized price of $0.092/kWh, whereas natural gas combined cycle and/or cogeneration projects have an average price of $0.069/kWh. Within each technology type subsample, however, there is considerable variation. Prices for natural gas combustion turbines and one wind project are also presented. A preliminary statistical analysis is conducted to understand the relationship between price and four categories of explanatory factors including product heterogeneity, geographic heterogeneity, economic and technological change, and other buyer attributes (including avoided costs). Because of residual price variation, we are unable to accept the hypothesis that electricity is a homogeneous product. Instead, the analysis indicates that buyer value still plays an important role in the determination of price for competitively-acquired electricity.
Multivariate Calibration Models for Sorghum Composition using Near-Infrared Spectroscopy
Wolfrum, E.; Payne, C.; Stefaniak, T.; Rooney, W.; Dighe, N.; Bean, B.; Dahlberg, J.
2013-03-01T23:59:59.000Z
NREL developed calibration models based on near-infrared (NIR) spectroscopy coupled with multivariate statistics to predict compositional properties relevant to cellulosic biofuels production for a variety of sorghum cultivars. A robust calibration population was developed in an iterative fashion. The quality of models developed using the same sample geometry on two different types of NIR spectrometers and two different sample geometries on the same spectrometer did not vary greatly.
Dtection de Fautes par Rseaux Baysiens dans les Procds Multivaris
Paris-Sud XI, UniversitÃ© de
between the usual detection methods that are the multivari- ate control charts (Hotelling's T2 , MEWMA Discriminante KEYWORDS: SPC, multivariate, detection, bayesian network, T2 , MEWMA, Discriminant Analy- sis
Asymptotic Theory for Multivariate GARCH and O. Lieberman
Comte, Fabienne
Asymptotic Theory for Multivariate GARCH Processes F. Comte and O. Lieberman Revised, July 2001 Abstract We provide in this paper asymptotic theory for the multivariate GARCH(p, q) process. Strong and ergodic solution to the multivariate GARCH(p, q) process. We prove asymptotic normality of the quasi
Situvis: visualising multivariate context information to evaluate situation specifications
Dobson, Simon
Situvis: visualising multivariate context information to evaluate situation specifications Adrian K, highly multivariate, and constantly being updated as new readings are recorded. Situations have been. The visualisation of large and complex multivariate data sets, such as those that context-aware system developers
Accelerated Articles Design and Testing of a Multivariate Optical
Myrick, Michael Lenn
#12;Accelerated Articles Design and Testing of a Multivariate Optical Element: The First Demonstration of Multivariate Optical Computing for Predictive Spectroscopy O. Soyemi, D. Eastwood, L. Zhang, H Street, Suite 102, Lincoln, Nebraska 68508 A demonstration of multivariate optical computing is presented
Statistical Convergence and Convergence in Statistics
Mark Burgin; Oktay Duman
2006-12-07T23:59:59.000Z
Statistical convergence was introduced in connection with problems of series summation. The main idea of the statistical convergence of a sequence l is that the majority of elements from l converge and we do not care what is going on with other elements. We show (Section 2) that being mathematically formalized the concept of statistical convergence is directly connected to convergence of such statistical characteristics as the mean and standard deviation. At the same time, it known that sequences that come from real life sources, such as measurement and computation, do not allow, in a general case, to test whether they converge or statistically converge in the strict mathematical sense. To overcome limitations induced by vagueness and uncertainty of real life data, neoclassical analysis has been developed. It extends the scope and results of the classical mathematical analysis by applying fuzzy logic to conventional mathematical objects, such as functions, sequences, and series. The goal of this work is the further development of neoclassical analysis. This allows us to reflect and model vagueness and uncertainty of our knowledge, which results from imprecision of measurement and inaccuracy of computation. In the context on the theory of fuzzy limits, we develop the structure of statistical fuzzy convergence and study its properties.
Independent Statistics & Analysis
Annual Energy Outlook 2013 [U.S. Energy Information Administration (EIA)]
AFDC Printable Version Share this resource Send a link to EERE: Alternative Fuels Data Center Home Page to someone by E-mail Share EERE: Alternative Fuels Data Center Home Page on Facebook Tweet about EERE: Alternative Fuels Data Center Home Page on Twitter Bookmark EERE:1 First Use of Energy for All Purposes (Fuel and Nonfuel),Feet) Year Jan Feb Mar Apr May Jun Jul(Summary) " ,"ClickPipelines AboutDecember 2005 (Thousand9,0, 1997Environment >7,992000Implications-'V
Independent Statistics & Analysis
U.S. Energy Information Administration (EIA) Indexed Site
AFDC Printable Version Share this resource Send a link to EERE: Alternative Fuels Data Center Home Page to someone by E-mail Share EERE: Alternative Fuels Data Center Home Page on Facebook Tweet about EERE: Alternative Fuels Data Center Home Page on Twitter Bookmark EERE: Alternative Fuels Data Center Home Page on Google Bookmark EERE: Alternative Fuels Data CenterFranconia,(Million Barrels) Crude Oil Reserves in Nonproducing Reservoirs Year in Review W ithWellhead Price (DollarsAnnual
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
AFDC Printable Version Share this resource Send a link to EERE: Alternative Fuels Data Center Home Page to someone by E-mail Share EERE: Alternative Fuels Data Center Home Page on Facebook Tweet about EERE: Alternative Fuels Data Center Home Page on Twitter Bookmark EERE: Alternative Fuels Data Center Home Page on Google Bookmark EERE: Alternative Fuels Data Center Home Page on Delicious RankCombustion |Energy Usage Â» SearchEnergyDepartmentScopingOverviewFranklin M.Engine Dynamometer Test
Hitchcock, Adam P.
Mathematics & Statistics Coop Program Students from the Mathematics & Statistics Coop Program have design and data analysis, medical imaging, mathematical finance and statistical modeling. They have of Mathematics & Statistics Coop Work Terms Duties: Performed data mapping and analysis activities Derived
Early Classification of Multivariate Time Series Using a Hybrid HMM/SVM model
Obradovic, Zoran
Early Classification of Multivariate Time Series Using a Hybrid HMM/SVM model Mohamed F. Ghalwash to use a shorter time interval for classification is often more favorable than having a slightly more with other models that use full time series both in training and testing. Analysis of biomedical data has
Dynamic Graphics in a GIS: Exploring and An-alyzing Multivariate Spatial Data Using Linked
Symanzik, JÃ¼rgen
Dynamic Graphics in a GIS: Exploring and An- alyzing Multivariate Spatial Data Using Linked of the data set. Two examples of data that are inherently spatial can be seen in the next section: one information with the ground-based sample. Spatial data analysis is largely concerned with lookingfor trends
Term statistics Zipf's law text statistics
Lu, Jianguo
Term statistics Zipf's law text statistics October 20, 2014 text statistics 1 / 19 #12;Term statistics Zipf's law Overview 1 Term statistics 2 Zipf's law text statistics 2 / 19 #12;Term statistics Zipf's law Outline 1 Term statistics 2 Zipf's law text statistics 3 / 19 #12;Term statistics Zipf's law Model
Essays on Multivariate Modeling in Financial Econometrics
Yoldas, Emre
2008-01-01T23:59:59.000Z
24 5.1 GARCHTESTING IN MULTIVARIETE GARCH MODELS . . . . . . . . . . .t and J-Statistics for GARCH (1,1) Model of NYSE Returns
BS in STATISTICS: Biostatistics Emphasis (695233) MAP Sheet Department of Statistics
Olsen Jr., Dan R.
Statistics Stat 201 Statistics for Engineers & Scientists Stat 301 Statistics and Probability for Sec Ed Note Analysis of Variance Stat 240 Discrete Probability Stat 290 Communication of Statistical Results Stat 330BS in STATISTICS: Biostatistics Emphasis (695233) MAP Sheet Department of Statistics For students
A foundation for reliable spatial proteomics data analysis
Gatto, Laurent; Breckels, Lisa M.; Burger, Thomas; Nightingale, Daniel J.H.; Groen, Arnoud J.; Campbell, Callum; Mulvey, Claire M.; Christoforou, Andy; Ferro, Myriam; Lilley, Kathryn S.
2014-05-20T23:59:59.000Z
to the same degree as using inappropriate training examples. An important factor to consider in ones choice of training examples i.e. organelle markers, is how well they represent the multivariate data space i.e. the distribution of proteins over which... and insightful biological interpretation, and no consistent and robust solutions have been offered to the community so far. Here, we introduce the requirements for rigorous spatial proteomics data analysis as well as the statistical machine learning methodologies...
Method for factor analysis of GC/MS data
Van Benthem, Mark H; Kotula, Paul G; Keenan, Michael R
2012-09-11T23:59:59.000Z
The method of the present invention provides a fast, robust, and automated multivariate statistical analysis of gas chromatography/mass spectroscopy (GC/MS) data sets. The method can involve systematic elimination of undesired, saturated peak masses to yield data that follow a linear, additive model. The cleaned data can then be subjected to a combination of PCA and orthogonal factor rotation followed by refinement with MCR-ALS to yield highly interpretable results.
Multivariable controller increased MTBE complex capacity
Robertson, D.; Peterson, T.J.; O`Connor, D. [DMC Corp., Houston, TX (United States); Payne, D.; Adams, V. [Valero Refining Co., Corpus Christi, TX (United States)
1997-03-01T23:59:59.000Z
Capacity increased by more than 4.6% when one dynamic matrix multivariable controller began operating in Valero Refining Company`s MTBE production complex in Corpus Christi, Texas. This was on a plant that was already running well above design capacity due to previously made process changes. A single controller was developed to cover an isobutane dehydrogenation (ID) unit and an MTBE reaction and fractionation plant with the intermediate isobutylene surge drum. The overall benefit is realized by a comprehensive constrained multivariable predictive controller that properly handles all sets of limits experienced by the complex, whether limited by the front-end ID or back-end MTBE units. The controller has 20 manipulated, 6 disturbance and 44 controlled variables, and covers widely varying dynamics with settling times ranging from twenty minutes to six hours. The controller executes each minute with a six hour time horizon. A unique achievement is intelligent surge drum level handling by the controller for higher average daily complex capacity as a whole. The ID unit often operates at simultaneous limits on reactor effluent compressor capacity, cold box temperature and hydrogen/hydrocarbon ratio, and the MTBE unit at impurity in butene column overhead as well as impurity in MTBE product. The paper discusses ether production, isobutane dehydrogenation, maximizing production, controller design, and controller performance.
Rank test for multivariate two sample data using projection pursuit.
Gunathilaka, Unawatuna Gamage
2007-01-01T23:59:59.000Z
??Construction of an asymptotically distribution free test for the hypothesis that two multivariate random samples are identically distributed has been a topic among many statisticians… (more)
Michele Arzano; Dario Benedetti
2008-09-04T23:59:59.000Z
Non-commutative quantum field theories and their global quantum group symmetries provide an intriguing attempt to go beyond the realm of standard local quantum field theory. A common feature of these models is that the quantum group symmetry of their Hilbert spaces induces additional structure in the multiparticle states which reflects a non-trivial momentum-dependent statistics. We investigate the properties of this "rainbow statistics" in the particular context of $\\kappa$-quantum fields and discuss the analogies/differences with models with twisted statistics.
Middleton, D.
1996-05-02T23:59:59.000Z
The purpose of this report is to motivate and outline a program of data analysis, for data obtained from radar returns from ocean surfaces perturbed by internal waves and wind-wave interactions. The ultimate aims of this analysis are to provide the appropriate statistics of the signals returned from these ocean surfaces for: (1) use in implementing and evaluating optimum and near-optimum signal processing procedures for detecting and evaluating (i.e., measuring) these internal wave effects and, (2) to provide quantitative physical insight into both the surface scatter and subsurface mechanisms which determine the received radar signals. Here the focus is initially on the needed statistics of the radar returns. These are primarily: (i) the (instantaneous) amplitude and envelope probability densities, (pdf`s) and distributions (PDFS) of the returns and, (ii) analogous statistics for the intensities (associated with the pixel data). Also required are: (iii) space-time covariance data of the returns, for further improvement of detection capabilities. Preliminary evidence and earlier experiments suggest that these data [(i), (ii)] are nongaussian and strongly so at times. This in turn, if not properly taken into account, can greatly degrade signal detection in the usual weak-signal regimes [1],[2].
Engineering Statistics From "Engineering Statistics" , Top &
Kovintavewat, Piya
Engineering Statistics From "Engineering Statistics" , Top & Wiley, Prapaisri & Pongchanun 2 From "Engineering Statistics" , Top & Wiley, Prapaisri & Pongchanun 3 " "(Sample) (Sampling) ""(Population) " "(Statistics) ""(Parameter) From "Engineering Statistics" , Top & Wiley, Prapaisri
Leendert A. Klerk A; Er Broersen B; Ian W. Fletcher C
2006-01-01T23:59:59.000Z
The large size of the hyperspectral datasets that are produced with modern mass spectrometric imaging techniques makes it difficult to analyze the results. Unsupervised statistical techniques are needed to extract relevant information from these datasets and reduce the data into a surveyable overview. Multivariate statistics are commonly used for this purpose. Computational power and computer memory limit the resolution at which the datasets can be analyzed with these techniques. We introduce the use of a data format capable of efficiently storing sparse datasets for multivariate analysis. This format is more memory-efficient and therefore it increases the possible resolution together with a decrease of computation time. Three multivariate techniques are compared for both sparse-type data and non-sparse data acquired in two different imaging ToF-SIMS experiments and one LDI-ToF imaging experiment. There is no significant qualitative difference in the use of different data formats for the same multivariate algorithms. All evaluated multivariate techniques could be applied on both SIMS and the LDI imaging datasets. Principal component analysis is shown to be the fastest choice; however a small increase of computation time using a VARIMAX optimization increases the decomposition quality significantly. PARAFAC analysis is shown to be very effective in separating different chemical components but the calculations take a significant amount of time, limiting its use as a routine technique. An effective visualization of the results of the multivariate analysis is as important for the analyst as the computational issues. For this reason, a new technique for visualization is presented, combining both spectral loadings and spatial
FAULT DETECTION IN A MULTIVARIATE PROCESS WITH A BAYESIAN NETWORK
Boyer, Edmond
with simulations in order to analyze and compare his performances to other multivariate chart TÂ² of Hotelling and MEWMA. Keywords: SPC, multivariate, detection, bayesian network, TÂ², MEWMA 1 Â Introduction Control charts have been widely used in industry. The aim is to monitor the centering and the scattering
Robust Controller Design and PID Tuning for Multivariable Processes
Marquez, Horacio J.
Robust Controller Design and PID Tuning for Multivariable Processes Wen Tan Department approach, which gives a sub-optimal solution. Then a PID approximation method is proposed to reduce a high it serves as a PID tuning method for multivariable processes. Examples show that the method is easy to use
A multivariate quadrature based moment method for supersonic combustion modeling
Raman, Venkat
A multivariate quadrature based moment method for supersonic combustion modeling Pratik Donde) of thermochemical variables can be used for accurately computing the combustion source term. Quadrature based- ture method of moments (DQMOM) is well suited for multivariate problems like combustion. Numerical
Chang, Wen-Kuei; Hong, Tianzhen
2013-01-01T23:59:59.000Z
Occupancy profile is one of the driving factors behind discrepancies between the measured and simulated energy consumption of buildings. The frequencies of occupants leaving their offices and the corresponding durations of absences have significant impact on energy use and the operational controls of buildings. This study used statistical methods to analyze the occupancy status, based on measured lighting-switch data in five-minute intervals, for a total of 200 open-plan (cubicle) offices. Five typical occupancy patterns were identified based on the average daily 24-hour profiles of the presence of occupants in their cubicles. These statistical patterns were represented by a one-square curve, a one-valley curve, a two-valley curve, a variable curve, and a flat curve. The key parameters that define the occupancy model are the average occupancy profile together with probability distributions of absence duration, and the number of times an occupant is absent from the cubicle. The statistical results also reveal that the number of absence occurrences decreases as total daily presence hours decrease, and the duration of absence from the cubicle decreases as the frequency of absence increases. The developed occupancy model captures the stochastic nature of occupants moving in and out of cubicles, and can be used to generate a more realistic occupancy schedule. This is crucial for improving the evaluation of the energy saving potential of occupancy based technologies and controls using building simulations. Finally, to demonstrate the use of the occupancy model, weekday occupant schedules were generated and discussed.
Residential solar home resale analysis
Noll, S.A.
1980-01-01T23:59:59.000Z
One of the determinants of the market acceptance of solar technologies in the residential housing sector is the value placed upon the solar property at the time of resale. The resale factor is shown to be an important economic parameter when net benefits of the solar design are considered over a typical ownership cycle rather than the life cycle of the system. Although a study of solar resale in Davis, Ca, indicates that those particular homes have been appreciating in value faster than nonsolar market comparables, no study has been made that would confirm this conclusion for markets in other geograhical locations with supporting tests of statistical significance. The data to undertake such an analysis is available through numerous local sources; however, case by case data collection is prohibitively expensive. A recommended alternative approach is to make use of real estate market data firms who compile large data bases and provide multi-variate statistical analysis packages.
Jan de Leeuw
2011-01-01T23:59:59.000Z
with Real Data ? Annals Statistics, of 5:1055-1098, 1977.The Foundations of Statistics - Are There Any ? Synthese, [Statistics and the Sciences Jan de Leeuw UCLA Statistics
1979 DOE statistical symposium
Gardiner, D.A.; Truett T. (comps. and eds.)
1980-09-01T23:59:59.000Z
The 1979 DOE Statistical Symposium was the fifth in the series of annual symposia designed to bring together statisticians and other interested parties who are actively engaged in helping to solve the nation's energy problems. The program included presentations of technical papers centered around exploration and disposal of nuclear fuel, general energy-related topics, and health-related issues, and workshops on model evaluation, risk analysis, analysis of large data sets, and resource estimation.
The University of Chicago Department of Statistics
Stephens, Matthew
testing for presence of signal in the brain. It also arises naturally in multivariate analysis:00 PM 133 Eckhart Hall, 5734 S. University Avenue ABSTRACT We consider the well-studied problem of Euclidean space, or, more generally an abstract manifold, with or without boundary. This problem arises
Arrowood, L.F.; Tonn, B.E.
1992-02-01T23:59:59.000Z
This report presents recommendations relative to the use of expert systems and machine learning techniques by the Bureau of Labor Statistics (BLS) to substantially automate product substitution decisions associated with the Consumer Price Index (CPI). Thirteen commercially available, PC-based expert system shells have received in-depth evaluations. Various machine learning techniques were also reviewed. Two recommendations are given: (1) BLS should use the expert system shell LEVEL5 OBJECT and establish a software development methodology for expert systems; and (2) BLS should undertake a small study to evaluate the potential of machine learning techniques to create and maintain the approximately 350 ELI-specific knowledge bases to be used in CPI product substitution review.
acid sequence analysis: Topics by E-print Network
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
sequences. The starting Radicioni, Daniele 26 A Method for Sequence Analysis Using Multivariate Analysis CiteSeer Summary: We developed a computational sequence analysis...
Multi-variable optimization of pressurized oxy-coal combustion
Zebian, Hussam
2011-01-01T23:59:59.000Z
Simultaneous multi-variable gradient-based optimization with multi-start is performed on a 300 MWe wet-recycling pressurized oxy-coal combustion process with carbon capture and sequestration. The model accounts for realistic ...
Stochastic Estimation of Multi-Variable Human Ankle Mechanical Impedance
Rastgaar Aagaah, Mohammad
This article presents preliminary stochastic estimates of the multi-variable human ankle mechanical impedance. We employed Anklebot, a rehabilitation robot for the ankle, to provide torque perturbations. Time histories of ...
Multivariate Calibration Models for Sorghum Composition using...
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
M.A.; Agblevor, F.; Collins, M.; Johnson, D.K. "Compositional Analysis of Biomass Feedstocks by Near Infrared Reflectance Spectroscopy." Biomass and Bioenergy (11:5), 1996;...
J. Mark Heinzle; Claes Uggla
2012-12-21T23:59:59.000Z
In this paper we explore stochastical and statistical properties of so-called recurring spike induced Kasner sequences. Such sequences arise in recurring spike formation, which is needed together with the more familiar BKL scenario to yield a complete description of generic spacelike singularities. In particular we derive a probability distribution for recurring spike induced Kasner sequences, complementing similar available BKL results, which makes comparisons possible. As examples of applications, we derive results for so-called large and small curvature phases and the Hubble-normalized Weyl scalar.
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
AFDC Printable Version Share this resource Send a link to EERE: Alternative Fuels Data Center Home Page to someone by E-mail Share EERE: Alternative Fuels Data Center Home Page on Facebook Tweet about EERE: Alternative Fuels Data Center Home Page on Twitter Bookmark EERE:1 First Use of Energy for All Purposes (Fuel and Nonfuel),Feet) Year Jan Feb Mar Apr MayAtmosphericNuclear Security Administrationcontroller systemsBiSiteNeutron Scattering4 ByWatchingState ofDr. Donald6 Statistical
NSTec Environmental Restoration
2009-04-20T23:59:59.000Z
A statistical analysis and geologic evaluation of recently acquired laboratory-derived physical property data are being performed to better understand and more precisely correlate physical properties with specific geologic parameters associated with non-zeolitized tuffs at the Nevada Test Site. Physical property data include wet and dry bulk density, grain density (i.e., specific gravity), total porosity, and effective porosity. Geologic parameters utilized include degree of welding, lithology, stratigraphy, geographic area, and matrix mineralogy (i.e., vitric versus devitrified). Initial results indicate a very good correlation between physical properties and geologic parameters such as degree of welding, lithology, and matrix mineralogy. However, physical properties appear to be independent of stratigraphy and geographic area, suggesting that the data are transferrable with regards to these two geologic parameters. Statistical analyses also indicate that the assumed grain density of 2.65 grams per cubic centimeter used to calculate porosity in some samples is too high. This results in corresponding calculated porosity values approximately 5 percent too high (e.g., 45 percent versus 40 percent), which can be significant in the lower porosity rocks. Similar analyses and evaluations of zeolitic tuffs and carbonate rock physical properties data are ongoing as well as comparisons to geophysical log values.
Statistics applied to safeguards
Picard, R.R.
1993-05-01T23:59:59.000Z
Statistical methods are central to safeguards work. Measurements forming the basis of much materials accountancy are not perfect - ``perfect`` in the sense of being error free. Other sessions in this course address the destructive and nondestructive measurement of nuclear material, together with the inherent limitations in those measurements. The bottom line is that measurement errors are a fact of life and, since we can`t eliminate them, we have to find a rational way to deal with them. Which leads to the world of statistics. Beyond dealing with measurement errors, another area of statistical application involves the sampling of items for verification. Inspectors from the IAEA and domestic regulatory agencies periodically visit operating facilities and make measurements of selected items. By comparing their own measured values to those declared by the facilities, increased confidence is obtained. If verification measurements were not expensive, time consuming, and disruptive to operations, perhaps verification of 100% of the inventories would be desirable. In reality, many constraints lead to inspection of only a portion of those inventories. Drawing inferences about a larger ``population`` of declared items in a facility based on verification information obtained from a sample of those items is a statistical problem. There are few texts on statistics in safeguards. The lengthy exposition ``IAEA Safeguards: Statistical Concepts and Techniques`` and the US NRC book edited by Bowen and Bennet are two good sources of general information. In the next section, the subject of measurement quality is addressed. The third section covers the evaluation of MUFs, and discusses the related subjects of error propagation and sequential analysis. The final section covers verification, inspection sample size calculations, and the D statistic. The text is written at an elementary level, with references to the safeguards literature for more detailed treatment.
Statistics applied to safeguards
Picard, R.R.
1993-01-01T23:59:59.000Z
Statistical methods are central to safeguards work. Measurements forming the basis of much materials accountancy are not perfect - perfect'' in the sense of being error free. Other sessions in this course address the destructive and nondestructive measurement of nuclear material, together with the inherent limitations in those measurements. The bottom line is that measurement errors are a fact of life and, since we can't eliminate them, we have to find a rational way to deal with them. Which leads to the world of statistics. Beyond dealing with measurement errors, another area of statistical application involves the sampling of items for verification. Inspectors from the IAEA and domestic regulatory agencies periodically visit operating facilities and make measurements of selected items. By comparing their own measured values to those declared by the facilities, increased confidence is obtained. If verification measurements were not expensive, time consuming, and disruptive to operations, perhaps verification of 100% of the inventories would be desirable. In reality, many constraints lead to inspection of only a portion of those inventories. Drawing inferences about a larger population'' of declared items in a facility based on verification information obtained from a sample of those items is a statistical problem. There are few texts on statistics in safeguards. The lengthy exposition IAEA Safeguards: Statistical Concepts and Techniques'' and the US NRC book edited by Bowen and Bennet are two good sources of general information. In the next section, the subject of measurement quality is addressed. The third section covers the evaluation of MUFs, and discusses the related subjects of error propagation and sequential analysis. The final section covers verification, inspection sample size calculations, and the D statistic. The text is written at an elementary level, with references to the safeguards literature for more detailed treatment.
A Multivariate Analysis of Freeway Speed and Headway Data
Zou, Yajie
2013-11-11T23:59:59.000Z
key process is the generation of entry vehicle speeds and vehicle arrival times. It is helpful to find desirable mathematical distributions to model individual speed and headway values, because the individual vehicle speed and arrival time...
Parallel auto-correlative statistics with VTK.
Pebay, Philippe Pierre [Kitware, France; Bennett, Janine Camille
2013-08-01T23:59:59.000Z
This report summarizes existing statistical engines in VTK and presents both the serial and parallel auto-correlative statistics engines. It is a sequel to [PT08, BPRT09b, PT09, BPT09, PT10] which studied the parallel descriptive, correlative, multi-correlative, principal component analysis, contingency, k-means, and order statistics engines. The ease of use of the new parallel auto-correlative statistics engine is illustrated by the means of C++ code snippets and algorithm verification is provided. This report justifies the design of the statistics engines with parallel scalability in mind, and provides scalability and speed-up analysis results for the autocorrelative statistics engine.
Leendert A. Klerk A; Er Broersen B; Ian W. Fletcher C
2006-01-01T23:59:59.000Z
The large size of the hyperspectral datasets that are produced with modern mass spectrometric imaging techniques makes it difficult to analyze the results. Unsupervised statistical techniques are needed to extract relevant information from these datasets and reduce the data into a surveyable overview. Multivariate statistics are commonly used for this purpose. Computational power and computer memory limit the resolution at which the datasets can be analyzed with these techniques. We introduce the use of a data format capable of efficiently storing sparse datasets for multivariate analysis. This format is more memory-efficient and therefore it increases the possible resolution together with a decrease of computation time. Three multivariate techniques are compared for both sparse-type data and non-sparse data acquired in two different imaging ToF-SIMS experiments and one LDI-ToF imaging experiment. There is no significant qualitative difference in the use of different data formats for the same multivariate algorithms. All evaluated multivariate techniques could be applied on both SIMS and the LDI imaging datasets. Principal component analysis is shown to be the fastest choice; however a small increase of computation time using a VARIMAX optimization increases the decomposition quality significantly. PARAFAC analysis is shown to be very effective in separating different chemical components but the calculations take a significant amount of time, limiting its use as a routine technique. An effective visualization of the results of the multivariate analysis is as important for the analyst as the computational issues. For this reason, a new technique for visualization is presented, combining both spectral loadings and spatial
Statistical Digest No. 70 Fishery Statistics of
. These statistics include data on the volume and value of landed catches, employment, quantity of gear operatedStatistical Digest No. 70 Statistics of the United States 1976 Washington National Marine Fisheries Service #12;#12;Statistical Digest No. 70 Fishery Statistics of the United States
McGuire, Jimmy A.
. Trimorphodon biscutatus habita regiones aÂ´ridas desde los desiertos del suroeste de Estados Unidos hacia el sur
Mueller, Amy V.
In this paper a second-order method for blind source separation of noisy instantaneous linear mixtures is presented for the case where the signal order k is unknown. Its performance advantages are illustrated by simulations ...
Thresholding Multivariate Regression and Generalized Principal Components
Sun, Ranye
2014-03-17T23:59:59.000Z
the high-dimensional data matrices have been extensively researched for uncorrelated and independent situations, they are much less so for the transposable data matrices. A generalization of principal component analysis and the related weighted least...
Williams, O. R.; Bennett, K.; Much, R. [Astrophysics Division, ESTEC, P.O. Box 299, NL-2200 AG Noordwijk (Netherlands); Schoenfelder, V. [Max-Planck Institut fuer Extraterrestrische Physik, P.O. Box 1603, 85740 Garching (Germany); Blom, J. J. [SRON-Utrecht, Sorbonnelaan 2, NL-3584 CA Utrecht (Netherlands); Ryan, J. [Space Science Center, Univ. of New Hampshire, Durham New Hampshire 03824 (United States)
1997-05-10T23:59:59.000Z
The maximum likelihood-ratio method is frequently used in COMPTEL analysis to determine the significance of a point source at a given location. In this paper we do not consider whether the likelihood-ratio at a particular location indicates a detection, but rather whether distributions of likelihood-ratios derived from many locations depart from that expected for source free data. We have constructed distributions of likelihood-ratios by reading values from standard COMPTEL maximum-likelihood ratio maps at positions corresponding to the locations of different categories of AGN. Distributions derived from the locations of Seyfert galaxies are indistinguishable, according to a Kolmogorov-Smirnov test, from those obtained from ''random'' locations, but differ slightly from those obtained from the locations of flat spectrum radio loud quasars, OVVs, and BL Lac objects. This difference is not due to known COMPTEL sources, since regions near these sources are excluded from the analysis. We suggest that it might arise from a number of sources with fluxes below the COMPTEL detection threshold.
Generalized Enhanced Multivariance Product Representation for Data Partitioning: Constancy Level
Tunga, M. Alper [Bahcesehir University, Faculty of Engineering, Software Engineering Department, Besiktas, 34349, Istanbul (Turkey); Demiralp, Metin [Istanbul Technical University, Informatics Institute, Computational Science and Engineering Program, Maslak 34469, Istanbul (Turkey)
2011-09-14T23:59:59.000Z
Enhanced Multivariance Product Representation (EMPR) method is used to represent multivariate functions in terms of less-variate structures. The EMPR method extends the HDMR expansion by inserting some additional support functions to increase the quality of the approximants obtained for dominantly or purely multiplicative analytical structures. This work aims to develop the generalized form of the EMPR method to be used in multivariate data partitioning approaches. For this purpose, the Generalized HDMR philosophy is taken into consideration to construct the details of the Generalized EMPR at constancy level as the introductory steps and encouraging results are obtained in data partitioning problems by using our new method. In addition, to examine this performance, a number of numerical implementations with concluding remarks are given at the end of this paper.
Introduction Statistical Tests
Liu, Huan
Introduction Statistical Tests Experiment Summary Statistical Significance Testing Machine Learning Lab, ASU Surendra Singhi April 29, 2005 Surendra Singhi Statistical Significance Testing #12;Introduction Statistical Tests Experiment Summary Outline 1 Introduction Preliminary Stuff Sources of Variation
18.441 Statistical Inference, Spring 2002
Hardy, Michael
Reviews probability and introduces statistical inference. Point and interval estimation. The maximum likelihood method. Hypothesis testing. Likelihood-ratio tests and Bayesian methods. Nonparametric methods. Analysis of ...
Essays on microeconomics and statistical decision making
Nieto Barthaburu, Augusto
2006-01-01T23:59:59.000Z
An Introduction to Econometric Theory. Princeton Universityand S. Low (1989): An Econometric Analysis of the Bankextensive statistical and econometric literature concerned
analysis-a potential functional: Topics by E-print Network
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
the study) to IR. Functional Data Analysis (FDA) 5 is an extension of traditional Data Analysis multivariate analysis and FDA is that the latter manipulates the functional...
analysis structural integrity: Topics by E-print Network
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
processes 6, rotation 1 see e.g. the remarks in 2 1 12; WHITE NOISE ANALYSIS 113 Multivariate Analysis and Geovisualization with an Integrated Geographic Knowledge...
David, Mathieu; Garde, Francois; Boyer, Harry
2014-01-01T23:59:59.000Z
In building studies dealing about energy efficiency and comfort, simulation software need relevant weather files with optimal time steps. Few tools generate extreme and mean values of simultaneous hourly data including correlation between the climatic parameters. This paper presents the C++ Runeole software based on typical weather sequences analysis. It runs an analysis process of a stochastic continuous multivariable phenomenon with frequencies properties applied to a climatic database. The database analysis associates basic statistics, PCA (Principal Component Analysis) and automatic classifications. Different ways of applying these methods will be presented. All the results are stored in the Runeole internal database that allows an easy selection of weather sequences. The extreme sequences are used for system and building sizing and the mean sequences are used for the determination of the annual cooling loads as proposed by Audrier-Cros (Audrier-Cros, 1984). This weather analysis was tested with the datab...
PENALIZED MULTIVARIATE LOGISTIC REGRESSION WITH A LARGE DATA SET
Liblit, Ben
-linear model to build a partly exible model for multivariate Bernoulli data. The joint distribution the association between outcome variables. A numer- ical scheme based on the block one-step SOR-Leibler) distance. It is used to adaptively select smoothing parameters in each block one-step SOR itera- tion
ess5011 Robert J. Serfling MULTIVARIATE SYMMETRY AND
Serfling, Robert
ess5011 Robert J. Serfling MULTIVARIATE SYMMETRY AND ASYMMETRY Robert J. Serfling University by modern group theory. 1 #12;ess5011 Robert J. Serfling Here we focus on the notion of symmetry and s independently distributed as chi-square with m degrees of freedom. 2 #12;ess5011 Robert J. Serfling An important
SMITH NORMAL FORM OF A MULTIVARIATE MATRIX ASSOCIATED WITH PARTITIONS
SMITH NORMAL FORM OF A MULTIVARIATE MATRIX ASSOCIATED WITH PARTITIONS CHRISTINE BESSENRODT polynomials, and by determining not only the deter- minant but also the Smith normal form of these matrices. A priori the Smith form need not exist but its existence follows from the explicit computation
Multivariate classification of infrared spectra of cell and tissue samples
Haaland, David M. (Albuquerque, NM); Jones, Howland D. T. (Albuquerque, NM); Thomas, Edward V. (Albuquerque, NM)
1997-01-01T23:59:59.000Z
Multivariate classification techniques are applied to spectra from cell and tissue samples irradiated with infrared radiation to determine if the samples are normal or abnormal (cancerous). Mid and near infrared radiation can be used for in vivo and in vitro classifications using at least different wavelengths.
FISHERY STATISTICS UNITED STATES
FISHERY STATISTICS OF THE UNITED STATES 1973 STATISTICAL DIGEST NO. 67 Prepared by STATISTICS a review of the fishery statistics for the year 1973 . These statistics include data on the volume and value of landings of fishery products, employment 1n the fish- eries, quantity of gear operated, number
FISHERY STATISTICS UNITED STATES
FISHERY STATISTICS OF THE UNITED STATES 1971 STATISTICAL DIGEST NO. 65 Prepared by STATISTICS ry statistics for the year 1971 . These statistics include data on the volume and value of landings of fishery products, employment in the fishe ries, quantity of gear operated, number of fishing craft e
Multivariate SPC for Total Inertial Tolerancing Maurice Pillet / Boukar Abdelhakim / Eric Pairel /
Paris-Sud XI, UniversitÃ© de
Multivariate SPC for Total Inertial Tolerancing Maurice Pillet / Boukar Abdelhakim / Eric Pairel by minimizing the inertia of the surfaces. The proposed method is based on multivariate SPC. For a given surface
Big-Data RHEED analysis for understanding epitaxial film growth processes
Vasudevan, Rama K [ORNL; Tselev, Alexander [ORNL; Baddorf, Arthur P [ORNL; Kalinin, Sergei V [ORNL
2014-01-01T23:59:59.000Z
Reflection high energy electron diffraction (RHEED) has by now become a standard tool for in-situ monitoring of film growth by pulsed laser deposition and molecular beam epitaxy. Yet despite the widespread adoption and wealth of information in RHEED image, most applications are limited to observing intensity oscillations of the specular spot, and much additional information on growth is discarded. With ease of data acquisition and increased computation speeds, statistical methods to rapidly mine the dataset are now feasible. Here, we develop such an approach to the analysis of the fundamental growth processes through multivariate statistical analysis of RHEED image sequence. This approach is illustrated for growth of LaxCa1-xMnO3 films grown on etched (001) SrTiO3 substrates, but is universal. The multivariate methods including principal component analysis and k-means clustering provide insight into the relevant behaviors, the timing and nature of a disordered to ordered growth change, and highlight statistically significant patterns. Fourier analysis yields the harmonic components of the signal and allows separation of the relevant components and baselines, isolating the assymetric nature of the step density function and the transmission spots from the imperfect layer-by-layer (LBL) growth. These studies show the promise of big data approaches to obtaining more insight into film properties during and after epitaxial film growth. Furthermore, these studies open the pathway to use forward prediction methods to potentially allow significantly more control over growth process and hence final film quality.
Paul T. Baker; Sarah Caudill; Kari A. Hodge; Dipongkar Talukder; Collin Capano; Neil J. Cornish
2014-12-19T23:59:59.000Z
Searches for gravitational waves produced by coalescing black hole binaries with total masses $\\gtrsim25\\,$M$_\\odot$ use matched filtering with templates of short duration. Non-Gaussian noise bursts in gravitational wave detector data can mimic short signals and limit the sensitivity of these searches. Previous searches have relied on empirically designed statistics incorporating signal-to-noise ratio and signal-based vetoes to separate gravitational wave candidates from noise candidates. We report on sensitivity improvements achieved using a multivariate candidate ranking statistic derived from a supervised machine learning algorithm. We apply the random forest of bagged decision trees technique to two separate searches in the high mass $\\left( \\gtrsim25\\,\\mathrm{M}_\\odot \\right)$ parameter space. For a search which is sensitive to gravitational waves from the inspiral, merger, and ringdown (IMR) of binary black holes with total mass between $25\\,$M$_\\odot$ and $100\\,$M$_\\odot$, we find sensitive volume improvements as high as $70_{\\pm 13}-109_{\\pm 11}$\\% when compared to the previously used ranking statistic. For a ringdown-only search which is sensitive to gravitational waves from the resultant perturbed intermediate mass black hole with mass roughly between $10\\,$M$_\\odot$ and $600\\,$M$_\\odot$, we find sensitive volume improvements as high as $61_{\\pm 4}-241_{\\pm 12}$\\% when compared to the previously used ranking statistic. We also report how sensitivity improvements can differ depending on mass regime, mass ratio, and available data quality information. Finally, we describe the techniques used to tune and train the random forest classifier that can be generalized to its use in other searches for gravitational waves.
Kansas Statistical Abstract 2013 published exclusively online by Institute for Policy & Social,chapters,or the full Kansas Statistical Abstract are permitted on the condition that sources are cited. Kansas Statistical Abstract 2013 2 #12;Table of Contents Foreword
Statistical Laboratory established 1933
Statistical Laboratory established 1933 Biennial Report July 1, 1997 to June 30, 1999 #12;Index 50 years of statistics ....................... 1 Self study & external review .......... 2 Social sciences statistics ................ 3 On the lighter side........................... 6 Publications 1997
Distribution Free Decomposition of Multivariate Data \\Lambda SPR'98 Invited submission
neighbor method, kernel estimation, [8, 16, 18, 19]. For higher dimensional feature spaces, multivariateDistribution Free Decomposition of Multivariate Data \\Lambda SPR'98 Invited submission Dorin; Distribution Free Decomposition of Multivariate Data Abstract We present a practical approach to nonparametric
Exploratory Clusters of Student Technology Participation with Multivariate Regression Trees
Skipper, Peter Tyrel
2014-01-01T23:59:59.000Z
technology use in the context of a high school statistics curriculum, and generates exploratory clusters of that usage
Multivariate Skew-t Distributions in Econometrics and Environmetrics
Marchenko, Yulia V.
2012-02-14T23:59:59.000Z
The syntax for fitting multivariate skew-t regression. . . . . . . . . . 81 22 The syntax for obtaining predictions. . . . . . . . . . . . . . . . . . . 81 23 The syntax for producing goodness-of-fit plots. . . . . . . . . . . . . 81 24 Probability plot... of the participants. In this case, Y ? is subject to incidental truncation or sample selection (e.g., Greene 2008). The problem of sample-selection or, more specifically, sample-selection bias, arises when Y ? and U? are correlated and, thus, must be modeled...
Accuracy and reliability of China's energy statistics
Sinton, Jonathan E.
2001-09-14T23:59:59.000Z
Many observers have raised doubts about the accuracy and reliability of China's energy statistics, which show an unprecedented decline in recent years, while reported economic growth has remained strong. This paper explores the internal consistency of China's energy statistics from 1990 to 2000, coverage and reporting issues, and the state of the statistical reporting system. Available information suggests that, while energy statistics were probably relatively good in the early 1990s, their quality has declined since the mid-1990s. China's energy statistics should be treated as a starting point for analysis, and explicit judgments regarding ranges of uncertainty should accompany any conclusions.
Linda Stetzenbach; Lauren Nemnich; Davor Novosel
2009-08-31T23:59:59.000Z
Three independent tasks had been performed (Stetzenbach 2008, Stetzenbach 2008b, Stetzenbach 2009) to measure a variety of parameters in normative buildings across the United States. For each of these tasks 10 buildings were selected as normative indoor environments. Task 1 focused on office buildings, Task 13 focused on public schools, and Task 0606 focused on high performance buildings. To perform this task it was necessary to restructure the database for the Indoor Environmental Quality (IEQ) data and the Sound measurement as several issues were identified and resolved prior to and during the transfer of these data sets into SPSS. During overview discussions with the statistician utilized in this task it was determined that because the selection of indoor zones (1-6) was independently selected within each task; zones were not related by location across tasks. Therefore, no comparison would be valid across zones for the 30 buildings so the by location (zone) data were limited to three analysis sets of the buildings within each task. In addition, differences in collection procedures for lighting were used in Task 0606 as compared to Tasks 01 & 13 to improve sample collection. Therefore, these data sets could not be merged and compared so effects by-day data were run separately for Task 0606 and only Task 01 & 13 data were merged. Results of the statistical analysis of the IEQ parameters show statistically significant differences were found among days and zones for all tasks, although no differences were found by-day for Draft Rate data from Task 0606 (p>0.05). Thursday measurements of IEQ parameters were significantly different from Tuesday, and most Wednesday measures for all variables of Tasks 1 & 13. Data for all three days appeared to vary for Operative Temperature, whereas only Tuesday and Thursday differed for Draft Rate 1m. Although no Draft Rate measures within Task 0606 were found to significantly differ by-day, Temperature measurements for Tuesday and Thursday showed variation. Moreover, Wednesday measurements of Relative Humidity within Task 0606 varied significantly from either Tuesday or Thursday. The majority of differences in IEQ measurements by-zone were highly significant (p<0.001), with the exception of Relative Humidity in some buildings. When all task data were combined (30 buildings) neither the airborne culturable fungi nor the airborne non-culturable spore data differed in the concentrations found at any indoor location in terms of day of collection. However, the concentrations of surface-associated fungi varied among the day of collection. Specifically, there was a lower concentration of mold on Tuesday than on Wednesday, for all tasks combined. As expected, variation was found in the concentrations of both airborne culturable fungi and airborne non-culturable fungal spores between indoor zones (1-6) and the outdoor zone (zone 0). No variation was found among the indoor zones of office buildings for Task 1 in the concentrations of airborne culturable fungi. However, airborne non-culturable spores did vary among zones in one building in Task 1 and variation was noted between zones in surface-associated fungi. Due to the lack of multiple lighting measurements for Tasks 13 and 0606, by-day comparisons were only performed for Task 1. No statistical differences were observed in lighting with respect to the day of collection. There was a wide range of variability by-zone among seven of the office buildings. Although few differences were found for the brightest illumination of the worksurface (IllumWkSfcBrtst) and the darkest illumination of the worksurface (IllumWkSfcDrkst) in Task 1, there was considerable variation for these variables in Task 13 and Task 0606 (p < 0.001). Other variables that differed by-zone in Task 13 include CombCCT and AmbCCT1 for S03, S07, and S08. Additionally, AmbChromX1, CombChromY, and CombChromX varied by-zone for school buildings S02, S04, and S05, respectively. Although all tasks demonstrated significant differences in sound measurements by zone, some of the buil
Cella, Laura, E-mail: laura.cella@cnr.it [Institute of Biostructures and Bioimaging, National Council of Research, Naples (Italy); Department of Advanced Biomedical Sciences, Federico II University School of Medicine, Naples (Italy); Liuzzi, Raffaele; Conson, Manuel [Institute of Biostructures and Bioimaging, National Council of Research, Naples (Italy); Department of Advanced Biomedical Sciences, Federico II University School of Medicine, Naples (Italy); D’Avino, Vittoria [Institute of Biostructures and Bioimaging, National Council of Research, Naples (Italy); Salvatore, Marco [Department of Advanced Biomedical Sciences, Federico II University School of Medicine, Naples (Italy); Pacelli, Roberto [Institute of Biostructures and Bioimaging, National Council of Research, Naples (Italy); Department of Advanced Biomedical Sciences, Federico II University School of Medicine, Naples (Italy)
2013-10-01T23:59:59.000Z
Purpose: To establish a multivariate normal tissue complication probability (NTCP) model for radiation-induced asymptomatic heart valvular defects (RVD). Methods and Materials: Fifty-six patients treated with sequential chemoradiation therapy for Hodgkin lymphoma (HL) were retrospectively reviewed for RVD events. Clinical information along with whole heart, cardiac chambers, and lung dose distribution parameters was collected, and the correlations to RVD were analyzed by means of Spearman's rank correlation coefficient (Rs). For the selection of the model order and parameters for NTCP modeling, a multivariate logistic regression method using resampling techniques (bootstrapping) was applied. Model performance was evaluated using the area under the receiver operating characteristic curve (AUC). Results: When we analyzed the whole heart, a 3-variable NTCP model including the maximum dose, whole heart volume, and lung volume was shown to be the optimal predictive model for RVD (Rs = 0.573, P<.001, AUC = 0.83). When we analyzed the cardiac chambers individually, for the left atrium and for the left ventricle, an NTCP model based on 3 variables including the percentage volume exceeding 30 Gy (V30), cardiac chamber volume, and lung volume was selected as the most predictive model (Rs = 0.539, P<.001, AUC = 0.83; and Rs = 0.557, P<.001, AUC = 0.82, respectively). The NTCP values increase as heart maximum dose or cardiac chambers V30 increase. They also increase with larger volumes of the heart or cardiac chambers and decrease when lung volume is larger. Conclusions: We propose logistic NTCP models for RVD considering not only heart irradiation dose but also the combined effects of lung and heart volumes. Our study establishes the statistical evidence of the indirect effect of lung size on radio-induced heart toxicity.
Department of Statistical Science
Keinan, Alon
Joslin Diabetes Center LG Electronics Monsanto Risk Management Solutions Samsung Insurance (3) Statistics
Ge, Shuzhi Sam
, "Multivariate stochastic approximation using a simultaneous perturbation gradient approximation," IEEE Trans., vol. 19, no. 4, pp. 482492, 1998. [18] , "Adaptive stochastic approximation by the simultaneous and E. K. P. Chong, "A deterministic analysis of stochastic ap- proximation with randomized directions
a Creative Commons Attribution 4.0 International License. Page numbers and proceedings footer are added.tsarfaty@weizmann.ac.il 1 Introduction This first joint meeting on Statistical Parsing of Morphologically Rich Languages of morpholog- ically rich languages (SPMRL). The goal of the shared task is to allow to train and test
analysis model tsam: Topics by E-print Network
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
words Krzysztof Radzicki; Stphane Bonelli 2012-01-01 293 Generalized Impulse Response Analysis in Linear Multivariate Models CiteSeer Summary: Building on Koop, Pesaran and...
State Space Reconstruction for Multivariate Time Series Prediction
I. Vlachos; D. Kugiumtzis
2008-09-12T23:59:59.000Z
In the nonlinear prediction of scalar time series, the common practice is to reconstruct the state space using time-delay embedding and apply a local model on neighborhoods of the reconstructed space. The method of false nearest neighbors is often used to estimate the embedding dimension. For prediction purposes, the optimal embedding dimension can also be estimated by some prediction error minimization criterion. We investigate the proper state space reconstruction for multivariate time series and modify the two abovementioned criteria to search for optimal embedding in the set of the variables and their delays. We pinpoint the problems that can arise in each case and compare the state space reconstructions (suggested by each of the two methods) on the predictive ability of the local model that uses each of them. Results obtained from Monte Carlo simulations on known chaotic maps revealed the non-uniqueness of optimum reconstruction in the multivariate case and showed that prediction criteria perform better when the task is prediction.
Mohaghegh, Shahab
METROPOLITAN STATISTICAL AREA OUTLOOK MORGANTOWN COLLEGE OF BUSINESS AND ECONOMICS Bureau to be repeated over the next five years. The Morgantown Metropolitan Statistical Area (MSA) had an average annual
Interpreting Accident Statistics
Ferreira, Joseph Jr.
Accident statistics have often been used to support the argument that an abnormally small proportion of drivers account for a large proportion of the accidents. This paper compares statistics developed from six-year data ...
Statistics Statistique Canada Canada
Sinnamon, Gordon J.
Statistics Statistique Canada Canada Human Resources and Ressources humaines et Skills Development Canada DÃ©veloppement des compÃ©tences Canada Culture,Tourism and the Centre for Education Statistics about this product or the wide range of services and data available from Statistics Canada, visit our
Statistical Parsing Inside Algorithm
Ageno, Alicia
Parsing Â· Review Â· Statistical Parsing Â· SCFG Â· Inside Algorithm Â· Outside Algorithm NLP statistical parsing 1 Â· Outside Algorithm Â· Viterbi Algorithm Â· Learning models Â· SCFG extensions Â· Other NLP statistical parsing 2 language and is often viewed as an important prerequisite for building
STATISTICAL DESCRIPTION OF THE
Karney, Charles
STATISTICAL DESCRIPTION OF THE CHIRIKOV-TAYLOR MODEL IN THE PRESENCE OF NOISE A. B. RECHESTER that the presence of noise makes the statistical description of this system unique. Theform of the diffusion, and statistical averaging, performed ana- lytically with thepath-integral method, are the same. Some
POSITION OPENING APPLIED STATISTICS
Shepp, Larry
: Assistant or Associate Professor of Applied Statistics. Employment Beginning: September 16, 2012 DescriptionPOSITION OPENING APPLIED STATISTICS Department of Decision Sciences Charles H. Lundquist College at the University of Oregon is seeking to fill one tenure-track faculty position in Applied Statistics. Rank
Statistical Scientist / Senior Statistical Scientist Biomathematics & Statistics Scotland (BioSS)
Edinburgh, University of
------------------------------------------------------------------------------- Statistical Scientist / Senior Statistical Scientist Â£26,610 Biomathematics & Statistics Scotland (BioSS) statistically minded, individual to help address the range of statistical, bioinformatics and modelling problems
E-Print Network 3.0 - acoustic multivariate condition Sample...
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
for: acoustic multivariate condition Page: << < 1 2 3 4 5 > >> 1 ME706: Acoustics & Aerodynamic Sound Aerodynamic sound is the noise' produced by hydrodynamic (turbulent')...
A Visual Analytics Approach for Correlation, Classification, and Regression Analysis
Steed, Chad A [ORNL; SwanII, J. Edward [Mississippi State University (MSU); Fitzpatrick, Patrick J. [Mississippi State University (MSU); Jankun-Kelly, T.J. [Mississippi State University (MSU)
2012-02-01T23:59:59.000Z
New approaches that combine the strengths of humans and machines are necessary to equip analysts with the proper tools for exploring today's increasing complex, multivariate data sets. In this paper, a novel visual data mining framework, called the Multidimensional Data eXplorer (MDX), is described that addresses the challenges of today's data by combining automated statistical analytics with a highly interactive parallel coordinates based canvas. In addition to several intuitive interaction capabilities, this framework offers a rich set of graphical statistical indicators, interactive regression analysis, visual correlation mining, automated axis arrangements and filtering, and data classification techniques. The current work provides a detailed description of the system as well as a discussion of key design aspects and critical feedback from domain experts.
Scalable k-means statistics with Titan.
Thompson, David C.; Bennett, Janine C.; Pebay, Philippe Pierre
2009-11-01T23:59:59.000Z
This report summarizes existing statistical engines in VTK/Titan and presents both the serial and parallel k-means statistics engines. It is a sequel to [PT08], [BPRT09], and [PT09] which studied the parallel descriptive, correlative, multi-correlative, principal component analysis, and contingency engines. The ease of use of the new parallel k-means engine is illustrated by the means of C++ code snippets and algorithm verification is provided. This report justifies the design of the statistics engines with parallel scalability in mind, and provides scalability and speed-up analysis results for the k-means engine.
Pricing statistics sourcebook. 5. edition
NONE
1999-11-01T23:59:59.000Z
Thousands of historical and current prices for crude oil, NGL, petroleum products, natural gas and electric power are presented in easy to read tables. The book includes spot, posted and future prices; prices by state and by country; and monthly and annual prices. Most monthly price series go back 25 years. This comprehensive source for energy industry prices is a must for anyone involved in planning and budgeting. The Pricing Statistics Sourcebook has all of the essential key energy price statistics needed for analysis of the US and international oil and gas industries. Also include: an appendix of IEA, OECD and OPEC member lists, conversion factors heat content of fuels; and major events affecting the oil and gas industry since 1859. The book includes a summary analysis of significant changes in key data series written by Bob Beck, Economics Editor of the Oil and Gas Journal.
Babu, G. Jogesh
. Their work in astrostatistics includes multivariate methods for satellite data on Gamma-ray bursts in determining the size of the universe. Analyzed the Third Catalog of Gamma-Ray bursts data from BATSE on board the Compton Gamma Ray Observatory, using multivariate analysis. Babu, Feigelson along with a group
Statistics and the Modern Student
Robert Gould
2011-01-01T23:59:59.000Z
Technology Innovations in Statistics Education, 3(1). Wild,the "wider view" of statistics, The American Statistician,a History of Teaching Statistics, Edinburgh: John Bibby (
Statistics and the Modern Student
Gould, Robert
2010-01-01T23:59:59.000Z
Technology Innovations in Statistics Education, 3(1). Wild,the "wider view" of statistics, The American Statistician,a History of Teaching Statistics, Edinburgh: John Bibby (
Derrien, H.; Harvey, J.A.; Larson, N.M.; Leal, L.C.; Wright, R.Q.
2000-05-01T23:59:59.000Z
The average {sup 235}U neutron total cross sections were obtained in the energy range 2 keV to 330 keV from high-resolution transmission measurements of a 0.033 atom/b sample.1 The experimental data were corrected for the contribution of isotope impurities and for resonance self-shielding effects in the sample. The results are in very good agreement with the experimental data of Poenitz et al.4 in the energy range 40 keV to 330 keV and are the only available accurate experimental data in the energy range 2 keV to 40 keV. ENDF/B-VI evaluated data are 1.7% larger. The SAMMY/FITACS code 2 was used for a statistical model analysis of the total cross section, selected fission cross sections and data in the energy range 2 keV to 200 keV. SAMMY/FITACS is an extended version of SAMMY which allows consistent analysis of the experimental data in the resolved and unresolved resonance region. The Reich-Moore resonance parameters were obtained 3 from a SAMMY Bayesian fits of high resolution experimental neutron transmission and partial cross section data below 2.25 keV, and the corresponding average parameters and covariance data were used in the present work as input for the statistical model analysis of the high energy range of the experimental data. The result of the analysis shows that the average resonance parameters obtained from the analysis of the unresolved resonance region are consistent with those obtained in the resolved energy region. Another important result is that ENDF/B-VI capture cross section could be too small by more than 10% in the energy range 10 keV to 200 keV.
BS in STATISTICS: Statistical Science Emphasis (695220) MAP Sheet Department of Statistics
Olsen Jr., Dan R.
BS in STATISTICS: Statistical Science Emphasis (695220) MAP Sheet Department of Statistics the following: Stat 121 Principles of Statistics Stat 151 Introduction to Bayesian Statistics Stat 201 Statistics for Engineers & Scientists Stat 301 Statistics & Probability for Sec Ed Note: Students who have
Ensuring Positiveness of the Scaled Di?erence Chi-square Test Statistic
Albert Satorra; Peter M. Bentler
2011-01-01T23:59:59.000Z
M 1 , and gives the ML statistic T 1 . Compute the SB robustkurtosis. Computational Statistics, 17, 117-122. Browne, M.for chi-square statistics in covariance structure analysis.
Ensuring Positiveness of the Scaled Di?erence Chi-square Test Statistic
Satorra, Albert; Bentler, P M
2008-01-01T23:59:59.000Z
M 1 , and gives the ML statistic T 1 . Compute the SB robustkurtosis. Computational Statistics, 17, 117-122. Browne, M.for chi-square statistics in covariance structure analysis.
Kreinovich, Vladik
engineering applications, we are interested in computing statistics. For example, in environmental analysis, a reasonable statistic for estimating the mean value of a probability distribution is the population average EFast Algorithms for Computing Statistics Under Interval and Fuzzy Uncertainty
1 Statistics Statistics plays an important role throughout society, providing
Vertes, Akos
1 Statistics STATISTICS Statistics plays an important role throughout society, providing data. They also explore how those skills can be applied to develop new initiatives. Statistics is one. UNDERGRADUATE Bachelor's program Â· Bachelor of Science with a major in statistics (http:// bulletin.gwu.edu/arts-sciences/statistics
Course information Department of Mathematics and Statistics
Hickman, Mark
' is allowed, check the FAQ. Mathematical software: In lectures, MATLAB and MAPLE will be used sets, linear dependence and independence, basis, dimension, rank of a matrix. Â· Eigenvectors; types of solutions. Â· Multivariable differentiation. Graphs of multivariable functions, tangent planes
Flexible Information Visualization of Multivariate Data from Biological Sequence Similarity Searches
Chi, Ed Huai-hsin
Flexible Information Visualization of Multivariate Data from Biological Sequence Similarity of other variÂ ables. We present an enhanced system for interactive exploration of this multivariate data. We identify a larger set of useful variables in the information space. The new system involves more
Flexible Information Visualization of Multivariate Data from Biological Sequence Similarity Searches
Chi, Ed Huai-hsin
Flexible Information Visualization of Multivariate Data from Biological Sequence Similarity- ables. We present an enhanced system for interactive exploration of this multivariate data. We identify a larger set of useful variables in the information space. The new system involves more variables, so
Vine Copulas as a Way to Describe and Analyze Multi-Variate Dependence in Econometrics
Kreinovich, Vladik
Vine Copulas as a Way to Describe and Analyze Multi-Variate Dependence in Econometrics and analyzing multi-variate dependence in econometrics; see, e.g., [13, 7, 911, 13, 14, 21]. Our experience problems of econometrics, there is still a lot of confusion and misunderstanding related to vine copulas
Teaching Multivariable Control Using the Quadruple-Tank Process1 2
Johansson, Karl Henrik
Teaching Multivariable Control Using the Quadruple-Tank Process1 2 Karl Henrik Johanssony course. The process is called the Quadruple-Tank Process and demonstrates a multivariable level control by simply changing a valve. This makes the Quadruple- Tank Process suitable for illustrating many concepts
Bayesian network for the characterization of faults in a multivariate process
Paris-Sud XI, Université de
Bayesian network for the characterization of faults in a multivariate process Sylvain VERRON of multivariate processes, we propose an original network structure allowing deciding if a fault is appeared in the process. More, this structure permits the identification of the variables that are responsible (root
ISpace: Interactive Volume Data Classification Techniques Using Independent Component Analysis
Ma, Kwan-Liu
, multivariate data analysis, multimodality data, scientific visualization, seg- mentation, volume rendering 1ISpace: Interactive Volume Data Classification Techniques Using Independent Component Analysis, which uses Independent Component Analysis (ICA) and a multi- dimensional histogram of the volume data
Part I STATISTICAL PHYSICS 1 Statistical Physics
unknown authors
In this first part of the book we shall study aspects of classical statistical physics that every physicist should know, but are not usually treated in elementary thermodynamics courses. Our study will lay the microphysical (particle-scale) foundations for the continuum physics of Parts II—VI. As a central feature of our approach, we shall emphasize the intimate connections between the relativistic formulation of statistical physics and its nonrelativistic limit, and between quantum statistical physics and the classical theory. Throughout, we shall presume that the reader is familiar with elementary thermodynamics, but not with other aspects of statistical physics. In Chap. 2 we will study kinetic theory — the simplest of all formalisms for analyzing systems of huge numbers of particles (e.g., molecules of air, or neutrons diffusing through a nuclear reactor, or photons produced in the big-bang origin of the Universe). In kinetic theory the key concept is the “distribution function ” or “number density of particles in phase space”, N; i.e., the number of particles per unit 3-dimensional volume of ordinary space and per unit 3-dimensional volume of momentum space. Despite first appearances, N turns out to be a geometric, frame-independent entity. This N and the frame-independent laws it
Part I STATISTICAL PHYSICS 1 Statistical Physics
unknown authors
2004-01-01T23:59:59.000Z
In this first part of the book we shall study aspects of classical statistical physics that every physicist should know, but are not usually treated in elementary thermodynamics courses. Our study will lay the microphysical (particle-scale) foundations for the continuum physics of Parts II—VI. As a central feature of our approach, we shall emphasize the intimate connections between the relativistic formulation of statistical physics and its nonrelativistic limit, and between quantum statistical physics and the classical theory. Throughout, we shall presume that the reader is familiar with elementary thermodynamics, but not with other aspects of statistical physics. In Chap. 2 we will study kinetic theory — the simplest of all formalisms for analyzing systems of huge numbers of particles (e.g., molecules of air, or neutrons diffusing through a nuclear reactor, or photons produced in the big-bang origin of the Universe). In kinetic theory the key concept is the “distribution function ” or “number density of particles in phase space”, N; i.e., the number of particles per unit 3-dimensional volume of ordinary space and per unit 3-dimensional volume of momentum space. Despite first appearances, N turns out to be a geometric, frame-independent entity. This N and the frame-independent laws it
Part I STATISTICAL PHYSICS 1 Statistical Physics
unknown authors
In this first part of the book we shall study aspects of classical statistical physics that every physicist should know but are not usually treated in elementary thermodynamics courses. This study will lay the microphysical (particle-scale) foundations for the continuum physics of Parts II—VI. Throughout, we shall presume that the reader is familiar with elementary thermodynamics, but not with other aspects of statistical physics. As a central feature of our approach, we shall emphasize the intimate connections between the relativistic formulation of statistical physics and its nonrelativistic limit, and between quantum statistical physics and the classical theory. Chapter 2 will deal with kinetic theory, which is the simplest of all formalisms for studying systems of huge numbers of particles (e.g., molecules of air, or neutrons diffusing through a nuclear reactor, or photons produced in the big-bang origin of the Universe). In kinetic theory the key concept is the “distribution function ” or “number density of particles in phase space”, N; i.e., the number of particles per unit 3-dimensional volume of ordinary space and per unit 3-dimensional volume of momentum space. Despite first appearances, N turns out to be a geometric, frame-independent entity. This N and the laws it obeys provide
Statistical Hadronization and Holography
Jacopo Bechi
2009-12-17T23:59:59.000Z
In this paper we consider some issues about the statistical model of the hadronization in a holographic approach. We introduce a Rindler like horizon in the bulk and we understand the string breaking as a tunneling event under this horizon. We calculate the hadron spectrum and we get a thermal, and so statistical, shape for it.
Parallel contingency statistics with Titan.
Thompson, David C.; Pebay, Philippe Pierre
2009-09-01T23:59:59.000Z
This report summarizes existing statistical engines in VTK/Titan and presents the recently parallelized contingency statistics engine. It is a sequel to [PT08] and [BPRT09] which studied the parallel descriptive, correlative, multi-correlative, and principal component analysis engines. The ease of use of this new parallel engines is illustrated by the means of C++ code snippets. Furthermore, this report justifies the design of these engines with parallel scalability in mind; however, the very nature of contingency tables prevent this new engine from exhibiting optimal parallel speed-up as the aforementioned engines do. This report therefore discusses the design trade-offs we made and study performance with up to 200 processors.
Topology for statistical modeling of petascale data.
Pascucci, Valerio (University of Utah, Salt Lake City, UT); Mascarenhas, Ajith Arthur; Rusek, Korben (Texas A& M University, College Station, TX); Bennett, Janine Camille; Levine, Joshua (University of Utah, Salt Lake City, UT); Pebay, Philippe Pierre; Gyulassy, Attila (University of Utah, Salt Lake City, UT); Thompson, David C.; Rojas, Joseph Maurice (Texas A& M University, College Station, TX)
2011-07-01T23:59:59.000Z
This document presents current technical progress and dissemination of results for the Mathematics for Analysis of Petascale Data (MAPD) project titled 'Topology for Statistical Modeling of Petascale Data', funded by the Office of Science Advanced Scientific Computing Research (ASCR) Applied Math program. Many commonly used algorithms for mathematical analysis do not scale well enough to accommodate the size or complexity of petascale data produced by computational simulations. The primary goal of this project is thus to develop new mathematical tools that address both the petascale size and uncertain nature of current data. At a high level, our approach is based on the complementary techniques of combinatorial topology and statistical modeling. In particular, we use combinatorial topology to filter out spurious data that would otherwise skew statistical modeling techniques, and we employ advanced algorithms from algebraic statistics to efficiently find globally optimal fits to statistical models. This document summarizes the technical advances we have made to date that were made possible in whole or in part by MAPD funding. These technical contributions can be divided loosely into three categories: (1) advances in the field of combinatorial topology, (2) advances in statistical modeling, and (3) new integrated topological and statistical methods.
Gunst, R. F.
2013-05-01T23:59:59.000Z
Phase 3 of the EPAct/V2/E-89 Program investigated the effects of 27 program fuels and 15 program vehicles on exhaust emissions and fuel economy. All vehicles were tested over the California Unified Driving Cycle (LA-92) at 75 degrees F. The program fuels differed on T50, T90, ethanol, Reid vapor pressure, and aromatics. The vehicles tested were new, low-mileage 2008 model year Tier 2 vehicles. A total of 956 test runs were made. Comprehensive statistical modeling and analyses were conducted on methane, carbon dioxide, carbon monoxide, fuel economy, non-methane hydrocarbons, non-methane organic gases, oxides of nitrogen, particulate matter, and total hydrocarbons. In general, model fits determined that emissions and fuel economy were complicated by functions of the five fuel parameters. An extensive evaluation of alternative model fits produced a number of competing model fits. Many of these alternative fits produce similar estimates of mean emissions for the 27 program fuels but should be carefully evaluated for use with emerging fuels with combinations of fuel parameters not included here. The program includes detailed databases on each of the 27 program fuels on each of the 15 vehicles and on each of the vehicles on each of the program fuels.
. Former M.Sc. in Statistics students have found employment in industry, government, IT, economics statistical reasoning, techniques and models in the analysis of real data and employ technical computingThe M.Sc. in Statistics is aimed at students who have an undergraduate degree in Statistics
SDI: Statistical dynamic interactions
Blann, M.; Mustafa, M.G. (Lawrence Livermore National Lab., CA (USA)); Peilert, G.; Stoecker, H.; Greiner, W. (Frankfurt Univ. (Germany, F.R.). Inst. fuer Theoretische Physik)
1991-04-01T23:59:59.000Z
We focus on the combined statistical and dynamical aspects of heavy ion induced reactions. The overall picture is illustrated by considering the reaction {sup 36}Ar + {sup 238}U at a projectile energy of 35 MeV/nucleon. We illustrate the time dependent bound excitation energy due to the fusion/relaxation dynamics as calculated with the Boltzmann master equation. An estimate of the mass, charge and excitation of an equilibrated nucleus surviving the fast (dynamic) fusion-relaxation process is used as input into an evaporation calculation which includes 20 heavy fragment exit channels. The distribution of excitations between residue and clusters is explicitly calculated, as is the further deexcitation of clusters to bound nuclei. These results are compared with the exclusive cluster multiplicity measurements of Kim et al., and are found to give excellent agreement. We consider also an equilibrated residue system at 25% lower initial excitation, which gives an unsatisfactory exclusive multiplicity distribution. This illustrates that exclusive fragment multiplicity may provide a thermometer for system excitation. This analysis of data involves successive binary decay with no compressional effects nor phase transitions. Several examples of primary versus final (stable) cluster decay probabilities for an A = 100 nucleus at excitations of 100 to 800 MeV are presented. From these results a large change in multifragmentation patterns may be understood as a simple phase space consequence, invoking neither phase transitions, nor equation of state information. These results are used to illustrate physical quantities which are ambiguous to deduce from experimental fragment measurements. 14 refs., 4 figs.
Statistics: Part 1 1. Why bother with statistics?
Francis, Paul
Statistics: Part 1 1. Why bother with statistics? Why is statistics so necessary for observational. But your data just don't seem to back up their claim. Statistics allows you to determine how confidently) practical introduction to those bits of statistics most vital to observational astronomy. 2. What
Statistics 36-756: Advanced Statistics II Syllabus: Fall, 2006
Fienberg, Stephen E.
Statistics 36-756: Advanced Statistics II Syllabus: Fall, 2006 Instructor: Stephen E. Fienberg 132G: Â· To consider major topics from statistical theory and the foundations of inference not covered in Statistics 36-756: Advanced Statistics I, such as exchangeability, the axiomatic foundation of subjective probability
Pearson's Goodness of Fit Statistic as a Score Test Statistic
Smyth, Gordon K.
Pearson's Goodness of Fit Statistic as a Score Test Statistic Gordon K. Smyth Abstract For any generalized linear model, the Pearson goodness of fit statistic is the score test statistic for testing the current model against the saturated model. The re- lationship between the Pearson statistic
thesis for the degree of licentiate of engineering Contributions to Statistical
Patriksson, Michael
thesis for the degree of licentiate of engineering Contributions to Statistical Analysis of Gene to Statistical Analysis of Gene Expression Data Anders SjÂ¨ogren c Anders SjÂ¨ogren, 2005 Licentiate Thesis ISSN
Simon, P; van Waerbeke, L; Hoekstra, H; Erben, T; Fu, L; Harnois-Déraps, J; Heymans, C; Hildebrandt, H; Kilbinger, M; Kitching, T D; Miller, L; Schrabback, T
2015-01-01T23:59:59.000Z
We study the correlations of the shear signal between triplets of sources in the Canada-France-Hawaii Lensing Survey (CFHTLenS) to probe cosmological parameters via the matter bispectrum. In contrast to previous studies, we adopted a non-Gaussian model of the data likelihood which is supported by our simulations of the survey. We find that for state-of-the-art surveys, similar to CFHTLenS, a Gaussian likelihood analysis is a reasonable approximation, albeit small differences in the parameter constraints are already visible. For future surveys we expect that a Gaussian model becomes inaccurate. Our algorithm for a refined non-Gaussian analysis and data compression is then of great utility especially because it is not much more elaborate if simulated data are available. Applying this algorithm to the third-order correlations of shear alone in a blind analysis, we find a good agreement with the standard cosmological model: $\\Sigma_8$=$\\sigma_8$ $(\\Omega_{\\rmm}/0.27)^{0.64}$=$0.79^{+0.08}_{-0.11}$ for a flat $\\La...
Nonstationary statistical theory for multipactor
Anza, S.; Vicente, C.; Gil, J. [Aurora Software and Testing S.L., Edificio de Desarrollo Empresarial 9B, Universidad Politecnica de Valencia, Camino de Vera s/n, 46022 Valencia (Spain); Boria, V. E. [Departamento de Comunicaciones-iTEAM, Universidad Politecnica de Valencia, Camino de Vera s/n, 46022 Valencia (Spain); Gimeno, B. [Departamento de Fisica Aplicada y Electromagnetismo-ICMUV, Universitat de Valencia, c/Dr. Moliner, 50, 46100 Valencia (Spain); Raboso, D. [Payloads Systems Division, European Space Agency, 2200-AG Noordwijk (Netherlands)
2010-06-15T23:59:59.000Z
This work presents a new and general approach to the real dynamics of the multipactor process: the nonstationary statistical multipactor theory. The nonstationary theory removes the stationarity assumption of the classical theory and, as a consequence, it is able to adequately model electron exponential growth as well as absorption processes, above and below the multipactor breakdown level. In addition, it considers both double-surface and single-surface interactions constituting a full framework for nonresonant polyphase multipactor analysis. This work formulates the new theory and validates it with numerical and experimental results with excellent agreement.
FISHERY STATISTICS E UNITED STATES
SH 11 .A443X FISH FISHERY STATISTICS E UNITED STATES ^ 1951 &ch 3. \\Â§^ ^/'Â· m:^ STATISTICAL DIGEST. Farley, Director Statistical Digest 30 FISHERY STATISTICS OF THE UNITED STATES 1951 BY A. W. ANDERSON;Fishery Statistics of the United States and Alaska are compiled and published annually to make available
Multi-variate joint PDF for non-Gaussianities: exact formulation and generic approximations
Verde, Licia; Jimenez, Raul [ICREA and ICC, Instituto de Ciencias del Cosmos, University of Barcelona (IEEC-UB), Marti i Franques 1, Barcelona 08028 (Spain); Alvarez-Gaume, Luis [Theory Group, Physics Department, CERN, CH-1211, Geneva 23 (Switzerland); Heavens, Alan F. [Imperial Centre for Inference and Cosmology, Imperial College, Blackett Laboratory, Prince Consort Road, London SW7 2AZ U.K. (United Kingdom); Matarrese, Sabino, E-mail: liciaverde@icc.ub.edu, E-mail: raul.jimenez@icc.ub.edu, E-mail: luis.alvarez-gaume@cern.ch, E-mail: a.heavens@imperial.ac.uk, E-mail: sabino.matarrese@pd.infn.it [Dipartimento di Fisica e Astronomia G. Galilei, Universitá degli Studi di Padova, I-35131 Padova (Italy)
2013-06-01T23:59:59.000Z
We provide an exact expression for the multi-variate joint probability distribution function of non-Gaussian fields primordially arising from local transformations of a Gaussian field. This kind of non-Gaussianity is generated in many models of inflation. We apply our expression to the non-Gaussianity estimation from Cosmic Microwave Background maps and the halo mass function where we obtain analytical expressions. We also provide analytic approximations and their range of validity. For the Cosmic Microwave Background we give a fast way to compute the PDF which is valid up to more than 7? for f{sub NL} values (both true and sampled) not ruled out by current observations, which consists of expressing the PDF as a combination of bispectrum and trispectrum of the temperature maps. The resulting expression is valid for any kind of non-Gaussianity and is not limited to the local type. The above results may serve as the basis for a fully Bayesian analysis of the non-Gaussianity parameter.
Ba, Demba Elimane
2011-01-01T23:59:59.000Z
The formulation of multivariate point-process (MPP) models based on the Jacod likelihood does not allow for simultaneous occurrence of events at an arbitrarily small time resolution. In this thesis, we introduce two versatile ...
Nemati, Shamim, 1980-
2013-01-01T23:59:59.000Z
Physiological control systems involve multiple interacting variables operating in feedback loops that enhance an organism's ability to self-regulate and respond to internal and external disturbances. The resulting multivariate ...
Math 224 Multivariable Calculus and Analytic Geometry I Winter 2013 Instructor Amites Sarkar
Sarkar, Amites
Math 224 Multivariable Calculus and Analytic Geometry I Winter 2013 Instructor Amites Sarkar Text phone number is 650 7569 and my e-mail is amites.sarkar@wwu.edu #12;Course Objectives The successful
Math 224 Multivariable Calculus and Analytic Geometry I Winter 2014 Instructor Amites Sarkar
Sarkar, Amites
Math 224 Multivariable Calculus and Analytic Geometry I Winter 2014 Instructor Amites Sarkar Text is 650 7569 and my e-mail is amites.sarkar@wwu.edu #12;Course Objectives The successful student
Math 224 Multivariable Calculus and Analytic Geometry I Spring 2014 Instructor Amites Sarkar
Sarkar, Amites
Math 224 Multivariable Calculus and Analytic Geometry I Spring 2014 Instructor Amites Sarkar Text-mail is amites.sarkar@wwu.edu #12;Course Objectives The successful student will demonstrate: 1. Understanding of
Math 225 Multivariable Calculus and Analytic Geometry II Spring 2012 Instructor Amites Sarkar
Sarkar, Amites
Math 225 Multivariable Calculus and Analytic Geometry II Spring 2012 Instructor Amites Sarkar Text, in 216 Bond Hall. My phone number is 650 7569 and my e-mail is amites.sarkar@wwu.edu #12;Course
Statistical Mechanics with focus on
Johannesson, Henrik
Statistical Mechanics with focus on Liquids, Solutions and Colloidal Systems Course contents A. Foundations of statistical mechanics Classical dynamics Â Hamilton's and Liouville's equations The concept statistics. Ideal fermion or boson gases. Â Bose-Einstein condensation. The relationship between
School of Mathematics and Statistics
Du, Jie
School of Mathematics and Statistics Faculties of Arts Economics, Education, Engineering and Science INTERMEDIATE MATHEMATICS and STATISTICS 2012 THE UNIVERSITY OF SYDNEY #12;Contents 1. Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15 STAT2911 Probability and Statistical Models (Advanced) . . . . . . . . . . . 16 STAT2912
Brown, Gregory G.
Guide to Inferential Statistics (Hypothesis Testing Statistics) Statistic What does it measure of the variable from the 2 groups p-value less than .05 If p statistically different (reject null) If p > .05, there is no statistical difference in the mean values of groups (accept null) t
The statistical investigation of type Ib/c and II supernovae and their host galaxies
A. A. Hakobyan
2008-04-03T23:59:59.000Z
This is a statistical study of the properties of type Ib/c and II supernovae and of the integral parameters of their spiral host galaxies. The methods of one-dimensional and multivariate statistics were applied to the data sample. It was found that the Ib/c supernovae are more concentrated radially toward the centers of the galaxies than those of type II. The distributions of the radial distances R(SN)/R(25) for the type Ib/c and II supernovae in active galaxies are more concentrated toward the center than in normal galaxies. This effect is stronger for type Ib/c than for type II supernovae.
Statistics and samples 1.1 What is statistics?
Irwin, Darren
1 1 Statistics and samples 1.1 What is statistics? Biologists study the properties of living things to get sam- pled and who did not. Statistics is a technology that describes and measures aspects of nature from samples. Most importantly, statistics lets us quantify the uncertainty of these meas- ures
Statistics and Differential Geometry 18-466 Mathematical Statistics
Le Ny, Jerome
Statistics and Differential Geometry 18-466 Mathematical Statistics Jerome Le Ny December 14, 2005 of statistical curvature [Efr75], that most of the main concepts and methods of differ- ential geometry are of substantial interest in connection with the theory of statistical inference. This report describes in simple
Statistical Learning Theory of Protein Dynamics
Haas, Kevin
2013-01-01T23:59:59.000Z
integrated statistical learning and simulation approach tomolecular simulation, using statistical learning theory tomolecular simulation and statistical learning theory of
Indicators: Performance Statistics
Webb, Peter
% Quality Inspections Completed 66% +12% 95% 86% 95% 52% 81% 51% 76% Utilities: Performance Statistics of performance measures and specific color code target values. Trend status color indicators Â identifies changes from the prior month: Key: - 54% Electric - 0% All Zones Combined Zone Breakdown Steam/Chilled Water 0
General Indicators: Performance Statistics
Webb, Peter
Bank Central Svcs. Energy Mgmt Admin Preventive Maintenance: Fire/Life/Safety (FLS) 87% -4% 100% 47 Trend Initial Target Steam Chilled Utility Outages 2 +1% 0 0 0 Preventive Maintenance: Fire participation reading will be June 2012 *Budget data FYTD through March 2012 Maintenance: Performance Statistics
General Indicators: Performance Statistics
Webb, Peter
Bank Central Svcs. Energy Mgmt Admin Preventive Maintenance: Fire/Life/Safety (FLS) 85% -12% 100% 80 Trend Initial Target Steam Chilled Utility Outages 3 -5 0 1 0 Preventive Maintenance: Fire participation reading will be August 2012 *Budget data FYTD through May 2012 Maintenance: Performance Statistics
General Indicators: Performance Statistics
Webb, Peter
. Energy Mgmt Admin Preventive Maintenance: Fire/Life/Safety (FLS) 97% +10% 100% 96% 93% 99% 98% Non Chilled Utility Outages 8 +6% 0 1 2 Preventive Maintenance: Fire/Life/Safety (FLS) 100% No Change 100 will be June 2012 *Budget data FYTD through April 2012 Maintenance: Performance Statistics Current Month Change
Banaji,. Murad
MATHEMATICS AND STATISTICS Canterbury The UK's European university Undergraduate study #12;2 ACADEMIC EXCELLENCE AND INSPIRATIONAL TEACHING Much of science is based upon the application of mathematics as for computer science. New discoveries within mathematics affect not only science, but also our general
Topics on Regularization of Parameters in Multivariate Linear Regression
Chen, Lianfu
2012-02-14T23:59:59.000Z
are selected. Estimating the degrees of freedom when penalizing the entries of the matrices presents new computational challenges. A simulation study and real data analysis demonstrate that the MRCEII, which selects the tuning parameter of the precision matrix...
Computational and statistical tradeoffs via convex relaxation
5, 2013 (sent for review November 27, 2012) Modern massive datasets create a fundamental problem on data analysis that blend computer science and statistics. That classical perspectives from these fields are not adequate to address emerging problems in "Big Data" is ap- parent from their sharply divergent nature
The University of Chicago Department of Statistics
Stephens, Matthew
of Statistics The University of Chicago The Weekend Effect of Return on Crude Oil Prices THURSDAY, February 26, 2009 at 11:00 AM 110 Eckhart Hall, 5734 S. University Avenue ABSTRACT Crude oil prices experienced movements of WTI crude oil prices can be observed during the period from 1986 to 2008. Spectral analysis
Analysis of Semantic Building Blocks via Grobner Bases
Fernandez, Thomas
for the space of multivariate polynomials in n variables, capable of expressing arbitrary sums and productsAnalysis of Semantic Building Blocks via GrÂ¨obner Bases Jerry Swan1 , Geoffrey K. Neumann1 for greatest common divisor from univariate to multivariate polyno- mials. Since its invention in 1965, it has
High Performance Multivariate Visual Data Exploration for Extremely Large Data
Rubel, Oliver; Wu, Kesheng; Childs, Hank; Meredith, Jeremy; Geddes, Cameron G.R.; Cormier-Michel, Estelle; Ahern, Sean; Weber, Gunther H.; Messmer, Peter; Hagen, Hans; Hamann, Bernd; Bethel, E. Wes; Prabhat,
2008-08-22T23:59:59.000Z
One of the central challenges in modern science is the need to quickly derive knowledge and understanding from large, complex collections of data. We present a new approach that deals with this challenge by combining and extending techniques from high performance visual data analysis and scientific data management. This approach is demonstrated within the context of gaining insight from complex, time-varying datasets produced by a laser wakefield accelerator simulation. Our approach leverages histogram-based parallel coordinates for both visual information display as well as a vehicle for guiding a data mining operation. Data extraction and subsetting are implemented with state-of-the-art index/query technology. This approach, while applied here to accelerator science, is generally applicable to a broad set of science applications, and is implemented in a production-quality visual data analysis infrastructure. We conduct a detailed performance analysis and demonstrate good scalability on a distributed memory Cray XT4 system.
DCO Operations Interesting Statistics
DCO Operations Interesting Statistics 35 40 45 50 55 60 65 70 75 80 85 90 95 100 105 110 115 120 Chart by: HANDS DOWN SOFTWARE, www.handsdownsoftware.com 1.00 0.95 0.90 0.85 0.80 0.75 0.70 0.65 0.60 0 is annotated with the American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE
Homotopy in statistical physics
Ralph Kenna
2006-04-12T23:59:59.000Z
In condensed matter physics and related areas, topological defects play important roles in phase transitions and critical phenomena. Homotopy theory facilitates the classification of such topological defects. After a pedagogic introduction to the mathematical methods involved in topology and homotopy theory, the role of the latter in a number of mainly low-dimensional statistical-mechanical systems is outlined. Some recent activities in this area are reviewed and some possible future directions are discussed.
Design and performance of a scalable, parallel statistics toolkit.
Thompson, David C.; Bennett, Janine Camille; Pebay, Philippe Pierre
2010-11-01T23:59:59.000Z
Most statistical software packages implement a broad range of techniques but do so in an ad hoc fashion, leaving users who do not have a broad knowledge of statistics at a disadvantage since they may not understand all the implications of a given analysis or how to test the validity of results. These packages are also largely serial in nature, or target multicore architectures instead of distributed-memory systems, or provide only a small number of statistics in parallel. This paper surveys a collection of parallel implementations of statistics algorithm developed as part of a common framework over the last 3 years. The framework strategically groups modeling techniques with associated verification and validation techniques to make the underlying assumptions of the statistics more clear. Furthermore it employs a design pattern specifically targeted for distributed-memory parallelism, where architectural advances in large-scale high-performance computing have been focused. Moment-based statistics (which include descriptive, correlative, and multicorrelative statistics, principal component analysis (PCA), and k-means statistics) scale nearly linearly with the data set size and number of processes. Entropy-based statistics (which include order and contingency statistics) do not scale well when the data in question is continuous or quasi-diffuse but do scale well when the data is discrete and compact. We confirm and extend our earlier results by now establishing near-optimal scalability with up to 10,000 processes.
New likelihoods for shape analysis
Sylvain Fichet
2014-07-03T23:59:59.000Z
We introduce a new kind of likelihood function based on the sequence of moments of the data distribution. Both binned and unbinned data samples are discussed, and the multivariate case is also derived. Building on this approach we lay out the formalism of shape analysis for signal searches. In addition to moment-based likelihoods, standard likelihoods and approximate statistical tests are provided. Enough material is included to make the paper self-contained from the perspective of shape analysis. We argue that the moment-based likelihoods can advantageously replace unbinned standard likelihoods for the search of non-local signals, by avoiding the step of fitting Monte-Carlo generated distributions. This benefit increases with the number of variables simultaneously analyzed. The moment-based signal search is exemplified and tested in various 1D toy models mimicking typical high-energy signal--background configurations. Moment-based techniques should be particularly appropriate for the searches for effective operators at the LHC.
Parameterization and Statistical Analysis of Hurricane Waves
Mclaughlin, Patrick William
2014-05-03T23:59:59.000Z
developed sea state cap (Young and Verhagan 1996) to form the open coast and bay methodologies. This approach yields root mean square errors (RMSE) ranging from 0.01-0.46 m, with the majority of points below 0.3 m. This approach yields small bias values...
STATISTICAL ANALYSIS OF A DETERMINISTIC STOCHASTIC ORBIT
Kaufman, Allan N.
2013-01-01T23:59:59.000Z
DETERMINISTIC STOCHASTIC ORBIT Allan N. Kaufman, Henry O.I.a Deterministic Stochastic Orbit* Allan N. Kaufman, Henry D.
Understanding Manufacturing Energy Use Through Statistical Analysis
Kissock, J. K.; Seryak, J.
2004-01-01T23:59:59.000Z
. The energy required to chill this glycol should be dependent on outdoor air temperature. The rest of the chilled glycol is sent to fan-coil units that recirculate plant air near heat-generating equipment. The energy required to chill this glycol is much... less dependent on outdoor air temperature. Thus, comparing the two breakdowns suggests that about (13% - 10%) / 13% = 23% of chiller electricity use is devoted to the fan coil units and the balance to the make-up air units. The biggest...
Statistical Energy Analysis 2 Long. 3 Long.
Berlin,Technische Universität
Reciprocity Considerations Multi-modal Systems Multi-modal Coupling Coupling Lossfactor Variance. B.A.T. Petersson #12;6 · Built-up structures Heat pump · Structure-borne waves · Liquid · Single-degree-of-freedom-system, SDOF M RK=1/C x © Prof. B.A.T. Petersson SEA · Forced vibration M RK x F
A statistical analysis of tire tread wear
Sperberg, Ronald Leigh
1965-01-01T23:59:59.000Z
. o ~ 4 14 Tire ~uxement Teat QQXPmemt a o e ~ ~ a ~t CencMt9ome. Xmeyect Rem Sohedmles, , ~ ~ ~ ~ ~ a a e e a o ~ ~ ~ e o W a ~ XS lS lv Tzeatmeat ef the Rata. . . ~. . . . , . . . XS X X X ~ ANAEeYSXS OP ~ MM ~ e ~ e e e ~ ~ e ~ e ~ ~ SS... o e' 'e 'o e ' e 'e o e e ~ e Beanbag ef the Test ef Bean Siffexeneea an4 Kate~inn 'o o o ~ e o ~ e e o. e ~ R~B658atl~o e ~ ~ ~ o e e e e ~ ~ e o 45 41 4S 1. Tire Wear in Relation to Period, Position, Tire & Treatment I, encl Tclsg8xntnre...
Independent Statistics & Analysis Drilling Productivity Report
Annual Energy Outlook 2013 [U.S. Energy Information Administration (EIA)]
AFDC Printable Version Share this resource Send a link to EERE: Alternative Fuels Data Center Home Page to someone by E-mail Share EERE: Alternative Fuels Data Center Home Page on Facebook Tweet about EERE: Alternative Fuels Data Center Home Page on Twitter Bookmark EERE:1 First Use of Energy for All Purposes (Fuel and Nonfuel),Feet) Year Jan Feb Mar Apr May Jun Jul(Summary) " ,"ClickPipelines AboutDecember 2005 (Thousand9,0, 1997Environment >7,992000Implications-'VIndependent
Statistical pert analysis using integral operators
Ohlendorf, Curtis Gilbert
1969-01-01T23:59:59.000Z
of Optimization Algorithm. IV CONCLUSIONS AND SUMMARY. 23 25 27 BIBLIOGRAPHY' 29 VITA. 30 vi LIST OF FIGURES Figure Page 2. 1 Activities In Parallel. 2. 2 Activities In Series. 2. 3 2. 4 Wheatstone Bridge Double Wheatstone Bridge. 10 10 2. 5...] extended Hartley and Wortham's method by developing integral operators for activities in double wheatstone bridges and in different types of crise-crosses. Ringer [8] also developed integral operators for activities whose completion times...
Independent Statistics & Analysis Drilling Productivity Report
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
AFDC Printable Version Share this resource Send a link to EERE: Alternative Fuels Data Center Home Page to someone by E-mail Share EERE: Alternative Fuels Data Center Home Page on Facebook Tweet about EERE: Alternative Fuels Data Center Home Page on Twitter Bookmark EERE:1 First Use of Energy for All Purposes (Fuel and Nonfuel),Feet) Year Jan Feb Mar Apr MayAtmospheric Optical Depth7-1D: Vegetation ProposedUsingFun withconfinementEtching.348 270Energy InnovationDepartment of Energy
A MULTIVARIATE FIT LUMINOSITY FUNCTION AND WORLD MODEL FOR LONG GAMMA-RAY BURSTS
Shahmoradi, Amir, E-mail: amir@physics.utexas.edu [Institute for Fusion Studies, The University of Texas at Austin, TX 78712 (United States)] [Institute for Fusion Studies, The University of Texas at Austin, TX 78712 (United States)
2013-04-01T23:59:59.000Z
It is proposed that the luminosity function, the rest-frame spectral correlations, and distributions of cosmological long-duration (Type-II) gamma-ray bursts (LGRBs) may be very well described as a multivariate log-normal distribution. This result is based on careful selection, analysis, and modeling of LGRBs' temporal and spectral variables in the largest catalog of GRBs available to date: 2130 BATSE GRBs, while taking into account the detection threshold and possible selection effects. Constraints on the joint rest-frame distribution of the isotropic peak luminosity (L{sub iso}), total isotropic emission (E{sub iso}), the time-integrated spectral peak energy (E{sub p,z}), and duration (T{sub 90,z}) of LGRBs are derived. The presented analysis provides evidence for a relatively large fraction of LGRBs that have been missed by the BATSE detector with E{sub iso} extending down to {approx}10{sup 49} erg and observed spectral peak energies (E{sub p} ) as low as {approx}5 keV. LGRBs with rest-frame duration T{sub 90,z} {approx}< 1 s or observer-frame duration T{sub 90} {approx}< 2 s appear to be rare events ({approx}< 0.1% chance of occurrence). The model predicts a fairly strong but highly significant correlation ({rho} = 0.58 {+-} 0.04) between E{sub iso} and E{sub p,z} of LGRBs. Also predicted are strong correlations of L{sub iso} and E{sub iso} with T{sub 90,z} and moderate correlation between L{sub iso} and E{sub p,z}. The strength and significance of the correlations found encourage the search for underlying mechanisms, though undermine their capabilities as probes of dark energy's equation of state at high redshifts. The presented analysis favors-but does not necessitate-a cosmic rate for BATSE LGRBs tracing metallicity evolution consistent with a cutoff Z/Z{sub Sun} {approx} 0.2-0.5, assuming no luminosity-redshift evolution.
Frederi G. Viens Professor of Statistics and Mathematics
Viens, Frederi G.
history on page 13) 19921996 National Defense Science and Engineering Graduate Fellow, U.C. Irvine 1996 U calculus, Elementary Probability, Elementary Statistics. Undergraduate upper division: Discrete mathematics, Linear Algebra, Intermediate probability and statistics, Real Analysis, Actuarial Models (life
Frederi G. Viens Professor of Statistics and Mathematics
Viens, Frederi G.
history on page 13) 1992-1996 National Defense Science and Engineering Graduate Fellow, U.C. Irvine 1996 U calculus, Elementary Probability, Elementary Statistics. Undergraduate upper division: Discrete mathematics, Linear Algebra, Intermediate probability and statistics, Real Analysis, Actuarial Models (life
Beyth, M.; Broxton, D.; McInteer, C.; Averett, W.R.; Stablein, N.K.
1980-06-01T23:59:59.000Z
Multivariate statistical analysis to support the National Uranium Resource Evaluation and to evaluate strategic and other commercially important mineral resources was carried out on Hydrogeochemical and Stream Sediment Reconnaissance data from the Montrose quadrangle, Colorado. The analysis suggests that: (1) the southern Colorado Mineral Belt is an area favorable for uranium mineral occurrences; (2) carnotite-type occurrences are likely in the nose of the Gunnison Uplift; (3) uranium mineral occurrences may be present along the western and northern margins of the West Elk crater; (4) a base-metal mineralized area is associated with the Uncompahgre Uplift; and (5) uranium and base metals are associated in some areas, and both are often controlled by faults trending west-northwest and north.
anti-parasite cytokine responses: Topics by E-print Network
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
and Fasciola hepatica-rat were statistically coanalyzed. 1 H NMR spectroscopy and multivariate statistical analysis were used to characterize the urine and plasma...
affect plasma cytokine: Topics by E-print Network
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
and Fasciola hepatica-rat were statistically coanalyzed. 1 H NMR spectroscopy and multivariate statistical analysis were used to characterize the urine and plasma...
International petroleum statistics report
NONE
1995-10-01T23:59:59.000Z
The International Petroleum Statistics Report is a monthly publication that provides current international oil data. This report presents data on international oil production, demand, imports, exports and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). Section 2 presents an oil supply/demand balance for the world, in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries.
Broader source: Energy.gov (indexed) [DOE]
AFDC Printable Version Share this resource Send a link to EERE: Alternative Fuels Data Center Home Page to someone by E-mail Share EERE: Alternative Fuels Data Center Home Page on Facebook Tweet about EERE: Alternative Fuels Data Center Home Page on Twitter Bookmark EERE:1 First Use of Energy for All Purposes (Fuel and Nonfuel),Feet) Year Jan Feb Mar Apr May Jun Jul(Summary) "of EnergyEnergyENERGYWomentheATLANTA, GA5 &of Energy memoCityTheDepartmentKey9Statistical Table by
Broader source: Energy.gov (indexed) [DOE]
AFDC Printable Version Share this resource Send a link to EERE: Alternative Fuels Data Center Home Page to someone by E-mail Share EERE: Alternative Fuels Data Center Home Page on Facebook Tweet about EERE: Alternative Fuels Data Center Home Page on Twitter Bookmark EERE:1 First Use of Energy for All Purposes (Fuel and Nonfuel),Feet) Year Jan Feb Mar Apr May Jun Jul(Summary) "of EnergyEnergyENERGYWomentheATLANTA, GA5 &of EnergyOrganization (dollars in5Statistical Table by
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
AFDC Printable Version Share this resource Send a link to EERE: Alternative Fuels Data Center Home Page to someone by E-mail Share EERE: Alternative Fuels Data Center Home Page on Facebook Tweet about EERE: Alternative Fuels Data Center Home Page on Twitter Bookmark EERE:1 First Use of Energy for All Purposes (Fuel and Nonfuel),Feet) Year Jan Feb Mar Apr May JunDatastreamsmmcrcalgovInstrumentsrucLas Conchas recoveryLaboratorySpeedingOptimizing I/OP-Glycoprotein StructureStatistics Cluster
STATISTICS ESSENTIAL SKILL OUTCOME STATEMENTS
Gering, Jon C.
SB #3205 STATISTICS ESSENTIAL SKILL OUTCOME STATEMENTS Newly-Revised Version: A liberally educated person is capable of being both a producer and a consumer of statistical information with some basic level of competency. He or she should be able to perform basic statistical analyses (producer
Statistical Ensembles with Volume Fluctuations
Mark I. Gorenstein
2008-06-17T23:59:59.000Z
The volume fluctuations in statistical mechanics are discussed. First, the volume fluctuations in ensembles with a fixed external pressure, the so called pressure ensembles, are considered. Second, a generalization of the pressure ensembles is suggested. Namely, the statistical ensembles with the volume fluctuating according to externally given distributions are considered. Several examples and possible applications in statistical models of hadron production are discussed.
Statistical Explanation WESLEY C. SALMON
Fitelson, Branden
Statistical Explanation WESLEY C. SALMON Indiana University EVER SINCE IUS CLASSIC PAPER with Paul This paper grew out of a discussion of statistical expl8D8tion presented at the meeting of the American Probabilities in Statistical Explanation," along with Henry E. Kyburg's comments and my rejoinder, were
STATISTICS OF PRECIPITATION EXTREMES: QUANTIFYING CONFIDENCE IN TRENDS
Katz, Richard
1 STATISTICS OF PRECIPITATION EXTREMES: QUANTIFYING CONFIDENCE IN TRENDS Rick Katz Institute of the validity of this analysis." -- Emil Gumbel #12;3 Outline (1) Introduction (2) Extreme Value Analysis under Stationarity: Classical Approach (3) Extreme Value Analysis under Stationarity: Modern Approach (4) Extreme
International petroleum statistics report
NONE
1997-05-01T23:59:59.000Z
The International Petroleum Statistics Report is a monthly publication that provides current international oil data. This report is published for the use of Members of Congress, Federal agencies, State agencies, industry, and the general public. Publication of this report is in keeping with responsibilities given the Energy Information Administration in Public Law 95-91. The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1995; OECD stocks from 1973 through 1995; and OECD trade from 1985 through 1995.
A MULTIVARIATE SKEW-GARCH Giovanni De Luca, Marc G. Genton and
Genton, Marc G.
A MULTIVARIATE SKEW-GARCH MODEL Giovanni De Luca, Marc G. Genton and Nicola Loperfido ABSTRACT conditionally on the sign of the one- day lagged US return is skew-normal. The resulting model is denoted Skew-GARCH) and the Exponential GARCH model (Nelson, 1991) allowing for the inclusion of the asymmetric effect of volatility
Forecasting the Hourly Ontario Energy Price by Multivariate Adaptive Regression Splines
CaÃ±izares, Claudio A.
1 Forecasting the Hourly Ontario Energy Price by Multivariate Adaptive Regression Splines H. In this paper, the MARS technique is applied to forecast the hourly Ontario energy price (HOEP). The MARS models values of the latest pre- dispatch price and demand information, made available by the Ontario
Al-Duwaish, Hussain N.
MODELLING OF A NONLINEAR MULTIVARIABLE BOILER PLANT USING HAMMERSTEIN MODEL, A NONPARAMETRIC mathematically and prac- tically tractable. Boilers are industrial units, which are used for gener- ating steam of fuel. Boiler operation is a complex operation in which hot water must be delivered to a turbine
Design of angle-tolerant multivariate optical elements for chemical imaging
Myrick, Michael Lenn
Design of angle-tolerant multivariate optical elements for chemical imaging Olusola O. Soyemi in imaging applications. We report a method for the design of angle-insensitive MOEs based on modification of Bismarck Brown and Crystal Violet, was designed and its performance simulated. For angles of incidence
and Conditions Agreement. References [1] J. C. Bezdek. Pattern Recognition with FuzzyObjective Func- tionUnsupervised Fuzzy Clustering of Multi-variate Image Data Klaus Baggesen Hilger and Allan Aasbjerg Nielsen IMM, Department of Mathematical Modelling Technical University of Denmark, 2800 Lyngby, Denmark e
Blei, Ron
, and Fabiana Cardetti Mathematics Education REU Project University of Connecticut Summer 2012 BACKGROUND and gains. Proceedings of the Thirty First Annual Meeting of the North American Chapter of the International supplementary application-based tutorials in the multivariable calculus course. International Journal
Approximation using scattered shifts of a multivariate Ronald DeVore and Amos Ron
Liblit, Ben
, of a fixed function occurs in many applications such as data fit- ting, neural networks, and learning theory() of the (Â· - ), .1 We refer to the book [32] for more details on radial basis functions in general, and their useApproximation using scattered shifts of a multivariate function Ronald DeVore and Amos Ron
Approximation using scattered shifts of a multivariate Ronald DeVore and Amos Ron
Liblit, Ben
Approximation using scattered shifts of a multivariate function # Ronald DeVore and Amos Ron February 17, 2008 Abstract The approximation of a general dÂvariate function f by the shifts #(Â· - #), # # # # R d , of a fixed function # occurs in many applications such as data fitÂ ting, neural networks
Multivariate Non-stationary Stochastic Streamflow Models for Two Urban , R. Vogel2
Vogel, Richard M.
as changes in the water use cycle. Urbanization leads to construction of water distribution systems and storm1 Multivariate Non-stationary Stochastic Streamflow Models for Two Urban Watersheds M. Ng1 , R: urbanization and changes in climate and water use are examples of such influences. The evolution
A Multivariate Randomization Test of Association Applied to Cognitive Test Results
A Multivariate Randomization Test of Association Applied to Cognitive Test Results Albert Ahumada and Bettina Beard NASA Ames Research Center Abstract Randomization tests provide a conceptually simple, distribution-free way to implement significance testing. We have applied this method to the problem
Detecting Climate Change in Multivariate Time Series Data by Novel Clustering and Cluster Tracing Aachen University, Germany {kremer, guennemann, seidl}@cs.rwth-aachen.de Abstract--Climate change can series, and trace the clusters over time. A climate pattern is categorized as a changing pattern
Human-Automation Collaboration in Complex Multivariate Resource Allocation Decision Support Systems
Cummings, Mary "Missy"
1 Human-Automation Collaboration in Complex Multivariate Resource Allocation Decision Support Systems M.L. Cummings* Sylvain Bruni Humans and Automation Laboratory 77 Massachusetts Ave, 33 uncertainty, typical of supervisory control environments, it is critical that some balance of human-automation
Stoffelen, Ad
forecast errors of the European Centre for Medium-Range Weather Forecasts (ECMWF) model. Tropical massÂwindImpact Assessment of Simulated Doppler Wind Lidars with a Multivariate Variational Assimilation, De Bilt, Netherlands CHRISTOPHE ACCADIA AND PETER SCHLÃ?SSEL European Organisation
Shneiderman, Ben
on health care. The U.S. Department of Health and Human Services keeps track of a variety of health care that enables users to visualize health care data in multivariate space as well as geospatially. It is designed a compre- hensible and powerful interface for policy makers to visualize health care quality, public health
Kimball, Sarah
making, as well as for taking prompt and effective actions to avoidreduce the effects of droughts to characterize agricultural and hydrological droughts (Hayes et al. 2011). The standard- ization conceptA Nonparametric Multivariate Multi-Index Drought Monitoring Framework ZENGCHAO HAO AND AMIR
Note on Design Criteria for Rainbow-Type Multivariates Jintai Ding1
This was a short note that deals with the design of Rainbow or "stagewise unbalanced oil-and-vinegar" multivariate parameters in current schemes. These can be ameliorated according to an updated list of security design, 2006: Second Draft, TWISC (Taiwan Information Security Center) tech report Â· September 5, 2006
Supersonic combustion studies using a multivariate quadrature based method for combustion modeling
Raman, Venkat
Supersonic combustion studies using a multivariate quadrature based method for combustion modeling function (PDF) of thermochemical variables can be used for accurately computing the combustion source term of predictive models for supersonic combustion is a critical step in design and development of scramjet engines
Ilchmann, Achim; Pahl, M. : Adaptive Multivariable pH Regulation of a Biogas Tower Reactor
Knobloch,JÃ¼rgen
Ilchmann, Achim; Pahl, M. : Adaptive Multivariable pH Regulation of a Biogas Tower Reactor Zuerst. The adaptive controller was successlullytesteclover il pcriod of tu'o nonths at a biogas tower reuetoriu pilot are not applicable to the biogas tower reÃ¼ctor.since a dontinatingf-eatureof the new reactol' prir-rciplc-is its
Mennis, Jeremy
explain the variation in hazardous facility location? Many environmental equity studies employ some form hazardous facility location (e.g., Ringquist 1997; Hird and Reese 1998; Sadd et al. 1999a). Multivariate sta mapping, can reveal this spatial nonstationarity and shed light on its form. We use GWR, in combination
Franklin, W. Randolph
5D-ODETLAP: A NOVEL FIVE-DIMENSIONAL COMPRESSION METHOD ON TIME-VARYING MULTIVARIABLE GEOSPATIAL dimensional (5D) geospatial dataset consists of several multivariable 4D datasets, which are sequences of time technique for 5D geospatial data as a whole, instead of applying 3D compression method on each 3D slice
Ferreira, MÃ¡rcia M. C.
Multivariate accelerated shelf-life testing: a novel approach for determining the shelf-lives, accelerated studies have to be conducted and a third parameter has to be estimated: the acceleration factor approach for determining the shelf-life of industrialised food products, the Multivariate Accelerated Shelf
Johansson, Karl Henrik
456 IEEE TRANSACTIONS ON CONTROL SYSTEMS TECHNOLOGY, VOL. 8, NO. 3, MAY 2000 The Quadruple-Tank Abstract--A novel multivariable laboratory process that con- sists of four interconnected water tanks-plane. In this way the quadruple-tank process is ideal for illustrating many concepts in multivariable control
MATH 482A/598A: Statistics Practicum Spring 2013 Applied Mathematics and Statistics, CSM Syllabus
will apply statistical principles to data analysis through advanced work, leading to a written report must implement the method (or a modified version thereof) on real wind data. Secondly, all teams, lectures will be given over assigned readings in basic time series, forecasting, and general wind energy
Statistical Moments - Spring/Summer 2007
UCLA Department of Statistics
2007-01-01T23:59:59.000Z
Re- peated Measures. ! Statistics in Medi- cine, 26, 1552-with a minor in statistics by Summer 2007. These students1 UCLA DEPARTMENT OF STATISTICS NEWSLETTER ! status, more
Statistical Moments - Spring/Summer 2008
UCLA Department of Statistics
2008-01-01T23:59:59.000Z
nonignorable missing values. Statistics in Medicine, 13.species. Annals of Applied Statistics, 1: 36-65. 22. DinovLarge Run Sizes. UCLA Statistics Electronic Publications,
Statistics Online Computational Resource for Education
Dinov, Ivo D; Christou, Nicolas
2009-01-01T23:59:59.000Z
and assess- ment of the statistic online computationalintroductory probability and statistics courses. Computers &Leslie, M. (2003). Statistics starter kit. Science, © 2009
Journal of Multivariate Analysis 92 (2005) 186204 Covariate selection for semiparametric hazard
McKeague, Ian
partial likelihood--henceforth PPL [16], a backwards elimination covariate selection method [9], Bayesian model averaging [14,15], Bayesian variable selection [8], the lasso method for PPL [17], and nonconcave PPL [7]. Large sample
Multivariable analysis of spectral measurements for the characterization of semiconductor processes
White, David A. (David Allan), 1966-
2001-01-01T23:59:59.000Z
The availability of affordable and reliable optical sensor technology and the abundance of data that these sensors now provide have created new opportunities to better characterize and control semiconductor processes in ...
Applied Multivariate Analysis, Notes originally for the course of Lent 2004,
Altham, Pat
-definite matrix, with eigen-values 1, . . . , p say (which are then > 0, since V is positive-definite). Let u1 0 for any vector u, ie the matrix V is positive semi- definite. Here is one possible, . . . , up be the corresponding eigen-vectors of V , thus V ui = iui, for 1 i p, and uT i uj = 0 for i = j
Applied Multivariate Analysis, Notes for course of Lent 2004, MPhil in
Lee, Stephen
) and V is a positive-definite matrix, with eigen-values 1, . . ., p say (which are then > 0, since V 0 for any vector u, ie the matrix V is positive semi- definite. Here is one possible is positive-definite). Let u1, . . . , up be the corresponding eigen-vectors of V , thus V ui = iui, for 1 i
Hierarchical Linear Discriminant Analysis for Beamforming
Park, Haesun
model of h-LDA by relating it to the two-way multivariate analysis of variance (MANOVA), which fits well dimension reduction, hierarchical linear discriminant analysis (h-LDA) to a well-known spatial localization1 Hierarchical Linear Discriminant Analysis for Beamforming Jaegul Choo , Barry L. Drake
International petroleum statistics report
NONE
1995-11-01T23:59:59.000Z
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1994; OECD stocks from 1973 through 1994; and OECD trade from 1984 through 1994.
International petroleum statistics report
NONE
1996-05-01T23:59:59.000Z
The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1995; OECD stocks from 1973 through 1995; and OECD trade from 1084 through 1994.
International petroleum statistics report
NONE
1997-07-01T23:59:59.000Z
The International Petroleum Statistics Report is a monthly publication that provides current international data. The report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent 12 months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1996; OECD stocks from 1973 through 1996; and OECD trade from 1986 through 1996.
International petroleum statistics report
NONE
1996-10-01T23:59:59.000Z
The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. Word oil production and OECD demand data are for the years 1970 through 1995; OECD stocks from 1973 through 1995; and OECD trade from 1985 through 1995.
Statistics as a dynamical attractor
Michail Zak
2012-08-30T23:59:59.000Z
It is demonstrated that any statistics can be represented by an attractor of the solution to a corresponding systen of ODE coupled with its Liouville equation. Such a non-Newtonian representation allows one to reduce foundations of statistics to better established foundations of ODE. In addition to that, evolution to the attractor reveals possible micro-mechanisms driving random events to the final distribution of the corresponding statistical law. Special attention is concentrated upon the power law and its dynamical interpretation: it is demonstrated that the underlying dynamics supports a " violent reputation" of the power law statistics.
Algebraic Statistics methods Giovanni Pistone
Ceragioli, Francesca
Algebraic Statistics methods for DOE Giovanni Pistone Department of Mathematics Politecnico di Torino, Italy giovanni.pistone@polito.it Eva Riccomagno Department of Mathematics Politecnico di Torino
Small Business Goals and Statistics
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
& Statistics The Idaho National Laboratory (INL) is committed to supporting the small business objectives of the U.S. Government and the Department of Energy (DOE) and recognizes...
Computer, Computational, and Statistical Sciences Division
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Computing CCS Division Computer, Computational, and Statistical Sciences Division Computational physics, computer science, applied mathematics, statistics and the integration of...
Li, Yang; Qin, Le; Zou, Shipeng; Long, Shijun [School of Information Engineering, Guangdong University of Technology, Guangzhou, 510006 (China)
2014-04-11T23:59:59.000Z
A lots of problems may occur frequently when controlling the temperature of the enamelling machine oven in the real industrial process, such as multi-variable coupled problem. an experimental rig with triple inputs and triple outputs was devised and a simulation modeling was established accordingly in this study,. the temperature control system based on the feedforward compensation algorithm was proposed. Experimental results have shown that the system is of high efficiency, good stability and promising application.
Building statistical models by visualization
Minka,Tom
books Â· "The Elements of Graphing Data", William Cleveland, 2nd Ed. Â· "Visualizing Data", WilliamBuilding statistical models by visualization Tom Minka CMU Statistics Dept #12;Outline-scatterplot for unpaired data Â· Quantile of x = fraction of points
December 2000 A STATISTICAL TEST
December 2000 A STATISTICAL TEST SUITE FOR RANDOM AND PSEUDORANDOM NUMBER GENERATORS challenges in authentication protocols. NIST Special Publication (SP) 800-22, A Statistical Test Suite testing of random number and pseudorandom number generators (RNGs and PRNGs) that may be used for many
Key China Energy Statistics 2012
Levine, Mark
2013-01-01T23:59:59.000Z
Total Crude Oil Imports: 239 Mt World's Oil Consumption (consumption - Urban Statistical Difference Appendix 3: Energy Balance/China 2010 (cont’d) Mtce Crude Oilconsumption - Urban Other Statistical Difference Appendix 3: Energy Balance/China 2010 (cont’d) Physical Quantity Crude Oil
Probability, Statistics, and Stochastic Processes
Olofsson, Peter
, science, and engineering students. Other than the basic probability theory, my goal was to in- cludeProbability, Statistics, and Stochastic Processes Peter Olofsson A Wiley-Interscience Publication had been teaching a course on calculus-based probability and statistics mainly for mathematics
Practical Statistics for the LHC
Cranmer, Kyle
2015-01-01T23:59:59.000Z
This document is a pedagogical introduction to statistics for particle physics. Emphasis is placed on the terminology, concepts, and methods being used at the Large Hadron Collider. The document addresses both the statistical tests applied to a model of the data and the modeling itself.
Statistics in Practice Forensic Science
Lucy, David
Statistics in Practice Forensic Science Dr. David Lucy d.lucy@lancaster.ac.uk Lancaster University Statistics in Practice Â p.1/36 #12;Forensic Science Criminal evidence becoming increasingly "scientific in Practice Â p.2/36 #12;Forensic Science Greater realisation that uncertainty is important has lead to
Damage detection in mechanical structures using extreme value statistic.
Worden, K.; Allen, D. W. (David W.); Sohn, H. (Hoon); Farrar, C. R. (Charles R.)
2002-01-01T23:59:59.000Z
The first and most important objective of any damage identification algorithms is to ascertain with confidence if damage is present or not. Many methods have been proposed for damage detection based on ideas of novelty detection founded in pattern recognition and multivariate statistics. The philosophy of novelty detection is simple. Features are first extracted from a baseline system to be monitored, and subsequent data are then compared to see if the new features are outliers, which significantly depart from the rest of population. In damage diagnosis problems, the assumption is that outliers are generated from a damaged condition of the monitored system. This damage classification necessitates the establishment of a decision boundary. Choosing this threshold value is often based on the assumption that the parent distribution of data is Gaussian in nature. While the problem of novelty detection focuses attention on the outlier or extreme values of the data i.e. those points in the tails of the distribution, the threshold selection using the normality assumption weighs the central population of data. Therefore, this normality assumption might impose potentially misleading behavior on damage classification, and is likely to lead the damage diagnosis astray. In this paper, extreme value statistics is integrated with the novelty detection to specifically model the tails of the distribution of interest. Finally, the proposed technique is demonstrated on simulated numerical data and time series data measured from an eight degree-of-freedom spring-mass system.
Curvature independence of statistical entropy
Judy Kupferman
2014-07-20T23:59:59.000Z
We examine the statistical number of states, from which statistical entropy can be derived, and we show that it is an explicit function of the metric and thus observer dependent. We find a constraint on a transformation of the metric that preserves the number of states but does not preserve curvature. In showing exactly how curvature independence arises in the conventional definition of statistical entropy, we gain a precise understanding of the direction in which it needs to be redefined in the treatment of black hole entropy.
Statistical Mechanics of Amplifying Apparatus
Joseph Johnson
2005-02-08T23:59:59.000Z
We implement Feynman's suggestion that the only missing notion needed for the puzzle of Quantum Measurement is the statistical mechanics of amplifying apparatus. We define a thermodynamic limit of quantum amplifiers which is a classically describable system in the sense of Bohr, and define macroscopic pointer variables for the limit system. Then we derive the probabilities of Quantum Measurement from the deterministic Schroedinger equation by the usual techniques of Classical Statistical Mechanics.
ASTR 511/O'Connell Lec 6 1 STATISTICS OF OBSERVATIONS &
Peletier, Reynier
Reduction & Error Analysis for the Physical Sciences" LLM: Appendix B Warning: the introductory literature on statistics of measurement is remarkably uneven, and nomenclature is not consistent. Is error analysisASTR 511/O'Connell Lec 6 1 STATISTICS OF OBSERVATIONS & SAMPLING THEORY References: Bevington "Data
Statistics in Linguistics Tutorial Just a sip. . .
Fong, Sandiway
Statistics in Linguistics Tutorial Just a sip. . . Mike Hammond Linguistics, U. of Arizona Statistics/Hammond Â p.1/9 #12;Overview Statistics/Hammond Â p.2/9 #12;Overview Are our data categorical? Statistics/Hammond Â p.2/9 #12;OverviewÂ¡ Are our data categorical? Â¡ Typological claims Statistics
Functional Data Analysis With Multi Layer Perceptrons
Fleuret, FranÃ§ois
Analysis. We introduce a computation model for functional input data and we show that this model is a wellFunctional Data Analysis With Multi Layer Perceptrons Fabrice Rossi , Brieuc Conan-Guez and Fran as clas- sical multivariate data, because they are in general de- scribed by a finite set of input
The fixed structurally robust internal model principle for linear multivariable regulators
McGrath, John Thomas
2012-06-07T23:59:59.000Z
)]~ = 0 where all the poles must be within C . The property described by (6) is defined as loop stability, Note that loop s . ability does not guarantee internal stability since un'table canoelle, t'ion may occur in the ce. . cede of T(s, 0, )F(s). W(s... for the degree of I'V. STER OF S"IENCE Vay 1980 Va jor Sub jec ~: Elec+r ical Engineering THE FIXED STRUCTURALLY ROBUST INTERNAL MODEL PRINCIPLE FOR LINEAR MULTIVARIABLE REGUIATORS A Thesis by JOHN THOMAS MCGRATH Aoproved as to style and content by...
Statistically Equivalent Representative Volume Elements for Unidirectional
Ghosh, Somnath
Statistically Equivalent Representative Volume Elements for Unidirectional Composite the statistically equivalent representative volume element (SERVE) are proposed for fiber-reinforced microstructures using a bilinear cohesive zone law. As introduced in the first article, a combination of statistical
Statistical Moments - Winter/Spring 2006
UCLA Department of Statistics
2006-01-01T23:59:59.000Z
in Los Angeles County, UCLA Statistics Preprint No. 441 5.Point Processes, UCLA Statistics Preprint No. 377 KatherinePoint Patterns, UCLA Statistics Preprint No. 406 Kim, T. H,
Zhang, Li-Xin
Â§1.0 What is Statistics? Â§1.1 What is Mathematical Statistics? Random Data Basic Ideas in Statistics Â§1.2 Fundamental Con Mathematical Statistics Zhang, Lixin and Dai, Jialing Course Website: www.math.zju.edu.cn/zlx/teaching.htm #12;Â§1.0 What is Statistics? Â§1.1 What is Mathematical Statistics? Random Data Basic Ideas
Gary Christopher Vezzoli
2001-04-04T23:59:59.000Z
This work presents physical consequences of our theory of induced gravity (Ref.1) regarding: 1) the requirement to consider shape and materials properties when calculating graviton cross section collision area; 2) use of Special Relativity; 3) implications regarding the shape of cosmos; 4) comparison to explanations using General Relativity; 5) properties of black holes; 6) relationship to the strong force and the theorized Higgs boson; 7) the possible origin of magnetic attraction; 8) new measurements showing variation from gravitational inverse square behavior at length scales of 0.1 mm and relationship to the Cosmological constant, and proof of the statistical time properties of the gravitational interaction.
Konomi, Bledar A.; Karagiannis, Georgios; Sarkar, Avik; Sun, Xin; Lin, Guang
2014-05-16T23:59:59.000Z
Computer experiments (numerical simulations) are widely used in scientific research to study and predict the behavior of complex systems, which usually have responses consisting of a set of distinct outputs. The computational cost of the simulations at high resolution are often expensive and become impractical for parametric studies at different input values. To overcome these difficulties we develop a Bayesian treed multivariate Gaussian process (BTMGP) as an extension of the Bayesian treed Gaussian process (BTGP) in order to model and evaluate a multivariate process. A suitable choice of covariance function and the prior distributions facilitates the different Markov chain Monte Carlo (MCMC) movements. We utilize this model to sequentially sample the input space for the most informative values, taking into account model uncertainty and expertise gained. A simulation study demonstrates the use of the proposed method and compares it with alternative approaches. We apply the sequential sampling technique and BTMGP to model the multiphase flow in a full scale regenerator of a carbon capture unit. The application presented in this paper is an important tool for research into carbon dioxide emissions from thermal power plants.
Multivariable Robust Control of a Simulated Hybrid Solid Oxide Fuel Cell Gas Turbine Plant
Tsai, Alex; Banta, Larry; Tucker, D.A.; Gemmen, R.S.
2008-06-01T23:59:59.000Z
This paper presents a systematic approach to the multivariable robust control of a hybrid fuel cell gas turbine plant. The hybrid configuration under investigation comprises a physical simulation of a 300kW fuel cell coupled to a 120kW auxiliary power unit single spool gas turbine. The facility provides for the testing and simulation of different fuel cell models that in turn help identify the key issues encountered in the transient operation of such systems. An empirical model of the facility consisting of a simulated fuel cell cathode volume and balance of plant components is derived via frequency response data. Through the modulation of various airflow bypass valves within the hybrid configuration, Bode plots are used to derive key input/output interactions in Transfer Function format. A multivariate system is then built from individual transfer functions, creating a matrix that serves as the nominal plant in an H-Infinity robust control algorithm. The controller’s main objective is to track and maintain hybrid operational constraints in the fuel cell’s cathode airflow, and the turbo machinery states of temperature and speed, under transient disturbances. This algorithm is then tested on a Simulink/MatLab platform for various perturbations of load and fuel cell heat effluence.
DATA MONITORING AND ANALYSIS PROGRAM MANUAL
Gravois, Melanie
2007-01-01T23:59:59.000Z
charts and analysis guidelines can be found in common textbooks and guidebooks on Statistical Process Control (SPC).
statistical planning Journal of Statistical Planning and and inference
Lin, Danyu
-dependent covariates D.Y. Lin a'*, Zhiliang Ying b "Department . Ying/Journal q/' Statistical Plannin~l and lt~[brence 44 (1995) 47 63 where To refers to the condition attention (e.g., Ritov, 1990; Tsiatis, 1990; Wei et al., 1990; Lai and Ying, 1991; Ying, 1993). To provide
STATISTICS IN MEDICINE Statist. Med. 2009; 28:34543466
McLachlan, Geoff
of Western Australia, Perth, Australia 4Centre for Statistics, University of Queensland, Brisbane, Australia, Curtin University of Technology, GPO Box U 1987, Perth, WA 6845, Australia 2Department of Management Research Institute, Curtin University of Technology, GPO Box U 1987, Perth, WA 6845, Australia. E
Some statistics of Galactic SNRs
D. A. Green
2005-05-20T23:59:59.000Z
The selection effects applicable to the identification of Galactic supernova remnants (SNRs) at radio wavelengths are discussed. Low surface brightness remnants are missing, as are those with small angular sizes (including young but distant SNRs). Several statistical properties of Galactic SNRs are discussed, including the surface-brightness/diameter (Sigma-D) relation. The wide range of intrinsic properties of Galactic remnants with known distances, and the observational selection effects, means that the Sigma-D relation is of limited use to derive diameters and hence distances for individual SNRs, or for statistical studies.
On Statistical Aspects of Qjets
Stephen D. Ellis; Andrew Hornig; David Krohn; Tuhin S. Roy
2014-12-05T23:59:59.000Z
The process by which jet algorithms construct jets and subjets is inherently ambiguous and equally well motivated algorithms often return very different answers. The Qjets procedure was introduced by the authors to account for this ambiguity by considering many reconstructions of a jet at once, allowing one to assign a weight to each interpretation of the jet. Employing these weighted interpretations leads to an improvement in the statistical stability of many measurements. Here we explore in detail the statistical properties of these sets of weighted measurements and demonstrate how they can be used to improve the reach of jet-based studies.
Statistics Colloquium Dr. Noel Cadigan
Oyet, Alwell
:00a.m., HH-3026 Statistical problems to address for some NL fish stocks when deriving Maximum prescribed actions should occur when stock size or fishing mortality rates (F) exceed the reference points. This is the biomass that should result in the long term when fishing at Fmsy Â the harvest rate that maximizes long
Purdue Agriculture Annual Statistical Report
Purdue Agriculture Research Works Annual Statistical Report 2005-2006 Purdue AGrICuLTure Read the full report on the Web www.ag.purdue.edu/arp/stat_report_05-06 #12;Purdue AGrICuLTure Purdue Agriculture Research Works Here's why. We are riding the wave of revolutionary changes brought about
STATISTICS AND PHILOSOPHY OF PROBABILITY
Burdzy, Krzysztof "Chris"
STATISTICS AND PHILOSOPHY OF PROBABILITY -- SIX DEGREES OF SEPARATION Krzysztof Burdzy University of Washington Krzysztof Burdzy Philosophy of probability #12;The search for certainty The search for certainty. On the clash of science and philosophy of probability. Preface, Table of Contents and Introduction
Non-statistical Weak Measurements
Jeff Tollaksen; Yakir Aharonov
2006-07-28T23:59:59.000Z
Non-statistical weak measurements yield weak values that are outside the range of eigenvalues and are not rare, suggesting that weak values are a property of every pre-and-post-selected ensemble. They also extend the applicability and valid regime of weak values.
Statistical Assistant Research Data Centre, Queens University
Graham, Nick
JOB OFFER Statistical Assistant Research Data Centre, Queens University Position: Statistical Assistant at the RDC, Queens University Classification: Statistics Canada term part-time CR-04 Salary: $ 44, etc.) - Provide basic assistance on the use of the computer network and statistical software
Statistical Seismology DAVID VERE-JONES,1
Ben-Zion, Yehuda
Statistical Seismology DAVID VERE-JONES,1 YEHUDA BEN-ZION,2 and RAMOÂ´ N ZUÂ´ N~ IGA 3 Introduction the last two decades. The subject of statistical seismology aims to bridge the gap between physics-based models without statistics, and statistics-based models without physics. This volume, which is based
Experimental particle physics in Finland: Statistics and Overview
Eerola, Paula
Experimental particle physics in Finland: Statistics and Overview P. Eerola University of Helsinki, Finland RÂECFA meeting Helsinki, Finland September 5, 1997 #12; History HighÂenergy physics research, analysis of bubble chamber experiments data. ffl 1968 CERN fund by the Particle Physics Committee
Statistical Mechanics of Money, Income, Debt, and Energy Consumption
Hill, Wendell T.
Statistical Mechanics of Money, Income, Debt, and Energy Consumption Physics Colloquium Presented in financial markets. Globally, data analysis of energy consumption per capita around the world shows@american.edu Similarly to the probability distribution of energy in physics, the probability distribution of money among
Statistical inference for density dependent Markovian forestry models
Paris-Sud XI, UniversitÃ© de
Statistical inference for density dependent Markovian forestry models Abstract A stochastic forestry model with a density-dependence structure is studied. The population evolves in discrete roughly speaking, becomes large. From the perspective of the analysis of forestry data and predict
Application for SAS Certificate Applied Statistics and SAS Programming
Dahl, David B.
# ________________________________________ BYU ID ________________________________________ ________________________________________ Year Name Year/Term Taken Grade Stat 124 1.0 SAS Certification 1 Stat 125 1.0 SAS Certification 2 Stat 224 2.0 Statistical Computing 1 Stat 230 3.0 Analysis of Variance Stat 330 3.0 Introduction to Regression Stat 424 3
Application for SAS Certificate Applied Statistics and SAS Programming
Dahl, David B.
# ________________________________________ BYU ID ________________________________________ ________________________________________ Year Name Year/Term Taken Grade Stat 124 1.5 SAS Base Programming Skills Stat 224 1.5 Applied SAS Programming Stat 230 3.0 Analysis of Variance Stat 330 3.0 Introduction to Regression Stat 424 3.0 Statistical
Scalable Statistical Monitoring of Fleet , Dimitry Gorinevsky
LLC, Palo Alto, CA e-mail: dimitry@mitekan.com Abstract: This paper considers the problem of fitting monitoring of data from a fleet (population) of similar units. A fleet-wide extension of the multivariable historical cruise flight data. 1. INTRODUCTION 1.1 Population monitoring problems This paper considers
Introduction to statistical models and non-extensive statistics
T. S. Biro
2005-07-27T23:59:59.000Z
Quark matter is being expected to be found in heavy ion collisions on the basis of calculations in the framework of traditional, extensive thermodynamics. Recently a non-extensive generalization of the thermodynamics is emerging in the theoretical research. We review here some basic concepts in statistics, kinetic theory and thermodynamics, in particular those encountered in non-extensive thermodynamics. This offers an introduction into the theoretical basis of considering non-extensive parton kinetics for describing the hadronization of quark matter.
Unified univariate and multivariate random field theory Keith J. Worsley12
Worsley, Keith
: vector deformations to warp an MRI image to an atlas standard, diffusion in several different directions of the F-statistic, such as Wilks's and the Lawley-Hotelling trace. There are as yet no known random field
Decomposition of a Nonlinear Multivariate Function using the Heaviside Step Function
Eisuke Chikayama
2014-05-21T23:59:59.000Z
Whereas the Dirac delta function introduced by P. A. M. Dirac in 1930 in his famous quantum mechanics text has been well studied, a not famous formula related to the delta function using the Heaviside step function in a single-variable form, also given in Dirac's text, has been poorly studied. We demonstrate the decomposition of a nonlinear multivariate function into a sum of integrals in which each integrand is composed of a derivative of the function and a direct product of Heaviside step functions. It is an extension of Dirac's single-variable form to that for multiple variables. Moreover, it remains mathematically equivalent to the definition of the Dirac delta function with multiple variables, and offers a mathematically unified expression.
Application of multi-variable control for automatic frequency controller of HVDC transmission system
Sanpei, Masatoshi (Electric Power Development Co., Tokyo (Japan)); Kakehi, Atsuyuki; Takeda, Hideo (Toshiba Corp., Tokyo (Japan))
1994-04-01T23:59:59.000Z
In an HVDC transmission system that links two ac power systems, the automatic frequency controller (AFC) calculates power to be interchanged between the two ac systems according to their frequencies thereby improving the frequency characteristics of the two power systems. This paper introduces a newly developed dc AFC system, which applies a multi-variable control to the dc system-based frequency control. It is capable of controlling the frequencies of the two ac systems optimally while maintaining their stability. This system was developed for one of Japan's HVDC transmission facilities and produced good results in a combined test using a power system simulator. The field installation will be completed in March 1993, when the AFC system will enter service.
Multivariable Robust Control of a Simulated Hybrid Solid Oxide Fuel Cell Gas Turbine Plant
Tsai A, Banta L, Tucker D
2010-08-01T23:59:59.000Z
This work presents a systematic approach to the multivariable robust control of a hybrid fuel cell gas turbine plant. The hybrid configuration under investigation built by the National Energy Technology Laboratory comprises a physical simulation of a 300kW fuel cell coupled to a 120kW auxiliary power unit single spool gas turbine. The public facility provides for the testing and simulation of different fuel cell models that in turn help identify the key difficulties encountered in the transient operation of such systems. An empirical model of the built facility comprising a simulated fuel cell cathode volume and balance of plant components is derived via frequency response data. Through the modulation of various airflow bypass valves within the hybrid configuration, Bode plots are used to derive key input/output interactions in transfer function format. A multivariate system is then built from individual transfer functions, creating a matrix that serves as the nominal plant in an H{sub {infinity}} robust control algorithm. The controller’s main objective is to track and maintain hybrid operational constraints in the fuel cell’s cathode airflow, and the turbo machinery states of temperature and speed, under transient disturbances. This algorithm is then tested on a Simulink/MatLab platform for various perturbations of load and fuel cell heat effluence. As a complementary tool to the aforementioned empirical plant, a nonlinear analytical model faithful to the existing process and instrumentation arrangement is evaluated and designed in the Simulink environment. This parallel task intends to serve as a building block to scalable hybrid configurations that might require a more detailed nonlinear representation for a wide variety of controller schemes and hardware implementations.
Statistical Complexity of Sampled Chaotic Attractors
Luciana De Micco; Juana Graciela Fernández; Hilda Angela Larrondo; Angelo Plastino; Osvaldo Anibal Rosso
2011-05-19T23:59:59.000Z
We analyze the statistical complexity vs. entropy plane-representation of sampled chaotic attractors as a function of the sampling period {\\tau}. It is shown that if the Bandt and Pompe procedure is used to assign a probability distribution function (PDF) to the pertinent time series, the statistical complexity measure (SCM) attains a definite maximum for a specific sampling period tM. If the usual histogram approach is used instead in order to assign the PDF to the time series, the SCM remains almost constant at any sampling period {\\tau}. The significance of tM is further investigated by comparing it with typical times given in the literature for the two main reconstruction processes: the Takens' one in a delay-time embedding, and the exact Nyquist-Shannon reconstruction. It is shown that tM is compatible with those times recommended as adequate delay ones in Takens' reconstruction. The reported results correspond to three representative chaotic systems having correlation dimension 2 < D2 < 3. One recent experiment confirms the analysis presented here.
Chapter 11. Community analysis-based methods
Cao, Y.; Wu, C.H.; Andersen, G.L.; Holden, P.A.
2010-05-01T23:59:59.000Z
Microbial communities are each a composite of populations whose presence and relative abundance in water or other environmental samples are a direct manifestation of environmental conditions, including the introduction of microbe-rich fecal material and factors promoting persistence of the microbes therein. As shown by culture-independent methods, different animal-host fecal microbial communities appear distinctive, suggesting that their community profiles can be used to differentiate fecal samples and to potentially reveal the presence of host fecal material in environmental waters. Cross-comparisons of microbial communities from different hosts also reveal relative abundances of genetic groups that can be used to distinguish sources. In increasing order of their information richness, several community analysis methods hold promise for MST applications: phospholipid fatty acid (PLFA) analysis, denaturing gradient gel electrophoresis (DGGE), terminal restriction fragment length polymorphism (TRFLP), cloning/sequencing, and PhyloChip. Specific case studies involving TRFLP and PhyloChip approaches demonstrate the ability of community-based analyses of contaminated waters to confirm a diagnosis of water quality based on host-specific marker(s). The success of community-based MST for comprehensively confirming fecal sources relies extensively upon using appropriate multivariate statistical approaches. While community-based MST is still under evaluation and development as a primary diagnostic tool, results presented herein demonstrate its promise. Coupled with its inherently comprehensive ability to capture an unprecedented amount of microbiological data that is relevant to water quality, the tools for microbial community analysis are increasingly accessible, and community-based approaches have unparalleled potential for translation into rapid, perhaps real-time, monitoring platforms.
Visual servoing using statistical pressure snakes.
Schaub, Hanspeter (ORION International Technologies, Albuquerque, NM)
2004-05-01T23:59:59.000Z
A nonlinear visual servoing steering law is presented which is used to align a camera view with a visual target. A full color version of statistical pressure snakes is used to identify and track the target with a series of video frames. The nonlinear steering law provides camera-frame centric speed commands to a velocity based servo sub-system. To avoid saturating the subsystem, the commanded speeds are smoothly limited to remain within a finite range. Analytical error analysis is also provided illustrating how the two control gains contribute to the stiffness of the control. The algorithm is demonstrated on a pan and tilt camera system. The control law is able to smoothly realign the camera to point at the target.
Statistics and Discoveries at the LHC (2/4)
None
2011-10-06T23:59:59.000Z
The lectures will give an introduction to statistics as applied in particle physics and will provide all the necessary basics for data analysis at the LHC. Special emphasis will be placed on the the problems and questions that arise when searching for new phenomena, including p-values, discovery significance, limit setting procedures, treatment of small signals in the presence of large backgrounds. Specific issues that will be addressed include the advantages and drawbacks of different statistical test procedures (cut-based, likelihood-ratio, etc.), the look-elsewhere effect and treatment of systematic uncertainties.
Statistics and Discoveries at the LHC (4/4)
None
2011-10-06T23:59:59.000Z
The lectures will give an introduction to statistics as applied in particle physics and will provide all the necessary basics for data analysis at the LHC. Special emphasis will be placed on the the problems and questions that arise when searching for new phenomena, including p-values, discovery significance, limit setting procedures, treatment of small signals in the presence of large backgrounds. Specific issues that will be addressed include the advantages and drawbacks of different statistical test procedures (cut-based, likelihood-ratio, etc.), the look-elsewhere effect and treatment of systematic uncertainties.
Statistics and Discoveries at the LHC (3/4)
None
2011-10-06T23:59:59.000Z
The lectures will give an introduction to statistics as applied in particle physics and will provide all the necessary basics for data analysis at the LHC. Special emphasis will be placed on the the problems and questions that arise when searching for new phenomena, including p-values, discovery significance, limit setting procedures, treatment of small signals in the presence of large backgrounds. Specific issues that will be addressed include the advantages and drawbacks of different statistical test procedures (cut-based, likelihood-ratio, etc.), the look-elsewhere effect and treatment of systematic uncertainties.
Statistics and Discoveries at the LHC (1/4)
None
2011-10-06T23:59:59.000Z
The lectures will give an introduction to statistics as applied in particle physics and will provide all the necessary basics for data analysis at the LHC. Special emphasis will be placed on the the problems and questions that arise when searching for new phenomena, including p-values, discovery significance, limit setting procedures, treatment of small signals in the presence of large backgrounds. Specific issues that will be addressed include the advantages and drawbacks of different statistical test procedures (cut-based, likelihood-ratio, etc.), the look-elsewhere effect and treatment of systematic uncertainties.
Kenkel, Norm
12Â14 June 2000. International Journal of Remote Sensing ISSN 0143-1161 print/ISSN 1366-5901 onlineint. j. remote sensing, 2002, vol. 23, no. 21, 4761Â4776 A multivariate approach to vegetation the likelihood of errors in classi cation caused by overlap between classes. 1. Introduction Remotely sensed data
Barrash, Warren
of groundwater contamination, and understanding petrophysical relations or multivariate associations. We examine and remediation of groundwater contamination. Also, knowledge of the distribution of K along with other physical on reconstructed samples) and empirical estimates based on samples from quarry and outcrop exposures [e.g., Jussel
adult epileptic patients: Topics by E-print Network
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Control Chung, Moo K. 91 Testing statistical significance of multivariate time series analysis techniques for epileptic seizure prediction Physics Websites Summary: Epilepsy...
Paris-Sud XI, UniversitÃ© de
networks 641 Hydrology and Earth System Sciences, 6(4), 641Â654 (2002) Â© EGS Multivariate synthetic associated with hydrological processes, making it valuable as a practical tool for synthetic generation backpropagation, hydrological scenario generation, multivariate time-series. Introduction It has been almost four
The Role of Technology in Improving Student Learning of Statistics
Chance, Beth; Ben-Zvi, Dani; Garfield, Joan; Medina, Elsa
2007-01-01T23:59:59.000Z
Voorburg, The Netherland: International Statistical1-14). Voorburg, The Netherland: International Statistical
Biomarker Detection in Whole Slide Imaging based on Statistical Color Models
Aickelin, Uwe
Biomarker Detection in Whole Slide Imaging based on Statistical Color Models Jie Shu1 , Guoping Qiu slides. We treat immunostaining detection as a color image analysis problem and build statistical color slides of oesophagitis and colorectal biopsies. We present experimental results and show
Statistical fracture modeling: crack path and fracture criteria with application to homogeneous and
Ritchie, Robert
Statistical fracture modeling: crack path and fracture criteria with application to homogeneous; accepted 23 January 2002 Abstract Analysis has been performed on fracture initiation near a crack in a brittle material with strength described by Weibull statistics. This nonlocal fracture model allows
STATISTICS OF EXTREMES IN CLIMATOLOGY AND HYDROLOGY PART I: BACKGROUND AND TRADITIONAL APPROACHES
Katz, Richard
1 STATISTICS OF EXTREMES IN CLIMATOLOGY AND HYDROLOGY PART I: BACKGROUND AND TRADITIONAL APPROACHES;3 Outline (1) Historical Perspective (2) Basic Characteristics of Climate/Hydrologic Extremes (3) Traditional Statistical Analysis of Climate/Hydrologic Extremes (4) Spatial/Temporal Dependence of Climate
A new development cycle of the Statistical Toolkit
Batic, M; Pfeiffer, A; Pia, M G; Ribon, A
2012-01-01T23:59:59.000Z
The Statistical Toolkit is an open source system specialized in the statistical comparison of distributions. It addresses requirements common to different experimental domains, such as simulation validation (e.g. comparison of experimental and simulated distributions), regression testing in the course of the software development process, and detector performance monitoring. Various sets of statistical tests have been added to the existing collection to deal with the one sample problem (i.e. the comparison of a data distribution to a function, including tests for normality, categorical analysis and the estimate of randomness). Improved algorithms and software design contribute to the robustness of the results. A simple user layer dealing with primitive data types facilitates the use of the toolkit both in standalone analyses and in large scale experiments.
Kansas Statistical Abstract 2005 (40th Edition)
Policy Research Institute
2008-10-21T23:59:59.000Z
The Kansas Statistical Abstract contains state, county, and city-level data for Kansas on population, vital statistics and health, housing, elections, education, business and manufacturing, exports, employment, income, ...
Kansas Statistical Abstract 2001 (36th Edition)
2002-12-01T23:59:59.000Z
The Kansas Statistical Abstract contains state, county, and city-level data for Kansas on population, vital statistics and health, housing, elections, education, business and manufacturing, exports, employment, income, ...
Kansas Statistical Abstract 2004 (39th Edition)
Policy Research Institute
2006-01-31T23:59:59.000Z
The Kansas Statistical Abstract contains state, county, and city-level data for Kansas on population, vital statistics and health, housing, elections, education, business and manufacturing, exports, employment, income, ...
Kansas Statistical Abstract 2003 (38th Edition)
Policy Research Institute
2004-09-01T23:59:59.000Z
The Kansas Statistical Abstract contains state, county, and city-level data for Kansas on population, vital statistics and health, housing, elections, education, business and manufacturing, exports, employment, income, ...
Kansas Statistical Abstract 2012 (47th Edition)
Institute for Policy & Social Research
2014-05-27T23:59:59.000Z
The Kansas Statistical Abstract 2012, contains the latest available state, county, and city-level data for Kansas on population, vital statistics and health, housing, education, business and manufacturing, exports, employment, ...
Kansas Statistical Abstract 2002 (37th Edition)
2003-09-01T23:59:59.000Z
The Kansas Statistical Abstract contains state, county, and city-level data for Kansas on population, vital statistics and health, housing, elections, education, business and manufacturing, exports, employment, income, ...
|Research Focus Statistical decision theory and evolution
Maloney, Laurence T.
|Research Focus Statistical decision theory and evolution Laurence T. Maloney Department recent articles by Geisler and Diehl use Bayesian statistical decision theory to model the co, an advantage that ultimately translates into `reproductive success'. The balance between predator and prey
Transportation Statistics Annual Report 1997
Fenn, M.
1997-01-01T23:59:59.000Z
This document is the fourth Transportation Statistics Annual Report (TSAR) prepared by the Bureau of Transportation Statistics (BTS) for the President and Congress. As in previous years, it reports on the state of U.S. transportation system at two levels. First, in Part I, it provides a statistical and interpretive survey of the system—its physical characteristics, its economic attributes, aspects of its use and performance, and the scale and severity of unintended consequences of transportation, such as fatalities and injuries, oil import dependency, and environment impacts. Part I also explores the state of transportation statistics, and new needs of the rapidly changing world of transportation. Second, Part II of the report, as in prior years, explores in detail the performance of the U.S. transportation system from the perspective of desired social outcomes or strategic goals. This year, the performance aspect of transportation chosen for thematic treatment is “Mobility and Access,” which complements past TSAR theme sections on “The Economic Performance of Transportation” (1995) and “Transportation and the Environment” (1996). Mobility and access are at the heart of the transportation system’s performance from the user’s perspective. In what ways and to what extent does the geographic freedom provided by transportation enhance personal fulfillment of the nation’s residents and contribute to economic advancement of people and businesses? This broad question underlies many of the topics examined in Part II: What is the current level of personal mobility in the United States, and how does it vary by sex, age, income level, urban or rural location, and over time? What factors explain variations? Has transportation helped improve people’s access to work, shopping, recreational facilities, and medical services, and in what ways and in what locations? How have barriers, such as age, disabilities, or lack of an automobile, affected these accessibility patterns? How are commodity flows and transportation services responding to global competition, deregulation, economic restructuring, and new information technologies? How do U.S. patterns of personal mobility and freight movement compare with other advanced industrialized countries, formerly centrally planned economies, and major newly industrializing countries? Finally, how is the rapid adoption of new information technologies influencing the patterns of transportation demand and the supply of new transportation services? Indeed, how are information technologies affecting the nature and organization of transportation services used by individuals and firms?
The Statistics of Crumpled Paper
Eric Sultan; Arezki Boudaoud
2005-09-06T23:59:59.000Z
A statistical study of crumpled paper is allowed by a minimal 1D model: a self-avoiding line bent at sharp angles -- in which resides the elastic energy -- put in a confining potential. Many independent equilibrium configurations are generated numerically and their properties are investigated. At small confinement, the distribution of segment lengths is log-normal in agreement with previous predictions and experiments. At high confinement, the system approaches a jammed state with a critical behavior, whereas the length distribution follows a Gamma law which parameter is predicted as a function of the number of layers in the system.
On statistics of molecular chaos
Yuriy Kuzovlev
2009-11-03T23:59:59.000Z
It is shown that the BBGKY equations for a particle interacting with ideal gas imply exact relations between probability distribution of path of the particle, its derivatives in respect to the gas density and irreducible many-particle correlations of gas atoms with the path. These relations visualize that the correlations of any order always significantly contribute to evolution of the path distribution, so that the exact statistical mechanics theory does not reduce to the classical kinetics even in the low-density (or Boltzmann-Grad) limit.
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
AFDC Printable Version Share this resource Send a link to EERE: Alternative Fuels Data Center Home Page to someone by E-mail Share EERE: Alternative Fuels Data Center Home Page on Facebook Tweet about EERE: Alternative Fuels Data Center Home Page on Twitter Bookmark EERE:1 First Use of Energy for All Purposes (Fuel and Nonfuel),Feet) Year Jan Feb Mar Apr MayAtmospheric Optical Depth7-1D: Vegetation ProposedUsing ZirconiaPolicy andExsolutionFES6FY 2011 OIG(SC) 2FY98 To Present Statistics
Does interactivity improve exploratory data analysis of animated trend visualization?
Shaw, Chris
]. In particular, animation has become a popular method for visualizing trends in multivariate information spaces1 Does interactivity improve exploratory data analysis of animated trend visualization? Felwa A technique for data analysis of large data. We compared interactive animations with non-interactive (passive
Functional Data Analysis With Multi Layer Perceptrons0
Rossi, Fabrice
Analysis. We introduce a computation model for functional input data and we show that this model is a wellFunctional Data Analysis With Multi Layer Perceptrons0 Fabrice Rossi1 , Brieuc Conan-Guez and Fran as clas- sical multivariate data, because they are in general de- scribed by a finite set of input
A Visual Analytics Approach for Correlation, Classification, and Regression Analysis
Swan II, J. Edward
multivariate visual analysis. The current work features an expanded version of MDX that builds on recentA Visual Analytics Approach for Correlation, Classification, and Regression Analysis Chad A. Steeda, Mississippi State University, Stennis Space Center, MS, 39529; cDepartment of Computer Science and Engineering
Statistical Estimation of Two-Body Hydrodynamic Properties Using System Identification
Xie, Chen
2010-01-14T23:59:59.000Z
information concerning the response characteristics of such systems. The current study demonstrates that the analysis of these data using a combination of statistical tools and system identification techniques can efficiently recover the main hydrodynamic...
An estimator for statistical anisotropy from the CMB bispectrum
Bartolo, N.; Dimastrogiovanni, E.; Matarrese, S. [Dipartimento di Fisica ''G. Galilei'', Università degli Studi di Padova, via Marzolo 8, 35131 Padova (Italy); Liguori, M. [Institut d'Astrophysique de Paris, UMR-7095 du CNRS, Université Pierre et Marie Curie, 98 bis bd Arago, 75014 Paris (France); Riotto, A., E-mail: nicola.bartolo@pd.infn.it, E-mail: dimastro@pd.infn.it, E-mail: liguori@iap.fr, E-mail: sabino.matarrese@pd.infn.it, E-mail: riotto@mail.cern.ch [INFN — Sezione di Padova, via Marzolo 8, 35131 Padova (Italy)
2012-01-01T23:59:59.000Z
Various data analyses of the Cosmic Microwave Background (CMB) provide observational hints of statistical isotropy breaking. Some of these features can be studied within the framework of primordial vector fields in inflationary theories which generally display some level of statistical anisotropy both in the power spectrum and in higher-order correlation functions. Motivated by these observations and the recent theoretical developments in the study of primordial vector fields, we develop the formalism necessary to extract statistical anisotropy information from the three-point function of the CMB temperature anisotropy. We employ a simplified vector field model and parametrize the bispectrum of curvature fluctuations in such a way that all the information about statistical anisotropy is encoded in some parameters ?{sub LM} (which measure the anisotropic to the isotropic bispectrum amplitudes). For such a template bispectrum, we compute an optimal estimator for ?{sub LM} and the expected signal-to-noise ratio. We estimate that, for f{sub NL} ? 30, an experiment like Planck can be sensitive to a ratio of the anisotropic to the isotropic amplitudes of the bispectrum as small as 10%. Our results are complementary to the information coming from a power spectrum analysis and particularly relevant for those models where statistical anisotropy turns out to be suppressed in the power spectrum but not negligible in the bispectrum.
Electric field statistics in MHD turbulence
Low, Robert
Electric field statistics in MHD turbulence Bernard Knaepen, Nicolas Denewet & Daniele Carati, ULB #12;Electric field statistics in MHD turbulence Outline Electric field in MHD? Particle acceleration Statistics of the electric & magnetic fields #12;Outline Electric field in MHD? Particle acceleration
Writing statistics project reports October 8, 2001
Painter, Kevin
Writing statistics project reports S. Zachary October 8, 2001 This is a draft version projects. The need for such additional guidance is apparent from the thousands of reports we have marked is scientifically and statistically literate. Therefore you do not need to describe standard statistical procedures
Statistical Physics of RNA folding Ralf Bundschuh
Bundschuh, Ralf
Statistical Physics of RNA folding Ralf Bundschuh Ohio State University August 15, 2007 Ralf Bundschuh (Ohio State University) Statistical Physics of RNA folding August 15, 2007 1 / 119 #12;Outline University) Statistical Physics of RNA folding August 15, 2007 2 / 119 #12;Part I Introduction to RNA biology
Evaluating statistic appropriateness for Bayesian model choice
Paris-Sud XI, UniversitÃ© de
Evaluating statistic appropriateness for Bayesian model choice Jean-Michel Marin I3M, UMR CNRS 5149, UniversitÂ´e Montpellier 2. France. Natesh S. Pillai Department of Statistics, Harvard University, Cambridge Rousseau ENSAE and CREST, Paris, France. Summary. The choice of the summary statistics in Bayesian
Statistical mechanics and ocean circulation Rick Salmon
Salmon, Rick
Statistical mechanics and ocean circulation Rick Salmon Scripps Institution of Oceanography, UCSD equilibrium statistical mechanics based upon the conservation of energy and potential enstrophy to the mass. The equilibrium state resembles the buoyancy structure actually observed. Key words: statistical mechanics, ocean
STATISTICS COLLOQUIUM MONDAY, SEPTEMBER 12, 2011
Schrag, Daniel
STATISTICS COLLOQUIUM MONDAY, SEPTEMBER 12, 2011 TALK: 4:00 PM -- SCIENCE CENTER RM. 309 RECEPTION Dasgupta Statistics Department Harvard University ABSTRACT A framework for causal inference from two experiments. The framework allows for statistical inference from a finite population, permits definition
Linearity -statistics 1.1B training
Linearity - statistics IPAT 1.1B training 300M training D0 resolution is evaluated using 100k. Black is a bank with 3.5 times as much statistics. There may be a hint of slight improvement;Narrow beam + high statistics IPAT Default FTK constants Narrow-beam constants Reconstruction performance
Statistics and Causal Inference PAUL W. HOLLAND*
Fitelson, Branden
Statistics and Causal Inference PAUL W. HOLLAND* Problems involving causal inference have dogged conclusions drawn from a carefully designed experiment are often valid. What can a statistical model say about in the most unexpected places, for example, "If the statistics cannot relate cause and effect, they can
Statistics Department University of California, Berkeley
California at Santa Cruz, University of
John Rice Statistics Department University of California, Berkeley Joint work with Peter Bickel, no matter how rich the dictionary from which you adaptively compose a detection statistic, no matter how to be a [closet] Bayesian and choose directions a priori. Lehman & Romano. Testing Statistical Hypotheses. Chapt
STATISTICS COLLOQUIUM MONDAY, MARCH 24, 2014
Schrag, Daniel
STATISTICS COLLOQUIUM MONDAY, MARCH 24, 2014 TALK: 4:15 PM -- SCIENCE CENTER RM. 705 RECEPTION: 3 Department of Statistics University of California, Berkeley ABSTRACT In a 2009 PNAS article, based on work at appropriate speeds. We discuss these results and argue that while they are hard to interpret statistically
Statistics beyond Physics -Misused in Public ?
Kobe, Sigismund
Statistics beyond Physics - Misused in Public ? S. KOBE Institut fÃ¼r Theoretische Physik Technische are three kinds of lies: Lies, damned lies, and statistics" attributed by Mark TWAIN to the 19th Century gefÃ¤lscht habe. (Do not trust any statistics you did not fake yourself.)" attributed by ... to the 20th
Department of Statistics University of California, Berkeley
El Karoui, Noureddine
.berkeley.edu 367A Evans Hall 510-642-1430 510-643-6131 | cco@stat.berkeley.edu Undergraduate major website: http://www.stat.berkeley.edu/?id evaluated by assist.org or the Math Department) are acceptable. Core Statistics Courses (2 courses): Stat 134 (or 101) Concepts of Probability Stat 135 (or 102) Concepts of Statistics Statistics Electives (3
FISHERY STATISTICS OF THE UNITED STATES
FISHERY STATISTICS OF THE UNITED STATES 1946 BY A. W. ANDERSON and E. A. POWER UNITED STATES GOVERNMENT Statistics of the United States, 1946 #12;FISHERY STATISTICS OF THE UNITED STATES: 1946 By A. W. ANDERSON City 129 Section 4. - Chesapeake Fisheries 131 Sectional Summaries 133 Mary I and 137 Vi rginia 140
Use of multivariate calibration for plutonium quantitation by the Pu(III) spectrophotometric method
Wangen, L.E.; Phillips, M.V.; Walker, L.F.
1988-05-01T23:59:59.000Z
Two new multivariate calibration methods for using all of the relevant spectral information are applied to the determination of plutonium. The analyte response signal originates from the absorbance spectrum of Pu(III)from 500 to 900 nm. Partial least squares (PLS) regression gives an average absolute error of 0.114 /+-/ 0.108 mg when predicting plutonium content of standards containing 65 to 90 mg total plutonium. PLS uses all of the signal in the spectrum and is a more robust calibration procedure than a method based on absorbances at five wavelengths. Another calibration procedure involving least squares curve fitting (LSCF) fits either the entire spectrum or individual spectral intervals derived from standards to spectra of unknowns. In addition, an arbitrary linear base line can be included. The best LSCF option for the same calibration and test set as used for PLS was the full spectrum (522 to 900 nm) with a linear base-line option. The average absolute error when predicting with LSCF was 0.130 /+-/ 0.092 mg plutonium. LSCF has an advantage over PLS in that the linear base line can account for certain types of interferences that have been observed for this plutonium assay procedure. An example is given. 6 refs., 3 figs., 5 tabs.
One multivariable controller increased capacity of an Oleflex{trademark}/MTBE complex
Robertson, D.; Peterson, T.J.; O`Connor, D. [Dynamic Matrix Control Corp., Houston, TX (United States); Adams, V.; Payne, D. [Valero Refining Co., Corpus Christi, TX (United States)
1996-12-01T23:59:59.000Z
Capacity increased by more than 4.6% when one dynamic matrix controller began operating in Valero Refining Company`s MTBE production complex in Corpus Christi, Texas. This was on a plant that was already running well above design capacity due to process changes previously made on the plant. A single controller was developed to cover an Oleflex{trademark} isobutane dehydrogenation unit and an MTBe reaction and fractionation plant with the intermediate isobutylene surge drum. The overall benefit is realized by a comprehensive constrained multivariable predictive controller which properly handles all sets of limits experienced by the complex, whether limited by the front-end Oleflex{trademark} or back-end MTBE unit. The controller has 20 manipulated, 6 disturbance and 44 controlled variables, and covers widely varying dynamics with settling times ranging from twenty minutes to six hours. The controller executes each minute with a six hour time horizon. A unique achievement is intelligent handling of the surge drum level by the controller for higher average daily capacity of the complex as a whole. The Oleflex{trademark} often operates at simultaneous limits on reactor effluent compressor capacity, cold box temperature and hydrogen/hydrocarbon ratio and the MTBE at impurity in butene column overhead as well as impurity in MTBE product.
STATISTICS and PROBABILITY Statistics is the science and practice of developing
Bolch, Tobias
STATISTICS and PROBABILITY Definition Statistics is the science and practice of developing human knowledge through the use of empirical data expressed in quantitative form. It is based on statistical theory which is supposed to be a branch of applied mathematics. Within statistical theory, randomness
STATISTICS DEPARTMENT CODE 1 REVISED JANUARY 2003 OF THE DEPARTMENT OF STATISTICS
STATISTICS DEPARTMENT CODE 1 REVISED JANUARY 2003 CODE1 OF THE DEPARTMENT OF STATISTICS Colorado and primary appointment outside of Statistics [see (q)], and "regular faculty" comprises those faculty holding in Statistics as defined by the Academic Faculty and Administrative Professional Manual). A quorum for any
Ravikumar, B.
Statistics Page 233Sonoma State University 2012-2013 Catalog STATISTICS DEPARTMENT OF MATHEMATICS AND STATISTICS Darwin Hall 114 phone: (707) 664-2368 fax: (707) 664-3535 www.sonoma.edu/math DEPARTMENT CHAIR Sam Brannen STATISTICS PROGRAM ADVISORS Susan Herring Elaine McDonald-Newman Scott Nickleach ADMINISTRATIVE
Ravikumar, B.
Statistics Page 239Sonoma State University 2014-2015 Catalog STATISTICS DEPARTMENT OF MATHEMATICS AND STATISTICS Darwin Hall 114 phone: (707) 664-2368 fax: (707) 664-3535 www.sonoma.edu/math DEPARTMENT CHAIR Brigitte Lahme STATISTICS PROGRAM ADVISORS Susan Herring Elaine Newman ADMINISTRATIVE COORDINATOR Marybeth
A Flexible Approach for the Statistical Visualization of Ensemble Data
Potter, K; Wilson, A; Bremer, P; Williams, D; Pascucci, V; Johnson, C
2009-09-29T23:59:59.000Z
Scientists are increasingly moving towards ensemble data sets to explore relationships present in dynamic systems. Ensemble data sets combine spatio-temporal simulation results generated using multiple numerical models, sampled input conditions and perturbed parameters. While ensemble data sets are a powerful tool for mitigating uncertainty, they pose significant visualization and analysis challenges due to their complexity. We present a collection of overview and statistical displays linked through a high level of interactivity to provide a framework for gaining key scientific insight into the distribution of the simulation results as well as the uncertainty associated with the data. In contrast to methods that present large amounts of diverse information in a single display, we argue that combining multiple linked statistical displays yields a clearer presentation of the data and facilitates a greater level of visual data analysis. We demonstrate this approach using driving problems from climate modeling and meteorology and discuss generalizations to other fields.
Statistical constraints on binary black hole inspiral dynamics
Chad R. Galley; Frank Herrmann; John Silberholz; Manuel Tiglio; Gustavo Guerberoff
2010-05-30T23:59:59.000Z
We perform a statistical analysis of the binary black hole problem in the post-Newtonian approximation by systematically sampling and evolving the parameter space of initial configurations for quasi-circular inspirals. Through a principal component analysis of spin and orbital angular momentum variables we systematically look for uncorrelated quantities and find three of them which are highly conserved in a statistical sense, both as functions of time and with respect to variations in initial spin orientations. We also look for and find the variables that account for the largest variations in the problem. We present binary black hole simulations of the full Einstein equations analyzing to what extent these results might carry over to the full theory in the inspiral and merger regimes. Among other applications these results should be useful both in semi-analytical and numerical building of templates of gravitational waves for gravitational wave detectors.
Statistical Performance Modeling of SRAMs
Zhao, Chang
2011-02-22T23:59:59.000Z
Yield analysis is a critical step in memory designs considering a variety of performance constraints. Traditional circuit level Monte-Carlo simulations for yield estimation of Static Random Access Memory (SRAM) cell is quite time consuming due...
Statistical Mechanics of Resource Allocation
Inoue, Jun-ichi
2014-01-01T23:59:59.000Z
We provide a mathematical model to investigate the resource allocation problem for agents, say, university graduates who are looking for their positions in labor markets. The basic model is described by the so-called Potts spin glass which is well-known in the research field of statistical physics. In the model, each Potts spin (a tiny magnet in atomic scale length) represents the action of each student, and it takes a discrete variable corresponding to the company he/she applies for. We construct the energy to include three distinct effects on the students' behavior, namely, collective effect, market history and international ranking of companies. In this model system, the correlations (the adjacent matrix) between students are taken into account through the pairwise spin-spin interactions. We carry out computer simulations to examine the efficiency of the model. We also show that some chiral representation of the Potts spin enables us to obtain some analytical insights into our labor markets.